hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
63185317bc241d715a7bd3075e76a700def3ea51 | 15,652 | py | Python | python/swagger_client/api/matrix_api.py | byung90/graphhopper | 443891657e20cd1393675b1d4a1b9f38c691cdff | [
"Apache-2.0"
] | null | null | null | python/swagger_client/api/matrix_api.py | byung90/graphhopper | 443891657e20cd1393675b1d4a1b9f38c691cdff | [
"Apache-2.0"
] | null | null | null | python/swagger_client/api/matrix_api.py | byung90/graphhopper | 443891657e20cd1393675b1d4a1b9f38c691cdff | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
GraphHopper Directions API
You use the GraphHopper Directions API to add route planning, navigation and route optimization to your software. E.g. the Routing API has turn instructions and elevation data and the Route Optimization API solves your logistic problems and supports various constraints like time window and capacity restrictions. Also it is possible to get all distances between all locations with our fast Matrix API. # noqa: E501
OpenAPI spec version: 1.0.0
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from swagger_client.api_client import ApiClient
class MatrixApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def matrix_get(self, key, **kwargs): # noqa: E501
"""Matrix API # noqa: E501
The Matrix API is part of the GraphHopper Directions API and with this API you can calculate many-to-many distances, times or routes a lot more efficient than calling the Routing API multiple times. In the Routing API we support multiple points, so called 'via points', which results in one route being calculated. The Matrix API results in NxM routes or more precise NxM weights, distances or times being calculated but is a lot faster compared to NxM single requests. The most simple example is a tourist trying to decide which pizza is close to him instead of using beeline distance she can calculate a 1x4 matrix. Or a delivery service in the need of often big NxN matrices to solve vehicle routing problems. E.g. the GraphHopper Route Optimization API uses the Matrix API under the hood to achieve this. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.matrix_get(key, async=True)
>>> result = thread.get()
:param async bool
:param str key: Get your key at graphhopper.com (required)
:param list[str] point: Specifiy multiple points for which the weight-, route-, time- or distance-matrix should be calculated. In this case the starts are identical to the destinations. If there are N points, then NxN entries will be calculated. The order of the point parameter is important. Specify at least three points. Cannot be used together with from_point or to_point. Is a string with the format latitude,longitude.
:param list[str] from_point: The starting points for the routes. E.g. if you want to calculate the three routes A->1, A->2, A->3 then you have one from_point parameter and three to_point parameters. Is a string with the format latitude,longitude.
:param list[str] to_point: The destination points for the routes. Is a string with the format latitude,longitude.
:param list[str] point_hint: Optional parameter. Specifies a hint for each `point` parameter to prefer a certain street for the closest location lookup. E.g. if there is an address or house with two or more neighboring streets you can control for which street the closest location is looked up.
:param list[str] from_point_hint: For the from_point parameter. See point_hint
:param list[str] to_point_hint: For the to_point parameter. See point_hint
:param list[str] out_array: Specifies which arrays should be included in the response. Specify one or more of the following options 'weights', 'times', 'distances'. To specify more than one array use e.g. out_array=times&out_array=distances. The units of the entries of distances are meters, of times are seconds and of weights is arbitrary and it can differ for different vehicles or versions of this API.
:param str vehicle: The vehicle for which the route should be calculated. Other vehicles are foot, small_truck etc
:return: MatrixResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return self.matrix_get_with_http_info(key, **kwargs) # noqa: E501
else:
(data) = self.matrix_get_with_http_info(key, **kwargs) # noqa: E501
return data
def matrix_get_with_http_info(self, key, **kwargs): # noqa: E501
"""Matrix API # noqa: E501
The Matrix API is part of the GraphHopper Directions API and with this API you can calculate many-to-many distances, times or routes a lot more efficient than calling the Routing API multiple times. In the Routing API we support multiple points, so called 'via points', which results in one route being calculated. The Matrix API results in NxM routes or more precise NxM weights, distances or times being calculated but is a lot faster compared to NxM single requests. The most simple example is a tourist trying to decide which pizza is close to him instead of using beeline distance she can calculate a 1x4 matrix. Or a delivery service in the need of often big NxN matrices to solve vehicle routing problems. E.g. the GraphHopper Route Optimization API uses the Matrix API under the hood to achieve this. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.matrix_get_with_http_info(key, async=True)
>>> result = thread.get()
:param async bool
:param str key: Get your key at graphhopper.com (required)
:param list[str] point: Specifiy multiple points for which the weight-, route-, time- or distance-matrix should be calculated. In this case the starts are identical to the destinations. If there are N points, then NxN entries will be calculated. The order of the point parameter is important. Specify at least three points. Cannot be used together with from_point or to_point. Is a string with the format latitude,longitude.
:param list[str] from_point: The starting points for the routes. E.g. if you want to calculate the three routes A->1, A->2, A->3 then you have one from_point parameter and three to_point parameters. Is a string with the format latitude,longitude.
:param list[str] to_point: The destination points for the routes. Is a string with the format latitude,longitude.
:param list[str] point_hint: Optional parameter. Specifies a hint for each `point` parameter to prefer a certain street for the closest location lookup. E.g. if there is an address or house with two or more neighboring streets you can control for which street the closest location is looked up.
:param list[str] from_point_hint: For the from_point parameter. See point_hint
:param list[str] to_point_hint: For the to_point parameter. See point_hint
:param list[str] out_array: Specifies which arrays should be included in the response. Specify one or more of the following options 'weights', 'times', 'distances'. To specify more than one array use e.g. out_array=times&out_array=distances. The units of the entries of distances are meters, of times are seconds and of weights is arbitrary and it can differ for different vehicles or versions of this API.
:param str vehicle: The vehicle for which the route should be calculated. Other vehicles are foot, small_truck etc
:return: MatrixResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['key', 'point', 'from_point', 'to_point', 'point_hint', 'from_point_hint', 'to_point_hint', 'out_array', 'vehicle'] # noqa: E501
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method matrix_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'key' is set
if ('key' not in params or
params['key'] is None):
raise ValueError("Missing the required parameter `key` when calling `matrix_get`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'point' in params:
query_params.append(('point', params['point'])) # noqa: E501
collection_formats['point'] = 'multi' # noqa: E501
if 'from_point' in params:
query_params.append(('from_point', params['from_point'])) # noqa: E501
collection_formats['from_point'] = 'multi' # noqa: E501
if 'to_point' in params:
query_params.append(('to_point', params['to_point'])) # noqa: E501
collection_formats['to_point'] = 'multi' # noqa: E501
if 'point_hint' in params:
query_params.append(('point_hint', params['point_hint'])) # noqa: E501
collection_formats['point_hint'] = 'multi' # noqa: E501
if 'from_point_hint' in params:
query_params.append(('from_point_hint', params['from_point_hint'])) # noqa: E501
collection_formats['from_point_hint'] = 'multi' # noqa: E501
if 'to_point_hint' in params:
query_params.append(('to_point_hint', params['to_point_hint'])) # noqa: E501
collection_formats['to_point_hint'] = 'multi' # noqa: E501
if 'out_array' in params:
query_params.append(('out_array', params['out_array'])) # noqa: E501
collection_formats['out_array'] = 'multi' # noqa: E501
if 'vehicle' in params:
query_params.append(('vehicle', params['vehicle'])) # noqa: E501
if 'key' in params:
query_params.append(('key', params['key'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/matrix', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='MatrixResponse', # noqa: E501
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def matrix_post(self, key, **kwargs): # noqa: E501
"""Matrix API Post # noqa: E501
The GET request has an URL length limitation, which hurts for many locations per request. In those cases use a HTTP POST request with JSON data as input. The only parameter in the URL will be the key which stays in the URL. Both request scenarios are identically except that all singular parameter names are named as their plural for a POST request. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.matrix_post(key, async=True)
>>> result = thread.get()
:param async bool
:param str key: Get your key at graphhopper.com (required)
:param MatrixRequest body:
:return: MatrixResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async'):
return self.matrix_post_with_http_info(key, **kwargs) # noqa: E501
else:
(data) = self.matrix_post_with_http_info(key, **kwargs) # noqa: E501
return data
def matrix_post_with_http_info(self, key, **kwargs): # noqa: E501
"""Matrix API Post # noqa: E501
The GET request has an URL length limitation, which hurts for many locations per request. In those cases use a HTTP POST request with JSON data as input. The only parameter in the URL will be the key which stays in the URL. Both request scenarios are identically except that all singular parameter names are named as their plural for a POST request. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async=True
>>> thread = api.matrix_post_with_http_info(key, async=True)
>>> result = thread.get()
:param async bool
:param str key: Get your key at graphhopper.com (required)
:param MatrixRequest body:
:return: MatrixResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['key', 'body'] # noqa: E501
all_params.append('async')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method matrix_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'key' is set
if ('key' not in params or
params['key'] is None):
raise ValueError("Missing the required parameter `key` when calling `matrix_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'key' in params:
query_params.append(('key', params['key'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/matrix', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='MatrixResponse', # noqa: E501
auth_settings=auth_settings,
async=params.get('async'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 58.402985 | 831 | 0.670649 | 2,155 | 15,652 | 4.742923 | 0.152668 | 0.034439 | 0.016437 | 0.018589 | 0.896292 | 0.884845 | 0.851776 | 0.826338 | 0.826338 | 0.817141 | 0 | 0.012927 | 0.253706 | 15,652 | 267 | 832 | 58.621723 | 0.862084 | 0.040059 | 0 | 0.623188 | 0 | 0 | 0.182683 | 0.021646 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.028986 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
2d60caf05f56541f211a8547e29f246c69fb46db | 155 | py | Python | google_play_scraper/utils/__init__.py | shikher-chhawchharia/google-play-scraper | 31dfd4df46911e3489a61d50ea098d06c79e8353 | [
"MIT"
] | 325 | 2019-07-19T09:52:45.000Z | 2022-03-31T10:03:56.000Z | google_play_scraper/utils/__init__.py | ConnectionMaster/google-play-scraper | d9ee67e9ef85e9e725f32d43a6f89c78a339209f | [
"MIT"
] | 111 | 2019-07-08T16:31:19.000Z | 2022-03-10T07:51:56.000Z | google_play_scraper/utils/__init__.py | ConnectionMaster/google-play-scraper | d9ee67e9ef85e9e725f32d43a6f89c78a339209f | [
"MIT"
] | 93 | 2019-07-25T05:38:38.000Z | 2022-03-31T03:08:56.000Z | def nested_lookup(source, indexes):
if len(indexes) == 1:
return source[indexes[0]]
return nested_lookup(source[indexes[0]], indexes[1::])
| 31 | 58 | 0.664516 | 21 | 155 | 4.809524 | 0.47619 | 0.386139 | 0.356436 | 0.49505 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.031496 | 0.180645 | 155 | 4 | 59 | 38.75 | 0.76378 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
2d719a9af19063c851cc89c944b19b6d850c0503 | 4,002 | py | Python | tests/test_build.py | ofek/hatch-vcs | c5388d67192d9bf88191927a35f51705121784a1 | [
"MIT"
] | null | null | null | tests/test_build.py | ofek/hatch-vcs | c5388d67192d9bf88191927a35f51705121784a1 | [
"MIT"
] | 1 | 2022-03-08T04:07:09.000Z | 2022-03-18T05:41:17.000Z | tests/test_build.py | ofek/hatch-vcs | c5388d67192d9bf88191927a35f51705121784a1 | [
"MIT"
] | null | null | null | # SPDX-FileCopyrightText: 2022-present Ofek Lev <oss@ofek.dev>
#
# SPDX-License-Identifier: MIT
import os
import sys
import zipfile
import pytest
from .utils import build_project, read_file
def test_basic(new_project_basic):
build_project('-t', 'wheel')
build_dir = os.path.join(new_project_basic, 'dist')
assert os.path.isdir(build_dir)
artifacts = os.listdir(build_dir)
assert len(artifacts) == 1
wheel_file = artifacts[0]
assert wheel_file == 'my_app-1.2.3-py2.py3-none-any.whl'
extraction_directory = os.path.join(os.path.dirname(new_project_basic), '_archive')
os.mkdir(extraction_directory)
with zipfile.ZipFile(os.path.join(build_dir, wheel_file), 'r') as zip_archive:
zip_archive.extractall(extraction_directory)
metadata_directory = os.path.join(extraction_directory, 'my_app-1.2.3.dist-info')
assert os.path.isdir(metadata_directory)
package_directory = os.path.join(extraction_directory, 'my_app')
assert os.path.isdir(package_directory)
assert len(os.listdir(package_directory)) == 4
assert os.path.isfile(os.path.join(package_directory, '__init__.py'))
assert os.path.isfile(os.path.join(package_directory, 'foo.py'))
assert os.path.isfile(os.path.join(package_directory, 'bar.py'))
assert os.path.isfile(os.path.join(package_directory, 'baz.py'))
def test_write(new_project_write):
build_project('-t', 'wheel')
build_dir = os.path.join(new_project_write, 'dist')
assert os.path.isdir(build_dir)
artifacts = os.listdir(build_dir)
assert len(artifacts) == 1
wheel_file = artifacts[0]
assert wheel_file == 'my_app-1.2.3-py2.py3-none-any.whl'
extraction_directory = os.path.join(os.path.dirname(new_project_write), '_archive')
os.mkdir(extraction_directory)
with zipfile.ZipFile(os.path.join(build_dir, wheel_file), 'r') as zip_archive:
zip_archive.extractall(extraction_directory)
metadata_directory = os.path.join(extraction_directory, 'my_app-1.2.3.dist-info')
assert os.path.isdir(metadata_directory)
package_directory = os.path.join(extraction_directory, 'my_app')
assert os.path.isdir(package_directory)
assert len(os.listdir(package_directory)) == 5
assert os.path.isfile(os.path.join(package_directory, '__init__.py'))
assert os.path.isfile(os.path.join(package_directory, 'foo.py'))
assert os.path.isfile(os.path.join(package_directory, 'bar.py'))
assert os.path.isfile(os.path.join(package_directory, 'baz.py'))
version_file = os.path.join(package_directory, '_version.py')
assert os.path.isfile(version_file)
lines = read_file(version_file).splitlines()
assert lines[3] == "version = '1.2.3'"
@pytest.mark.skipif(sys.version_info[0] == 2, reason='Depends on fix in 6.4.0 which is Python 3-only')
def test_fallback(new_project_fallback):
build_project('-t', 'wheel')
build_dir = os.path.join(new_project_fallback, 'dist')
assert os.path.isdir(build_dir)
artifacts = os.listdir(build_dir)
assert len(artifacts) == 1
wheel_file = artifacts[0]
assert wheel_file == 'my_app-7.8.9-py2.py3-none-any.whl'
extraction_directory = os.path.join(os.path.dirname(new_project_fallback), '_archive')
os.mkdir(extraction_directory)
with zipfile.ZipFile(os.path.join(build_dir, wheel_file), 'r') as zip_archive:
zip_archive.extractall(extraction_directory)
metadata_directory = os.path.join(extraction_directory, 'my_app-7.8.9.dist-info')
assert os.path.isdir(metadata_directory)
package_directory = os.path.join(extraction_directory, 'my_app')
assert os.path.isdir(package_directory)
assert len(os.listdir(package_directory)) == 4
assert os.path.isfile(os.path.join(package_directory, '__init__.py'))
assert os.path.isfile(os.path.join(package_directory, 'foo.py'))
assert os.path.isfile(os.path.join(package_directory, 'bar.py'))
assert os.path.isfile(os.path.join(package_directory, 'baz.py'))
| 36.054054 | 102 | 0.728636 | 593 | 4,002 | 4.70489 | 0.150084 | 0.113978 | 0.100358 | 0.083871 | 0.845161 | 0.826523 | 0.826523 | 0.826523 | 0.826523 | 0.826523 | 0 | 0.013592 | 0.135932 | 4,002 | 110 | 103 | 36.381818 | 0.793233 | 0.022239 | 0 | 0.69863 | 0 | 0 | 0.103351 | 0.04221 | 0 | 0 | 0 | 0 | 0.438356 | 1 | 0.041096 | false | 0 | 0.068493 | 0 | 0.109589 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
2db17d5e21af0290b1d80b2c2d0b32d93077653a | 57,954 | py | Python | flymazerl/agents/neuralnetworks.py | neurorishika/FlYMazeRL | 8663c9a49a19acba6df3e713b4cbf2e4514edd3e | [
"BSD-3-Clause"
] | null | null | null | flymazerl/agents/neuralnetworks.py | neurorishika/FlYMazeRL | 8663c9a49a19acba6df3e713b4cbf2e4514edd3e | [
"BSD-3-Clause"
] | null | null | null | flymazerl/agents/neuralnetworks.py | neurorishika/FlYMazeRL | 8663c9a49a19acba6df3e713b4cbf2e4514edd3e | [
"BSD-3-Clause"
] | 1 | 2022-03-11T19:17:40.000Z | 2022-03-11T19:17:40.000Z | import numpy as np
import os
import torch
import torch.nn as nn
from flymazerl.agents.base import FlYMazeAgent
import time
"""
+++++++++++++++++++++++++++++++
Recurrent Neural Network Agents
+++++++++++++++++++++++++++++++
"""
class VanillaRNN(nn.Module):
"""
A class to create a typical Recurrent Neural Network (RNN) with the option to use an intermediate Linear encoder.
"""
def __init__(
self,
input_size,
state_size,
output_size,
num_layers,
use_intermediate_encoder=False,
encoder_size=None,
allow_negative_values=True,
symmetric=False,
device="cpu",
):
"""
Initialize the RNN
==================
Parameters:
input_size: The number of inputs to the RNN (int)
state_size: The number of hidden units in the RNN (int)
output_size: The number of outputs from the RNN (int)
num_layers: The number of layers in the RNN (int)
use_intermediate_encoder: Whether or not to use an intermediate encoder (bool)
encoder_size: The number of units in the intermediate encoder (int)
allow_negative_values: Whether or not to allow negative values in the output (bool)
symmetric: Whether or not to use a symmetric RNN (bool)
device: The device to use for the RNN (str)
"""
super(VanillaRNN, self).__init__()
self.state_size = state_size
self.input_size = input_size
self.output_size = output_size
self.num_layers = num_layers
if use_intermediate_encoder:
self.encoder = nn.Linear(input_size, encoder_size)
self.encoder_size = encoder_size
self.rnn = nn.RNN(encoder_size, state_size, num_layers, batch_first=True)
else:
self.encoder = None
self.encoder_size = None
self.rnn = nn.RNN(input_size, state_size, num_layers, batch_first=True)
self.decoder = nn.Linear(state_size, output_size)
self.device = device
self.allow_negative_values = allow_negative_values
self.symmetric = symmetric
def forward(self, inputs, hidden_state):
"""
Forward pass through the RNN
=============================
Parameters:
inputs: The inputs to the RNN (torch.Tensor)
hidden_state: The hidden state of the RNN (torch.Tensor)
Returns:
output: The output of the RNN (torch.Tensor)
hidden_state: The new hidden state of the RNN (torch.Tensor)
"""
x = inputs
x_ = inputs * torch.tensor([[-1, 1]]).to(self.device)
if self.encoder is not None:
output, _ = self.rnn(self.encoder(x), hidden_state)
output_, _ = self.rnn(self.encoder(x_), hidden_state)
else:
output, _ = self.rnn(x, hidden_state)
output_, _ = self.rnn(x_, hidden_state)
output = self.decoder(output)
output_ = self.decoder(output_)
if self.symmetric:
output = (output + torch.flip(output_, dims=[2])) / 2
else:
output = output
if self.allow_negative_values:
output = output
else:
output = torch.sigmoid(output)
return output
def init_hidden(self, batch_size):
"""
Initialize the hidden state of the RNN
======================================
Parameters:
batch_size: The batch size of the RNN (int)
Returns:
hidden_state: The hidden state of the RNN (torch.Tensor)
"""
return torch.zeros(self.num_layers, batch_size, self.state_size).to(self.device)
class LSTMNN(nn.Module):
"""
A class to create a Long Short-Term Memory (LSTM) RNN with the option to use an intermediate Linear encoder.
"""
def __init__(
self,
input_size,
state_size,
output_size,
num_layers,
use_intermediate_encoder=False,
encoder_size=None,
allow_negative_values=True,
symmetric=False,
device="cpu",
):
"""
Initialize the LSTM
===================
Parameters:
input_size: The number of inputs to the LSTM (int)
state_size: The number of hidden units in the LSTM (int)
output_size: The number of outputs from the LSTM (int)
num_layers: The number of layers in the LSTM (int)
use_intermediate_encoder: Whether or not to use an intermediate encoder (bool)
encoder_size: The number of units in the intermediate encoder (int)
allow_negative_values: Whether or not to allow negative values in the output (bool)\
symmetric: Whether or not to use a symmetric LSTM (bool)
device: The device to use for the LSTM (str)
"""
super(LSTMNN, self).__init__()
self.state_size = state_size
self.input_size = input_size
self.output_size = output_size
self.num_layers = num_layers
if use_intermediate_encoder:
self.encoder = nn.Linear(input_size, encoder_size)
self.encoder_size = encoder_size
self.rnn = nn.LSTM(encoder_size, state_size, num_layers, batch_first=True)
else:
self.encoder = None
self.encoder_size = None
self.rnn = nn.LSTM(input_size, state_size, num_layers, batch_first=True)
self.decoder = nn.Linear(state_size, output_size)
self.device = device
self.allow_negative_values = allow_negative_values
self.symmetric = symmetric
def forward(self, inputs, hidden_state):
"""
Forward pass through the LSTM
=============================
Parameters:
inputs: The inputs to the LSTM (torch.Tensor)
hidden_state: The hidden state of the LSTM (torch.Tensor)
Returns:
output: The output of the LSTM (torch.Tensor)
hidden_state: The new hidden state of the LSTM (torch.Tensor)
"""
x = inputs
x_ = inputs * torch.tensor([[-1, 1]]).to(self.device)
if self.encoder is not None:
output, _ = self.rnn(self.encoder(x), hidden_state)
output_, _ = self.rnn(self.encoder(x_), hidden_state)
else:
output, _ = self.rnn(x, hidden_state)
output_, _ = self.rnn(x_, hidden_state)
output = self.decoder(output)
output_ = self.decoder(output_)
if self.symmetric:
output = (output + torch.flip(output_, dims=[2])) / 2
else:
output = output
if self.allow_negative_values:
output = output
else:
output = torch.sigmoid(output)
return output
def init_hidden(self, batch_size):
"""
Initialize the hidden state of the LSTM
=======================================
Parameters:
batch_size: The batch size of the LSTM (int)
Returns:
hidden_state: The hidden state of the LSTM (torch.Tensor)
"""
return (
torch.zeros(self.num_layers, batch_size, self.state_size).to(self.device),
torch.zeros(self.num_layers, batch_size, self.state_size).to(self.device),
)
class GRUNN(nn.Module):
"""
A class to create a Gated Recurrent Unit (GRU) RNN with the option to use an intermediate Linear encoder.
"""
def __init__(
self,
input_size,
state_size,
output_size,
num_layers,
use_intermediate_encoder=False,
encoder_size=None,
allow_negative_values=True,
symmetric=False,
device="cpu",
):
"""
Initialize the GRU
==================
Parameters:
input_size: The number of inputs to the GRU (int)
state_size: The number of hidden units in the GRU (int)
output_size: The number of outputs from the GRU (int)
num_layers: The number of layers in the GRU (int)
use_intermediate_encoder: Whether or not to use an intermediate encoder (bool)
encoder_size: The number of units in the intermediate encoder (int)
allow_negative_values: Whether or not to allow negative values in the output (bool)
symmetric: Whether or not to use a symmetric GRU (bool)
device: The device to use for the GRU (str)
"""
super(GRUNN, self).__init__()
self.state_size = state_size
self.input_size = input_size
self.output_size = output_size
self.num_layers = num_layers
if use_intermediate_encoder:
self.encoder = nn.Linear(input_size, encoder_size)
self.encoder_size = encoder_size
self.rnn = nn.GRU(encoder_size, state_size, num_layers, batch_first=True)
else:
self.encoder = None
self.encoder_size = None
self.rnn = nn.GRU(input_size, state_size, num_layers, batch_first=True)
self.decoder = nn.Linear(state_size, output_size)
self.device = device
self.allow_negative_values = allow_negative_values
self.symmetric = symmetric
def forward(self, inputs, hidden_state):
"""
Forward pass through the GRU
============================
Parameters:
inputs: The inputs to the GRU (torch.Tensor)
hidden_state: The hidden state of the GRU (torch.Tensor)
Returns:
output: The output of the GRU (torch.Tensor)
hidden_state: The new hidden state of the GRU (torch.Tensor)
"""
x = inputs
x_ = inputs * torch.tensor([[-1, 1]]).to(self.device)
if self.encoder is not None:
output, _ = self.rnn(self.encoder(x), hidden_state)
output_, _ = self.rnn(self.encoder(x_), hidden_state)
else:
output, _ = self.rnn(x, hidden_state)
output_, _ = self.rnn(x_, hidden_state)
output = self.decoder(output)
output_ = self.decoder(output_)
if self.symmetric:
output = (output + torch.flip(output_, dims=[2])) / 2
else:
output = output
if self.allow_negative_values:
output = output
else:
output = torch.sigmoid(output)
return output
def init_hidden(self, batch_size):
"""
Initialize the hidden state of the GRU
======================================
Parameters:
batch_size: The batch size of the GRU (int)
Returns:
hidden_state: The hidden state of the GRU (torch.Tensor)
"""
return torch.zeros(self.num_layers, batch_size, self.state_size).to(self.device)
class GRNNLearner(FlYMazeAgent):
def init_variables(
self,
reservoir_size,
num_layers,
symmetric_q_function=True,
allow_negative_values=True,
omission_is_punishment=False,
encoder_size=None,
kind="RNN",
policy_type="softmax",
device="cpu",
pre_trained=False,
model_path=None,
multi_agent=False,
n_agents=1,
):
"""
Initialize the variables of a Generalized RNN Learner
=====================================================
Parameters:
reservoir_size: The number of units in the RNN hidden layer (int)
num_layers: The number of layers in the RNN (int)
symmetric_q_function: Whether or not to use a symmetric decision network (bool)
allow_negative_values: Whether or not to allow negative values in the RNN (bool)
omission_is_punishment: Whether or not to punish omission (bool)
encoder_size: The number of units in the intermediate encoder (int)
kind: The type of RNN to use (str) (options: "RNN", "LSTM", "GRU")
policy_type: The type of policy to use (str) (options: "softmax", "greedy")
device: The device to use for the RNN (str) (options: "cpu", "cuda")
pre_trained: Whether or not to load a pre-trained model (bool)
model_path: The path to the pre-trained model (str)
multi_agent: Whether or not to simulate multiple agents (bool)
n_agents: The number of agents to simulate (int)
"""
self.reservoir_size = reservoir_size
self.num_layers = num_layers
self.encoder_size = encoder_size
self.device = device
self.symmetric_q_function = symmetric_q_function
self.allow_negative_values = allow_negative_values
self.omission_is_punishment = omission_is_punishment
self.kind = kind
if pre_trained:
assert (
model_path is not None
), "If you want to use a pre-trained model, you need to specify the path to the model."
if kind == "RNN":
self.agent = VanillaRNN(
2,
self.reservoir_size,
self.action_space_size,
self.num_layers,
use_intermediate_encoder=self.encoder_size is not None,
encoder_size=self.encoder_size,
allow_negative_values=self.allow_negative_values,
symmetric=self.symmetric_q_function,
).to(self.device)
elif kind == "LSTM":
self.agent = LSTMNN(
2,
self.reservoir_size,
self.action_space_size,
self.num_layers,
use_intermediate_encoder=self.encoder_size is not None,
encoder_size=self.encoder_size,
allow_negative_values=self.allow_negative_values,
symmetric=self.symmetric_q_function,
).to(self.device)
elif kind == "GRU":
self.agent = GRUNN(
2,
self.reservoir_size,
self.action_space_size,
self.num_layers,
use_intermediate_encoder=self.encoder_size is not None,
encoder_size=self.encoder_size,
allow_negative_values=self.allow_negative_values,
symmetric=self.symmetric_q_function,
).to(self.device)
else:
raise ValueError("Unknown RNN kind: {}".format(kind))
if pre_trained:
self.agent.load_state_dict(torch.load(model_path))
self.policy_type = policy_type
self.multi_agent = multi_agent
self.n_agents = n_agents
if multi_agent:
self.history = torch.zeros((n_agents, 1, 2)).to(self.device)
self.bias = np.zeros(n_agents)
else:
self.history = torch.zeros((1, 1, 2)).to(self.device)
def trial_step(self, state):
"""
Take a step in the environment given the current state
======================================================
Parameters:
state: The current state of the environment (np.array)
Returns:
action: The action to take in the environment (int)
"""
hidden = self.agent.init_hidden(self.n_agents).to(self.device)
action_logits = self.agent(self.history.float(), hidden)
if self.multi_agent:
if self.policy_type == "softmax":
action_probabilities = action_logits.softmax(dim=2)[:, -1, :]
action = torch.multinomial(action_probabilities, 1)
elif self.policy_type == "greedy":
action_probabilities = action_logits[:, -1, :]
action = action_probabilities.argmax(dim=1)
elif self.policy_type == "acceptreject":
action_probabilities = torch.exp(
torch.stack(
[
torch.log(action_logits[:, :, 0])
+ torch.log(3 - action_logits[:, :, 1])
- torch.log(
3 * action_logits[:, :, 1]
+ 3 * action_logits[:, :, 0]
- 2 * action_logits[:, :, 0] * action_logits[:, :, 1]
),
torch.log(action_logits[:, :, 1])
+ torch.log(3 - action_logits[:, :, 0])
- torch.log(
3 * action_logits[:, :, 1]
+ 3 * action_logits[:, :, 0]
- 2 * action_logits[:, :, 0] * action_logits[:, :, 1]
),
],
dim=2,
)
)[:, -1, :]
action = torch.multinomial(action_probabilities, 1)
else:
raise ValueError("Unknown policy type: {}".format(self.policy_type))
new_state, reward, done, _ = self.env.step(action)
reward = torch.tensor(reward)
if self.omission_is_punishment:
self.history = torch.cat(
[
self.history,
torch.concat([action.view(-1, 1) * 2 - 1, reward.view(-1, 1) * 2 - 1], axis=1).unsqueeze(1),
],
axis=1,
)
self.action_history = (self.history.cpu().detach().numpy()[:, 1:, 0] + 1) / 2
self.reward_history = (self.history.cpu().detach().numpy()[:, 1:, 1] + 1) / 2
else:
self.history = torch.cat(
[
self.history,
torch.concat([action.view(-1, 1) * 2 - 1, reward.view(-1, 1)], axis=1).unsqueeze(1),
],
axis=1,
)
self.action_history = (self.history.cpu().detach().numpy()[:, 1:, 0] + 1) / 2
self.reward_history = self.history.cpu().detach().numpy()[:, 1:, 1]
self.bias += (action == self.biased_action).cpu().detach().numpy() # update bias estimate
else:
if self.policy_type == "softmax":
action_probabilities = action_logits.softmax(dim=2).squeeze(0)[-1]
action = torch.multinomial(action_probabilities, 1).item()
elif self.policy_type == "greedy":
action_probabilities = action_logits.squeeze(0)[-1]
action = action_probabilities.argmax().item()
elif self.policy_type == "acceptreject":
action_probabilities = torch.exp(
torch.stack(
[
torch.log(action_logits[:, :, 0])
+ torch.log(3 - action_logits[:, :, 1])
- torch.log(
3 * action_logits[:, :, 1]
+ 3 * action_logits[:, :, 0]
- 2 * action_logits[:, :, 0] * action_logits[:, :, 1]
),
torch.log(action_logits[:, :, 1])
+ torch.log(3 - action_logits[:, :, 0])
- torch.log(
3 * action_logits[:, :, 1]
+ 3 * action_logits[:, :, 0]
- 2 * action_logits[:, :, 0] * action_logits[:, :, 1]
),
],
dim=2,
)
).squeeze(0)[-1]
action = torch.multinomial(action_probabilities, 1).item()
else:
raise ValueError("Unknown policy type: {}".format(self.policy_type))
new_state, reward, done, _ = self.env.step(action)
if self.omission_is_punishment:
self.history = torch.cat(
(self.history, torch.tensor([[[action * 2 - 1, reward * 2 - 1]]]).to(self.device)), dim=1
)
self.action_history = (self.history.squeeze(0).cpu().detach().numpy()[1:, 0] + 1) / 2
self.reward_history = (self.history.squeeze(0).cpu().detach().numpy()[1:, 1] + 1) / 2
else:
self.history = torch.cat(
(self.history, torch.tensor([[[action * 2 - 1, reward]]]).to(self.device)), dim=1
)
self.action_history = (self.history.squeeze(0).cpu().detach().numpy()[1:, 0] + 1) / 2
self.reward_history = self.history.squeeze(0).cpu().detach().numpy()[1:, 1]
if action == self.biased_action:
self.bias += 1 # update bias estimate
return new_state, done
def get_q_history(self):
"""
Get the history of the values
=================================
Returns:
q_history: The history of the values (np.array)
"""
if self.multi_agent:
return self.agent.forward(self.history.float())[:, 1:-1, :].cpu().detach().numpy()
else:
return self.agent.forward(self.history.float())[:, 1:-1, :].squeeze(0).cpu().detach().numpy()
def run_episode(self):
"""
Run an episode of the environment
"""
state = self.env.reset() # reset environment
done = False
while not done:
state, done = self.trial_step(state) # trial step
def reset_variables(self):
"""
Reset the variables of the agent
"""
self.history = torch.zeros((1, 1, 2)).to(self.device)
def get_action_probabilities_from_data(self, actions_set, rewards_set):
"""
Given a dataset, infer the action probabilities
===============================================
actions_set: The set of actions taken in the dataset (np.array)
rewards_set: The set of rewards obtained in the dataset (np.array)
Returns:
action_probabilities: The inferred action probabilities (np.array)
"""
dataset = torch.tensor(np.array([actions_set, rewards_set]).transpose((1, 2, 0)), dtype=torch.int32).to(
self.device
)
X = torch.clone(dataset[:, :-1, :])
if self.omission_is_punishment:
X = X * 2 - 1
else:
X[:, :, 0] = X[:, :, 0] * 2 - 1
hidden = self.agent.init_hidden(X.shape[0]).to(self.device)
logits = self.agent.forward(X.float(), hidden)
action_probabilities = logits.softmax(dim=2)[:, :, 1].cpu().detach().numpy()
return action_probabilities
def load_pre_trained_model(self, model_path):
"""
Load a pre-trained model
"""
assert os.path.exists(model_path), "The model path does not exist."
self.agent.load_state_dict(torch.load(model_path))
def fit(
self,
actions_set,
rewards_set,
train_test_split=0.8,
n_replications=1,
early_stopping=True,
early_stopping_patience=50,
max_epochs=10000,
learning_rate=0.0005,
print_every=500,
weight_decay=1e-5,
filter_best=True,
uid=None,
tolerance=1e-4,
):
"""
Fit the agent to the data using early stopping
==============================================
Parameters:
actions_set: The set of actions taken in the environment (np.array)
rewards_set: The set of rewards received in the environment (np.array)
train_test_split: The percentage of the data to use for training (float)
n_replications: The number of times to replicate the training (int)
early_stopping: Whether or not to use early stopping (bool)
early_stopping_patience: The number of epochs to wait before stopping (int)
max_epochs: The maximum number of epochs to train for (int)
learning_rate: The learning rate to use for training (float)
print_every: The number of epochs to wait before printing the loss (int)
weight_decay: The weight decay to use for training (float)
filter_best: Whether or not to filter the best model (bool)
uid: The unique identifier of the model (str)
tolerance: The tolerance for the loss (float)
"""
dataset = torch.tensor(np.array([actions_set, rewards_set]).transpose((1, 2, 0)), dtype=torch.int32).to(
self.device
)
X = torch.clone(dataset[:, :-1, :])
y = torch.clone(dataset[:, 1:, 0])
if self.omission_is_punishment:
X = X * 2 - 1
else:
X[:, :, 0] = X[:, :, 0] * 2 - 1
fitting_stats = []
for i in range(n_replications):
start_time = time.time()
for layer in self.agent.children():
if hasattr(layer, "reset_parameters"):
layer.reset_parameters()
optimizer = torch.optim.Adam(self.agent.parameters(), lr=learning_rate, weight_decay=weight_decay)
scheduler = torch.optim.lr_scheduler.ReduceLROnPlateau(
optimizer, mode="min", factor=0.5, patience=5, verbose=True
)
loss_fn = nn.CrossEntropyLoss()
train_indices = np.random.choice(len(X), int(len(X) * train_test_split), replace=False)
val_indices = np.array(list(set(range(len(X))) - set(train_indices)))
X_train, X_val = X[train_indices].to(self.device), X[val_indices].to(self.device)
y_train, y_val = y[train_indices].to(self.device), y[val_indices].to(self.device)
training_loss = []
validation_loss = []
patience = early_stopping_patience
best_val_loss = float("inf")
for epoch in range(max_epochs):
self.agent.train()
optimizer.zero_grad()
hidden = self.agent.init_hidden(X_train.shape[0]).to(self.device)
output = self.agent(X_train.float(), hidden)
if self.policy_type == "softmax":
output = output.softmax(dim=2).view(-1, self.action_space_size)
elif self.policy_type == "greedy":
output = output.argmax(dim=2).view(-1)
elif self.policy_type == "acceptreject":
output = torch.exp(
torch.stack(
[
torch.log(output[:, :, 0])
+ torch.log(3 - output[:, :, 1])
- torch.log(
3 * output[:, :, 1] + 3 * output[:, :, 0] - 2 * output[:, :, 0] * output[:, :, 1]
),
torch.log(output[:, :, 1])
+ torch.log(3 - output[:, :, 0])
- torch.log(
3 * output[:, :, 1] + 3 * output[:, :, 0] - 2 * output[:, :, 0] * output[:, :, 1]
),
],
dim=2,
)
).view(-1, self.action_space_size)
else:
raise ValueError("Unknown policy type.")
loss = loss_fn(output, y_train.view(-1).long())
training_loss.append(loss.item())
loss.backward()
optimizer.step()
self.agent.eval()
with torch.no_grad():
hidden = self.agent.init_hidden(X_val.shape[0]).to(self.device)
output = self.agent(X_val.float(), hidden)
if self.policy_type == "softmax":
output = output.softmax(dim=2).view(-1, self.action_space_size)
elif self.policy_type == "greedy":
output = output.argmax(dim=2).view(-1)
elif self.policy_type == "acceptreject":
output = torch.exp(
torch.stack(
[
torch.log(output[:, :, 0])
+ torch.log(3 - output[:, :, 1])
- torch.log(
3 * output[:, :, 1]
+ 3 * output[:, :, 0]
- 2 * output[:, :, 0] * output[:, :, 1]
),
torch.log(output[:, :, 1])
+ torch.log(3 - output[:, :, 0])
- torch.log(
3 * output[:, :, 1]
+ 3 * output[:, :, 0]
- 2 * output[:, :, 0] * output[:, :, 1]
),
],
dim=2,
)
).view(-1, self.action_space_size)
else:
raise ValueError("Unknown policy type.")
val_loss = loss_fn(output, y_val.view(-1).long())
validation_loss.append(val_loss.item())
scheduler.step(val_loss)
if epoch % print_every == 0:
print("Epoch {}: \tTraining Loss: {:.4f}\tValidation Loss: {:.4f}".format(epoch, loss, val_loss))
if early_stopping:
if val_loss < best_val_loss - tolerance:
best_val_loss = val_loss
patience = early_stopping_patience
torch.save(
self.agent.state_dict(),
"model_{}.pt".format(i) if uid is None else "model_{}_{}.pt".format(uid, i),
)
else:
patience -= 1
if patience == 0:
print("Early stopping at epoch {}".format(epoch))
print("Best validation loss: {:.4f}".format(best_val_loss))
break
fitting_stats.append(
{
"training_loss": training_loss,
"validation_loss": validation_loss,
"best_val_loss": best_val_loss,
"epoch": epoch,
"best_val_epoch": epoch - early_stopping_patience,
"training_time": time.time() - start_time,
}
)
if filter_best:
print("Finding best model replicate...")
best_model_loss = float("inf")
for i in range(n_replications):
for layer in self.agent.children():
if hasattr(layer, "reset_parameters"):
layer.reset_parameters()
loss_fn = nn.CrossEntropyLoss()
self.agent.load_state_dict(
torch.load("model_{}.pt".format(i) if uid is None else "model_{}_{}.pt".format(uid, i))
)
self.agent.eval()
with torch.no_grad():
hidden = self.agent.init_hidden(X.shape[0]).to(self.device)
output = self.agent(X.float(), hidden)
if self.policy_type == "softmax":
output = output.softmax(dim=2).view(-1, self.action_space_size)
elif self.policy_type == "greedy":
output = output.argmax(dim=2).view(-1)
elif self.policy_type == "acceptreject":
output = torch.exp(
torch.stack(
[
torch.log(output[:, :, 0])
+ torch.log(3 - output[:, :, 1])
- torch.log(
3 * output[:, :, 1]
+ 3 * output[:, :, 0]
- 2 * output[:, :, 0] * output[:, :, 1]
),
torch.log(output[:, :, 1])
+ torch.log(3 - output[:, :, 0])
- torch.log(
3 * output[:, :, 1]
+ 3 * output[:, :, 0]
- 2 * output[:, :, 0] * output[:, :, 1]
),
],
dim=2,
)
).view(-1, self.action_space_size)
else:
raise ValueError("Unknown policy type.")
val_loss = loss_fn(output, y.view(-1).long())
if val_loss < best_model_loss:
best_model_loss = val_loss
torch.save(
self.agent.state_dict(), "best_model.pt" if uid is None else "best_model_{}.pt".format(uid)
)
os.remove("model_{}.pt".format(i) if uid is None else "model_{}_{}.pt".format(uid, i))
print("Best model found!")
return fitting_stats
"""
+++++++++++++++++++++++++++++++
Q Function Approximator Network
+++++++++++++++++++++++++++++++
"""
class GFFNN(nn.Module):
"""
A class to create a generalized Feedforward Q function approximation network
"""
def __init__(
self,
input_size,
hidden_state_sizes,
action_space_size,
device="cpu",
activation="relu",
allow_negative_values=False,
symmetric=False,
):
"""
Initialize the network
======================
Parameters:
input_size: The size of the input to the network (int)
hidden_state_sizes: The sizes of the hidden layers (list of ints)
action_space_size: The size of the action space (int)
"""
super(GFFNN, self).__init__()
self.num_layers = len(hidden_state_sizes)
self.input_size = input_size + action_space_size
self.action_space_size = action_space_size
self.layers = nn.ModuleList([nn.Linear(self.input_size, hidden_state_sizes[0])])
self.layers.extend([nn.Linear(h1, h2) for h1, h2 in zip(hidden_state_sizes, hidden_state_sizes[1:])])
self.layers.append(nn.Linear(hidden_state_sizes[-1], self.action_space_size))
self.activation = activation
self.allow_negative_values = allow_negative_values
self.symmetric = symmetric
self.device = device
def forward(self, input, q_value):
"""
Forward pass of the network
===========================
Parameters:
input: The input to the network (torch.Tensor)
q_value: The q value of the action (torch.Tensor)
"""
x = torch.cat((input, q_value), dim=1)
x_ = torch.cat(
(input * torch.tensor([[-1, 1]] * input.shape[0]).to(self.device), torch.flip(q_value, dims=[1])), dim=1
)
for layer in self.layers[:-1]:
if self.activation == "relu":
x = torch.relu(layer(x))
x_ = torch.relu(layer(x_))
elif self.activation == "tanh":
x = torch.tanh(layer(x))
x_ = torch.tanh(layer(x_))
elif self.activation == "sigmoid":
x = torch.sigmoid(layer(x))
x_ = torch.sigmoid(layer(x_))
else:
raise ValueError("Invalid activation function")
if self.symmetric:
q_values = (self.layers[-1](x) + torch.flip(self.layers[-1](x_), dims=(1,))) / 2
else:
q_values = self.layers[-1](x)
if self.allow_negative_values:
return q_values
else:
return torch.sigmoid(q_values)
def forward_loop(self, inputs):
"""
Full forward pass of the network over time
==========================================
Parameters:
inputs: The inputs to the network (torch.Tensor)
"""
q_value = self.init_qvalue(inputs.shape[0])
q_values = q_value.unsqueeze(1)
n_trials = inputs.shape[1]
for i in range(n_trials):
q_value = self.forward(inputs[:, i, :], q_value)
q_values = torch.concat([q_values, q_value.unsqueeze(1)], dim=1)
return q_values
def init_qvalue(self, batch_size):
"""
Initialize the q value of the network
======================================
Parameters:
batch_size: The size of the batch (int)
"""
return 0.5 * torch.ones(batch_size, self.action_space_size).to(self.device)
class GQLearner(FlYMazeAgent):
def init_variables(
self,
hidden_state_sizes,
activation="relu",
allow_negative_values=False,
symmetric_q_function=True,
omission_is_punishment=False,
policy_type="softmax",
device="cpu",
pre_trained=False,
model_path=None,
multi_agent=False,
n_agents=1,
):
"""
Initialize the variables of a Generalized Q-function learner
============================================================
Parameters:
hidden_state_sizes: The sizes of the hidden layers (list of ints)
activation: The activation function of the network (str)
omission_is_punishment: Whether the omission is a punishment (bool)
policy_type: The type of policy to use (str)
device: The device to use (str)
pre_trained: Whether to use a pre-trained model (bool)
model_path: The path to the pre-trained model (str)
multi_agent: Whether to simulate multiple agents (bool)
n_agents: The number of agents to simulate (int)
"""
self.hidden_state_sizes = hidden_state_sizes
self.activation = activation
self.omission_is_punishment = omission_is_punishment
self.policy_type = policy_type
self.device = device
if pre_trained:
assert (
model_path is not None
), "If you want to use a pre-trained model, you need to specify the path to the model."
self.symmetric_q_function = symmetric_q_function
self.allow_negative_values = allow_negative_values
self.agent = GFFNN(
input_size=2,
hidden_state_sizes=hidden_state_sizes,
action_space_size=self.action_space_size,
device=self.device,
activation=activation,
allow_negative_values=self.allow_negative_values,
symmetric=self.symmetric_q_function,
).to(self.device)
if pre_trained:
self.agent.load_state_dict(torch.load(model_path))
self.multi_agent = multi_agent
if multi_agent:
self.history = torch.zeros((n_agents, 1, 2)).to(self.device)
self.bias = np.zeros(n_agents)
else:
self.history = torch.zeros((1, 1, 2)).to(self.device)
def trial_step(self, state):
"""
Take a step in the environment given the current state
======================================================
Parameters:
state: The current state of the environment (np.array)
Returns:
action: The action to take in the environment (int)
"""
action_logits = self.agent.forward_loop(self.history.float())
if self.multi_agent:
if self.policy_type == "softmax":
action_probabilities = action_logits.softmax(dim=2)[:, -1, :]
action = torch.multinomial(action_probabilities, 1)
elif self.policy_type == "greedy":
action_probabilities = action_logits[:, -1, :]
action = action_probabilities.argmax(dim=1)
elif self.policy_type == "acceptreject":
action_probabilities = torch.exp(
torch.stack(
[
torch.log(action_logits[:, :, 0])
+ torch.log(3 - action_logits[:, :, 1])
- torch.log(
3 * action_logits[:, :, 1]
+ 3 * action_logits[:, :, 0]
- 2 * action_logits[:, :, 0] * action_logits[:, :, 1]
),
torch.log(action_logits[:, :, 1])
+ torch.log(3 - action_logits[:, :, 0])
- torch.log(
3 * action_logits[:, :, 1]
+ 3 * action_logits[:, :, 0]
- 2 * action_logits[:, :, 0] * action_logits[:, :, 1]
),
],
dim=2,
)
)[:, -1, :]
action = torch.multinomial(action_probabilities, 1)
else:
raise ValueError("Unknown policy type: {}".format(self.policy_type))
new_state, reward, done, _ = self.env.step(action)
reward = torch.tensor(reward)
if self.omission_is_punishment:
self.history = torch.cat(
[
self.history,
torch.concat([action.view(-1, 1) * 2 - 1, reward.view(-1, 1) * 2 - 1], axis=1).unsqueeze(1),
],
axis=1,
)
self.action_history = (self.history.cpu().detach().numpy()[:, 1:, 0] + 1) / 2
self.reward_history = (self.history.cpu().detach().numpy()[:, 1:, 1] + 1) / 2
else:
self.history = torch.cat(
[
self.history,
torch.concat([action.view(-1, 1) * 2 - 1, reward.view(-1, 1)], axis=1).unsqueeze(1),
],
axis=1,
)
self.action_history = (self.history.cpu().detach().numpy()[:, 1:, 0] + 1) / 2
self.reward_history = self.history.cpu().detach().numpy()[:, 1:, 1]
self.bias += (action == self.biased_action).cpu().detach().numpy() # update bias estimate
else:
if self.policy_type == "softmax":
action_probabilities = action_logits.softmax(dim=2).squeeze(0)[-1]
action = torch.multinomial(action_probabilities, 1).item()
elif self.policy_type == "greedy":
action_probabilities = action_logits.squeeze(0)[-1]
action = action_probabilities.argmax().item()
elif self.policy_type == "acceptreject":
action_probabilities = torch.exp(
torch.stack(
[
torch.log(action_logits[:, :, 0])
+ torch.log(3 - action_logits[:, :, 1])
- torch.log(
3 * action_logits[:, :, 1]
+ 3 * action_logits[:, :, 0]
- 2 * action_logits[:, :, 0] * action_logits[:, :, 1]
),
torch.log(action_logits[:, :, 1])
+ torch.log(3 - action_logits[:, :, 0])
- torch.log(
3 * action_logits[:, :, 1]
+ 3 * action_logits[:, :, 0]
- 2 * action_logits[:, :, 0] * action_logits[:, :, 1]
),
],
dim=2,
)
).squeeze(0)[-1]
action = torch.multinomial(action_probabilities, 1).item()
else:
raise ValueError("Unknown policy type: {}".format(self.policy_type))
new_state, reward, done, _ = self.env.step(action)
if self.omission_is_punishment:
self.history = torch.cat(
(self.history, torch.tensor([[[action * 2 - 1, reward * 2 - 1]]]).to(self.device)), dim=1
)
self.action_history = (self.history.squeeze(0).cpu().detach().numpy()[1:, 0] + 1) / 2
self.reward_history = (self.history.squeeze(0).cpu().detach().numpy()[1:, 1] + 1) / 2
else:
self.history = torch.cat(
(self.history, torch.tensor([[[action * 2 - 1, reward]]]).to(self.device)), dim=1
)
self.action_history = (self.history.squeeze(0).cpu().detach().numpy()[1:, 0] + 1) / 2
self.reward_history = self.history.squeeze(0).cpu().detach().numpy()[1:, 1]
if action == self.biased_action:
self.bias += 1 # update bias estimate
return new_state, done
def get_q_history(self):
"""
Get the history of the Q-values
=================================
Returns:
q_history: The history of the Q-values (np.array)
"""
if self.multi_agent:
return self.agent.forward_loop(self.history.float())[:, 1:-1, :].cpu().detach().numpy()
else:
return self.agent.forward_loop(self.history.float())[:, 1:-1, :].squeeze(0).cpu().detach().numpy()
def run_episode(self):
"""
Run an episode of the environment
"""
state = self.env.reset() # reset environment
done = False
while not done:
state, done = self.trial_step(state) # trial step
def reset_variables(self):
"""
Reset the variables of the agent
"""
if self.multi_agent:
self.history = torch.zeros((self.n_agents, 1, 2)).to(self.device)
else:
self.history = torch.zeros((1, 1, 2)).to(self.device)
def load_pre_trained_model(self, model_path):
"""
Load a pre-trained model
"""
assert os.path.exists(model_path), "The model path does not exist."
self.agent.load_state_dict(torch.load(model_path))
def get_action_probabilities_from_data(self, actions_set, rewards_set):
"""
Given a dataset, infer the action probabilities
===============================================
actions_set: The set of actions taken in the dataset (np.array)
rewards_set: The set of rewards obtained in the dataset (np.array)
Returns:
action_probabilities: The inferred action probabilities (np.array)
"""
dataset = torch.tensor(np.array([actions_set, rewards_set]).transpose((1, 2, 0)), dtype=torch.int32).to(
self.device
)
X = torch.clone(dataset[:, :-1, :])
if self.omission_is_punishment:
X = X * 2 - 1
else:
X[:, :, 0] = X[:, :, 0] * 2 - 1
logits = self.agent.forward_loop(X.float())
action_probabilities = logits.softmax(dim=2)[:, :, 1].cpu().detach().numpy()
return action_probabilities
def fit(
self,
actions_set,
rewards_set,
train_test_split=0.8,
n_replications=1,
early_stopping=True,
early_stopping_patience=50,
max_epochs=10000,
learning_rate=0.0005,
print_every=500,
weight_decay=1e-5,
filter_best=True,
uid=None,
tolerance=1e-4,
):
"""
Fit the agent to the data using early stopping
==============================================
Parameters:
actions_set: The set of actions taken in the environment (np.array)
rewards_set: The set of rewards received in the environment (np.array)
train_test_split: The percentage of the data to use for training (float)
n_replications: The number of times to replicate the training (int)
early_stopping: Whether or not to use early stopping (bool)
early_stopping_patience: The number of epochs to wait before stopping (int)
max_epochs: The maximum number of epochs to train for (int)
learning_rate: The learning rate to use for training (float)
print_every: The number of epochs to wait before printing the loss (int)
weight_decay: The weight decay to use for training (float)
filter_best: Whether or not to filter the best model (bool)
uid: The unique identifier of the model (str)
tolerance: The tolerance for the loss (float)
"""
dataset = torch.tensor(np.array([actions_set, rewards_set]).transpose((1, 2, 0)), dtype=torch.int32).to(
self.device
)
X = torch.clone(dataset[:, :-1, :])
y = torch.clone(dataset[:, :, 0])
if self.omission_is_punishment:
X = X * 2 - 1
else:
X[:, :, 0] = X[:, :, 0] * 2 - 1
fitting_stats = []
for i in range(n_replications):
start_time = time.time()
for layer in self.agent.children():
if hasattr(layer, "reset_parameters"):
layer.reset_parameters()
optimizer = torch.optim.Adam(self.agent.parameters(), lr=learning_rate, weight_decay=weight_decay)
scheduler = torch.optim.lr_scheduler.ReduceLROnPlateau(
optimizer, mode="min", factor=0.5, patience=5, verbose=True
)
loss_fn = nn.CrossEntropyLoss()
train_indices = np.random.choice(len(X), int(len(X) * train_test_split), replace=False)
val_indices = np.array(list(set(range(len(X))) - set(train_indices)))
X_train, X_val = X[train_indices].to(self.device), X[val_indices].to(self.device)
y_train, y_val = y[train_indices].to(self.device), y[val_indices].to(self.device)
training_loss = []
validation_loss = []
patience = early_stopping_patience
best_val_loss = float("inf")
for epoch in range(max_epochs):
self.agent.train()
optimizer.zero_grad()
output = self.agent.forward_loop(X_train.float())
if self.policy_type == "softmax":
output = output.softmax(dim=2).view(-1, self.action_space_size)
elif self.policy_type == "greedy":
output = output.argmax(dim=2).view(-1)
elif self.policy_type == "acceptreject":
output = torch.exp(
torch.stack(
[
torch.log(output[:, :, 0])
+ torch.log(3 - output[:, :, 1])
- torch.log(
3 * output[:, :, 1] + 3 * output[:, :, 0] - 2 * output[:, :, 0] * output[:, :, 1]
),
torch.log(output[:, :, 1])
+ torch.log(3 - output[:, :, 0])
- torch.log(
3 * output[:, :, 1] + 3 * output[:, :, 0] - 2 * output[:, :, 0] * output[:, :, 1]
),
],
dim=2,
)
).view(-1, self.action_space_size)
else:
raise ValueError("Unknown policy type.")
loss = loss_fn(output, y_train.view(-1).long())
training_loss.append(loss.item())
loss.backward()
optimizer.step()
self.agent.eval()
with torch.no_grad():
output = self.agent.forward_loop(X_val.float())
if self.policy_type == "softmax":
output = output.softmax(dim=2).view(-1, self.action_space_size)
elif self.policy_type == "greedy":
output = output.argmax(dim=2).view(-1)
elif self.policy_type == "acceptreject":
output = torch.exp(
torch.stack(
[
torch.log(output[:, :, 0])
+ torch.log(3 - output[:, :, 1])
- torch.log(
3 * output[:, :, 1]
+ 3 * output[:, :, 0]
- 2 * output[:, :, 0] * output[:, :, 1]
),
torch.log(output[:, :, 1])
+ torch.log(3 - output[:, :, 0])
- torch.log(
3 * output[:, :, 1]
+ 3 * output[:, :, 0]
- 2 * output[:, :, 0] * output[:, :, 1]
),
],
dim=2,
)
).view(-1, self.action_space_size)
else:
raise ValueError("Unknown policy type.")
val_loss = loss_fn(output, y_val.view(-1).long())
validation_loss.append(val_loss.item())
scheduler.step(val_loss)
if epoch % print_every == 0:
print("Epoch {}: \tTraining Loss: {:.4f}\tValidation Loss: {:.4f}".format(epoch, loss, val_loss))
if early_stopping:
if val_loss < best_val_loss - tolerance:
best_val_loss = val_loss
patience = early_stopping_patience
torch.save(
self.agent.state_dict(),
"model_{}.pt".format(i) if uid is None else "model_{}_{}.pt".format(uid, i),
)
else:
patience -= 1
if patience == 0:
print("Early stopping at epoch {}".format(epoch))
print("Best validation loss: {:.4f}".format(best_val_loss))
break
fitting_stats.append(
{
"training_loss": training_loss,
"validation_loss": validation_loss,
"best_val_loss": best_val_loss,
"epoch": epoch,
"best_val_epoch": epoch - early_stopping_patience,
"training_time": time.time() - start_time,
}
)
if filter_best:
print("Finding best model replicate...")
best_model_loss = float("inf")
for i in range(n_replications):
for layer in self.agent.children():
if hasattr(layer, "reset_parameters"):
layer.reset_parameters()
loss_fn = nn.CrossEntropyLoss()
self.agent.load_state_dict(
torch.load("model_{}.pt".format(i) if uid is None else "model_{}_{}.pt".format(uid, i))
)
self.agent.eval()
with torch.no_grad():
output = self.agent.forward_loop(X.float())
if self.policy_type == "softmax":
output = output.softmax(dim=2).view(-1, self.action_space_size)
elif self.policy_type == "greedy":
output = output.argmax(dim=2).view(-1)
elif self.policy_type == "acceptreject":
output = torch.exp(
torch.stack(
[
torch.log(output[:, :, 0])
+ torch.log(3 - output[:, :, 1])
- torch.log(
3 * output[:, :, 1]
+ 3 * output[:, :, 0]
- 2 * output[:, :, 0] * output[:, :, 1]
),
torch.log(output[:, :, 1])
+ torch.log(3 - output[:, :, 0])
- torch.log(
3 * output[:, :, 1]
+ 3 * output[:, :, 0]
- 2 * output[:, :, 0] * output[:, :, 1]
),
],
dim=2,
)
).view(-1, self.action_space_size)
else:
raise ValueError("Unknown policy type.")
val_loss = loss_fn(output, y.view(-1).long())
if val_loss < best_model_loss:
best_model_loss = val_loss
torch.save(
self.agent.state_dict(), "best_model.pt" if uid is None else "best_model_{}.pt".format(uid)
)
os.remove("model_{}.pt".format(i) if uid is None else "model_{}_{}.pt".format(uid, i))
print("Best model found!")
return fitting_stats
| 40.726634 | 117 | 0.489699 | 6,109 | 57,954 | 4.476019 | 0.05009 | 0.017554 | 0.017993 | 0.013166 | 0.921153 | 0.894419 | 0.874781 | 0.85602 | 0.840769 | 0.819485 | 0 | 0.016751 | 0.389171 | 57,954 | 1,422 | 118 | 40.755274 | 0.755678 | 0.163302 | 0 | 0.849746 | 0 | 0.00203 | 0.033682 | 0 | 0 | 0 | 0 | 0 | 0.004061 | 1 | 0.029442 | false | 0 | 0.006091 | 0 | 0.061929 | 0.014213 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
2dc1224c80790244d04a42ce748dc6a54bd5da90 | 22,082 | py | Python | contrib/runners/mistral_v2/tests/unit/test_mistral_v2_pause_and_resume.py | saucetray/st2 | 8f507d6c8d9483c8371e386fe2b7998596856fd7 | [
"Apache-2.0"
] | 2 | 2021-08-04T01:04:06.000Z | 2021-08-04T01:04:08.000Z | contrib/runners/mistral_v2/tests/unit/test_mistral_v2_pause_and_resume.py | saucetray/st2 | 8f507d6c8d9483c8371e386fe2b7998596856fd7 | [
"Apache-2.0"
] | 1 | 2022-03-31T03:53:22.000Z | 2022-03-31T03:53:22.000Z | contrib/runners/mistral_v2/tests/unit/test_mistral_v2_pause_and_resume.py | saucetray/st2 | 8f507d6c8d9483c8371e386fe2b7998596856fd7 | [
"Apache-2.0"
] | 1 | 2019-10-11T14:42:28.000Z | 2019-10-11T14:42:28.000Z | # Copyright 2019 Extreme Networks, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import absolute_import
import copy
import uuid
import mock
import yaml
from mistralclient.api.v2 import executions
from mistralclient.api.v2 import workflows
from oslo_config import cfg
# XXX: actionsensor import depends on config being setup.
import st2tests.config as tests_config
tests_config.parse_args()
from mistral_v2.mistral_v2 import MistralRunner
from st2common.bootstrap import actionsregistrar
from st2common.bootstrap import runnersregistrar
from st2common.constants import action as action_constants
from st2common.models.db.execution import ActionExecutionDB
from st2common.models.db.liveaction import LiveActionDB
from st2common.persistence.liveaction import LiveAction
from st2common.runners import base as runners
from st2common.services import action as action_service
from st2common.transport.liveaction import LiveActionPublisher
from st2common.transport.publishers import CUDPublisher
from st2common.util import loader
from st2tests import ExecutionDbTestCase
from st2tests import fixturesloader
from st2tests.mocks.liveaction import MockLiveActionPublisher
TEST_PACK = 'mistral_tests'
TEST_PACK_PATH = fixturesloader.get_fixtures_packs_base_path() + '/' + TEST_PACK
PACKS = [
TEST_PACK_PATH,
fixturesloader.get_fixtures_packs_base_path() + '/core'
]
# Action executions requirements
ACTION_PARAMS = {'friend': 'Rocky'}
NON_EMPTY_RESULT = 'non-empty'
# Non-workbook with a single workflow
WF1_META_FILE_NAME = 'workflow_v2.yaml'
WF1_META_FILE_PATH = TEST_PACK_PATH + '/actions/' + WF1_META_FILE_NAME
WF1_META_CONTENT = loader.load_meta_file(WF1_META_FILE_PATH)
WF1_NAME = WF1_META_CONTENT['pack'] + '.' + WF1_META_CONTENT['name']
WF1_ENTRY_POINT = TEST_PACK_PATH + '/actions/' + WF1_META_CONTENT['entry_point']
WF1_ENTRY_POINT_X = WF1_ENTRY_POINT.replace(WF1_META_FILE_NAME, 'xformed_' + WF1_META_FILE_NAME)
WF1_SPEC = yaml.safe_load(MistralRunner.get_workflow_definition(WF1_ENTRY_POINT_X))
WF1_YAML = yaml.safe_dump(WF1_SPEC, default_flow_style=False)
WF1 = workflows.Workflow(None, {'name': WF1_NAME, 'definition': WF1_YAML})
WF1_OLD = workflows.Workflow(None, {'name': WF1_NAME, 'definition': ''})
WF1_EXEC = {'id': str(uuid.uuid4()), 'state': 'RUNNING', 'workflow_name': WF1_NAME}
WF1_EXEC_PAUSED = copy.deepcopy(WF1_EXEC)
WF1_EXEC_PAUSED['state'] = 'PAUSED'
# Workflow with a subworkflow action
WF2_META_FILE_NAME = 'workflow_v2_call_workflow_action.yaml'
WF2_META_FILE_PATH = TEST_PACK_PATH + '/actions/' + WF2_META_FILE_NAME
WF2_META_CONTENT = loader.load_meta_file(WF2_META_FILE_PATH)
WF2_NAME = WF2_META_CONTENT['pack'] + '.' + WF2_META_CONTENT['name']
WF2_ENTRY_POINT = TEST_PACK_PATH + '/actions/' + WF2_META_CONTENT['entry_point']
WF2_ENTRY_POINT_X = WF2_ENTRY_POINT.replace(WF2_META_FILE_NAME, 'xformed_' + WF2_META_FILE_NAME)
WF2_SPEC = yaml.safe_load(MistralRunner.get_workflow_definition(WF2_ENTRY_POINT_X))
WF2_YAML = yaml.safe_dump(WF2_SPEC, default_flow_style=False)
WF2 = workflows.Workflow(None, {'name': WF2_NAME, 'definition': WF2_YAML})
WF2_EXEC = {'id': str(uuid.uuid4()), 'state': 'RUNNING', 'workflow_name': WF2_NAME}
WF2_EXEC_PAUSED = copy.deepcopy(WF2_EXEC)
WF2_EXEC_PAUSED['state'] = 'PAUSED'
@mock.patch.object(
CUDPublisher,
'publish_update',
mock.MagicMock(return_value=None))
@mock.patch.object(
CUDPublisher,
'publish_create',
mock.MagicMock(side_effect=MockLiveActionPublisher.publish_create))
@mock.patch.object(
LiveActionPublisher,
'publish_state',
mock.MagicMock(side_effect=MockLiveActionPublisher.publish_state))
class MistralRunnerPauseResumeTest(ExecutionDbTestCase):
@classmethod
def setUpClass(cls):
super(MistralRunnerPauseResumeTest, cls).setUpClass()
# Override the retry configuration here otherwise st2tests.config.parse_args
# in ExecutionDbTestCas.setUpClass will reset these overrides.
cfg.CONF.set_override('retry_exp_msec', 100, group='mistral')
cfg.CONF.set_override('retry_exp_max_msec', 200, group='mistral')
cfg.CONF.set_override('retry_stop_max_msec', 200, group='mistral')
cfg.CONF.set_override('api_url', 'http://0.0.0.0:9101', group='auth')
# Register runners.
runnersregistrar.register_runners()
# Register test pack(s).
actions_registrar = actionsregistrar.ActionsRegistrar(
use_pack_cache=False,
fail_on_failure=True
)
for pack in PACKS:
actions_registrar.register_from_pack(pack)
@classmethod
def get_runner_class(cls, runner_name):
return runners.get_runner(runner_name, runner_name).__class__
@mock.patch.object(
workflows.WorkflowManager, 'list',
mock.MagicMock(return_value=[]))
@mock.patch.object(
workflows.WorkflowManager, 'get',
mock.MagicMock(return_value=WF1))
@mock.patch.object(
workflows.WorkflowManager, 'create',
mock.MagicMock(return_value=[WF1]))
@mock.patch.object(
executions.ExecutionManager, 'create',
mock.MagicMock(return_value=executions.Execution(None, WF1_EXEC)))
@mock.patch.object(
executions.ExecutionManager, 'update',
mock.MagicMock(return_value=executions.Execution(None, WF1_EXEC_PAUSED)))
@mock.patch.object(
action_service, 'is_children_active',
mock.MagicMock(return_value=True))
def test_pause(self):
# Launch the workflow execution.
liveaction = LiveActionDB(action=WF1_NAME, parameters=ACTION_PARAMS)
liveaction, execution = action_service.request(liveaction)
liveaction = self._wait_on_status(liveaction, action_constants.LIVEACTION_STATUS_RUNNING)
mistral_context = liveaction.context.get('mistral', None)
self.assertIsNotNone(mistral_context)
self.assertEqual(mistral_context['execution_id'], WF1_EXEC.get('id'))
self.assertEqual(mistral_context['workflow_name'], WF1_EXEC.get('workflow_name'))
# Pause the workflow execution.
requester = cfg.CONF.system_user.user
liveaction, execution = action_service.request_pause(liveaction, requester)
executions.ExecutionManager.update.assert_called_with(WF1_EXEC.get('id'), 'PAUSED')
liveaction = self._wait_on_status(liveaction, action_constants.LIVEACTION_STATUS_PAUSING)
@mock.patch.object(
workflows.WorkflowManager, 'list',
mock.MagicMock(return_value=[]))
@mock.patch.object(
workflows.WorkflowManager, 'get',
mock.MagicMock(return_value=WF1))
@mock.patch.object(
workflows.WorkflowManager, 'create',
mock.MagicMock(return_value=[WF1]))
@mock.patch.object(
executions.ExecutionManager, 'create',
mock.MagicMock(return_value=executions.Execution(None, WF1_EXEC)))
@mock.patch.object(
executions.ExecutionManager, 'update',
mock.MagicMock(side_effect=[
executions.Execution(None, WF1_EXEC_PAUSED),
executions.Execution(None, WF1_EXEC)]))
@mock.patch.object(
action_service, 'is_children_active',
mock.MagicMock(return_value=True))
def test_resume(self):
# Launch the workflow execution.
liveaction = LiveActionDB(action=WF1_NAME, parameters=ACTION_PARAMS)
liveaction, execution = action_service.request(liveaction)
liveaction = self._wait_on_status(liveaction, action_constants.LIVEACTION_STATUS_RUNNING)
mistral_context = liveaction.context.get('mistral', None)
self.assertIsNotNone(mistral_context)
self.assertEqual(mistral_context['execution_id'], WF1_EXEC.get('id'))
self.assertEqual(mistral_context['workflow_name'], WF1_EXEC.get('workflow_name'))
# Pause the workflow execution.
requester = cfg.CONF.system_user.user
liveaction, execution = action_service.request_pause(liveaction, requester)
executions.ExecutionManager.update.assert_called_with(WF1_EXEC.get('id'), 'PAUSED')
liveaction = self._wait_on_status(liveaction, action_constants.LIVEACTION_STATUS_PAUSING)
# Manually update the liveaction from pausing to paused. The paused state
# is usually updated by the mistral querier.
action_service.update_status(liveaction, action_constants.LIVEACTION_STATUS_PAUSED)
liveaction = self._wait_on_status(liveaction, action_constants.LIVEACTION_STATUS_PAUSED)
# Resume the workflow execution.
liveaction, execution = action_service.request_resume(liveaction, requester)
executions.ExecutionManager.update.assert_called_with(WF1_EXEC.get('id'), 'RUNNING')
liveaction = self._wait_on_status(liveaction, action_constants.LIVEACTION_STATUS_RUNNING)
@mock.patch.object(
workflows.WorkflowManager, 'list',
mock.MagicMock(return_value=[]))
@mock.patch.object(
workflows.WorkflowManager, 'get',
mock.MagicMock(side_effect=[WF2, WF1]))
@mock.patch.object(
workflows.WorkflowManager, 'create',
mock.MagicMock(side_effect=[[WF2], [WF1]]))
@mock.patch.object(
executions.ExecutionManager, 'create',
mock.MagicMock(side_effect=[
executions.Execution(None, WF2_EXEC),
executions.Execution(None, WF1_EXEC)]))
@mock.patch.object(
executions.ExecutionManager, 'update',
mock.MagicMock(side_effect=[
executions.Execution(None, WF2_EXEC_PAUSED),
executions.Execution(None, WF1_EXEC_PAUSED),
executions.Execution(None, WF2_EXEC),
executions.Execution(None, WF1_EXEC)]))
@mock.patch.object(
action_service, 'is_children_active',
mock.MagicMock(return_value=True))
def test_resume_subworkflow_action(self):
requester = cfg.CONF.system_user.user
liveaction1 = LiveActionDB(action=WF2_NAME, parameters=ACTION_PARAMS)
liveaction1, execution1 = action_service.request(liveaction1)
liveaction1 = self._wait_on_status(liveaction1, action_constants.LIVEACTION_STATUS_RUNNING)
liveaction2 = LiveActionDB(action=WF1_NAME, parameters=ACTION_PARAMS)
liveaction2, execution2 = action_service.request(liveaction2)
liveaction2 = LiveAction.get_by_id(str(liveaction2.id))
liveaction2 = self._wait_on_status(liveaction2, action_constants.LIVEACTION_STATUS_RUNNING)
# Mock the children of the parent execution to make this
# test case has subworkflow execution.
with mock.patch.object(
ActionExecutionDB, 'children',
new_callable=mock.PropertyMock) as action_ex_children_mock:
action_ex_children_mock.return_value = [execution2.id]
mistral_context = liveaction1.context.get('mistral', None)
self.assertIsNotNone(mistral_context)
self.assertEqual(mistral_context['execution_id'], WF2_EXEC.get('id'))
self.assertEqual(mistral_context['workflow_name'], WF2_EXEC.get('workflow_name'))
# Pause the parent liveaction and check that the request is cascaded down.
liveaction1, execution1 = action_service.request_pause(liveaction1, requester)
self.assertTrue(executions.ExecutionManager.update.called)
self.assertEqual(executions.ExecutionManager.update.call_count, 2)
calls = [
mock.call(WF2_EXEC.get('id'), 'PAUSED'),
mock.call(WF1_EXEC.get('id'), 'PAUSED')
]
executions.ExecutionManager.update.assert_has_calls(calls, any_order=False)
liveaction1 = LiveAction.get_by_id(str(liveaction1.id))
self.assertEqual(liveaction1.status, action_constants.LIVEACTION_STATUS_PAUSING)
liveaction2 = LiveAction.get_by_id(str(liveaction2.id))
self.assertEqual(liveaction2.status, action_constants.LIVEACTION_STATUS_PAUSING)
# Manually set the liveaction status to PAUSED.
action_service.update_status(liveaction2, action_constants.LIVEACTION_STATUS_PAUSED)
action_service.update_status(liveaction1, action_constants.LIVEACTION_STATUS_PAUSED)
liveaction1 = LiveAction.get_by_id(str(liveaction1.id))
self.assertEqual(liveaction1.status, action_constants.LIVEACTION_STATUS_PAUSED)
liveaction2 = LiveAction.get_by_id(str(liveaction2.id))
self.assertEqual(liveaction2.status, action_constants.LIVEACTION_STATUS_PAUSED)
# Resume the parent liveaction and check that the request is cascaded down.
liveaction1, execution1 = action_service.request_resume(liveaction1, requester)
# Includes the previous calls.
self.assertTrue(executions.ExecutionManager.update.called)
self.assertEqual(executions.ExecutionManager.update.call_count, 4)
calls = [
mock.call(WF2_EXEC.get('id'), 'PAUSED'),
mock.call(WF1_EXEC.get('id'), 'PAUSED'),
mock.call(WF2_EXEC.get('id'), 'RUNNING'),
mock.call(WF1_EXEC.get('id'), 'RUNNING')
]
executions.ExecutionManager.update.assert_has_calls(calls, any_order=False)
liveaction1 = LiveAction.get_by_id(str(liveaction1.id))
self.assertEqual(liveaction1.status, action_constants.LIVEACTION_STATUS_RUNNING)
liveaction2 = LiveAction.get_by_id(str(liveaction2.id))
self.assertEqual(liveaction2.status, action_constants.LIVEACTION_STATUS_RUNNING)
@mock.patch.object(
workflows.WorkflowManager, 'list',
mock.MagicMock(return_value=[]))
@mock.patch.object(
workflows.WorkflowManager, 'get',
mock.MagicMock(side_effect=[WF2, WF1]))
@mock.patch.object(
workflows.WorkflowManager, 'create',
mock.MagicMock(side_effect=[[WF2], [WF1]]))
@mock.patch.object(
executions.ExecutionManager, 'create',
mock.MagicMock(side_effect=[
executions.Execution(None, WF2_EXEC),
executions.Execution(None, WF1_EXEC)]))
@mock.patch.object(
executions.ExecutionManager, 'update',
mock.MagicMock(side_effect=[
executions.Execution(None, WF2_EXEC_PAUSED),
executions.Execution(None, WF1_EXEC_PAUSED),
executions.Execution(None, WF2_EXEC),
executions.Execution(None, WF1_EXEC)]))
def test_pause_missing_subworkflow_action(self):
requester = cfg.CONF.system_user.user
liveaction1 = LiveActionDB(action=WF2_NAME, parameters=ACTION_PARAMS)
liveaction1, execution1 = action_service.request(liveaction1)
liveaction1 = LiveAction.get_by_id(str(liveaction1.id))
liveaction1 = self._wait_on_status(liveaction1, action_constants.LIVEACTION_STATUS_RUNNING)
# Mock the children of the parent execution to make this
# test case has subworkflow execution.
with mock.patch.object(
ActionExecutionDB, 'children',
new_callable=mock.PropertyMock) as action_ex_children_mock:
action_ex_children_mock.return_value = [uuid.uuid4().hex]
mistral_context = liveaction1.context.get('mistral', None)
self.assertIsNotNone(mistral_context)
self.assertEqual(mistral_context['execution_id'], WF2_EXEC.get('id'))
self.assertEqual(mistral_context['workflow_name'], WF2_EXEC.get('workflow_name'))
# Pause the parent liveaction and check that the request is cascaded down.
liveaction1, execution1 = action_service.request_pause(liveaction1, requester)
self.assertTrue(executions.ExecutionManager.update.called)
self.assertEqual(executions.ExecutionManager.update.call_count, 1)
calls = [
mock.call(WF2_EXEC.get('id'), 'PAUSED'),
]
executions.ExecutionManager.update.assert_has_calls(calls, any_order=False)
# The workflow execution will fail because the liveaction for the subworkflow
# should not be missing and we do not know what state it is in.
liveaction1 = LiveAction.get_by_id(str(liveaction1.id))
self.assertEqual(liveaction1.status, action_constants.LIVEACTION_STATUS_FAILED)
self.assertIn('not a valid ObjectId', liveaction1.result.get('error', ''))
@mock.patch.object(
workflows.WorkflowManager, 'list',
mock.MagicMock(return_value=[]))
@mock.patch.object(
workflows.WorkflowManager, 'get',
mock.MagicMock(side_effect=[WF2, WF1]))
@mock.patch.object(
workflows.WorkflowManager, 'create',
mock.MagicMock(side_effect=[[WF2], [WF1]]))
@mock.patch.object(
executions.ExecutionManager, 'create',
mock.MagicMock(side_effect=[
executions.Execution(None, WF2_EXEC),
executions.Execution(None, WF1_EXEC)]))
@mock.patch.object(
executions.ExecutionManager, 'update',
mock.MagicMock(side_effect=[
executions.Execution(None, WF2_EXEC_PAUSED),
executions.Execution(None, WF1_EXEC_PAUSED),
executions.Execution(None, WF2_EXEC),
executions.Execution(None, WF1_EXEC)]))
@mock.patch.object(
action_service, 'is_children_active',
mock.MagicMock(return_value=True))
def test_resume_missing_subworkflow_action(self):
requester = cfg.CONF.system_user.user
liveaction1 = LiveActionDB(action=WF2_NAME, parameters=ACTION_PARAMS)
liveaction1, execution1 = action_service.request(liveaction1)
liveaction1 = self._wait_on_status(liveaction1, action_constants.LIVEACTION_STATUS_RUNNING)
liveaction2 = LiveActionDB(action=WF1_NAME, parameters=ACTION_PARAMS)
liveaction2, execution2 = action_service.request(liveaction2)
liveaction2 = self._wait_on_status(liveaction2, action_constants.LIVEACTION_STATUS_RUNNING)
# Mock the children of the parent execution to make this
# test case has subworkflow execution.
with mock.patch.object(
ActionExecutionDB, 'children',
new_callable=mock.PropertyMock) as action_ex_children_mock:
action_ex_children_mock.return_value = [execution2.id]
mistral_context = liveaction1.context.get('mistral', None)
self.assertIsNotNone(mistral_context)
self.assertEqual(mistral_context['execution_id'], WF2_EXEC.get('id'))
self.assertEqual(mistral_context['workflow_name'], WF2_EXEC.get('workflow_name'))
# Pause the parent liveaction and check that the request is cascaded down.
liveaction1, execution1 = action_service.request_pause(liveaction1, requester)
self.assertTrue(executions.ExecutionManager.update.called)
self.assertEqual(executions.ExecutionManager.update.call_count, 2)
calls = [
mock.call(WF2_EXEC.get('id'), 'PAUSED'),
mock.call(WF1_EXEC.get('id'), 'PAUSED')
]
executions.ExecutionManager.update.assert_has_calls(calls, any_order=False)
liveaction1 = LiveAction.get_by_id(str(liveaction1.id))
self.assertEqual(liveaction1.status, action_constants.LIVEACTION_STATUS_PAUSING)
liveaction2 = LiveAction.get_by_id(str(liveaction2.id))
self.assertEqual(liveaction2.status, action_constants.LIVEACTION_STATUS_PAUSING)
# Manually set the liveaction status to PAUSED.
action_service.update_status(liveaction2, action_constants.LIVEACTION_STATUS_PAUSED)
action_service.update_status(liveaction1, action_constants.LIVEACTION_STATUS_PAUSED)
liveaction1 = LiveAction.get_by_id(str(liveaction1.id))
self.assertEqual(liveaction1.status, action_constants.LIVEACTION_STATUS_PAUSED)
liveaction2 = LiveAction.get_by_id(str(liveaction2.id))
self.assertEqual(liveaction2.status, action_constants.LIVEACTION_STATUS_PAUSED)
# Mock the children of the parent execution to make this
# test case has subworkflow execution.
with mock.patch.object(
ActionExecutionDB, 'children',
new_callable=mock.PropertyMock) as action_ex_children_mock:
action_ex_children_mock.return_value = [uuid.uuid4().hex]
# Resume the parent liveaction and check that the request is cascaded down.
liveaction1, execution1 = action_service.request_resume(liveaction1, requester)
# Includes the previous calls.
self.assertTrue(executions.ExecutionManager.update.called)
self.assertEqual(executions.ExecutionManager.update.call_count, 3)
calls = [
mock.call(WF2_EXEC.get('id'), 'PAUSED'),
mock.call(WF1_EXEC.get('id'), 'PAUSED'),
mock.call(WF2_EXEC.get('id'), 'RUNNING'),
]
executions.ExecutionManager.update.assert_has_calls(calls, any_order=False)
# The workflow execution will fail because the liveaction for the subworkflow
# should not be missing and we do not know what state it is in.
liveaction1 = LiveAction.get_by_id(str(liveaction1.id))
self.assertEqual(liveaction1.status, action_constants.LIVEACTION_STATUS_FAILED)
self.assertIn('not a valid ObjectId', liveaction1.result.get('error', ''))
| 46.586498 | 99 | 0.709175 | 2,502 | 22,082 | 6.016387 | 0.111511 | 0.021524 | 0.035873 | 0.057663 | 0.814921 | 0.786621 | 0.76596 | 0.755065 | 0.736996 | 0.715871 | 0 | 0.015598 | 0.19577 | 22,082 | 473 | 100 | 46.684989 | 0.832029 | 0.105199 | 0 | 0.706052 | 0 | 0 | 0.05495 | 0.001877 | 0 | 0 | 0 | 0 | 0.135447 | 1 | 0.020173 | false | 0 | 0.069164 | 0.002882 | 0.095101 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
2dda69883eb309c72886beec4cf4c00ac66ab1c2 | 113 | py | Python | hybrik/version.py | Jeff-sjtu/HybrIK | 92acf081ef9614671c907a697ba7eeea5a0b08e6 | [
"MIT"
] | 287 | 2020-11-30T12:45:20.000Z | 2022-03-31T16:03:45.000Z | hybrik/version.py | pengyun1314123/HybrIK | ae1bc3cea0cc5aa98fb512eeb295c3478b0c598f | [
"MIT"
] | 62 | 2021-01-08T02:06:47.000Z | 2022-03-15T11:55:58.000Z | hybrik/version.py | pengyun1314123/HybrIK | ae1bc3cea0cc5aa98fb512eeb295c3478b0c598f | [
"MIT"
] | 30 | 2021-03-04T07:18:03.000Z | 2022-03-09T06:06:28.000Z | # GENERATED VERSION FILE
# TIME: Mon Apr 5 16:07:46 2021
__version__ = '0.1.0+c9f82ae'
short_version = '0.1.0'
| 18.833333 | 32 | 0.690265 | 21 | 113 | 3.47619 | 0.714286 | 0.219178 | 0.246575 | 0.273973 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.212766 | 0.168142 | 113 | 5 | 33 | 22.6 | 0.56383 | 0.469027 | 0 | 0 | 1 | 0 | 0.315789 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9320ce30824f0d38207b0aa28907388d2191ec23 | 13,628 | py | Python | test/terra/reference/ref_unitary_gate.py | eliarbel/qiskit-aer | 827f8922948dd18a588e8617bccaec465934280f | [
"Apache-2.0"
] | 1 | 2019-07-26T05:04:14.000Z | 2019-07-26T05:04:14.000Z | test/terra/reference/ref_unitary_gate.py | eliarbel/qiskit-aer | 827f8922948dd18a588e8617bccaec465934280f | [
"Apache-2.0"
] | 29 | 2018-12-19T10:11:00.000Z | 2018-12-19T10:16:18.000Z | test/terra/reference/ref_unitary_gate.py | atilag/qiskit-aer | d964795b0a24b1d3287ba2ba2dda45d1dfed4a5d | [
"Apache-2.0"
] | null | null | null | # This code is part of Qiskit.
#
# (C) Copyright IBM 2018, 2019.
#
# This code is licensed under the Apache License, Version 2.0. You may
# obtain a copy of this license in the LICENSE.txt file in the root directory
# of this source tree or at http://www.apache.org/licenses/LICENSE-2.0.
#
# Any modifications or derivative works of this code must retain this
# copyright notice, and modified files need to carry a notice indicating
# that they have been altered from the originals.
"""
Test circuits and reference outputs for measure instruction.
"""
import numpy as np
from qiskit import QuantumRegister, ClassicalRegister, QuantumCircuit
from qiskit.compiler import assemble
from qiskit.providers.aer.backends import QasmSimulator
from qiskit.providers.aer.utils.qobj_utils import unitary_instr
from qiskit.providers.aer.utils.qobj_utils import append_instr
from qiskit.providers.aer.utils.qobj_utils import measure_instr
# ==========================================================================
# Multi-qubit measure
# ==========================================================================
def _dummy_qobj():
"""Return a dummy qobj to insert experiments into"""
qr = QuantumRegister(1)
circuit = QuantumCircuit(qr)
circuit.barrier(qr)
qobj = assemble(circuit, QasmSimulator(), shots=1)
# remove experiment
qobj.experiments = []
return qobj
def unitary_gate_circuits_real_deterministic(final_measure=True):
"""Unitary gate test circuits with deterministic count output."""
final_qobj = _dummy_qobj()
qr = QuantumRegister(2)
if final_measure:
cr = ClassicalRegister(2)
regs = (qr, cr)
else:
regs = (qr, )
x_mat = np.array([[0, 1], [1, 0]])
cx_mat = np.array([[1, 0, 0, 0], [0, 0, 0, 1], [0, 0, 1, 0], [0, 1, 0, 0]])
# CX01, |00> state
circuit = QuantumCircuit(*regs)
circuit.barrier(qr)
qobj = assemble(circuit, QasmSimulator(), shots=1)
append_instr(qobj, 0, unitary_instr(cx_mat, [0, 1]))
if final_measure:
append_instr(qobj, 0, measure_instr([0], [0]))
append_instr(qobj, 0, measure_instr([1], [1]))
final_qobj.experiments.append(qobj.experiments[0])
# CX10, |00> state
circuit = QuantumCircuit(*regs)
circuit.barrier(qr)
qobj = assemble(circuit, QasmSimulator(), shots=1)
append_instr(qobj, 0, unitary_instr(cx_mat, [1, 0]))
if final_measure:
append_instr(qobj, 0, measure_instr([0], [0]))
append_instr(qobj, 0, measure_instr([1], [1]))
final_qobj.experiments.append(qobj.experiments[0])
# CX01.(X^I), |10> state
circuit = QuantumCircuit(*regs)
circuit.barrier(qr)
qobj = assemble(circuit, QasmSimulator(), shots=1)
append_instr(qobj, 0, unitary_instr(x_mat, [1]))
append_instr(qobj, 0, unitary_instr(cx_mat, [0, 1]))
if final_measure:
append_instr(qobj, 0, measure_instr([0], [0]))
append_instr(qobj, 0, measure_instr([1], [1]))
final_qobj.experiments.append(qobj.experiments[0])
# CX10.(I^X), |01> state
circuit = QuantumCircuit(*regs)
circuit.barrier(qr)
qobj = assemble(circuit, QasmSimulator(), shots=1)
append_instr(qobj, 0, unitary_instr(x_mat, [0]))
append_instr(qobj, 0, unitary_instr(cx_mat, [1, 0]))
if final_measure:
append_instr(qobj, 0, measure_instr([0], [0]))
append_instr(qobj, 0, measure_instr([1], [1]))
final_qobj.experiments.append(qobj.experiments[0])
# CX01.(I^X), |11> state
circuit = QuantumCircuit(*regs)
circuit.barrier(qr)
qobj = assemble(circuit, QasmSimulator(), shots=1)
append_instr(qobj, 0, unitary_instr(x_mat, [0]))
append_instr(qobj, 0, unitary_instr(cx_mat, [0, 1]))
if final_measure:
append_instr(qobj, 0, measure_instr([0], [0]))
append_instr(qobj, 0, measure_instr([1], [1]))
final_qobj.experiments.append(qobj.experiments[0])
# CX10.(X^I), |11> state
circuit = QuantumCircuit(*regs)
circuit.barrier(qr)
qobj = assemble(circuit, QasmSimulator(), shots=1)
append_instr(qobj, 0, unitary_instr(x_mat, [1]))
append_instr(qobj, 0, unitary_instr(cx_mat, [1, 0]))
if final_measure:
append_instr(qobj, 0, measure_instr([0], [0]))
append_instr(qobj, 0, measure_instr([1], [1]))
final_qobj.experiments.append(qobj.experiments[0])
return final_qobj
def unitary_gate_counts_real_deterministic(shots, hex_counts=True):
"""Unitary gate circuits reference counts."""
targets = []
if hex_counts:
# CX01, |00> state
targets.append({'0x0': shots}) # {"00": shots}
# CX10, |00> state
targets.append({'0x0': shots}) # {"00": shots}
# CX01.(X^I), |10> state
targets.append({'0x2': shots}) # {"00": shots}
# CX10.(I^X), |01> state
targets.append({'0x1': shots}) # {"00": shots}
# CX01.(I^X), |11> state
targets.append({'0x3': shots}) # {"00": shots}
# CX10.(X^I), |11> state
targets.append({'0x3': shots}) # {"00": shots}
else:
# CX01, |00> state
targets.append({'00': shots}) # {"00": shots}
# CX10, |00> state
targets.append({'00': shots}) # {"00": shots}
# CX01.(X^I), |10> state
targets.append({'10': shots}) # {"00": shots}
# CX10.(I^X), |01> state
targets.append({'01': shots}) # {"00": shots}
# CX01.(I^X), |11> state
targets.append({'11': shots}) # {"00": shots}
# CX10.(X^I), |11> state
return targets
def unitary_gate_statevector_real_deterministic():
"""Unitary gate test circuits with deterministic counts."""
targets = []
# CX01, |00> state
targets.append(np.array([1, 0, 0, 0]))
# CX10, |00> state
targets.append(np.array([1, 0, 0, 0]))
# CX01.(X^I), |10> state
targets.append(np.array([0, 0, 1, 0]))
# CX10.(I^X), |01> state
targets.append(np.array([0, 1, 0, 0]))
# CX01.(I^X), |11> state
targets.append(np.array([0, 0, 0, 1]))
# CX10.(X^I), |11> state
targets.append(np.array([0, 0, 0, 1]))
return targets
def unitary_gate_unitary_real_deterministic():
"""Unitary gate circuits reference unitaries."""
targets = []
# CX01, |00> state
targets.append(np.array([[1, 0, 0, 0],
[0, 0, 0, 1],
[0, 0, 1, 0],
[0, 1, 0, 0]]))
# CX10, |00> state
targets.append(np.array([[1, 0, 0, 0],
[0, 1, 0, 0],
[0, 0, 0, 1],
[0, 0, 1, 0]]))
# CX01.(X^I), |10> state
targets.append(np.array([[0, 0, 1, 0],
[0, 1, 0, 0],
[1, 0, 0, 0],
[0, 0, 0, 1]]))
# CX10.(I^X), |01> state
targets.append(np.array([[0, 1, 0, 0],
[1, 0, 0, 0],
[0, 0, 1, 0],
[0, 0, 0, 1]]))
# CX01.(I^X), |11> state
targets.append(np.array([[0, 1, 0, 0],
[0, 0, 1, 0],
[0, 0, 0, 1],
[1, 0, 0, 0]]))
# CX10.(X^I), |11> state
targets.append(np.array([[0, 0, 1, 0],
[0, 0, 0, 1],
[0, 1, 0, 0],
[1, 0, 0, 0]]))
return targets
def unitary_gate_circuits_complex_deterministic(final_measure=True):
"""Unitary gate test circuits with deterministic count output."""
final_qobj = _dummy_qobj()
qr = QuantumRegister(2)
if final_measure:
cr = ClassicalRegister(2)
regs = (qr, cr)
else:
regs = (qr, )
y_mat = np.array([[0, -1j], [1j, 0]], dtype=complex)
cx_mat = np.array([[1, 0, 0, 0], [0, 0, 0, 1], [0, 0, 1, 0], [0, 1, 0, 0]],
dtype=complex)
# CX01, |00> state
circuit = QuantumCircuit(*regs)
circuit.barrier(qr)
qobj = assemble(circuit, QasmSimulator(), shots=1)
append_instr(qobj, 0, unitary_instr(cx_mat, [0, 1]))
if final_measure:
append_instr(qobj, 0, measure_instr([0], [0]))
append_instr(qobj, 0, measure_instr([1], [1]))
final_qobj.experiments.append(qobj.experiments[0])
# CX10, |00> state
circuit = QuantumCircuit(*regs)
circuit.barrier(qr)
qobj = assemble(circuit, QasmSimulator(), shots=1)
append_instr(qobj, 0, unitary_instr(cx_mat, [1, 0]))
if final_measure:
append_instr(qobj, 0, measure_instr([0], [0]))
append_instr(qobj, 0, measure_instr([1], [1]))
final_qobj.experiments.append(qobj.experiments[0])
# CX01.(Y^I), |10> state
circuit = QuantumCircuit(*regs)
circuit.barrier(qr)
qobj = assemble(circuit, QasmSimulator(), shots=1)
append_instr(qobj, 0, unitary_instr(y_mat, [1]))
append_instr(qobj, 0, unitary_instr(cx_mat, [0, 1]))
if final_measure:
append_instr(qobj, 0, measure_instr([0], [0]))
append_instr(qobj, 0, measure_instr([1], [1]))
final_qobj.experiments.append(qobj.experiments[0])
# CX10.(I^Y), |01> state
circuit = QuantumCircuit(*regs)
circuit.barrier(qr)
qobj = assemble(circuit, QasmSimulator(), shots=1)
append_instr(qobj, 0, unitary_instr(y_mat, [0]))
append_instr(qobj, 0, unitary_instr(cx_mat, [1, 0]))
if final_measure:
append_instr(qobj, 0, measure_instr([0], [0]))
append_instr(qobj, 0, measure_instr([1], [1]))
final_qobj.experiments.append(qobj.experiments[0])
# CX01.(I^Y), |11> state
circuit = QuantumCircuit(*regs)
circuit.barrier(qr)
qobj = assemble(circuit, QasmSimulator(), shots=1)
append_instr(qobj, 0, unitary_instr(y_mat, [0]))
append_instr(qobj, 0, unitary_instr(cx_mat, [0, 1]))
if final_measure:
append_instr(qobj, 0, measure_instr([0], [0]))
append_instr(qobj, 0, measure_instr([1], [1]))
final_qobj.experiments.append(qobj.experiments[0])
# CX10.(Y^I), |11> state
circuit = QuantumCircuit(*regs)
circuit.barrier(qr)
qobj = assemble(circuit, QasmSimulator(), shots=1)
append_instr(qobj, 0, unitary_instr(y_mat, [1]))
append_instr(qobj, 0, unitary_instr(cx_mat, [1, 0]))
if final_measure:
append_instr(qobj, 0, measure_instr([0], [0]))
append_instr(qobj, 0, measure_instr([1], [1]))
final_qobj.experiments.append(qobj.experiments[0])
return final_qobj
def unitary_gate_counts_complex_deterministic(shots, hex_counts=True):
"""Unitary gate circuits reference counts."""
targets = []
if hex_counts:
# CX01, |00> state
targets.append({'0x0': shots}) # {"00": shots}
# CX10, |00> state
targets.append({'0x0': shots}) # {"00": shots}
# CX01.(Y^I), |10> state
targets.append({'0x2': shots}) # {"00": shots}
# CX10.(I^Y), |01> state
targets.append({'0x1': shots}) # {"00": shots}
# CX01.(I^Y), |11> state
targets.append({'0x3': shots}) # {"00": shots}
# CX10.(Y^I), |11> state
targets.append({'0x3': shots}) # {"00": shots}
else:
# CX01, |00> state
targets.append({'00': shots}) # {"00": shots}
# CX10, |00> state
targets.append({'00': shots}) # {"00": shots}
# CX01.(Y^I), |10> state
targets.append({'10': shots}) # {"00": shots}
# CX10.(I^Y), |01> state
targets.append({'01': shots}) # {"00": shots}
# CX01.(I^Y), |11> state
targets.append({'11': shots}) # {"00": shots}
# CX10.(Y^I), |11> state
return targets
def unitary_gate_statevector_complex_deterministic():
"""Unitary gate test circuits with deterministic counts."""
targets = []
# CX01, |00> state
targets.append(np.array([1, 0, 0, 0]))
# CX10, |00> state
targets.append(np.array([1, 0, 0, 0]))
# CX01.(Y^I), |10> state
targets.append(np.array([0, 0, 1j, 0]))
# CX10.(I^Y), |01> state
targets.append(np.array([0, 1j, 0, 0]))
# CX01.(I^Y), |11> state
targets.append(np.array([0, 0, 0, 1j]))
# CX10.(Y^I), |11> state
targets.append(np.array([0, 0, 0, 1j]))
return targets
def unitary_gate_unitary_complex_deterministic():
"""Unitary gate circuits reference unitaries."""
targets = []
# CX01, |00> state
targets.append(np.array([[1, 0, 0, 0],
[0, 0, 0, 1],
[0, 0, 1, 0],
[0, 1, 0, 0]]))
# CX10, |00> state
targets.append(np.array([[1, 0, 0, 0],
[0, 1, 0, 0],
[0, 0, 0, 1],
[0, 0, 1, 0]]))
# CX01.(Y^I), |10> state
targets.append(np.array([[0, 0, -1j, 0],
[0, 1j, 0, 0],
[1j, 0, 0, 0],
[0, 0, 0, -1j]]))
# CX10.(I^Y), |01> state
targets.append(np.array([[0, -1j, 0, 0],
[1j, 0, 0, 0],
[0, 0, 1j, 0],
[0, 0, 0, -1j]]))
# CX01.(I^Y), |11> state
targets.append(np.array([[0, -1j, 0, 0],
[0, 0, 1j, 0],
[0, 0, 0, -1j],
[1j, 0, 0, 0]]))
# CX10.(Y^I), |11> state
targets.append(np.array([[0, 0, -1j, 0],
[0, 0, 0, -1j],
[0, 1j, 0, 0],
[1j, 0, 0, 0]]))
return targets
| 36.536193 | 79 | 0.545861 | 1,796 | 13,628 | 4.035078 | 0.075167 | 0.039189 | 0.02815 | 0.097144 | 0.877881 | 0.871947 | 0.86077 | 0.86077 | 0.850697 | 0.807093 | 0 | 0.078451 | 0.276049 | 13,628 | 372 | 80 | 36.634409 | 0.656092 | 0.199516 | 0 | 0.881633 | 0 | 0 | 0.005204 | 0 | 0 | 0 | 0.003346 | 0 | 0 | 1 | 0.036735 | false | 0 | 0.028571 | 0 | 0.102041 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9345e8cb898bc1dc435110a12d6c6051a651f6ea | 477 | py | Python | settings/config_objs/__init__.py | GeorgOhneH/ethz-document-fetcher | 42921e5d71698a269eb54cf9d3979e4a7d88a9cf | [
"MIT"
] | 15 | 2020-03-17T15:43:46.000Z | 2022-01-08T04:23:49.000Z | settings/config_objs/__init__.py | GeorgOhneH/ethz-document-fetcher | 42921e5d71698a269eb54cf9d3979e4a7d88a9cf | [
"MIT"
] | 5 | 2020-03-12T10:05:27.000Z | 2021-03-03T16:01:47.000Z | settings/config_objs/__init__.py | GeorgOhneH/ethz-document-fetcher | 42921e5d71698a269eb54cf9d3979e4a7d88a9cf | [
"MIT"
] | 2 | 2020-03-17T17:09:20.000Z | 2020-12-28T22:59:17.000Z | from settings.config_objs.bool import ConfigBool
from settings.config_objs.dict import ConfigDict
from settings.config_objs.dummy import ConfigDummy
from settings.config_objs.int import ConfigInt
from settings.config_objs.list import ConfigListString, ConfigList
from settings.config_objs.options import ConfigOptions
from settings.config_objs.password import ConfigPassword
from settings.config_objs.path import ConfigPath
from settings.config_objs.string import ConfigString
| 47.7 | 66 | 0.8826 | 64 | 477 | 6.4375 | 0.375 | 0.262136 | 0.393204 | 0.480583 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.077568 | 477 | 9 | 67 | 53 | 0.936364 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.111111 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 8 |
935d1c7af0d43e2cbba2164009cb175b975af324 | 2,094 | py | Python | check_specific_images.py | nicolasrosa-forks/evaluating_bdl | 2973b0d018551de0c9f087e2ae4e6b2c22f2ce3c | [
"MIT"
] | null | null | null | check_specific_images.py | nicolasrosa-forks/evaluating_bdl | 2973b0d018551de0c9f087e2ae4e6b2c22f2ce3c | [
"MIT"
] | null | null | null | check_specific_images.py | nicolasrosa-forks/evaluating_bdl | 2973b0d018551de0c9f087e2ae4e6b2c22f2ce3c | [
"MIT"
] | null | null | null | from matplotlib import pyplot as plt
image_filenames = [
# '/root/data/kitti_depth/train/2011_09_26_drive_0039_sync/proj_depth/groundtruth/image_02/0000000040.png',
# '/root/data/kitti_depth/train/2011_09_26_drive_0039_sync/proj_depth/groundtruth/image_02/0000000058.png',
# '/root/data/kitti_depth/train/2011_09_26_drive_0039_sync/proj_depth/groundtruth/image_02/0000000076.png',
# '/root/data/kitti_depth/train/2011_09_26_drive_0039_sync/proj_depth/groundtruth/image_02/0000000022.png',
'/home/lasi/Downloads/datasets/kitti/depth/data_depth_annotated/train/2011_09_28_drive_0043_sync/proj_depth/groundtruth/image_02/0000000042.png',
'/home/lasi/Downloads/datasets/kitti/depth/data_depth_annotated/train/2011_09_28_drive_0043_sync/proj_depth/groundtruth/image_02/0000000038.png',
'/home/lasi/Downloads/datasets/kitti/depth/data_depth_annotated/train/2011_09_28_drive_0043_sync/proj_depth/groundtruth/image_02/0000000050.png',
'/home/lasi/Downloads/datasets/kitti/depth/data_depth_annotated/train/2011_09_28_drive_0043_sync/proj_depth/groundtruth/image_02/0000000113.png',
'/home/lasi/Downloads/datasets/kitti/depth/data_depth_annotated/train/2011_09_28_drive_0043_sync/proj_depth/groundtruth/image_02/0000000138.png',
'/home/lasi/Downloads/datasets/kitti/depth/data_depth_annotated/train/2011_09_28_drive_0043_sync/proj_depth/groundtruth/image_02/0000000034.png',
'/home/lasi/Downloads/datasets/kitti/depth/data_depth_annotated/train/2011_09_28_drive_0043_sync/proj_depth/groundtruth/image_02/0000000073.png',
'/home/lasi/Downloads/datasets/kitti/depth/data_depth_annotated/train/2011_09_28_drive_0043_sync/proj_depth/groundtruth/image_02/0000000053.png',
'/home/lasi/Downloads/datasets/kitti/depth/data_depth_annotated/train/2011_09_28_drive_0043_sync/proj_depth/groundtruth/image_02/0000000110.png'
]
for image_filename in image_filenames:
try:
img = plt.imread(image_filename)
plt.figure(1)
plt.imshow(img)
plt.pause(1)
print(image_filename, ' ok')
except FileNotFoundError:
print(image_filename, ' fail')
print("Done") | 63.454545 | 145 | 0.829035 | 320 | 2,094 | 5.053125 | 0.184375 | 0.080396 | 0.088435 | 0.19295 | 0.787879 | 0.787879 | 0.787879 | 0.787879 | 0.787879 | 0.787879 | 0 | 0.158827 | 0.055874 | 2,094 | 33 | 146 | 63.454545 | 0.659079 | 0.202006 | 0 | 0 | 0 | 0.409091 | 0.773845 | 0.766647 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.045455 | 0 | 0.045455 | 0.136364 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
936be43498e5ce1ae4f05785221884d608133ede | 4,927 | py | Python | tests/test_dbt_s3_hook.py | slve/airflow-dbt-python | bfff2bef83839fd563e2bbaac17c626ed3d51a06 | [
"MIT"
] | null | null | null | tests/test_dbt_s3_hook.py | slve/airflow-dbt-python | bfff2bef83839fd563e2bbaac17c626ed3d51a06 | [
"MIT"
] | null | null | null | tests/test_dbt_s3_hook.py | slve/airflow-dbt-python | bfff2bef83839fd563e2bbaac17c626ed3d51a06 | [
"MIT"
] | null | null | null | import pytest
try:
from airflow_dbt_python.hooks.dbt_s3 import DbtS3Hook
except ImportError:
pytest.skip(
"S3Hook not available, consider installing amazon extras",
allow_module_level=True,
)
def test_get_dbt_profiles(s3_bucket, tmpdir, profiles_file):
"""Test pulling dbt profile from S3 path"""
hook = DbtS3Hook()
bucket = hook.get_bucket(s3_bucket)
with open(profiles_file) as pf:
profiles_content = pf.read()
bucket.put_object(Key="profiles/profiles.yml", Body=profiles_content.encode())
profiles_path = hook.get_dbt_profiles(
f"s3://{s3_bucket}/profiles/",
profiles_dir=str(tmpdir),
)
assert profiles_path.exists()
with open(profiles_path) as f:
result = f.read()
assert result == profiles_content
def test_get_dbt_profiles_sub_dir(s3_bucket, tmpdir, profiles_file):
hook = DbtS3Hook()
bucket = hook.get_bucket(s3_bucket)
with open(profiles_file) as pf:
profiles_content = pf.read()
bucket.put_object(
Key="profiles/v0.0.1/profiles.yml", Body=profiles_content.encode()
)
profiles_path = hook.get_dbt_profiles(
f"s3://{s3_bucket}/profiles/v0.0.1",
profiles_dir=str(tmpdir),
)
assert profiles_path.exists()
with open(profiles_path) as f:
result = f.read()
assert result == profiles_content
def test_get_dbt_profiles_sub_dir_trailing_slash(s3_bucket, tmpdir, profiles_file):
"""
Test whether an S3 path without a trailing slash successfully pulls a dbt project
"""
hook = DbtS3Hook()
bucket = hook.get_bucket(s3_bucket)
with open(profiles_file) as pf:
profiles_content = pf.read()
bucket.put_object(
Key="profiles/v0.0.1/profiles.yml", Body=profiles_content.encode()
)
profiles_path = hook.get_dbt_profiles(
f"s3://{s3_bucket}/profiles/v0.0.1/",
profiles_dir=str(tmpdir),
)
assert profiles_path.exists()
with open(profiles_path) as f:
result = f.read()
assert result == profiles_content
def test_get_dbt_project(s3_bucket, tmpdir, dbt_project_file):
"""Test pulling dbt project from S3 path"""
hook = DbtS3Hook()
bucket = hook.get_bucket(s3_bucket)
with open(dbt_project_file) as pf:
project_content = pf.read()
bucket.put_object(Key="project/dbt_project.yml", Body=project_content.encode())
bucket.put_object(Key="project/models/a_model.sql", Body=b"SELECT 1")
bucket.put_object(Key="project/models/another_model.sql", Body=b"SELECT 2")
bucket.put_object(Key="project/data/a_seed.csv", Body=b"col1,col2\n1,2")
project_path = hook.get_dbt_project(
f"s3://{s3_bucket}/project/",
project_dir=str(tmpdir),
)
assert project_path.exists()
dir_contents = [f for f in project_path.iterdir()]
assert sorted(str(f.name) for f in dir_contents) == [
"data",
"dbt_project.yml",
"models",
]
with open(project_path / "dbt_project.yml") as f:
result = f.read()
assert result == project_content
with open(project_path / "models" / "a_model.sql") as f:
result = f.read()
assert result == "SELECT 1"
with open(project_path / "models" / "another_model.sql") as f:
result = f.read()
assert result == "SELECT 2"
with open(project_path / "data" / "a_seed.csv") as f:
result = f.read()
assert result == "col1,col2\n1,2"
def test_get_dbt_project_no_trailing_slash(s3_bucket, tmpdir, dbt_project_file):
"""
Test whether an S3 path without a trailing slash successfully pulls a dbt project
"""
hook = DbtS3Hook()
bucket = hook.get_bucket(s3_bucket)
with open(dbt_project_file) as pf:
project_content = pf.read()
bucket.put_object(Key="project/dbt_project.yml", Body=project_content.encode())
bucket.put_object(Key="project/models/a_model.sql", Body=b"SELECT 1")
bucket.put_object(Key="project/models/another_model.sql", Body=b"SELECT 2")
bucket.put_object(Key="project/data/a_seed.csv", Body=b"col1,col2\n1,2")
project_path = hook.get_dbt_project(
f"s3://{s3_bucket}/project",
project_dir=str(tmpdir),
)
assert project_path.exists()
dir_contents = [f for f in project_path.iterdir()]
assert sorted(str(f.name) for f in dir_contents) == [
"data",
"dbt_project.yml",
"models",
]
with open(project_path / "dbt_project.yml") as f:
result = f.read()
assert result == project_content
with open(project_path / "models" / "a_model.sql") as f:
result = f.read()
assert result == "SELECT 1"
with open(project_path / "models" / "another_model.sql") as f:
result = f.read()
assert result == "SELECT 2"
with open(project_path / "data" / "a_seed.csv") as f:
result = f.read()
assert result == "col1,col2\n1,2"
| 29.680723 | 85 | 0.659225 | 693 | 4,927 | 4.471861 | 0.124098 | 0.054856 | 0.053243 | 0.063892 | 0.938367 | 0.906099 | 0.889319 | 0.871249 | 0.871249 | 0.871249 | 0 | 0.017612 | 0.216359 | 4,927 | 165 | 86 | 29.860606 | 0.78503 | 0.048508 | 0 | 0.762712 | 0 | 0 | 0.169572 | 0.091457 | 0 | 0 | 0 | 0 | 0.152542 | 1 | 0.042373 | false | 0 | 0.025424 | 0 | 0.067797 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fa75ab7e8577c5140f04b3b2ebe3b34465b95593 | 19,708 | py | Python | posts/tests/test_views.py | CMPUT404W22AMNRY/CMPUT404-project-socialdistribution | 61d5c8aa2c7f038c137fc86c8b194d92a33d90e3 | [
"W3C-20150513"
] | 1 | 2022-01-14T04:37:54.000Z | 2022-01-14T04:37:54.000Z | posts/tests/test_views.py | CMPUT404W22AMNRY/CMPUT404-project-socialdistribution | 61d5c8aa2c7f038c137fc86c8b194d92a33d90e3 | [
"W3C-20150513"
] | 88 | 2022-02-19T00:16:44.000Z | 2022-03-29T03:05:08.000Z | posts/tests/test_views.py | CMPUT404W22AMNRY/CMPUT404-project-socialdistribution | 61d5c8aa2c7f038c137fc86c8b194d92a33d90e3 | [
"W3C-20150513"
] | null | null | null | from .constants import COMMENT_DATA, COMMONMARK_POST_DATA, POST_DATA
from api.tests.test_api import TEST_PASSWORD, TEST_USERNAME
from api.tests.constants import SAMPLE_REMOTE_AUTHOR, SAMPLE_REMOTE_POST
from servers.models import Server
from requests import Response
from django.urls import reverse
from posts.models import CommentLike, Post, Category, ContentType, Comment, Like, RemoteComment
import json
from unittest.mock import MagicMock, patch
from django.test import TestCase, Client
from django.contrib.auth import get_user_model
EDITED_POST_DATA = POST_DATA.copy()
EDITED_POST_DATA['content_type'] = ContentType.MARKDOWN
class CreatePostViewTests(TestCase):
def setUp(self) -> None:
self.client = Client()
get_user_model().objects.create_user(username='bob', password='password')
def test_new_post_page(self):
self.client.login(username='bob', password='password')
res = self.client.get(reverse('posts:new'))
self.assertEqual(res.status_code, 200)
self.assertTemplateUsed('posts/create_post.html')
def test_new_post_require_login(self):
res = self.client.get(reverse('posts:new'))
self.assertEqual(res.status_code, 302)
def test_new_post_require_csrf(self):
csrf_client = Client(enforce_csrf_checks=True)
csrf_client.login(username='bob', password='password')
res = csrf_client.post(reverse('posts:new'), data=POST_DATA)
self.assertEqual(res.status_code, 403)
def test_new_post(self):
self.client.login(username='bob', password='password')
initial_post_count = len(Post.objects.all())
self.client.post(reverse('posts:new'), data=POST_DATA)
self.assertEqual(len(Post.objects.all()), initial_post_count + 1)
def test_categories_not_duplicated(self):
self.client.login(username='bob', password='password')
Category.objects.create(category='web')
initial_post_count = len(Post.objects.all())
self.client.post(reverse('posts:new'), data=POST_DATA)
self.assertEqual(len(Category.objects.all()), 2)
self.assertEqual(len(Post.objects.all()), initial_post_count + 1)
class EditPostViewTests(TestCase):
def setUp(self) -> None:
self.client = Client()
current_user = 'bob'
get_user_model().objects.create_user(username=current_user, password='password')
# Create test post to edit
post = Post.objects.create(
title=POST_DATA['title'],
description=POST_DATA['description'],
content_type=POST_DATA['content_type'],
content=POST_DATA['content'],
author_id=get_user_model().objects.get(
username=current_user).id,
unlisted=True)
post.save()
self.post_id = post.id
def test_edit_post_page(self):
self.client.login(username='bob', password='password')
res = self.client.get(reverse('posts:edit', kwargs={'pk': self.post_id}))
self.assertEqual(res.status_code, 200)
self.assertTemplateUsed('posts/edit_post.html')
def test_edit_post_require_login(self):
res = self.client.get(reverse('posts:edit', kwargs={'pk': self.post_id}))
self.assertEqual(res.status_code, 302)
def test_edit_post_require_csrf(self):
csrf_client = Client(enforce_csrf_checks=True)
csrf_client.login(username='bob', password='password')
res = csrf_client.post(reverse('posts:edit', kwargs={'pk': self.post_id}), data=EDITED_POST_DATA)
self.assertEqual(res.status_code, 403)
def test_edit_post(self):
self.client.login(username='bob', password='password')
res = self.client.post(reverse('posts:edit', kwargs={'pk': self.post_id}), data=EDITED_POST_DATA)
self.assertEqual(res.status_code, 302)
self.assertEqual(Post.objects.get(pk=self.post_id).content_type, EDITED_POST_DATA['content_type'])
def test_edit_non_existing_post(self):
self.client.login(username='bob', password='password')
res = self.client.post(reverse('posts:edit', kwargs={'pk': 900}), data=EDITED_POST_DATA)
self.assertEqual(res.status_code, 404)
def test_edit_page_as_another_user(self):
username = 'alice'
password = TEST_PASSWORD
get_user_model().objects.create_user(username=username, password=password)
self.client.login(username=username, password=password)
res = self.client.get(reverse('posts:edit', kwargs={'pk': self.post_id}))
self.assertEqual(res.status_code, 403)
def test_edit_as_another_user(self):
username = 'alice'
password = TEST_PASSWORD
get_user_model().objects.create_user(username=username, password=password)
self.client.login(username=username, password=password)
res = self.client.post(reverse('posts:edit', kwargs={'pk': self.post_id}), data=EDITED_POST_DATA)
self.assertEqual(res.status_code, 403)
class PostDetailViewTests(TestCase):
def setUp(self) -> None:
self.client = Client()
self.user = get_user_model().objects.create_user(username='bob', password='password')
self.post = Post.objects.create(
title=POST_DATA['title'],
description=POST_DATA['description'],
content_type=POST_DATA['content_type'],
content=POST_DATA['content'],
author_id=self.user.id,
unlisted=True)
self.post2 = Post.objects.create(
title=COMMONMARK_POST_DATA['title'],
description=COMMONMARK_POST_DATA['description'],
content_type=COMMONMARK_POST_DATA['content_type'],
content=COMMONMARK_POST_DATA['content'],
author_id=self.user.id,
unlisted=True)
self.post.save()
def test_detail_view_page(self):
self.client.login(username='bob', password='password')
res = self.client.get(reverse('posts:detail', kwargs={'pk': self.post.id}))
self.assertEqual(res.status_code, 200)
self.assertTemplateUsed(res, 'posts/post_detail.html')
self.assertContains(res, self.post.title)
self.assertContains(res, self.post.author.get_full_name())
self.assertContains(res, self.post.content)
for category in self.post.categories.all():
self.assertContains(res, category.category)
def test_comments_section(self):
for _ in range(3):
comment = Comment.objects.create(
comment=COMMENT_DATA['comment'],
author=self.user,
content_type=COMMENT_DATA['content_type'],
post=self.post,
)
comment.save()
self.client.login(username='bob', password='password')
res = self.client.get(reverse('posts:detail', kwargs={'pk': self.post.id}))
self.assertEqual(res.status_code, 200)
self.assertContains(res, 'Comments')
self.assertContains(res, COMMENT_DATA['comment'], count=3)
def test_like_section(self):
self.client.login(username='bob', password='password')
res = self.client.get(reverse('posts:detail', kwargs={'pk': self.post.id}))
self.assertEqual(res.status_code, 200)
self.assertContains(res, 'Like')
def test_like(self):
self.client.login(username='bob', password='password')
res = self.client.post(reverse('posts:like', kwargs={'pk': self.post.id}))
self.assertEqual(res.status_code, 302)
self.assertIsNotNone(Like.objects.get(author=self.user, post=self.post))
def test_like_require_csrf(self):
csrf_client = Client(enforce_csrf_checks=True)
csrf_client.login(username='bob', password='password')
res = csrf_client.post(reverse('posts:like', kwargs={'pk': self.post.id}))
self.assertEqual(res.status_code, 403)
def test_disable_like(self):
Like.objects.create(author=self.user, post=self.post)
self.client.login(username='bob', password='password')
res = self.client.get(reverse('posts:detail', kwargs={'pk': self.post.id}))
self.assertEqual(res.status_code, 200)
self.assertNotContains(res, 'Like')
def test_commonmark(self):
self.client.login(username='bob', password='password')
res = self.client.get(reverse('posts:detail', kwargs={'pk': self.post2.id}))
self.assertEqual(res.status_code, 200)
self.assertContains(res, '<h1>Heading 8-)</h1>')
self.assertContains(res, '<p><strong>This is bold text!</strong></p>')
def test_like_comment(self):
comment = Comment.objects.create(
comment=COMMENT_DATA['comment'],
author=self.user,
content_type=COMMENT_DATA['content_type'],
post=self.post,
)
comment.save()
self.assertEqual(len(comment.commentlike_set.all()), 0)
self.client.login(username=TEST_USERNAME, password=TEST_PASSWORD)
res = self.client.post(reverse('posts:like-comment', kwargs={'post_pk': self.post.id, 'pk': comment.id}))
self.assertEqual(res.status_code, 302)
self.assertEqual(len(comment.commentlike_set.all()), 1)
def test_unlike_comment(self):
comment = Comment.objects.create(
comment=COMMENT_DATA['comment'],
author=self.user,
content_type=COMMENT_DATA['content_type'],
post=self.post,
)
comment.save()
comment_like = CommentLike.objects.create(
author=self.user,
comment=comment
)
comment_like.save()
self.assertEqual(len(comment.commentlike_set.all()), 1)
self.client.login(username=TEST_USERNAME, password=TEST_PASSWORD)
res = self.client.post(reverse('posts:unlike-comment', kwargs={'post_pk': self.post.id, 'pk': comment.id}))
self.assertEqual(res.status_code, 302)
self.assertEqual(len(comment.commentlike_set.all()), 0)
def test_share_post(self):
alice = get_user_model().objects.create_user(username='alice', password='password')
self.client.login(username=alice.username, password=alice.password)
self.post = Post.objects.create(
title=POST_DATA['title'],
description=POST_DATA['description'],
content_type=POST_DATA['content_type'],
content=POST_DATA['content'],
author_id=alice.id,
unlisted=True)
self.post.save()
initial_post_count = len(Post.objects.all())
self.client.login(username=TEST_USERNAME, password=TEST_PASSWORD)
self.shared_post = Post.objects.create(
title=POST_DATA['title'],
description=POST_DATA['description'],
content_type=POST_DATA['content_type'],
content=POST_DATA['content'],
author_id=alice.id,
original_author=self.user,
unlisted=True)
self.shared_post.save()
res = self.client.get(reverse('posts:my-posts'))
self.assertEqual(res.status_code, 200)
self.assertEqual(len(Post.objects.all()), initial_post_count + 1)
def test_contains_remote_comments(self):
author = json.loads(SAMPLE_REMOTE_AUTHOR)
author_url = author.get('url')
remote_comment = RemoteComment.objects.create(
comment=COMMENT_DATA['comment'],
author_url=author_url,
content_type=COMMENT_DATA['content_type'],
post=self.post,
)
remote_comment.save()
self.client.login(username=TEST_USERNAME, password=TEST_PASSWORD)
mock_response = Response()
mock_response.json = MagicMock(return_value=author)
mock_server = Server(
service_address="https://cmput-404-w22-project-group09.herokuapp.com/service",
username="hello",
password="no",
)
mock_server.get = MagicMock(return_value=mock_response)
with patch('servers.models.Server.objects') as MockServerObjects:
MockServerObjects.all.return_value = [mock_server]
self.client.login(username=TEST_USERNAME, password=TEST_PASSWORD)
res = self.client.get(reverse('posts:detail', kwargs={'pk': self.post.id}))
remote_comment_count = len(RemoteComment.objects.filter(post=self.post))
self.assertTemplateUsed(res, 'posts/partials/_remote_comment.html', count=remote_comment_count)
class RemotePostDetailView(TestCase):
def setUp(self) -> None:
self.user = get_user_model().objects.create_user(username=TEST_USERNAME, password=TEST_PASSWORD)
def test_remote_detail_view_page(self):
mock_json_response = json.loads(SAMPLE_REMOTE_POST)
mock_response = Response()
mock_response.json = MagicMock(return_value=mock_json_response)
mock_server = Server(
service_address="http://localhost:5555/api/v2",
username="hello",
password="no",
)
mock_server.get = MagicMock(return_value=mock_response)
with patch('servers.models.Server.objects') as MockServerObjects:
MockServerObjects.all.return_value = [mock_server]
self.client.login(username=TEST_USERNAME, password=TEST_PASSWORD)
res = self.client.get(
reverse(
'posts:remote-detail',
kwargs={
'url': 'http://localhost:5555/api/v2/authors/1/posts/1/'}))
self.assertEqual(res.status_code, 200)
self.assertContains(res, mock_json_response['title'])
self.assertContains(res, mock_json_response['author']['display_name'])
def test_contains_remote_comments(self):
mock_json_response = json.loads(SAMPLE_REMOTE_POST)
mock_response = Response()
mock_response.json = MagicMock(return_value=mock_json_response)
mock_server = Server(
service_address="http://localhost:5555/api/v2",
username="hello",
password="no",
)
mock_server.get = MagicMock(return_value=mock_response)
with patch('servers.models.Server.objects') as MockServerObjects:
MockServerObjects.all.return_value = [mock_server]
self.client.login(username=TEST_USERNAME, password=TEST_PASSWORD)
res = self.client.get(
reverse(
'posts:remote-detail',
kwargs={
'url': 'http://localhost:5555/api/v2/authors/1/posts/1/'}))
self.assertEqual(res.status_code, 200)
self.assertTemplateUsed(res, 'posts/partials/_remote_comment.html')
for comment in mock_json_response['comment_src']:
self.assertContains(res, comment['comment'])
self.assertContains(res, comment['published'])
self.assertContains(res, comment['author']['display_name'])
class PostDeleteViewTests(TestCase):
def setUp(self) -> None:
self.client = Client()
self.user = get_user_model().objects.create_user(username='bob', password='password')
self.post = Post.objects.create(
title=POST_DATA['title'],
description=POST_DATA['description'],
content_type=POST_DATA['content_type'],
content=POST_DATA['content'],
author_id=self.user.id,
unlisted=True)
self.post.save()
def test_post_delete_view_page(self):
self.client.login(username='bob', password='password')
res = self.client.get(reverse('posts:delete', kwargs={'pk': self.post.id}))
self.assertEqual(res.status_code, 200)
self.assertTemplateUsed(res, 'posts/delete_post.html')
self.assertContains(res, self.post.title)
def test_post_delete_view(self):
initial_post_count = len(Post.objects.all())
self.client.login(username='bob', password='password')
res = self.client.post(reverse('posts:delete', kwargs={'pk': self.post.id}))
self.assertEqual(res.status_code, 302)
self.assertEqual(len(Post.objects.all()), initial_post_count - 1)
class CreateCommentViewTests(TestCase):
def setUp(self) -> None:
self.client = Client()
self.user = get_user_model().objects.create_user(username='bob', password='password')
self.post = Post.objects.create(
title=POST_DATA['title'],
description=POST_DATA['description'],
content_type=POST_DATA['content_type'],
content=POST_DATA['content'],
author_id=self.user.id,
unlisted=True)
self.post.save()
def test_new_comment_page(self):
self.client.login(username='bob', password='password')
res = self.client.get(reverse('posts:new-comment', kwargs={'pk': self.post.id}))
self.assertEqual(res.status_code, 200)
self.assertTemplateUsed(res, 'comments/create_comment.html')
def test_new_comment(self):
self.client.login(username='bob', password='password')
res = self.client.post(reverse('posts:new-comment', kwargs={'pk': self.post.id}), data=COMMENT_DATA)
self.assertEqual(res.status_code, 302)
self.assertRedirects(res, reverse('posts:detail', kwargs={'pk': self.post.id}))
self.assertEqual(len(self.post.comment_set.all()), 1)
def test_new_comment_require_login(self):
res = self.client.get(reverse('posts:edit', kwargs={'pk': self.post.id}))
self.assertEqual(res.status_code, 302)
def test_new_comment_require_csrf(self):
csrf_client = Client(enforce_csrf_checks=True)
csrf_client.login(username='bob', password='password')
res = csrf_client.post(reverse('posts:new-comment', kwargs={'pk': self.post.id}), data=COMMENT_DATA)
self.assertEqual(res.status_code, 403)
class MyPostsViewTests(TestCase):
def setUp(self) -> None:
self.client = Client()
self.user = get_user_model().objects.create_user(username='bob', password='password')
alice = get_user_model().objects.create_user(username='alice', password='password')
self.posts: list[Post] = []
self.posts_per_user = 2
for user in [self.user, alice]:
for _ in range(self.posts_per_user):
post = Post.objects.create(
title=POST_DATA['title'],
description=POST_DATA['description'],
content_type=POST_DATA['content_type'],
content=POST_DATA['content'],
author_id=user.id,
unlisted=True)
post.save()
def test_post_list_page(self):
self.client.login(username='bob', password='password')
res = self.client.get(reverse('posts:my-posts'))
self.assertEqual(res.status_code, 200)
self.assertTemplateUsed(res, 'posts/post_list.html')
def test_post_list_empty_page(self):
Post.objects.filter(author=self.user).delete()
self.client.login(username='bob', password='password')
res = self.client.get(reverse('posts:my-posts'))
self.assertEqual(res.status_code, 200)
self.assertContains(res, 'No posts yet')
def test_post_list(self):
self.client.login(username='bob', password='password')
res = self.client.get(reverse('posts:my-posts'))
self.assertEqual(res.status_code, 200)
self.assertContains(res, POST_DATA['title'], self.posts_per_user)
def test_new_comment_require_login(self):
res = self.client.get(reverse('posts:my-posts'))
self.assertEqual(res.status_code, 302)
| 42.843478 | 115 | 0.651157 | 2,364 | 19,708 | 5.240271 | 0.071912 | 0.053277 | 0.050614 | 0.061995 | 0.808605 | 0.782612 | 0.756377 | 0.743704 | 0.716581 | 0.683323 | 0 | 0.009623 | 0.219606 | 19,708 | 459 | 116 | 42.936819 | 0.795839 | 0.001218 | 0 | 0.629243 | 0 | 0 | 0.099075 | 0.012753 | 0 | 0 | 0 | 0 | 0.185379 | 1 | 0.109661 | false | 0.130548 | 0.028721 | 0 | 0.156658 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
fabe8f3b4e81d73dbca13b3c754135259b3231d5 | 784 | py | Python | examples/service/handlers/secondService.py | BSlience/fastweb | 2c1b956e9846c4205d0201d39d09891d088754e4 | [
"Apache-2.0"
] | 123 | 2017-06-06T04:59:07.000Z | 2019-07-11T10:20:35.000Z | examples/service/handlers/secondService.py | BSlience/fastweb | 2c1b956e9846c4205d0201d39d09891d088754e4 | [
"Apache-2.0"
] | null | null | null | examples/service/handlers/secondService.py | BSlience/fastweb | 2c1b956e9846c4205d0201d39d09891d088754e4 | [
"Apache-2.0"
] | 2 | 2017-06-28T05:58:39.000Z | 2018-09-25T00:18:33.000Z | # coding:utf8
"""generate by fasthrift"""
from fastweb.service import ABLogic
class secondServiceHandler(ABLogic):
def sayHello(self):
pass
def getData(self):
pass
# coding:utf8
"""generate by fasthrift"""
from fastweb.service import ABLogic
class secondServiceHandler(ABLogic):
def sayHello(self):
pass
def getData(self):
pass
# coding:utf8
"""generate by fasthrift"""
from fastweb.service import ABLogic
class secondServiceHandler(ABLogic):
def sayHello(self):
pass
def getData(self):
pass
# coding:utf8
"""generate by fasthrift"""
from fastweb.service import ABLogic
class secondServiceHandler(ABLogic):
def sayHello(self):
pass
def getData(self):
pass
| 12.061538 | 36 | 0.658163 | 84 | 784 | 6.142857 | 0.202381 | 0.124031 | 0.139535 | 0.155039 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0.006803 | 0.25 | 784 | 64 | 37 | 12.25 | 0.870748 | 0.089286 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.333333 | 0.166667 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 11 |
ea015cd3ae7c3bf7b769db882ddfd8d7c3774afe | 32 | py | Python | torch_geometric_signed_directed/nn/general/__init__.py | huangjunjie-cs/pytorch_geometric_signed_directed-1 | 24b121ff4325d201b30811975bcb6f104a39dc35 | [
"MIT"
] | null | null | null | torch_geometric_signed_directed/nn/general/__init__.py | huangjunjie-cs/pytorch_geometric_signed_directed-1 | 24b121ff4325d201b30811975bcb6f104a39dc35 | [
"MIT"
] | null | null | null | torch_geometric_signed_directed/nn/general/__init__.py | huangjunjie-cs/pytorch_geometric_signed_directed-1 | 24b121ff4325d201b30811975bcb6f104a39dc35 | [
"MIT"
] | null | null | null | from .conv_base import Conv_Base | 32 | 32 | 0.875 | 6 | 32 | 4.333333 | 0.666667 | 0.615385 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09375 | 32 | 1 | 32 | 32 | 0.896552 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
ea1e9ac4ec5fa72718dac11c46a9297ac91d5678 | 343 | py | Python | Chapter 3/changing_guest_list.py | WilliamJaber/Python-Crash-Course | d87621785011039fbe0b42f0d8b6cd2364246577 | [
"MIT"
] | null | null | null | Chapter 3/changing_guest_list.py | WilliamJaber/Python-Crash-Course | d87621785011039fbe0b42f0d8b6cd2364246577 | [
"MIT"
] | null | null | null | Chapter 3/changing_guest_list.py | WilliamJaber/Python-Crash-Course | d87621785011039fbe0b42f0d8b6cd2364246577 | [
"MIT"
] | 5 | 2021-09-22T16:53:47.000Z | 2022-03-24T00:56:49.000Z | guest_list = ['Alex', 'Dan', 'Edd']
guest_list[-1] = 'Dave'
print(f'Hey {guest_list[0]}! you are invited for my celebration dinner next week.')
print(f'Hey {guest_list[1]}! you are invited for my celebration dinner next week.')
print(f'Hey {guest_list[2]}! you are invited for my celebration dinner next week.')
print("Edd can´t make it.")
| 34.3 | 83 | 0.705539 | 61 | 343 | 3.901639 | 0.42623 | 0.189076 | 0.113445 | 0.176471 | 0.789916 | 0.714286 | 0.714286 | 0.714286 | 0.714286 | 0.714286 | 0 | 0.013559 | 0.139942 | 343 | 9 | 84 | 38.111111 | 0.789831 | 0 | 0 | 0 | 0 | 0 | 0.731778 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.666667 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 9 |
ea420e7910ebbbc1b4698230c96938e2ffd1ee63 | 185 | py | Python | square.py | shuquan/luca-s-toy | 057d9bfaf792de7bccb12e51ca3efa5146fad039 | [
"Apache-2.0"
] | null | null | null | square.py | shuquan/luca-s-toy | 057d9bfaf792de7bccb12e51ca3efa5146fad039 | [
"Apache-2.0"
] | null | null | null | square.py | shuquan/luca-s-toy | 057d9bfaf792de7bccb12e51ca3efa5146fad039 | [
"Apache-2.0"
] | null | null | null | import turtle
luca = turtle.Turtle()
luca.forward(50)
luca.right(90)
luca.forward(50)
luca.right(90)
luca.forward(50)
luca.right(90)
luca.forward(50)
luca.right(90)
turtle.done()
| 10.277778 | 22 | 0.724324 | 31 | 185 | 4.322581 | 0.258065 | 0.328358 | 0.38806 | 0.507463 | 0.716418 | 0.716418 | 0.716418 | 0.716418 | 0.716418 | 0.716418 | 0 | 0.09697 | 0.108108 | 185 | 17 | 23 | 10.882353 | 0.715152 | 0 | 0 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.090909 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
ea6d995abcc30475e962f89c562338abf3cbbca7 | 132 | py | Python | apps/__init__.py | anthill-arch/framework | a6c238a62ae9c3fb319d12e77f7e9047aab75e8d | [
"MIT"
] | 1 | 2018-11-30T21:56:14.000Z | 2018-11-30T21:56:14.000Z | apps/__init__.py | anthill-arch/framework | a6c238a62ae9c3fb319d12e77f7e9047aab75e8d | [
"MIT"
] | null | null | null | apps/__init__.py | anthill-arch/framework | a6c238a62ae9c3fb319d12e77f7e9047aab75e8d | [
"MIT"
] | null | null | null | from anthill.framework.apps.builder import app
from anthill.framework.apps.cls import Application
__all__ = ['app', 'Application']
| 26.4 | 50 | 0.795455 | 17 | 132 | 5.941176 | 0.588235 | 0.217822 | 0.39604 | 0.475248 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.098485 | 132 | 4 | 51 | 33 | 0.84874 | 0 | 0 | 0 | 0 | 0 | 0.106061 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
ea8b4067038868dec5d59c618d7d0279796bf90e | 125 | py | Python | web_progress/models/__init__.py | redblow/odooDingDing | d5bea9d69889615819d82d94924a5d54b498db03 | [
"Apache-2.0"
] | 6 | 2019-10-04T01:57:03.000Z | 2021-10-25T00:53:27.000Z | web_progress/models/__init__.py | redblow/odooDingDing | d5bea9d69889615819d82d94924a5d54b498db03 | [
"Apache-2.0"
] | null | null | null | web_progress/models/__init__.py | redblow/odooDingDing | d5bea9d69889615819d82d94924a5d54b498db03 | [
"Apache-2.0"
] | 1 | 2022-03-22T09:23:48.000Z | 2022-03-22T09:23:48.000Z | from . import base
from . import base_import
from . import ir_actions_report
from . import ir_cron
from . import web_progress | 25 | 31 | 0.808 | 20 | 125 | 4.8 | 0.45 | 0.520833 | 0.291667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152 | 125 | 5 | 32 | 25 | 0.90566 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
578b14792d41cff4f4aec2bbc43c4de6917e75f0 | 33,440 | py | Python | apps/space/views.py | tiposaurio/venton | 23660aba99c520e6da55982b92a6d3a3a9a6855e | [
"BSD-3-Clause"
] | null | null | null | apps/space/views.py | tiposaurio/venton | 23660aba99c520e6da55982b92a6d3a3a9a6855e | [
"BSD-3-Clause"
] | null | null | null | apps/space/views.py | tiposaurio/venton | 23660aba99c520e6da55982b92a6d3a3a9a6855e | [
"BSD-3-Clause"
] | 1 | 2021-07-13T22:11:42.000Z | 2021-07-13T22:11:42.000Z | # _*_ coding: utf-8 _*_
"""
@copyright Copyright (c) 2014 Submit Consulting
@author Angel Sullon (@asullom)
@package space
Descripcion: Implementacion de los controladores de la app space
"""
import logging
log = logging.getLogger(__name__)
from django.utils.translation import ugettext as _ # , ungettext
from django.utils.encoding import force_text
from django.utils.text import capfirst, get_text_list
from django.utils.decorators import method_decorator
from django.contrib.auth.decorators import login_required
from django.contrib import messages
from django.views import generic
from django.db import transaction
from django.db.models import Avg, Max, Min, Count
from django.http import HttpResponseRedirect
#from django.http import Http404
#from django.shortcuts import render, render_to_response, redirect, get_object_or_404
from django.core.urlresolvers import reverse_lazy
from django.conf import settings
from apps.utils.forms import empty
from apps.utils.security import SecurityKey, log_params, get_dep_objects, UserToken
from apps.utils.decorators import permission_resource_required
# models
from .models import Solution, Association, Enterprise, Headquar
# forms
from apps.space.forms import SolutionForm, AssociationForm, EnterpriseForm, \
HeadquarForm, HeadquarAssociationForm
# others
import datetime
import json
#from unicodedata import normalize
#from django.utils import translation
#from django.utils import timezone
#from django.utils.timezone import get_current_timezone
#from django.conf import settings
# http://ccbv.co.uk/projects/Django/1.6/django.views.generic.edit
#context_object_name = 'pagex_obj'
# https://djangosnippets.org/snippets/73/ paginator
#from django.utils.html import escape, escapejs
#from django.core.exceptions import ObjectDoesNotExist
from django.core.mail import send_mail
from django.contrib.sites.models import Site
from django.template.loader import render_to_string
class HeadquarAssociationUpdateView(generic.edit.UpdateView):
""" """
model = Headquar
form_class = HeadquarAssociationForm
success_url = reverse_lazy('space:headquar-list')
template_name = 'space/headquarassociation_form.html'
@method_decorator(permission_resource_required)
def dispatch(self, request, *args, **kwargs):
key = self.kwargs.get(self.pk_url_kwarg, None)
pk = SecurityKey.is_valid_key(request, key, 'headquar_uas')
if not pk:
return HttpResponseRedirect(self.success_url)
self.kwargs['pk'] = pk
try:
self.get_object()
except Exception, e:
messages.error(self.request, e)
log.warning(force_text(e), extra=log_params(self.request))
return HttpResponseRedirect(self.success_url)
return super(HeadquarAssociationUpdateView, self).dispatch(request, *args, **kwargs)
def get_context_data(self, **kwargs):
context = super(
HeadquarAssociationUpdateView, self).get_context_data(**kwargs)
context['opts'] = self.model._meta
context['cmi'] = 'headquar'
context['title'] = _('Change %s') % _('Association')
try:
association_name_list = json.dumps(
list(col["name"] + "" for col in Association.objects.values("name").filter(is_active=True).order_by("name")))
except Exception, e:
messages.error(self.request, e)
context['association_name_list'] = association_name_list
return context
def get_initial(self):
initial = super(HeadquarAssociationUpdateView, self).get_initial()
initial = initial.copy()
initial['association_name'] = self.object.association.name
return initial
# TODO msg
def form_valid(self, form):
try:
association_name = self.request.POST.get("association_name")
try:
form.instance.association = Association.objects.get(
name=association_name)
except:
raise Exception(
u"La asociación <b>%s</b> no existe, vuelva a intentar " %
(self.request.POST.get("association_name")))
# salvar registro
self.object = form.save(commit=True)
msg = _('The %(name)s "%(obj)s" was changed successfully.') % {
'name': capfirst(force_text(self.model._meta.verbose_name)),
'obj': force_text(self.object)
}
messages.success(self.request, msg)
log.warning(msg, extra=log_params(self.request))
return super(HeadquarAssociationUpdateView, self).form_valid(form)
except Exception, e:
messages.error(self.request, e)
log.warning(force_text(e), extra=log_params(self.request))
return super(HeadquarAssociationUpdateView, self).form_invalid(form)
class HeadquarUpdateActiveView(generic.View):
""" """
model = Headquar
success_url = reverse_lazy('space:headquar-list')
@method_decorator(permission_resource_required)
def dispatch(self, request, *args, **kwargs):
key = self.kwargs['pk']
state = self.kwargs['state']
pk = SecurityKey.is_valid_key(request, key, 'headquar_%s' % state)
if not pk:
return HttpResponseRedirect(self.success_url)
try:
self.object = self.model.objects.get(pk=pk)
except Exception, e:
messages.error(self.request, e)
log.warning(force_text(e), extra=log_params(self.request))
return HttpResponseRedirect(self.success_url)
msg = _('The %(name)s "%(obj)s" was %(action)s successfully.') % {
'name': capfirst(force_text(self.model._meta.verbose_name)),
'obj': force_text(self.object),
'action': (_('reactivated') if state == 'rea' else _('inactivated'))
}
mse = _('The %(name)s "%(obj)s" is already %(action)s.') % {
'name': capfirst(force_text(self.model._meta.verbose_name)),
'obj': force_text(self.object),
'action': (_('active') if state == 'rea' else _('inactive'))
}
try:
if state == 'ina' and not self.object.is_active:
raise Exception(mse)
else:
if state == 'rea' and self.object.is_active:
raise Exception(mse)
else:
self.object.is_active = (True if state == 'rea' else False)
self.object.save()
messages.success(self.request, msg)
log.warning(msg, extra=log_params(self.request))
except Exception, e:
messages.error(self.request, e)
log.warning(force_text(e), extra=log_params(self.request))
return HttpResponseRedirect(self.success_url)
class HeadquarCreateView(generic.edit.CreateView):
""" """
model = Headquar
form_class = HeadquarForm
success_url = reverse_lazy('space:headquar-list')
@method_decorator(permission_resource_required)
def dispatch(self, request, *args, **kwargs):
return super(HeadquarCreateView, self).dispatch(request, *args, **kwargs)
def get_context_data(self, **kwargs):
context = super(HeadquarCreateView, self).get_context_data(**kwargs)
context['opts'] = self.model._meta
context['cmi'] = 'headquar'
context['title'] = _('Add %s') % _('Headquar')
return context
def form_valid(self, form):
try:
form.instance.association_id = UserToken.get_association_id(
self.request.session)
form.instance.enterprise_id = UserToken.get_enterprise_id(
self.request.session)
self.object = form.save(commit=True)
msg = _('The %(name)s "%(obj)s" was added successfully.') % {
'name': capfirst(force_text(self.model._meta.verbose_name)),
'obj': force_text(self.object)
}
if self.object.id:
messages.success(self.request, msg)
log.warning(msg, extra=log_params(self.request))
return super(HeadquarCreateView, self).form_valid(form)
except Exception, e:
messages.error(self.request, e)
log.warning(force_text(e), extra=log_params(self.request))
return super(HeadquarCreateView, self).form_invalid(form)
class HeadquarUpdateView(generic.edit.UpdateView):
""" """
model = Headquar
form_class = HeadquarForm
success_url = reverse_lazy('space:headquar-list')
@method_decorator(permission_resource_required)
def dispatch(self, request, *args, **kwargs):
key = self.kwargs.get(self.pk_url_kwarg, None)
pk = SecurityKey.is_valid_key(request, key, 'headquar_upd')
if not pk:
return HttpResponseRedirect(self.success_url)
self.kwargs['pk'] = pk
try:
self.get_object()
except Exception, e:
messages.error(self.request, e)
log.warning(force_text(e), extra=log_params(self.request))
return HttpResponseRedirect(self.success_url)
return super(HeadquarUpdateView, self).dispatch(request, *args, **kwargs)
def get_context_data(self, **kwargs):
context = super(HeadquarUpdateView, self).get_context_data(**kwargs)
context['opts'] = self.model._meta
context['cmi'] = 'headquar'
context['title'] = _('Change %s') % _('Headquar')
return context
def form_valid(self, form):
try:
self.object = form.save(commit=True)
msg = _('The %(name)s "%(obj)s" was changed successfully.') % {
'name': capfirst(force_text(self.model._meta.verbose_name)),
'obj': force_text(self.object)
}
messages.success(self.request, msg)
log.warning(msg, extra=log_params(self.request))
return super(HeadquarUpdateView, self).form_valid(form)
except Exception, e:
messages.error(self.request, e)
log.warning(force_text(e), extra=log_params(self.request))
return super(HeadquarUpdateView, self).form_invalid(form)
class HeadquarListView(generic.ListView):
""" """
model = Headquar
paginate_by = settings.PER_PAGE
@method_decorator(permission_resource_required)
def dispatch(self, request, *args, **kwargs):
enterprise_id = UserToken.get_enterprise_id(request.session)
msg = _(u'%s is not selected or not found in the database.') % _(
'Enterprise')
try:
Enterprise.objects.get(pk=enterprise_id)
except Exception, e:
messages.error(self.request, e)
messages.warning(self.request, msg)
return HttpResponseRedirect(reverse_lazy('accounts:index'))
return super(HeadquarListView, self).dispatch(request, *args, **kwargs)
def get_paginate_by(self, queryset):
if 'all' in self.request.REQUEST:
return None
return generic.ListView.get_paginate_by(self, queryset)
def get_queryset(self):
self.o = empty(self.request, 'o', '-id')
self.f = empty(self.request, 'f', 'name')
self.q = empty(self.request, 'q', '')
column_contains = u'%s__%s' % (self.f, 'contains')
return self.model.objects.filter(
enterprise_id=UserToken.get_enterprise_id(self.request.session)
).filter(**{column_contains: self.q}).order_by(self.o).distinct()
def get_context_data(self, **kwargs):
context = super(HeadquarListView, self).get_context_data(**kwargs)
context['opts'] = self.model._meta
context['cmi'] = 'headquar'
context['title'] = _('Select %s to change') % _('Headquar')
context['o'] = self.o
context['f'] = self.f
context['q'] = self.q.replace('/', '-')
return context
# region Enterprise OK
class EnterpriseUpdateActiveView(generic.View):
""" """
model = Enterprise
success_url = reverse_lazy('space:enterprise-list')
@method_decorator(permission_resource_required)
def dispatch(self, request, *args, **kwargs):
key = self.kwargs['pk']
state = self.kwargs['state']
pk = SecurityKey.is_valid_key(request, key, 'enterprise_%s' % state)
if not pk:
return HttpResponseRedirect(self.success_url)
try:
self.object = self.model.objects.get(pk=pk)
except Exception, e:
messages.error(self.request, e)
log.warning(force_text(e), extra=log_params(self.request))
return HttpResponseRedirect(self.success_url)
msg = _('The %(name)s "%(obj)s" was %(action)s successfully.') % {
'name': capfirst(force_text(self.model._meta.verbose_name)),
'obj': force_text(self.object),
'action': (_('reactivated') if state == 'rea' else _('inactivated'))
}
mse = _('The %(name)s "%(obj)s" is already %(action)s.') % {
'name': capfirst(force_text(self.model._meta.verbose_name)),
'obj': force_text(self.object),
'action': (_('active') if state == 'rea' else _('inactive'))
}
try:
if state == 'ina' and not self.object.is_active:
raise Exception(mse)
else:
if state == 'rea' and self.object.is_active:
raise Exception(mse)
else:
self.object.is_active = (True if state == 'rea' else False)
self.object.save()
messages.success(self.request, msg)
log.warning(msg, extra=log_params(self.request))
except Exception, e:
messages.error(self.request, e)
log.warning(force_text(e), extra=log_params(self.request))
return HttpResponseRedirect(self.success_url)
class EnterpriseDeleteView(generic.edit.BaseDeleteView):
""" Elimina empresa con todas sus sedes """
model = Enterprise
success_url = reverse_lazy('space:enterprise-list')
@method_decorator(permission_resource_required)
def dispatch(self, request, *args, **kwargs):
key = self.kwargs['pk']
pk = SecurityKey.is_valid_key(request, key, 'enterprise_del')
if not pk:
return HttpResponseRedirect(self.success_url)
self.kwargs['pk'] = pk
try:
self.get_object()
except Exception, e:
messages.error(self.request, e)
log.warning(force_text(e), extra=log_params(self.request))
return HttpResponseRedirect(self.success_url)
return super(EnterpriseDeleteView, self).dispatch(request, *args, **kwargs)
# TODO msg
@transaction.atomic
def delete(self, request, *args, **kwargs):
sid = transaction.savepoint()
try:
association = Association.objects.get(
id=UserToken.get_association_id(request.session))
if Enterprise.objects.filter(headquar__association_id=UserToken.get_association_id(request.session)).count() == 1:
raise Exception(
(u"Asociación <b>%(name)s</b> no puede quedar sin ninguna sede asociada.") % {"name": association.name})
d = self.get_object()
# rastrear dependencias
deps, msg = get_dep_objects(d)
if deps:
messages.warning(self.request, _('Cannot delete %(name)s') % {
"name": capfirst(force_text(self.model._meta.verbose_name))
+ ' "' + force_text(d) + '"'
})
raise Exception(msg)
d.delete()
msg = _('The %(name)s "%(obj)s" was deleted successfully.') % {
'name': capfirst(force_text(self.model._meta.verbose_name)),
'obj': force_text(d)
}
if not d.id:
messages.success(self.request, msg)
log.warning(msg, extra=log_params(self.request))
except Exception, e:
try:
transaction.savepoint_rollback(sid)
except:
pass
messages.error(request, e)
log.warning(force_text(e), extra=log_params(self.request))
return HttpResponseRedirect(self.success_url)
def get(self, request, *args, **kwargs):
return self.delete(request, *args, **kwargs)
class EnterpriseCreateView(generic.edit.CreateView):
""" """
model = Enterprise
form_class = EnterpriseForm
success_url = reverse_lazy('space:enterprise-list')
@method_decorator(permission_resource_required)
def dispatch(self, request, *args, **kwargs):
return super(EnterpriseCreateView, self).dispatch(request, *args, **kwargs)
def get_context_data(self, **kwargs):
context = super(EnterpriseCreateView, self).get_context_data(**kwargs)
context['opts'] = self.model._meta
context['cmi'] = 'enterprise'
context['title'] = _('Add %s') % _('Enterprise')
return context
def get_form_kwargs(self):
kwargs = super(EnterpriseCreateView, self).get_form_kwargs()
kwargs['create'] = True
return kwargs
@transaction.atomic
def form_valid(self, form):
sid = transaction.savepoint()
try:
self.object = form.save(commit=True)
headquar = Headquar()
headquar.name = self.request.POST.get("sede")
headquar.association_id = UserToken.get_association_id(
self.request.session)
headquar.enterprise = self.object
headquar.save()
msg = _('The %(name)s "%(obj)s" was added successfully.') % {
'name': capfirst(force_text(self.model._meta.verbose_name)),
'obj': force_text(self.object)
}
if self.object.id:
messages.success(self.request, msg)
log.warning(msg, extra=log_params(self.request))
return super(EnterpriseCreateView, self).form_valid(form)
except Exception, e:
try:
transaction.savepoint_rollback(sid)
except:
pass
messages.success(self.request, e)
log.warning(force_text(e), extra=log_params(self.request))
return super(EnterpriseCreateView, self).form_invalid(form)
class EnterpriseListView(generic.ListView):
""" """
model = Enterprise
paginate_by = settings.PER_PAGE
@method_decorator(permission_resource_required)
def dispatch(self, request, *args, **kwargs):
association_id = UserToken.get_association_id(request.session)
msg = _(u'%s is not selected or not found in the database.') % _(
'Association')
try:
Association.objects.get(pk=association_id)
except Exception, e:
messages.error(self.request, e)
messages.warning(self.request, msg)
return HttpResponseRedirect(reverse_lazy('accounts:index'))
return super(EnterpriseListView, self).dispatch(request, *args, **kwargs)
def get_paginate_by(self, queryset):
if 'all' in self.request.REQUEST:
return None
return generic.ListView.get_paginate_by(self, queryset)
def get_queryset(self):
self.o = empty(self.request, 'o', '-id')
self.f = empty(self.request, 'f', 'name')
self.q = empty(self.request, 'q', '')
column_contains = u'%s__%s' % (self.f, 'contains')
return self.model.objects.filter(
headquar__association_id=UserToken.get_association_id(
self.request.session)
).annotate(num_sedes=Count("headquar")).filter(
**{column_contains: self.q}).order_by(self.o).distinct()
def get_context_data(self, **kwargs):
context = super(EnterpriseListView, self).get_context_data(**kwargs)
context['opts'] = self.model._meta
context['cmi'] = 'enterprise'
context['title'] = _('Select %s to change') % _('Enterprise')
context['o'] = self.o
context['f'] = self.f
context['q'] = self.q.replace('/', '-')
return context
class EnterpriseUpdateView(generic.edit.UpdateView):
""" """
model = Enterprise
form_class = EnterpriseForm
success_url = reverse_lazy('space:enterprise-list')
@method_decorator(permission_resource_required)
def dispatch(self, request, *args, **kwargs):
key = self.kwargs.get('pk', None)
if key:
pk = SecurityKey.is_valid_key(request, key, 'enterprise_upd')
if not pk:
return HttpResponseRedirect(self.success_url)
self.kwargs['pk'] = pk
try:
self.get_object()
except Exception, e:
messages.error(self.request, e)
return HttpResponseRedirect(self.success_url)
else:
self.kwargs['pk'] = UserToken.get_enterprise_id(request.session)
self.success_url = reverse_lazy('space:enterprise-edit_current')
msg = _(u'%s is not selected or not found in the database.') % _(
'Enterprise')
try:
self.get_object()
except Exception, e:
messages.error(self.request, e)
messages.warning(self.request, msg)
return HttpResponseRedirect(reverse_lazy('accounts:index'))
return super(EnterpriseUpdateView, self).dispatch(request, *args, **kwargs)
def get_context_data(self, **kwargs):
context = super(EnterpriseUpdateView, self).get_context_data(**kwargs)
context['opts'] = self.model._meta
context['cmi'] = 'enterprise'
context['title'] = _('Change %s') % (_('Enterprise') + ': ' +
force_text(self.get_object()))
return context
def form_valid(self, form):
self.object = form.save(commit=True)
msg = _('The %(name)s "%(obj)s" was changed successfully.') % {
'name': capfirst(force_text(self.model._meta.verbose_name)),
'obj': force_text(self.object)
}
messages.success(self.request, msg)
log.warning(force_text(msg), extra=log_params(self.request))
return super(EnterpriseUpdateView, self).form_valid(form)
class AssociationUpdateView(generic.edit.UpdateView):
""" """
model = Association
form_class = AssociationForm
success_url = reverse_lazy('space:association-edit_current')
@method_decorator(permission_resource_required)
def dispatch(self, request, *args, **kwargs):
self.kwargs['pk'] = UserToken.get_association_id(request.session)
msg = _(u'%s is not selected or not found in the database.') % _(
'Association')
try:
self.get_object()
except Exception, e:
messages.error(self.request, e)
messages.warning(self.request, msg)
return HttpResponseRedirect(reverse_lazy('accounts:index'))
return super(AssociationUpdateView, self).dispatch(request, *args, **kwargs)
def get_context_data(self, **kwargs):
context = super(AssociationUpdateView, self).get_context_data(**kwargs)
context['opts'] = self.model._meta
context['cmi'] = 'association'
context['title'] = _('Change %s') % (_('Association') + ': ' +
force_text(self.get_object()))
return context
def form_valid(self, form):
self.object = form.save(commit=True)
msg = _('The %(name)s "%(obj)s" was changed successfully.') % {
'name': capfirst(force_text(self.model._meta.verbose_name)),
'obj': force_text(self.object)
}
messages.success(self.request, msg)
log.warning(force_text(msg), extra=log_params(self.request))
return super(AssociationUpdateView, self).form_valid(form)
# region Solution OK
class SolutionUpdateActiveView(generic.View):
""" """
model = Solution
success_url = reverse_lazy('space:solution-list')
@method_decorator(permission_resource_required)
def dispatch(self, request, *args, **kwargs):
key = self.kwargs['pk']
state = self.kwargs['state']
pk = SecurityKey.is_valid_key(request, key, 'solution_%s' % state)
if not pk:
return HttpResponseRedirect(self.success_url)
try:
self.object = self.model.objects.get(pk=pk)
except Exception, e:
messages.error(self.request, e)
log.warning(force_text(e), extra=log_params(self.request))
return HttpResponseRedirect(self.success_url)
msg = _('The %(name)s "%(obj)s" was %(action)s successfully.') % {
'name': capfirst(force_text(self.model._meta.verbose_name)),
'obj': force_text(self.object),
'action': (_('reactivated') if state == 'rea' else _('inactivated'))
}
mse = _('The %(name)s "%(obj)s" is already %(action)s.') % {
'name': capfirst(force_text(self.model._meta.verbose_name)),
'obj': force_text(self.object),
'action': (_('active') if state == 'rea' else _('inactive'))
}
try:
if state == 'ina' and not self.object.is_active:
raise Exception(mse)
else:
if state == 'rea' and self.object.is_active:
raise Exception(mse)
else:
self.object.is_active = (True if state == 'rea' else False)
self.object.save()
messages.success(self.request, msg)
log.warning(msg, extra=log_params(self.request))
except Exception, e:
messages.error(self.request, e)
log.warning(force_text(e), extra=log_params(self.request))
return HttpResponseRedirect(self.success_url)
class SolutionDeleteView(generic.edit.BaseDeleteView):
""" """
model = Solution
success_url = reverse_lazy('space:solution-list')
@method_decorator(permission_resource_required)
def dispatch(self, request, *args, **kwargs):
key = self.kwargs['pk']
pk = SecurityKey.is_valid_key(request, key, 'solution_del')
if not pk:
return HttpResponseRedirect(self.success_url)
self.kwargs['pk'] = pk
try:
self.get_object()
except Exception, e:
messages.error(self.request, e)
log.warning(force_text(e), extra=log_params(self.request))
return HttpResponseRedirect(self.success_url)
return super(SolutionDeleteView, self).dispatch(request, *args, **kwargs)
def delete(self, request, *args, **kwargs):
try:
d = self.get_object()
# rastrear dependencias OK
deps, msg = get_dep_objects(d)
if deps:
messages.warning(self.request, _('Cannot delete %(name)s') % {
"name": capfirst(force_text(self.model._meta.verbose_name))
+ ' "' + force_text(d) + '"'
})
raise Exception(msg)
'''
if d.module_set.count() > 0:
raise Exception(
(u"Solucion <b>%(name)s</b> tiene modulos asignados.") % {"name": d.name})
if d.association_set.count() > 0:
raise Exception(
(u"Solucion <b>%(name)s</b> está asignado en asociaciones.") % {"name": d.name})
if d.enterprise_set.count() > 0:
raise Exception(
(u"Solucion <b>%(name)s</b> está asignado en empresas.") % {"name": d.name})
'''
d.delete()
msg = _('The %(name)s "%(obj)s" was deleted successfully.') % {
'name': capfirst(force_text(self.model._meta.verbose_name)),
'obj': force_text(d)
}
if not d.id:
messages.success(self.request, msg)
log.warning(msg, extra=log_params(self.request))
except Exception, e:
messages.error(request, e)
log.warning(force_text(e), extra=log_params(self.request))
return HttpResponseRedirect(self.success_url)
def get(self, request, *args, **kwargs):
return self.delete(request, *args, **kwargs)
class SolutionUpdateView(generic.edit.UpdateView):
""" """
model = Solution
form_class = SolutionForm
success_url = reverse_lazy('space:solution-list')
@method_decorator(permission_resource_required)
def dispatch(self, request, *args, **kwargs):
key = self.kwargs.get(self.pk_url_kwarg, None)
pk = SecurityKey.is_valid_key(request, key, 'solution_upd')
if not pk:
return HttpResponseRedirect(self.success_url)
self.kwargs['pk'] = pk
try:
self.get_object()
'''
ctx_dict = {'activation_key': 'eee',
'expiration_days': 2,
'site': 'localhost:8000'}
subject = render_to_string(
'registration/activation_email_subject.txt',
ctx_dict)
# Email subject *must not* contain newlines
subject = ''.join(subject.splitlines())
message = render_to_string(
'registration/activation_email.txt', ctx_dict)
send_mail(
subject, message, settings.DEFAULT_FROM_EMAIL,
['asullom@gmail.com'], fail_silently=False)
# send_mail(
# 'Subject here', 'Here is the message.', 'asullom@gmail.com',
# ['asullom@gmail.com'], fail_silently=False)
'''
except Exception, e:
messages.error(self.request, e)
log.warning(force_text(e), extra=log_params(self.request))
return HttpResponseRedirect(self.success_url)
return super(SolutionUpdateView, self).dispatch(request, *args, **kwargs)
def get_context_data(self, **kwargs):
context = super(SolutionUpdateView, self).get_context_data(**kwargs)
context['opts'] = self.model._meta
context['cmi'] = 'solution'
context['title'] = _('Change %s') % _('Solution')
return context
def form_valid(self, form):
self.object = form.save(commit=True)
msg = _('The %(name)s "%(obj)s" was changed successfully.') % {
'name': capfirst(force_text(self.model._meta.verbose_name)),
'obj': force_text(self.object)
}
messages.success(self.request, msg)
log.warning(msg, extra=log_params(self.request))
return super(SolutionUpdateView, self).form_valid(form)
class SolutionCreateView(generic.edit.CreateView):
""" """
model = Solution
form_class = SolutionForm
success_url = reverse_lazy('space:solution-list')
@method_decorator(permission_resource_required)
def dispatch(self, request, *args, **kwargs):
return super(SolutionCreateView, self).dispatch(request, *args, **kwargs)
def get_context_data(self, **kwargs):
context = super(SolutionCreateView, self).get_context_data(**kwargs)
context['opts'] = self.model._meta
context['cmi'] = 'solution'
context['now'] = datetime.datetime.now()
context['title'] = _('Add %s') % _('Solution')
return context
def form_valid(self, form):
self.object = form.save(commit=True)
msg = _('The %(name)s "%(obj)s" was added successfully.') % {
'name': capfirst(force_text(self.model._meta.verbose_name)),
'obj': force_text(self.object)
}
if self.object.id:
messages.success(self.request, msg)
log.warning(msg, extra=log_params(self.request))
return super(SolutionCreateView, self).form_valid(form)
class SolutionListView(generic.ListView):
""" """
model = Solution
paginate_by = settings.PER_PAGE
#@method_decorator(login_required)
@method_decorator(permission_resource_required)
def dispatch(self, request, *args, **kwargs):
return super(SolutionListView, self).dispatch(request, *args, **kwargs)
def get_paginate_by(self, queryset):
if 'all' in self.request.REQUEST:
return None
return generic.ListView.get_paginate_by(self, queryset)
def get_queryset(self):
self.o = empty(self.request, 'o', '-id')
self.f = empty(self.request, 'f', 'name')
self.q = empty(self.request, 'q', '')
column_contains = u'%s__%s' % (self.f, 'contains')
return self.model.objects.filter(**{column_contains: self.q}).order_by(self.o)
def get_context_data(self, **kwargs):
#messages.success(self.request, _(u'saé'))
context = super(SolutionListView, self).get_context_data(**kwargs)
context['opts'] = self.model._meta
context['cmi'] = 'solution'
context['title'] = _('Select %s to change') % _('Solution')
context['o'] = self.o
context['f'] = self.f
context['q'] = self.q.replace('/', '-')
return context
| 39.202814 | 126 | 0.609958 | 3,678 | 33,440 | 5.387439 | 0.088363 | 0.06162 | 0.030028 | 0.027252 | 0.77517 | 0.754176 | 0.725309 | 0.714055 | 0.705829 | 0.69301 | 0 | 0.000979 | 0.266776 | 33,440 | 852 | 127 | 39.248826 | 0.80717 | 0.023834 | 0 | 0.744977 | 0 | 0 | 0.086386 | 0.006434 | 0 | 0 | 0 | 0.001174 | 0 | 0 | null | null | 0.003091 | 0.035549 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
57c8df7a1e754c6399707c2018c3bab5dabee359 | 99,246 | py | Python | automateVPCApplication/aws.py | ampcompany/multi-region-infra | d1a212b5c1bb43773c4748f19337f0ca335f506d | [
"MIT"
] | null | null | null | automateVPCApplication/aws.py | ampcompany/multi-region-infra | d1a212b5c1bb43773c4748f19337f0ca335f506d | [
"MIT"
] | null | null | null | automateVPCApplication/aws.py | ampcompany/multi-region-infra | d1a212b5c1bb43773c4748f19337f0ca335f506d | [
"MIT"
] | null | null | null | import boto3
from botocore.exceptions import ClientError
class Aws:
def __init__(self, access_key, secret_access_key):
self.ap_northeast_2_client = None
self.us_east_1_client = None
self.eu_west_1_client = None
self.dhcp_options_list = ['', '', '']
self.vpc_list = ['', '', '']
self.subnet_list = [
[['', ''], ['', ''], ['', '']], # ap-northeast-2
[['', ''], ['', ''], ['', '']], # us-east-1
[['', ''], ['', ''], ['', '']], # eu-west-1
]
self.internet_gateway_list = ['', '', '']
self.route_table_list = [
[[''], ['', ''], ['']], # ap-northeast-2
[[''], ['', ''], ['']], # us-east-1
[[''], ['', ''], ['']], # eu-west-1
]
self.network_acl_list = ['', '', '']
self.network_acl_association_id_list = [[], [], []]
self.elastic_ip_allocation_id_list = [
['', ''], # ap-northeast-2
['', ''], # us-east-1
['', ''], # eu-west-1
]
self.nat_gateway_id_list = [
['', ''], # ap-northeast-2
['', ''], # us-east-1
['', ''], # eu-west-1
]
self.security_group_bastion_id_list = ['', '', '']
self.security_group_elb_id_list = ['', '', '']
self.security_group_ec2_id_list = ['', '', '']
self.security_group_rds_id_list = ['', '', '']
self.access_key = access_key
self.secret_access_key = secret_access_key
def createClient(self):
try:
self.ap_northeast_2_client = boto3.client(
'ec2',
aws_access_key_id=self.access_key,
aws_secret_access_key=self.secret_access_key,
region_name='ap-northeast-2'
)
self.us_east_1_client = boto3.client(
'ec2',
aws_access_key_id=self.access_key,
aws_secret_access_key=self.secret_access_key,
region_name='us-east-1'
)
self.eu_west_1_client = boto3.client(
'ec2',
aws_access_key_id=self.access_key,
aws_secret_access_key=self.secret_access_key,
region_name='eu-west-1'
)
ap_northeast_2_response = self.ap_northeast_2_client.describe_vpcs()
us_east_1_response = self.us_east_1_client.describe_vpcs()
eu_west_1_response = self.eu_west_1_client.describe_vpcs()
except ClientError as err:
raise err
def createDhcpOptions(self):
try:
ap_northeast_2_response = self.ap_northeast_2_client.create_dhcp_options(
DhcpConfigurations=[
{
'Key': 'domain-name-servers',
'Values': [
'AmazonProvidedDNS'
]
},
{
'Key': 'domain-name',
'Values': [
'ap-northeast-2.compute.internal'
]
}
],
TagSpecifications=[
{
'ResourceType': 'dhcp-options',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-DHCP'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
us_east_1_response = self.us_east_1_client.create_dhcp_options(
DhcpConfigurations=[
{
'Key': 'domain-name-servers',
'Values': [
'AmazonProvidedDNS'
]
},
{
'Key': 'domain-name',
'Values': [
'ec2.internal'
]
}
],
TagSpecifications=[
{
'ResourceType': 'dhcp-options',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-DHCP'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
eu_west_1_response = self.eu_west_1_client.create_dhcp_options(
DhcpConfigurations=[
{
'Key': 'domain-name-servers',
'Values': [
'AmazonProvidedDNS'
]
},
{
'Key': 'domain-name',
'Values': [
'eu-west-1.compute.internal'
]
}
],
TagSpecifications=[
{
'ResourceType': 'dhcp-options',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-DHCP'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.dhcp_options_list[0] = ap_northeast_2_response['DhcpOptions']['DhcpOptionsId']
self.dhcp_options_list[1] = us_east_1_response['DhcpOptions']['DhcpOptionsId']
self.dhcp_options_list[2] = eu_west_1_response['DhcpOptions']['DhcpOptionsId']
except ClientError as err:
raise err
def createVpc(self):
try:
ap_northeast_2_response = self.ap_northeast_2_client.create_vpc(
CidrBlock='10.10.0.0/16',
TagSpecifications=[
{
'ResourceType': 'vpc',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-VPC'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
us_east_1_response = self.us_east_1_client.create_vpc(
CidrBlock='10.20.0.0/16',
TagSpecifications=[
{
'ResourceType': 'vpc',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-VPC'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
eu_west_1_response = self.eu_west_1_client.create_vpc(
CidrBlock='10.30.0.0/16',
TagSpecifications=[
{
'ResourceType': 'vpc',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-VPC'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.vpc_list[0] = ap_northeast_2_response['Vpc']['VpcId']
self.vpc_list[1] = us_east_1_response['Vpc']['VpcId']
self.vpc_list[2] = eu_west_1_response['Vpc']['VpcId']
except ClientError as err:
raise err
def associateDhcpOptions(self):
try:
self.ap_northeast_2_client.associate_dhcp_options(
DhcpOptionsId=self.dhcp_options_list[0],
VpcId=self.vpc_list[0],
)
self.us_east_1_client.associate_dhcp_options(
DhcpOptionsId=self.dhcp_options_list[1],
VpcId=self.vpc_list[1],
)
self.eu_west_1_client.associate_dhcp_options(
DhcpOptionsId=self.dhcp_options_list[2],
VpcId=self.vpc_list[2],
)
except ClientError as err:
raise err
def createSubnetPublicA(self):
try:
ap_northeast_2_response = self.ap_northeast_2_client.create_subnet(
AvailabilityZone='ap-northeast-2a',
CidrBlock='10.10.1.0/24',
VpcId=self.vpc_list[0],
TagSpecifications=[
{
'ResourceType': 'subnet',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-PublicSubnetA'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
us_east_1_response = self.us_east_1_client.create_subnet(
AvailabilityZone='us-east-1a',
CidrBlock='10.20.1.0/24',
VpcId=self.vpc_list[1],
TagSpecifications=[
{
'ResourceType': 'subnet',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-PublicSubnetA'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
eu_west_1_response = self.eu_west_1_client.create_subnet(
AvailabilityZone='eu-west-1a',
CidrBlock='10.30.1.0/24',
VpcId=self.vpc_list[2],
TagSpecifications=[
{
'ResourceType': 'subnet',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-PublicSubnetA'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.subnet_list[0][0][0] = ap_northeast_2_response['Subnet']['SubnetId']
self.subnet_list[1][0][0] = us_east_1_response['Subnet']['SubnetId']
self.subnet_list[2][0][0] = eu_west_1_response['Subnet']['SubnetId']
except ClientError as err:
raise err
def createSubnetPublicB(self):
try:
ap_northeast_2_response = self.ap_northeast_2_client.create_subnet(
AvailabilityZone='ap-northeast-2b',
CidrBlock='10.10.2.0/24',
VpcId=self.vpc_list[0],
TagSpecifications=[
{
'ResourceType': 'subnet',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-PublicSubnetB'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
us_east_1_response = self.us_east_1_client.create_subnet(
AvailabilityZone='us-east-1b',
CidrBlock='10.20.2.0/24',
VpcId=self.vpc_list[1],
TagSpecifications=[
{
'ResourceType': 'subnet',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-PublicSubnetB'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
eu_west_1_response = self.eu_west_1_client.create_subnet(
AvailabilityZone='eu-west-1b',
CidrBlock='10.30.2.0/24',
VpcId=self.vpc_list[2],
TagSpecifications=[
{
'ResourceType': 'subnet',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-PublicSubnetB'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.subnet_list[0][0][1] = ap_northeast_2_response['Subnet']['SubnetId']
self.subnet_list[1][0][1] = us_east_1_response['Subnet']['SubnetId']
self.subnet_list[2][0][1] = eu_west_1_response['Subnet']['SubnetId']
except ClientError as err:
raise err
def createSubnetPrivateA(self):
try:
ap_northeast_2_response = self.ap_northeast_2_client.create_subnet(
AvailabilityZone='ap-northeast-2a',
CidrBlock='10.10.11.0/24',
VpcId=self.vpc_list[0],
TagSpecifications=[
{
'ResourceType': 'subnet',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-PrivateSubnetA'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
us_east_1_response = self.us_east_1_client.create_subnet(
AvailabilityZone='us-east-1a',
CidrBlock='10.20.11.0/24',
VpcId=self.vpc_list[1],
TagSpecifications=[
{
'ResourceType': 'subnet',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-PrivateSubnetA'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
eu_west_1_response = self.eu_west_1_client.create_subnet(
AvailabilityZone='eu-west-1a',
CidrBlock='10.30.11.0/24',
VpcId=self.vpc_list[2],
TagSpecifications=[
{
'ResourceType': 'subnet',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-PrivateSubnetA'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.subnet_list[0][1][0] = ap_northeast_2_response['Subnet']['SubnetId']
self.subnet_list[1][1][0] = us_east_1_response['Subnet']['SubnetId']
self.subnet_list[2][1][0] = eu_west_1_response['Subnet']['SubnetId']
except ClientError as err:
raise err
def createSubnetPrivateB(self):
try:
ap_northeast_2_response = self.ap_northeast_2_client.create_subnet(
AvailabilityZone='ap-northeast-2b',
CidrBlock='10.10.12.0/24',
VpcId=self.vpc_list[0],
TagSpecifications=[
{
'ResourceType': 'subnet',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-PrivateSubnetB'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
us_east_1_response = self.us_east_1_client.create_subnet(
AvailabilityZone='us-east-1b',
CidrBlock='10.20.12.0/24',
VpcId=self.vpc_list[1],
TagSpecifications=[
{
'ResourceType': 'subnet',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-PrivateSubnetB'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
eu_west_1_response = self.eu_west_1_client.create_subnet(
AvailabilityZone='eu-west-1b',
CidrBlock='10.30.12.0/24',
VpcId=self.vpc_list[2],
TagSpecifications=[
{
'ResourceType': 'subnet',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-PrivateSubnetB'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.subnet_list[0][1][1] = ap_northeast_2_response['Subnet']['SubnetId']
self.subnet_list[1][1][1] = us_east_1_response['Subnet']['SubnetId']
self.subnet_list[2][1][1] = eu_west_1_response['Subnet']['SubnetId']
except ClientError as err:
raise err
def createSubnetDatabaseA(self):
try:
ap_northeast_2_response = self.ap_northeast_2_client.create_subnet(
AvailabilityZone='ap-northeast-2a',
CidrBlock='10.10.21.0/24',
VpcId=self.vpc_list[0],
TagSpecifications=[
{
'ResourceType': 'subnet',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-DatabaseSubnetA'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
us_east_1_response = self.us_east_1_client.create_subnet(
AvailabilityZone='us-east-1a',
CidrBlock='10.20.21.0/24',
VpcId=self.vpc_list[1],
TagSpecifications=[
{
'ResourceType': 'subnet',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-DatabaseSubnetA'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
eu_west_1_response = self.eu_west_1_client.create_subnet(
AvailabilityZone='eu-west-1a',
CidrBlock='10.30.21.0/24',
VpcId=self.vpc_list[2],
TagSpecifications=[
{
'ResourceType': 'subnet',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-DatabaseSubnetA'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.subnet_list[0][2][0] = ap_northeast_2_response['Subnet']['SubnetId']
self.subnet_list[1][2][0] = us_east_1_response['Subnet']['SubnetId']
self.subnet_list[2][2][0] = eu_west_1_response['Subnet']['SubnetId']
except ClientError as err:
raise err
def createSubnetDatabaseB(self):
try:
ap_northeast_2_response = self.ap_northeast_2_client.create_subnet(
AvailabilityZone='ap-northeast-2b',
CidrBlock='10.10.22.0/24',
VpcId=self.vpc_list[0],
TagSpecifications=[
{
'ResourceType': 'subnet',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-DatabaseSubnetB'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
us_east_1_response = self.us_east_1_client.create_subnet(
AvailabilityZone='us-east-1b',
CidrBlock='10.20.22.0/24',
VpcId=self.vpc_list[1],
TagSpecifications=[
{
'ResourceType': 'subnet',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-DatabaseSubnetB'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
eu_west_1_response = self.eu_west_1_client.create_subnet(
AvailabilityZone='eu-west-1b',
CidrBlock='10.30.22.0/24',
VpcId=self.vpc_list[2],
TagSpecifications=[
{
'ResourceType': 'subnet',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-DatabaseSubnetB'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.subnet_list[0][2][1] = ap_northeast_2_response['Subnet']['SubnetId']
self.subnet_list[1][2][1] = us_east_1_response['Subnet']['SubnetId']
self.subnet_list[2][2][1] = eu_west_1_response['Subnet']['SubnetId']
except ClientError as err:
raise err
def createInternetGateway(self):
try:
ap_northeast_2_response = self.ap_northeast_2_client.create_internet_gateway(
TagSpecifications=[
{
'ResourceType': 'internet-gateway',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-IGW'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
us_east_1_response = self.us_east_1_client.create_internet_gateway(
TagSpecifications=[
{
'ResourceType': 'internet-gateway',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-IGW'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
eu_west_1_response = self.eu_west_1_client.create_internet_gateway(
TagSpecifications=[
{
'ResourceType': 'internet-gateway',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-IGW'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.internet_gateway_list[0] = ap_northeast_2_response['InternetGateway']['InternetGatewayId']
self.internet_gateway_list[1] = us_east_1_response['InternetGateway']['InternetGatewayId']
self.internet_gateway_list[2] = eu_west_1_response['InternetGateway']['InternetGatewayId']
except ClientError as err:
raise err
def attachInternetGateway(self):
try:
self.ap_northeast_2_client.attach_internet_gateway(
InternetGatewayId=self.internet_gateway_list[0],
VpcId=self.vpc_list[0],
)
self.us_east_1_client.attach_internet_gateway(
InternetGatewayId=self.internet_gateway_list[1],
VpcId=self.vpc_list[1],
)
self.eu_west_1_client.attach_internet_gateway(
InternetGatewayId=self.internet_gateway_list[2],
VpcId=self.vpc_list[2],
)
except ClientError as err:
raise err
def createRouteTablePublic(self):
try:
ap_northeast_2_response = self.ap_northeast_2_client.create_route_table(
VpcId=self.vpc_list[0],
TagSpecifications=[
{
'ResourceType': 'route-table',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-Public-Route'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
us_east_1_response = self.us_east_1_client.create_route_table(
VpcId=self.vpc_list[1],
TagSpecifications=[
{
'ResourceType': 'route-table',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-Public-Route'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
eu_west_1_response = self.eu_west_1_client.create_route_table(
VpcId=self.vpc_list[2],
TagSpecifications=[
{
'ResourceType': 'route-table',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-Public-Route'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.route_table_list[0][0][0] = ap_northeast_2_response['RouteTable']['RouteTableId']
self.route_table_list[1][0][0] = us_east_1_response['RouteTable']['RouteTableId']
self.route_table_list[2][0][0] = eu_west_1_response['RouteTable']['RouteTableId']
except ClientError as err:
raise err
def createRouteTablePrivateA(self):
try:
ap_northeast_2_response = self.ap_northeast_2_client.create_route_table(
VpcId=self.vpc_list[0],
TagSpecifications=[
{
'ResourceType': 'route-table',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-Private-Route-A'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
us_east_1_response = self.us_east_1_client.create_route_table(
VpcId=self.vpc_list[1],
TagSpecifications=[
{
'ResourceType': 'route-table',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-Private-Route-A'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
eu_west_1_response = self.eu_west_1_client.create_route_table(
VpcId=self.vpc_list[2],
TagSpecifications=[
{
'ResourceType': 'route-table',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-Private-Route-A'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.route_table_list[0][1][0] = ap_northeast_2_response['RouteTable']['RouteTableId']
self.route_table_list[1][1][0] = us_east_1_response['RouteTable']['RouteTableId']
self.route_table_list[2][1][0] = eu_west_1_response['RouteTable']['RouteTableId']
except ClientError as err:
raise err
def createRouteTablePrivateB(self):
try:
ap_northeast_2_response = self.ap_northeast_2_client.create_route_table(
VpcId=self.vpc_list[0],
TagSpecifications=[
{
'ResourceType': 'route-table',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-Private-Route-B'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
us_east_1_response = self.us_east_1_client.create_route_table(
VpcId=self.vpc_list[1],
TagSpecifications=[
{
'ResourceType': 'route-table',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-Private-Route-B'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
eu_west_1_response = self.eu_west_1_client.create_route_table(
VpcId=self.vpc_list[2],
TagSpecifications=[
{
'ResourceType': 'route-table',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-Private-Route-B'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.route_table_list[0][1][1] = ap_northeast_2_response['RouteTable']['RouteTableId']
self.route_table_list[1][1][1] = us_east_1_response['RouteTable']['RouteTableId']
self.route_table_list[2][1][1] = eu_west_1_response['RouteTable']['RouteTableId']
except ClientError as err:
raise err
def createRouteTableDatabase(self):
try:
ap_northeast_2_response = self.ap_northeast_2_client.create_route_table(
VpcId=self.vpc_list[0],
TagSpecifications=[
{
'ResourceType': 'route-table',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-Database-Route'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
us_east_1_response = self.us_east_1_client.create_route_table(
VpcId=self.vpc_list[1],
TagSpecifications=[
{
'ResourceType': 'route-table',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-Database-Route'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
eu_west_1_response = self.eu_west_1_client.create_route_table(
VpcId=self.vpc_list[2],
TagSpecifications=[
{
'ResourceType': 'route-table',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-Database-Route'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.route_table_list[0][2][0] = ap_northeast_2_response['RouteTable']['RouteTableId']
self.route_table_list[1][2][0] = us_east_1_response['RouteTable']['RouteTableId']
self.route_table_list[2][2][0] = eu_west_1_response['RouteTable']['RouteTableId']
except ClientError as err:
raise err
def associateRouteTablePublicSubnetAPublicRoute(self):
try:
self.ap_northeast_2_client.associate_route_table(
RouteTableId=self.route_table_list[0][0][0],
SubnetId=self.subnet_list[0][0][0],
)
self.us_east_1_client.associate_route_table(
RouteTableId=self.route_table_list[1][0][0],
SubnetId=self.subnet_list[1][0][0],
)
self.eu_west_1_client.associate_route_table(
RouteTableId=self.route_table_list[2][0][0],
SubnetId=self.subnet_list[2][0][0],
)
except ClientError as err:
raise err
def associateRouteTablePublicSubnetBPublicRoute(self):
try:
self.ap_northeast_2_client.associate_route_table(
RouteTableId=self.route_table_list[0][0][0],
SubnetId=self.subnet_list[0][0][1],
)
self.us_east_1_client.associate_route_table(
RouteTableId=self.route_table_list[1][0][0],
SubnetId=self.subnet_list[1][0][1],
)
self.eu_west_1_client.associate_route_table(
RouteTableId=self.route_table_list[2][0][0],
SubnetId=self.subnet_list[2][0][1],
)
except ClientError as err:
raise err
def associateRouteTablePrivateSubnetAPrivateRouteA(self):
try:
self.ap_northeast_2_client.associate_route_table(
RouteTableId=self.route_table_list[0][1][0],
SubnetId=self.subnet_list[0][1][0],
)
self.us_east_1_client.associate_route_table(
RouteTableId=self.route_table_list[1][1][0],
SubnetId=self.subnet_list[1][1][0],
)
self.eu_west_1_client.associate_route_table(
RouteTableId=self.route_table_list[2][1][0],
SubnetId=self.subnet_list[2][1][0],
)
except ClientError as err:
raise err
def associateRouteTablePrivateSubnetBPrivateRouteB(self):
try:
self.ap_northeast_2_client.associate_route_table(
RouteTableId=self.route_table_list[0][1][1],
SubnetId=self.subnet_list[0][1][1],
)
self.us_east_1_client.associate_route_table(
RouteTableId=self.route_table_list[1][1][1],
SubnetId=self.subnet_list[1][1][1],
)
self.eu_west_1_client.associate_route_table(
RouteTableId=self.route_table_list[2][1][1],
SubnetId=self.subnet_list[2][1][1],
)
except ClientError as err:
raise err
def associateRouteTableDatabaseSubnetADatabaseRoute(self):
try:
self.ap_northeast_2_client.associate_route_table(
RouteTableId=self.route_table_list[0][2][0],
SubnetId=self.subnet_list[0][2][0],
)
self.us_east_1_client.associate_route_table(
RouteTableId=self.route_table_list[1][2][0],
SubnetId=self.subnet_list[1][2][0],
)
self.eu_west_1_client.associate_route_table(
RouteTableId=self.route_table_list[2][2][0],
SubnetId=self.subnet_list[2][2][0],
)
except ClientError as err:
raise err
def associateRouteTableDatabaseSubnetBDatabaseRoute(self):
try:
self.ap_northeast_2_client.associate_route_table(
RouteTableId=self.route_table_list[0][2][1],
SubnetId=self.subnet_list[0][2][1],
)
self.us_east_1_client.associate_route_table(
RouteTableId=self.route_table_list[1][2][1],
SubnetId=self.subnet_list[1][2][1],
)
self.eu_west_1_client.associate_route_table(
RouteTableId=self.route_table_list[2][2][1],
SubnetId=self.subnet_list[2][2][1],
)
except ClientError as err:
raise err
def createRouteIGW(self):
try:
self.ap_northeast_2_client.create_route(
DestinationCidrBlock='0.0.0.0/0',
GatewayId=self.internet_gateway_list[0],
RouteTableId=self.route_table_list[0][0][0]
)
self.us_east_1_client.create_route(
DestinationCidrBlock='0.0.0.0/0',
GatewayId=self.internet_gateway_list[1],
RouteTableId=self.route_table_list[1][0][0]
)
self.eu_west_1_client.create_route(
DestinationCidrBlock='0.0.0.0/0',
GatewayId=self.internet_gateway_list[2],
RouteTableId=self.route_table_list[2][0][0]
)
except ClientError as err:
raise err
def createNetworkAcl(self):
try:
ap_northeast_2_response = self.ap_northeast_2_client.create_network_acl(
VpcId=self.vpc_list[0],
TagSpecifications=[
{
'ResourceType': 'network-acl',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-NACL'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
us_east_1_response = self.us_east_1_client.create_network_acl(
VpcId=self.vpc_list[1],
TagSpecifications=[
{
'ResourceType': 'network-acl',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-NACL'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
eu_west_1_response = self.eu_west_1_client.create_network_acl(
VpcId=self.vpc_list[2],
TagSpecifications=[
{
'ResourceType': 'network-acl',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-NACL'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.network_acl_list[0] = ap_northeast_2_response['NetworkAcl']['NetworkAclId']
self.network_acl_list[1] = us_east_1_response['NetworkAcl']['NetworkAclId']
self.network_acl_list[2] = eu_west_1_response['NetworkAcl']['NetworkAclId']
self.network_acl_association_id_list[0] = [
associationId['NetworkAclAssociationId'] for associationId in
ap_northeast_2_response['NetworkAcl']['Associations']
]
self.network_acl_association_id_list[1] = [
associationId['NetworkAclAssociationId'] for associationId in
us_east_1_response['NetworkAcl']['Associations']
]
self.network_acl_association_id_list[2] = [
associationId['NetworkAclAssociationId'] for associationId in
eu_west_1_response['NetworkAcl']['Associations']
]
except ClientError as err:
raise err
def createNetworkAclEntryInboundAllow22(self):
try:
self.ap_northeast_2_client.create_network_acl_entry(
CidrBlock='10.0.0.0/8',
Egress=False,
NetworkAclId=self.network_acl_list[0],
PortRange={
'From': 22,
'To': 22
},
Protocol='6', # TCP
RuleAction='allow',
RuleNumber=100
)
self.us_east_1_client.create_network_acl_entry(
CidrBlock='10.0.0.0/8',
Egress=False,
NetworkAclId=self.network_acl_list[1],
PortRange={
'From': 22,
'To': 22
},
Protocol='6', # TCP
RuleAction='allow',
RuleNumber=100
)
self.eu_west_1_client.create_network_acl_entry(
CidrBlock='10.0.0.0/8',
Egress=False,
NetworkAclId=self.network_acl_list[2],
PortRange={
'From': 22,
'To': 22
},
Protocol='6', # TCP
RuleAction='allow',
RuleNumber=100
)
except ClientError as err:
raise err
def createNetworkAclEntryInboundAllow80(self):
try:
self.ap_northeast_2_client.create_network_acl_entry(
CidrBlock='0.0.0.0/0',
Egress=False,
NetworkAclId=self.network_acl_list[0],
PortRange={
'From': 80,
'To': 80
},
Protocol='6', # TCP
RuleAction='allow',
RuleNumber=200
)
self.us_east_1_client.create_network_acl_entry(
CidrBlock='0.0.0.0/0',
Egress=False,
NetworkAclId=self.network_acl_list[1],
PortRange={
'From': 80,
'To': 80
},
Protocol='6', # TCP
RuleAction='allow',
RuleNumber=200
)
self.eu_west_1_client.create_network_acl_entry(
CidrBlock='0.0.0.0/0',
Egress=False,
NetworkAclId=self.network_acl_list[2],
PortRange={
'From': 80,
'To': 80
},
Protocol='6', # TCP
RuleAction='allow',
RuleNumber=200
)
except ClientError as err:
raise err
def createNetworkAclEntryInboundAllow443(self):
try:
self.ap_northeast_2_client.create_network_acl_entry(
CidrBlock='0.0.0.0/0',
Egress=False,
NetworkAclId=self.network_acl_list[0],
PortRange={
'From': 443,
'To': 443
},
Protocol='6', # TCP
RuleAction='allow',
RuleNumber=300
)
self.us_east_1_client.create_network_acl_entry(
CidrBlock='0.0.0.0/0',
Egress=False,
NetworkAclId=self.network_acl_list[1],
PortRange={
'From': 443,
'To': 443
},
Protocol='6', # TCP
RuleAction='allow',
RuleNumber=300
)
self.eu_west_1_client.create_network_acl_entry(
CidrBlock='0.0.0.0/0',
Egress=False,
NetworkAclId=self.network_acl_list[2],
PortRange={
'From': 443,
'To': 443
},
Protocol='6', # TCP
RuleAction='allow',
RuleNumber=300
)
except ClientError as err:
raise err
def createNetworkAclEntryInboundAllow5234(self):
try:
self.ap_northeast_2_client.create_network_acl_entry(
CidrBlock='0.0.0.0/0',
Egress=False,
NetworkAclId=self.network_acl_list[0],
PortRange={
'From': 443,
'To': 443
},
Protocol='6', # TCP
RuleAction='allow',
RuleNumber=400
)
self.us_east_1_client.create_network_acl_entry(
CidrBlock='0.0.0.0/0',
Egress=False,
NetworkAclId=self.network_acl_list[1],
PortRange={
'From': 443,
'To': 443
},
Protocol='6', # TCP
RuleAction='allow',
RuleNumber=400
)
self.eu_west_1_client.create_network_acl_entry(
CidrBlock='0.0.0.0/0',
Egress=False,
NetworkAclId=self.network_acl_list[2],
PortRange={
'From': 443,
'To': 443
},
Protocol='6', # TCP
RuleAction='allow',
RuleNumber=400
)
except ClientError as err:
raise err
def createNetworkAclEntryInboundAllow20222(self):
try:
self.ap_northeast_2_client.create_network_acl_entry(
CidrBlock='0.0.0.0/0',
Egress=False,
NetworkAclId=self.network_acl_list[0],
PortRange={
'From': 20222,
'To': 20222
},
Protocol='6', # TCP
RuleAction='allow',
RuleNumber=500
)
self.us_east_1_client.create_network_acl_entry(
CidrBlock='0.0.0.0/0',
Egress=False,
NetworkAclId=self.network_acl_list[1],
PortRange={
'From': 20222,
'To': 20222
},
Protocol='6', # TCP
RuleAction='allow',
RuleNumber=500
)
self.eu_west_1_client.create_network_acl_entry(
CidrBlock='0.0.0.0/0',
Egress=False,
NetworkAclId=self.network_acl_list[2],
PortRange={
'From': 20222,
'To': 20222
},
Protocol='6', # TCP
RuleAction='allow',
RuleNumber=500
)
except ClientError as err:
raise err
def createNetworkAclEntryInboundAllow1024To65535(self):
try:
self.ap_northeast_2_client.create_network_acl_entry(
CidrBlock='0.0.0.0/0',
Egress=False,
NetworkAclId=self.network_acl_list[0],
PortRange={
'From': 1024,
'To': 65535
},
Protocol='6', # TCP
RuleAction='allow',
RuleNumber=600
)
self.us_east_1_client.create_network_acl_entry(
CidrBlock='0.0.0.0/0',
Egress=False,
NetworkAclId=self.network_acl_list[1],
PortRange={
'From': 1024,
'To': 65535
},
Protocol='6', # TCP
RuleAction='allow',
RuleNumber=600
)
self.eu_west_1_client.create_network_acl_entry(
CidrBlock='0.0.0.0/0',
Egress=False,
NetworkAclId=self.network_acl_list[2],
PortRange={
'From': 1024,
'To': 65535
},
Protocol='6', # TCP
RuleAction='allow',
RuleNumber=600
)
except ClientError as err:
raise err
def createNetworkAclEntryOutboundAllowAllTraffic(self):
try:
self.ap_northeast_2_client.create_network_acl_entry(
CidrBlock='0.0.0.0/0',
Egress=True,
NetworkAclId=self.network_acl_list[0],
Protocol='-1', # All Traffic
RuleAction='allow',
RuleNumber=100
)
self.us_east_1_client.create_network_acl_entry(
CidrBlock='0.0.0.0/0',
Egress=True,
NetworkAclId=self.network_acl_list[1],
Protocol='-1', # All Traffic
RuleAction='allow',
RuleNumber=100
)
self.eu_west_1_client.create_network_acl_entry(
CidrBlock='0.0.0.0/0',
Egress=True,
NetworkAclId=self.network_acl_list[2],
Protocol='-1', # All Traffic
RuleAction='allow',
RuleNumber=100
)
except ClientError as err:
raise err
def replaceNetworkAclAssociationSubnets(self):
try:
for i in self.network_acl_association_id_list[0]:
self.ap_northeast_2_client.replace_network_acl_association(
AssociationId=i,
NetworkAclId=self.network_acl_list[0]
)
for i in self.network_acl_association_id_list[1]:
self.us_east_1_client.replace_network_acl_association(
AssociationId=i,
NetworkAclId=self.network_acl_list[1]
)
for i in self.network_acl_association_id_list[2]:
self.eu_west_1_client.replace_network_acl_association(
AssociationId=i,
NetworkAclId=self.network_acl_list[2]
)
except ClientError as err:
raise err
def allocateAddressEipA(self):
try:
ap_northeast_2_response = self.ap_northeast_2_client.allocate_address(
TagSpecifications=[
{
'ResourceType': 'elastic-ip',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-EIP-NAT-A'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
us_east_1_response = self.us_east_1_client.allocate_address(
TagSpecifications=[
{
'ResourceType': 'elastic-ip',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-EIP-NAT-A'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
eu_west_1_response = self.eu_west_1_client.allocate_address(
TagSpecifications=[
{
'ResourceType': 'elastic-ip',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-EIP-NAT-A'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.elastic_ip_allocation_id_list[0][0] = ap_northeast_2_response['AllocationId']
self.elastic_ip_allocation_id_list[1][0] = us_east_1_response['AllocationId']
self.elastic_ip_allocation_id_list[2][0] = eu_west_1_response['AllocationId']
except ClientError as err:
raise err
def allocateAddressEipB(self):
try:
ap_northeast_2_response = self.ap_northeast_2_client.allocate_address(
TagSpecifications=[
{
'ResourceType': 'elastic-ip',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-EIP-NAT-B'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
us_east_1_response = self.us_east_1_client.allocate_address(
TagSpecifications=[
{
'ResourceType': 'elastic-ip',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-EIP-NAT-B'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
eu_west_1_response = self.eu_west_1_client.allocate_address(
TagSpecifications=[
{
'ResourceType': 'elastic-ip',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-EIP-NAT-B'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.elastic_ip_allocation_id_list[0][1] = ap_northeast_2_response['AllocationId']
self.elastic_ip_allocation_id_list[1][1] = us_east_1_response['AllocationId']
self.elastic_ip_allocation_id_list[2][1] = eu_west_1_response['AllocationId']
except ClientError as err:
raise err
def createNatGatewayA(self):
try:
ap_northeast_2_response = self.ap_northeast_2_client.create_nat_gateway(
AllocationId=self.elastic_ip_allocation_id_list[0][0],
SubnetId=self.subnet_list[0][0][0],
TagSpecifications=[
{
'ResourceType': 'natgateway',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-NAT-A'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
us_east_1_response = self.us_east_1_client.create_nat_gateway(
AllocationId=self.elastic_ip_allocation_id_list[1][0],
SubnetId=self.subnet_list[0][1][0],
TagSpecifications=[
{
'ResourceType': 'natgateway',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-NAT-A'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
eu_west_1_response = self.eu_west_1_client.create_nat_gateway(
AllocationId=self.elastic_ip_allocation_id_list[2][0],
SubnetId=self.subnet_list[0][2][0],
TagSpecifications=[
{
'ResourceType': 'natgateway',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-NAT-A'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.nat_gateway_id_list[0][0] = ap_northeast_2_response['NatGateway']['NatGatewayId']
self.nat_gateway_id_list[1][0] = us_east_1_response['NatGateway']['NatGatewayId']
self.nat_gateway_id_list[2][0] = eu_west_1_response['NatGateway']['NatGatewayId']
except ClientError as err:
raise err
def createNatGatewayB(self):
try:
ap_northeast_2_response = self.ap_northeast_2_client.create_nat_gateway(
AllocationId=self.elastic_ip_allocation_id_list[0][1],
SubnetId=self.subnet_list[0][0][1],
TagSpecifications=[
{
'ResourceType': 'natgateway',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-NAT-B'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
us_east_1_response = self.us_east_1_client.create_nat_gateway(
AllocationId=self.elastic_ip_allocation_id_list[1][1],
SubnetId=self.subnet_list[0][1][1],
TagSpecifications=[
{
'ResourceType': 'natgateway',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-NAT-B'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
eu_west_1_response = self.eu_west_1_client.create_nat_gateway(
AllocationId=self.elastic_ip_allocation_id_list[2][1],
SubnetId=self.subnet_list[0][2][1],
TagSpecifications=[
{
'ResourceType': 'natgateway',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-NAT-B'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.nat_gateway_id_list[0][1] = ap_northeast_2_response['NatGateway']['NatGatewayId']
self.nat_gateway_id_list[1][1] = us_east_1_response['NatGateway']['NatGatewayId']
self.nat_gateway_id_list[2][1] = eu_west_1_response['NatGateway']['NatGatewayId']
except ClientError as err:
raise err
def describeNatGateways(self):
try:
ap_northeast_2_response = self.ap_northeast_2_client.describe_nat_gateways(
NatGatewayIds=self.nat_gateway_id_list[0]
)
us_east_1_response = self.us_east_1_client.describe_nat_gateways(
NatGatewayIds=self.nat_gateway_id_list[1]
)
eu_west_1_response = self.eu_west_1_client.describe_nat_gateways(
NatGatewayIds=self.nat_gateway_id_list[2]
)
nat_gateways_states = {
'ap-northeast-2': [
{
'State': nat_gateway['State'],
'FailureMessage': nat_gateway['FailureMessage'],
'AvailabilityZone':
next((tag['Value'] for tag in nat_gateway['Tags'] if tag['Key'] == 'Name'), False)[-1]
} for nat_gateway in ap_northeast_2_response['NatGateways']
],
'us-east-1': [
{
'State': nat_gateway['State'],
'FailureMessage': nat_gateway['FailureMessage'],
'AvailabilityZone':
next((tag['Value'] for tag in nat_gateway['Tags'] if tag['Key'] == 'Name'), False)[-1]
} for nat_gateway in us_east_1_response['NatGateways']
],
'eu-west-1': [
{
'State': nat_gateway['State'],
'FailureMessage': nat_gateway['FailureMessage'],
'AvailabilityZone':
next((tag['Value'] for tag in nat_gateway['Tags'] if tag['Key'] == 'Name'), False)[-1]
} for nat_gateway in eu_west_1_response['NatGateways']
]
}
except ClientError as err:
raise err
def createRouteNatA(self):
try:
self.ap_northeast_2_client.create_route(
DestinationCidrBlock='0.0.0.0/0',
NatGatewayId=self.nat_gateway_id_list[0][0],
RouteTableId=self.route_table_list[0][1][0]
)
self.us_east_1_client.create_route(
DestinationCidrBlock='0.0.0.0/0',
NatGatewayId=self.nat_gateway_id_list[1][0],
RouteTableId=self.route_table_list[1][1][0]
)
self.eu_west_1_client.create_route(
DestinationCidrBlock='0.0.0.0/0',
NatGatewayId=self.nat_gateway_id_list[2][0],
RouteTableId=self.route_table_list[2][1][0]
)
except ClientError as err:
raise err
def createRouteNatB(self):
try:
self.ap_northeast_2_client.create_route(
DestinationCidrBlock='0.0.0.0/0',
NatGatewayId=self.nat_gateway_id_list[0][1],
RouteTableId=self.route_table_list[0][1][1]
)
self.us_east_1_client.create_route(
DestinationCidrBlock='0.0.0.0/0',
NatGatewayId=self.nat_gateway_id_list[1][1],
RouteTableId=self.route_table_list[1][1][1]
)
self.eu_west_1_client.create_route(
DestinationCidrBlock='0.0.0.0/0',
NatGatewayId=self.nat_gateway_id_list[2][1],
RouteTableId=self.route_table_list[2][1][1]
)
except ClientError as err:
raise err
def createSecurityGroupBastion(self):
try:
ap_northeast_2_response = self.ap_northeast_2_client.create_security_group(
Description='Security Group for Bastion EC2 Server.',
GroupName='MR-Bastion-SG',
VpcId=self.vpc_list[0],
TagSpecifications=[
{
'ResourceType': 'security-group',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-Bastion-SG'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
us_east_1_response = self.us_east_1_client.create_security_group(
Description='Security Group for Bastion EC2 Server.',
GroupName='MR-Bastion-SG',
VpcId=self.vpc_list[1],
TagSpecifications=[
{
'ResourceType': 'security-group',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-Bastion-SG'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
eu_west_1_response = self.eu_west_1_client.create_security_group(
Description='Security Group for Bastion EC2 Server.',
GroupName='MR-Bastion-SG',
VpcId=self.vpc_list[2],
TagSpecifications=[
{
'ResourceType': 'security-group',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-Bastion-SG'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.security_group_bastion_id_list[0] = ap_northeast_2_response['GroupId']
self.security_group_bastion_id_list[1] = us_east_1_response['GroupId']
self.security_group_bastion_id_list[2] = eu_west_1_response['GroupId']
except ClientError as err:
raise err
def createSecurityGroupELB(self):
try:
ap_northeast_2_response = self.ap_northeast_2_client.create_security_group(
Description='Security Group for Elastic Load Balancer.',
GroupName='MR-ELB-SG',
VpcId=self.vpc_list[0],
TagSpecifications=[
{
'ResourceType': 'security-group',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-ELB-SG'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
us_east_1_response = self.us_east_1_client.create_security_group(
Description='Security Group for Elastic Load Balancer.',
GroupName='MR-ELB-SG',
VpcId=self.vpc_list[1],
TagSpecifications=[
{
'ResourceType': 'security-group',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-ELB-SG'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
eu_west_1_response = self.eu_west_1_client.create_security_group(
Description='Security Group for Elastic Load Balancer.',
GroupName='MR-ELB-SG',
VpcId=self.vpc_list[2],
TagSpecifications=[
{
'ResourceType': 'security-group',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-ELB-SG'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.security_group_elb_id_list[0] = ap_northeast_2_response['GroupId']
self.security_group_elb_id_list[1] = us_east_1_response['GroupId']
self.security_group_elb_id_list[2] = eu_west_1_response['GroupId']
except ClientError as err:
raise err
def createSecurityGroupEC2(self):
try:
ap_northeast_2_response = self.ap_northeast_2_client.create_security_group(
Description='Security Group for WAS EC2 Server.',
GroupName='MR-EC2-SG',
VpcId=self.vpc_list[0],
TagSpecifications=[
{
'ResourceType': 'security-group',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-EC2-SG'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
us_east_1_response = self.us_east_1_client.create_security_group(
Description='Security Group for WAS EC2 Server.',
GroupName='MR-EC2-SG',
VpcId=self.vpc_list[1],
TagSpecifications=[
{
'ResourceType': 'security-group',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-EC2-SG'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
eu_west_1_response = self.eu_west_1_client.create_security_group(
Description='Security Group for WAS EC2 Server.',
GroupName='MR-EC2-SG',
VpcId=self.vpc_list[2],
TagSpecifications=[
{
'ResourceType': 'security-group',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-EC2-SG'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.security_group_ec2_id_list[0] = ap_northeast_2_response['GroupId']
self.security_group_ec2_id_list[1] = us_east_1_response['GroupId']
self.security_group_ec2_id_list[2] = eu_west_1_response['GroupId']
except ClientError as err:
raise err
def createSecurityGroupRDS(self):
try:
ap_northeast_2_response = self.ap_northeast_2_client.create_security_group(
Description='Security Group for Database RDS Server.',
GroupName='MR-RDS-SG',
VpcId=self.vpc_list[0],
TagSpecifications=[
{
'ResourceType': 'security-group',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-RDS-SG'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
us_east_1_response = self.us_east_1_client.create_security_group(
Description='Security Group for Database RDS Server.',
GroupName='MR-RDS-SG',
VpcId=self.vpc_list[1],
TagSpecifications=[
{
'ResourceType': 'security-group',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-RDS-SG'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
eu_west_1_response = self.eu_west_1_client.create_security_group(
Description='Security Group for Database RDS Server.',
GroupName='MR-RDS-SG',
VpcId=self.vpc_list[2],
TagSpecifications=[
{
'ResourceType': 'security-group',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-RDS-SG'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.security_group_rds_id_list[0] = ap_northeast_2_response['GroupId']
self.security_group_rds_id_list[1] = us_east_1_response['GroupId']
self.security_group_rds_id_list[2] = eu_west_1_response['GroupId']
except ClientError as err:
raise err
def authorizeSecurityGroupIngressBastionSgFrom20222(self):
try:
self.ap_northeast_2_client.authorize_security_group_ingress(
CidrIp='0.0.0.0/0',
FromPort=20222,
GroupId=self.security_group_bastion_id_list[0],
IpProtocol='tcp',
ToPort=20222,
TagSpecifications=[
{
'ResourceType': 'security-group-rule',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-Bastion-SG-Inbound-Rule-Allow-20222'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.us_east_1_client.authorize_security_group_ingress(
CidrIp='0.0.0.0/0',
FromPort=20222,
GroupId=self.security_group_bastion_id_list[1],
IpProtocol='tcp',
ToPort=20222,
TagSpecifications=[
{
'ResourceType': 'security-group-rule',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-Bastion-SG-Inbound-Rule-Allow-20222'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.eu_west_1_client.authorize_security_group_ingress(
CidrIp='0.0.0.0/0',
FromPort=20222,
GroupId=self.security_group_bastion_id_list[2],
IpProtocol='tcp',
ToPort=20222,
TagSpecifications=[
{
'ResourceType': 'security-group-rule',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-Bastion-SG-Inbound-Rule-Allow-20222'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
except ClientError as err:
raise err
def authorizeSecurityGroupEgressBastionSgToAllTraffic(self):
try:
self.ap_northeast_2_client.authorize_security_group_egress(
CidrIp='0.0.0.0/0',
GroupId=self.security_group_bastion_id_list[0],
IpProtocol='-1',
TagSpecifications=[
{
'ResourceType': 'security-group-rule',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-Bastion-SG-Outbound-Rule-Allow-All-Traffic'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.us_east_1_client.authorize_security_group_egress(
CidrIp='0.0.0.0/0',
GroupId=self.security_group_bastion_id_list[1],
IpProtocol='-1',
TagSpecifications=[
{
'ResourceType': 'security-group-rule',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-Bastion-SG-Outbound-Rule-Allow-All-Traffic'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.eu_west_1_client.authorize_security_group_egress(
CidrIp='0.0.0.0/0',
GroupId=self.security_group_bastion_id_list[2],
IpProtocol='-1',
TagSpecifications=[
{
'ResourceType': 'security-group-rule',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-Bastion-SG-Outbound-Rule-Allow-All-Traffic'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
except ClientError as err:
raise err
def authorizeSecurityGroupIngressElbSgFrom80(self):
try:
self.ap_northeast_2_client.authorize_security_group_ingress(
CidrIp='0.0.0.0/0',
FromPort=80,
GroupId=self.security_group_elb_id_list[0],
IpProtocol='tcp',
ToPort=80,
TagSpecifications=[
{
'ResourceType': 'security-group-rule',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-ELB-SG-Inbound-Rule-Allow-80'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.us_east_1_client.authorize_security_group_ingress(
CidrIp='0.0.0.0/0',
FromPort=80,
GroupId=self.security_group_elb_id_list[1],
IpProtocol='tcp',
ToPort=80,
TagSpecifications=[
{
'ResourceType': 'security-group-rule',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-ELB-SG-Inbound-Rule-Allow-80'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.eu_west_1_client.authorize_security_group_ingress(
CidrIp='0.0.0.0/0',
FromPort=80,
GroupId=self.security_group_elb_id_list[2],
IpProtocol='tcp',
ToPort=80,
TagSpecifications=[
{
'ResourceType': 'security-group-rule',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-ELB-SG-Inbound-Rule-Allow-80'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
except ClientError as err:
raise err
def authorizeSecurityGroupIngressElbSgFrom443(self):
try:
self.ap_northeast_2_client.authorize_security_group_ingress(
CidrIp='0.0.0.0/0',
FromPort=443,
GroupId=self.security_group_elb_id_list[0],
IpProtocol='tcp',
ToPort=443,
TagSpecifications=[
{
'ResourceType': 'security-group-rule',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-ELB-SG-Inbound-Rule-Allow-443'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.us_east_1_client.authorize_security_group_ingress(
CidrIp='0.0.0.0/0',
FromPort=443,
GroupId=self.security_group_elb_id_list[1],
IpProtocol='tcp',
ToPort=443,
TagSpecifications=[
{
'ResourceType': 'security-group-rule',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-ELB-SG-Inbound-Rule-Allow-443'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.eu_west_1_client.authorize_security_group_ingress(
CidrIp='0.0.0.0/0',
FromPort=443,
GroupId=self.security_group_elb_id_list[2],
IpProtocol='tcp',
ToPort=443,
TagSpecifications=[
{
'ResourceType': 'security-group-rule',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-ELB-SG-Inbound-Rule-Allow-443'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
except ClientError as err:
raise err
def authorizeSecurityGroupEgressElbSgToAllTraffic(self):
try:
self.ap_northeast_2_client.authorize_security_group_egress(
CidrIp='0.0.0.0/0',
GroupId=self.security_group_elb_id_list[0],
IpProtocol='-1',
TagSpecifications=[
{
'ResourceType': 'security-group-rule',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-ELB-SG-Outbound-Rule-Allow-All-Traffic'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.us_east_1_client.authorize_security_group_egress(
CidrIp='0.0.0.0/0',
GroupId=self.security_group_elb_id_list[1],
IpProtocol='-1',
TagSpecifications=[
{
'ResourceType': 'security-group-rule',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-ELB-SG-Outbound-Rule-Allow-All-Traffic'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.eu_west_1_client.authorize_security_group_egress(
CidrIp='0.0.0.0/0',
GroupId=self.security_group_elb_id_list[2],
IpProtocol='-1',
TagSpecifications=[
{
'ResourceType': 'security-group-rule',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-ELB-SG-Outbound-Rule-Allow-All-Traffic'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
except ClientError as err:
raise err
def authorizeSecurityGroupIngressEc2SgFrom22(self):
# TODO: Add BastionSG Source Security Group
try:
self.ap_northeast_2_client.authorize_security_group_ingress(
CidrIp='0.0.0.0/0',
GroupId=self.security_group_ec2_id_list[0],
IpProtocol='tcp',
TagSpecifications=[
{
'ResourceType': 'security-group-rule',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-EC2-SG-Inbound-Rule-Allow-22'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.us_east_1_client.authorize_security_group_ingress(
CidrIp='0.0.0.0/0',
GroupId=self.security_group_ec2_id_list[1],
IpProtocol='tcp',
TagSpecifications=[
{
'ResourceType': 'security-group-rule',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-EC2-SG-Inbound-Rule-Allow-22'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.eu_west_1_client.authorize_security_group_ingress(
CidrIp='0.0.0.0/0',
GroupId=self.security_group_ec2_id_list[2],
IpProtocol='tcp',
TagSpecifications=[
{
'ResourceType': 'security-group-rule',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-EC2-SG-Inbound-Rule-Allow-22'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
except ClientError as err:
raise err
def authorizeSecurityGroupIngressEc2SgFrom80(self):
try:
self.ap_northeast_2_client.authorize_security_group_ingress(
CidrIp='0.0.0.0/0',
FromPort=80,
GroupId=self.security_group_ec2_id_list[0],
IpProtocol='tcp',
ToPort=80,
TagSpecifications=[
{
'ResourceType': 'security-group-rule',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-EC2-SG-Inbound-Rule-Allow-80'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.us_east_1_client.authorize_security_group_ingress(
CidrIp='0.0.0.0/0',
FromPort=80,
GroupId=self.security_group_ec2_id_list[1],
IpProtocol='tcp',
ToPort=80,
TagSpecifications=[
{
'ResourceType': 'security-group-rule',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-EC2-SG-Inbound-Rule-Allow-80'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
self.eu_west_1_client.authorize_security_group_ingress(
CidrIp='0.0.0.0/0',
FromPort=80,
GroupId=self.security_group_ec2_id_list[2],
IpProtocol='tcp',
ToPort=80,
TagSpecifications=[
{
'ResourceType': 'security-group-rule',
'Tags': [
{
'Key': 'Name',
'Value': 'MR-EC2-SG-Inbound-Rule-Allow-80'
},
{
'Key': 'Project',
'Value': 'Multi-Region'
}
]
}
]
)
except ClientError as err:
raise err
# TODO: Add other security groups
# def authorizeSecurityGroup
def deleteDhcpOptions(self):
self.ap_northeast_2_client.delete_dhcp_options(
DhcpOptionsId=self.dhcp_options_list[0]
)
self.us_east_1_client.delete_dhcp_options(
DhcpOptionsId=self.dhcp_options_list[1]
)
self.eu_west_1_client.delete_dhcp_options(
DhcpOptionsId=self.dhcp_options_list[2]
)
def deleteVpc(self):
self.ap_northeast_2_client.delete_vpc(
VpcId=self.vpc_list[0]
)
self.us_east_1_client.delete_vpc(
VpcId=self.vpc_list[1]
)
self.eu_west_1_client.delete_vpc(
VpcId=self.vpc_list[2]
)
def deleteAll(self):
self.deleteVpc()
self.deleteDhcpOptions()
| 37.324558 | 114 | 0.362836 | 6,969 | 99,246 | 4.902712 | 0.034582 | 0.013697 | 0.013522 | 0.011239 | 0.941025 | 0.927884 | 0.91404 | 0.881874 | 0.866626 | 0.840899 | 0 | 0.035754 | 0.54374 | 99,246 | 2,658 | 115 | 37.3386 | 0.718785 | 0.003506 | 0 | 0.528384 | 0 | 0 | 0.111227 | 0.008831 | 0 | 0 | 0 | 0.000376 | 0 | 1 | 0.021437 | false | 0 | 0.000794 | 0 | 0.022628 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
57e2ea7d6120b29d9f5670269ec46f6f3531e552 | 40,156 | py | Python | tests/test_eng.py | pmacosta/putil | 416cea52df8221981727e25d133e9b4e3f464798 | [
"MIT"
] | 6 | 2015-12-15T04:09:08.000Z | 2020-02-21T01:40:57.000Z | tests/test_eng.py | pmacosta/putil | 416cea52df8221981727e25d133e9b4e3f464798 | [
"MIT"
] | null | null | null | tests/test_eng.py | pmacosta/putil | 416cea52df8221981727e25d133e9b4e3f464798 | [
"MIT"
] | 2 | 2016-01-21T23:29:17.000Z | 2020-02-21T01:41:05.000Z | # test_eng.py
# Copyright (c) 2013-2016 Pablo Acosta-Serafini
# See LICENSE for details
# pylint: disable=C0103,C0111,C0302,E0611,R0913,R0915,W0108,W0212
# Standard library imports
import functools
import sys
import pytest
from numpy import array, ndarray
# Putil imports
import putil.eng
from putil.test import AE, AI, CS
###
# Global variables
###
DFLT = 'def'
PY2 = bool(sys.hexversion < 0x03000000)
###
# Helper functions
###
isdflt = lambda obj: bool(obj == DFLT)
h = lambda num: '100.'+('0'*num)
o = lambda num: '1.'+('0'*num)
pv = lambda py2arg, py3arg: py2arg if PY2 else py3arg
sarg = lambda msg: 'Argument `{0}` is not valid'.format(msg)
t = lambda num: '10.'+('0'*num)
def to_sci_string(number):
"""
Returns a string with the number formatted in scientific notation. This
function does not have all the configurability of the public function
to_scientific_string, it is a convenience function to test _to_eng_tuple
"""
mant, exp = putil.eng._to_eng_tuple(number)
return '{mant}E{exp_sign}{exp}'.format(
mant=mant, exp_sign='-' if exp < 0 else '+', exp=abs(exp)
)
###
# Test functions
###
@pytest.mark.parametrize(
'text, sep, num, lstrip, rstrip, ref', [
('a, b, c, d', ',', 1, DFLT, DFLT, ('a', ' b', ' c', ' d')),
('a , b , c , d ', ',', 1, DFLT, DFLT, ('a ', ' b ', ' c ', ' d ')),
('a , b , c , d ', ',', 1, True, DFLT, ('a ', 'b ', 'c ', 'd ')),
('a , b , c , d ', ',', 1, DFLT, True, ('a', ' b', ' c', ' d')),
('a , b , c , d ', ',', 1, True, True, ('a', 'b', 'c', 'd')),
('a, b, c, d', ',', 2, DFLT, DFLT, ('a, b', ' c, d')),
('a, b, c, d', ',', 3, DFLT, DFLT, ('a, b, c', ' d')),
('a, b, c, d', ',', 4, DFLT, DFLT, ('a, b, c, d',)),
('a, b, c, d', ',', 5, DFLT, DFLT, ('a, b, c, d',)),
]
)
def test_split_every(text, sep, num, lstrip, rstrip, ref):
""" Test _split_every function behavior """
# DFLT in lstrip or rstrip means default argument values should be used
obj = putil.eng._split_every
obj = obj if isdflt(lstrip) else functools.partial(obj, lstrip=lstrip)
obj = obj if isdflt(rstrip) else functools.partial(obj, rstrip=rstrip)
assert obj(text, sep, num) == ref
@pytest.mark.parametrize(
'num, ref', [
(0.000000000000000000000001001234567890, '1.00123456789E-24'),
(0.000000000000000000000001, '1E-24'),
(0.00000000000000000000001001234567890, '10.0123456789E-24'),
(0.00000000000000000000001, '10E-24'),
(0.0000000000000000000001001234567890, '100.123456789E-24'),
(0.0000000000000000000001, '100E-24'),
(0.000000000000000000001001234567890, '1.00123456789E-21'),
(0.000000000000000000001, '1E-21'),
(0.00000000000000000001001234567890, '10.0123456789E-21'),
(0.00000000000000000001, '10E-21'),
(0.0000000000000000001001234567890, '100.123456789E-21'),
(0.0000000000000000001, '100E-21'),
(0.000000000000000001001234567890, '1.00123456789E-18'),
(0.000000000000000001, '1E-18'),
(0.00000000000000001001234567890, '10.0123456789E-18'),
(0.00000000000000001, '10E-18'),
(0.0000000000000001001234567890, '100.123456789E-18'),
(0.0000000000000001, '100E-18'),
(0.000000000000001001234567890, '1.00123456789E-15'),
(0.000000000000001, '1E-15'),
(0.00000000000001001234567890, '10.0123456789E-15'),
(0.00000000000001, '10E-15'),
(0.0000000000001001234567890, '100.123456789E-15'),
(0.0000000000001, '100E-15'),
(0.000000000001001234567890, '1.00123456789E-12'),
(0.000000000001, '1E-12'),
(0.00000000001001234567890, '10.0123456789E-12'),
(0.00000000001, '10E-12'),
(0.0000000001001234567890, '100.123456789E-12'),
(0.0000000001, '100E-12'),
(0.000000001001234567890, '1.00123456789E-9'),
(0.000000001, '1E-9'),
(0.00000001001234567890, '10.0123456789E-9'),
(0.00000001, '10E-9'),
(0.0000001001234567890, '100.123456789E-9'),
(0.0000001, '100E-9'),
(0.000001001234567890, '1.00123456789E-6'),
(0.000001, '1E-6'),
(0.00001001234567890, '10.0123456789E-6'),
(0.00001, '10E-6'),
(0.0001001234567890, '100.123456789E-6'),
(0.0001, '100E-6'),
(0.001001234567890, '1.00123456789E-3'),
(0.001, '1E-3'),
(0.01001234567890, '10.0123456789E-3'),
(0.01, '10E-3'),
(0.1001234567890, '100.123456789E-3'),
(0.1, '100E-3'),
(0, '0E+0'),
(1, '1E+0'),
(1.1234567890, '1.123456789E+0'),
(10, '10E+0'),
(10.1234567890, '10.123456789E+0'),
(100, '100E+0'),
(100.1234567890, '100.123456789E+0'),
(1000, '1E+3'),
(1000.1234567890, pv('1.00012345679E+3', '1.000123456789E+3')),
(10000, '10E+3'),
(10000.1234567890, pv('10.0001234568E+3', '10.000123456789E+3')),
(100000, '100E+3'),
(100000.1234567890, pv('100.000123457E+3', '100.000123456789E+3')),
(1000000, '1E+6'),
(1000000.1234567890, pv('1.00000012346E+6', '1.000000123456789E+6')),
(10000000, '10E+6'),
(10000000.1234567890, pv('10.0000001235E+6', '10.00000012345679E+6')),
(100000000, '100E+6'),
(100000000.1234567890, pv('100.000000123E+6', '100.00000012345679E+6')),
(1000000000, '1E+9'),
(1000000000.1234567890, pv('1.00000000012E+9', '1.0000000001234568E+9')),
(10000000000, '10E+9'),
(10000000000.1234567890, pv(t(9)+'1E+9', '10.000000000123457E+9')),
(100000000000, '100E+9'),
(100000000000.1234567890, pv('100E+9', '100.00000000012346E+9')),
(1000000000000, '1E+12'),
(1000000000000.1234567890, pv('1E+12', '1.0000000000001234E+12')),
(10000000000000, '10E+12'),
(10000000000000.1234567890, pv('10E+12', '10.000000000000123E+12')),
(100000000000000, '100E+12'),
(100000000000000.1234567890, pv('100E+12', '100.00000000000012E+12')),
(1000000000000000, '1E+15'),
(1000000000000000.1234567890, pv('1E+15', '1.0000000000000001E+15')),
(10000000000000000, '10E+15'),
(10000000000000000.1234567890, '10E+15'),
(100000000000000000, '100E+15'),
(100000000000000000.1234567890, '100E+15'),
(1000000000000000000, '1E+18'),
(1000000000000000000.1234567890, '1E+18'),
(10000000000000000000, '10E+18'),
(10000000000000000000.1234567890, '10E+18'),
(100000000000000000000, '100E+18'),
(100000000000000000000.1234567890, '100E+18'),
(1000000000000000000000, '1E+21'),
(1000000000000000000000.1234567890, '1E+21'),
(10000000000000000000000, '10E+21'),
(10000000000000000000000.1234567890, '10E+21'),
(100000000000000000000000, '100E+21'),
(100000000000000000000000.1234567890, pv('100E+21', h(13)+'1E+21')),
(1000000000000000000000000, '1E+24'),
(1000000000000000000000000.1234567890, '1E+24'),
(10000000000000000000000000, '10E+24'),
(10000000000000000000000000.1234567890, '10E+24'),
(100000000000000000000000000, '100E+24'),
(100000000000000000000000000.1234567890, '100E+24'),
(-0.000000000000000000000001001234567890, '-1.00123456789E-24'),
(-0.000000000000000000000001, '-1E-24'),
(-0.00000000000000000000001001234567890, '-10.0123456789E-24'),
(-0.00000000000000000000001, '-10E-24'),
(-0.0000000000000000000001001234567890, '-100.123456789E-24'),
(-0.0000000000000000000001, '-100E-24'),
(-0.000000000000000000001001234567890, '-1.00123456789E-21'),
(-0.000000000000000000001, '-1E-21'),
(-0.00000000000000000001001234567890, '-10.0123456789E-21'),
(-0.00000000000000000001, '-10E-21'),
(-0.0000000000000000001001234567890, '-100.123456789E-21'),
(-0.0000000000000000001, '-100E-21'),
(-0.000000000000000001001234567890, '-1.00123456789E-18'),
(-0.000000000000000001, '-1E-18'),
(-0.00000000000000001001234567890, '-10.0123456789E-18'),
(-0.00000000000000001, '-10E-18'),
(-0.0000000000000001001234567890, '-100.123456789E-18'),
(-0.0000000000000001, '-100E-18'),
(-0.000000000000001001234567890, '-1.00123456789E-15'),
(-0.000000000000001, '-1E-15'),
(-0.00000000000001001234567890, '-10.0123456789E-15'),
(-0.00000000000001, '-10E-15'),
(-0.0000000000001001234567890, '-100.123456789E-15'),
(-0.0000000000001, '-100E-15'),
(-0.000000000001001234567890, '-1.00123456789E-12'),
(-0.000000000001, '-1E-12'),
(-0.00000000001001234567890, '-10.0123456789E-12'),
(-0.00000000001, '-10E-12'),
(-0.0000000001001234567890, '-100.123456789E-12'),
(-0.0000000001, '-100E-12'),
(-0.000000001001234567890, '-1.00123456789E-9'),
(-0.000000001, '-1E-9'),
(-0.00000001001234567890, '-10.0123456789E-9'),
(-0.00000001, '-10E-9'),
(-0.0000001001234567890, '-100.123456789E-9'),
(-0.0000001, '-100E-9'),
(-0.000001001234567890, '-1.00123456789E-6'),
(-0.000001, '-1E-6'),
(-0.00001001234567890, '-10.0123456789E-6'),
(-0.00001, '-10E-6'),
(-0.0001001234567890, '-100.123456789E-6'),
(-0.0001, '-100E-6'),
(-0.001001234567890, '-1.00123456789E-3'),
(-0.001, '-1E-3'),
(-0.01001234567890, '-10.0123456789E-3'),
(-0.01, '-10E-3'),
(-0.1001234567890, '-100.123456789E-3'),
(-0.1, '-100E-3'),
(-1, '-1E+0'),
(-1.1234567890, '-1.123456789E+0'),
(-10, '-10E+0'),
(-10.1234567890, '-10.123456789E+0'),
(-100, '-100E+0'),
(-100.1234567890, '-100.123456789E+0'),
(-1000, '-1E+3'),
(-1000.1234567890, pv('-1.00012345679E+3', '-1.000123456789E+3')),
(-10000, '-10E+3'),
(-10000.1234567890, pv('-10.0001234568E+3', '-10.000123456789E+3')),
(-100000, '-100E+3'),
(-100000.1234567890, pv('-100.000123457E+3', '-100.000123456789E+3')),
(-1000000, '-1E+6'),
(-1000000.1234567890, pv('-1.00000012346E+6', '-1.000000123456789E+6')),
(-10000000, '-10E+6'),
(-10000000.1234567890, pv('-10.0000001235E+6', '-10.00000012345679E+6')),
(-100000000, '-100E+6'),
(-100000000.1234567890, pv('-'+h(6)+'123E+6', '-100.00000012345679E+6')),
(-1000000000, '-1E+9'),
(-1000000000.1234567890, pv('-'+o(9)+'12E+9', '-1.0000000001234568E+9')),
(-10000000000, '-10E+9'),
(-10000000000.1234567890, pv('-'+t(9)+'1E+9', '-'+t(9)+'123457E+9')),
(-100000000000, '-100E+9'),
(-100000000000.1234567890, pv('-100E+9', '-100.00000000012346E+9')),
(-1000000000000, '-1E+12'),
(-1000000000000.1234567890, pv('-1E+12', '-1.0000000000001234E+12')),
(-10000000000000, '-10E+12'),
(-10000000000000.1234567890, pv('-10E+12', '-10.000000000000123E+12')),
(-100000000000000, '-100E+12'),
(-100000000000000.1234567890, pv('-100E+12', '-100.00000000000012E+12')),
(-1000000000000000, '-1E+15'),
(-1000000000000000.1234567890, pv('-1E+15', '-1.0000000000000001E+15')),
(-10000000000000000, '-10E+15'),
(-10000000000000000.1234567890, '-10E+15'),
(-100000000000000000, '-100E+15'),
(-100000000000000000.1234567890, '-100E+15'),
(-1000000000000000000, '-1E+18'),
(-1000000000000000000.1234567890, '-1E+18'),
(-10000000000000000000, '-10E+18'),
(-10000000000000000000.1234567890, '-10E+18'),
(-100000000000000000000, '-100E+18'),
(-100000000000000000000.1234567890, '-100E+18'),
(-1000000000000000000000, '-1E+21'),
(-1000000000000000000000.1234567890, '-1E+21'),
(-10000000000000000000000, '-10E+21'),
(-10000000000000000000000.1234567890, '-10E+21'),
(-100000000000000000000000, '-100E+21'),
(-100000000000000000000000.1234567890, pv('-100E+21', '-'+h(13)+'1E+21')),
(-1000000000000000000000000, '-1E+24'),
(-1000000000000000000000000.1234567890, '-1E+24'),
(-10000000000000000000000000, '-10E+24'),
(-10000000000000000000000000.1234567890, '-10E+24'),
(-100000000000000000000000000, '-100E+24'),
(-100000000000000000000000000.1234567890, '-100E+24'),
('100000.1234567890', '100.000123456789E+3'),
('-100000.1234567890', '-100.000123456789E+3'),
]
)
def test_to_sci_string(num, ref):
""" Test _to_eng_string function behavior """
assert to_sci_string(num) == ref
@pytest.mark.parametrize(
'num, ref', [
(0, '0'),
(0.0, '0.0'),
(4, '4'),
(4.0, '4.0'),
(45, '45'),
(450, '450'),
(1234567, '1234567'),
(4.5, '4.5'),
(4.1234, '4.1234'),
(4123.4E4, '41234000'),
(0.1, '0.1'),
(1.43E-2, '0.0143'),
(100000000.0, '100000000.0'),
(1000000, '1000000'),
(1e3, '1000.0'),
]
)
def test_no_exp(num, ref):
""" Test no_exp function behavior """
assert putil.eng.no_exp(num) == ref
@pytest.mark.eng
def test_no_ex_exceptions():
""" Test no_exp function exceptions """
AI(putil.eng.no_exp, 'number', number='a')
@pytest.mark.eng
@pytest.mark.parametrize(
'args, name', [
(dict(number=['5'], frac_length=3, rjust=True), 'number'),
(dict(number=5, frac_length=3.5, rjust=True), 'frac_length'),
(dict(number=5, frac_length=-2, rjust=True), 'frac_length'),
(dict(number=5, frac_length=3, rjust='a'), 'rjust')
]
)
def test_peng_exceptions(args, name):
""" Test peng function exceptions """
AI(putil.eng.peng, name, **args)
@pytest.mark.parametrize(
'num, mant, rjust, ref', [
(3.0333333333, 1, False, '3.0'),
(0, 3, True, ' 0.000 '),
(0, 3, False, '0.000'),
(125.5, 0, False, '126'),
(1e-25, 3, True, ' 1.000y'),
(1e-24, 3, True, ' 1.000y'),
(1e-23, 3, True, ' 10.000y'),
(1e-22, 3, True, ' 100.000y'),
(1e-21, 3, True, ' 1.000z'),
(1e-20, 3, True, ' 10.000z'),
(1e-19, 3, True, ' 100.000z'),
(1e-18, 3, True, ' 1.000a'),
(1e-17, 3, True, ' 10.000a'),
(1e-16, 3, True, ' 100.000a'),
(1e-15, 3, True, ' 1.000f'),
(1e-14, 3, True, ' 10.000f'),
(1e-13, 3, True, ' 100.000f'),
(1e-12, 3, True, ' 1.000p'),
(1e-11, 3, True, ' 10.000p'),
(1e-10, 3, True, ' 100.000p'),
(1e-9, 3, True, ' 1.000n'),
(1e-8, 3, True, ' 10.000n'),
(1e-7, 3, True, ' 100.000n'),
(1e-6, 3, True, ' 1.000u'),
(1e-5, 3, True, ' 10.000u'),
(1e-4, 3, True, ' 100.000u'),
(1e-3, 3, True, ' 1.000m'),
(1e-2, 3, True, ' 10.000m'),
(1e-1, 3, True, ' 100.000m'),
(1e-0, 3, True, ' 1.000 '),
(1e+1, 3, True, ' 10.000 '),
(1e+2, 3, True, ' 100.000 '),
(1e+3, 3, True, ' 1.000k'),
(1e+4, 3, True, ' 10.000k'),
(1e+5, 3, True, ' 100.000k'),
(1e+6, 3, True, ' 1.000M'),
(1e+7, 3, True, ' 10.000M'),
(1e+8, 3, True, ' 100.000M'),
(1e+9, 3, True, ' 1.000G'),
(1e+10, 3, True, ' 10.000G'),
(1e+11, 3, True, ' 100.000G'),
(1e+12, 3, True, ' 1.000T'),
(1e+13, 3, True, ' 10.000T'),
(1e+14, 3, True, ' 100.000T'),
(1e+15, 3, True, ' 1.000P'),
(1e+16, 3, True, ' 10.000P'),
(1e+17, 3, True, ' 100.000P'),
(1e+18, 3, True, ' 1.000E'),
(1e+19, 3, True, ' 10.000E'),
(1e+20, 3, True, ' 100.000E'),
(1e+21, 3, True, ' 1.000Z'),
(1e+22, 3, True, ' 10.000Z'),
(1e+23, 3, True, ' 100.000Z'),
(1e+24, 3, True, ' 1.000Y'),
(1e+25, 3, True, ' 10.000Y'),
(1e+26, 3, True, ' 100.000Y'),
(1e+27, 3, True, ' 999.999Y'),
(12.45, 1, True, ' 12.5 '),
(998.999e3, 1, True, ' 999.0k'),
(998.999e3, 1, False, '999.0k'),
(999.999e3, 1, True, ' 1.0M'),
(999.999e3, 1, DFLT, ' 1.0M'),
(999.999e3, 1, False, '1.0M'),
(0.995, 0, False, '995m'),
(0.9999, 0, False, '1'),
(1.9999, 0, False, '2'),
(999.99, 0, False, '1k'),
(9.99, 1, False, '10.0'),
(5.25e3, 1, True, ' 5.3k'),
(1.05e3, 0, True, ' 1k'),
(-1e-25, 3, True, ' -1.000y'),
(-1e-24, 3, True, ' -1.000y'),
(-1e-23, 3, True, ' -10.000y'),
(-1e-22, 3, True, '-100.000y'),
(-1e-21, 3, True, ' -1.000z'),
(-1e-20, 3, True, ' -10.000z'),
(-1e-19, 3, True, '-100.000z'),
(-1e-18, 3, True, ' -1.000a'),
(-1e-17, 3, True, ' -10.000a'),
(-1e-16, 3, True, '-100.000a'),
(-1e-15, 3, True, ' -1.000f'),
(-1e-14, 3, True, ' -10.000f'),
(-1e-13, 3, True, '-100.000f'),
(-1e-12, 3, True, ' -1.000p'),
(-1e-11, 3, True, ' -10.000p'),
(-1e-10, 3, True, '-100.000p'),
(-1e-9, 3, True, ' -1.000n'),
(-1e-8, 3, True, ' -10.000n'),
(-1e-7, 3, True, '-100.000n'),
(-1e-6, 3, True, ' -1.000u'),
(-1e-5, 3, True, ' -10.000u'),
(-1e-4, 3, True, '-100.000u'),
(-1e-3, 3, True, ' -1.000m'),
(-1e-2, 3, True, ' -10.000m'),
(-1e-1, 3, True, '-100.000m'),
(-1e-0, 3, True, ' -1.000 '),
(-1e+1, 3, True, ' -10.000 '),
(-1e+2, 3, True, '-100.000 '),
(-1e+3, 3, True, ' -1.000k'),
(-1e+4, 3, True, ' -10.000k'),
(-1e+5, 3, True, '-100.000k'),
(-1e+6, 3, True, ' -1.000M'),
(-1e+7, 3, True, ' -10.000M'),
(-1e+8, 3, True, '-100.000M'),
(-1e+9, 3, True, ' -1.000G'),
(-1e+10, 3, True, ' -10.000G'),
(-1e+11, 3, True, '-100.000G'),
(-1e+12, 3, True, ' -1.000T'),
(-1e+13, 3, True, ' -10.000T'),
(-1e+14, 3, True, '-100.000T'),
(-1e+15, 3, True, ' -1.000P'),
(-1e+16, 3, True, ' -10.000P'),
(-1e+17, 3, True, '-100.000P'),
(-1e+18, 3, True, ' -1.000E'),
(-1e+19, 3, True, ' -10.000E'),
(-1e+20, 3, True, '-100.000E'),
(-1e+21, 3, True, ' -1.000Z'),
(-1e+22, 3, True, ' -10.000Z'),
(-1e+23, 3, True, '-100.000Z'),
(-1e+24, 3, True, ' -1.000Y'),
(-1e+25, 3, True, ' -10.000Y'),
(-1e+26, 3, True, '-100.000Y'),
(-1e+27, 3, True, '-999.999Y'),
(-12.45, 1, True, ' -12.5 '),
(-998.999e3, 1, True, '-999.0k'),
(-998.999e3, 1, False, '-999.0k'),
(-999.999e3, 1, True, ' -1.0M'),
(-999.999e3, 1, DFLT, ' -1.0M'),
(-999.999e3, 1, False, '-1.0M'),
(-0.995, 0, False, '-995m'),
(-0.9999, 0, False, '-1'),
(-1.9999, 0, False, '-2'),
(-999.99, 0, False, '-1k'),
(-9.99, 1, False, '-10.0'),
(-5.25e3, 1, True, ' -5.3k'),
(-1.05e3, 0, True, ' -1k')
]
)
def test_peng(num, mant, rjust, ref):
""" Test peng function behavior """
obj = putil.eng.peng
obj = obj if isdflt(rjust) else functools.partial(obj, rjust=rjust)
assert obj(num, mant) == ref
@pytest.mark.eng
@pytest.mark.parametrize('arg', [None, 5, '', ' 5x', 'a5M', '- - a5M'])
@pytest.mark.parametrize(
'func', [
putil.eng.peng_float,
putil.eng.peng_frac,
putil.eng.peng_int,
putil.eng.peng_mant,
putil.eng.peng_power,
putil.eng.peng_suffix,
]
)
def test_peng_snum_exceptions(func, arg):
"""
Test exceptions of functions that receive a string representing
a number in engineering notation
"""
AI(func, 'snum', **dict(snum=arg))
@pytest.mark.parametrize(
'arg, ref', [
(putil.eng.peng(5234.567, 3, True), 5.235e3),
(' 5.235k ', 5.235e3),
(' -5.235k ', -5.235e3),
]
)
def test_peng_float(arg, ref):
""" Test peng_float function behavior """
assert putil.eng.peng_float(arg) == ref
@pytest.mark.parametrize(
'arg, ref', [
(putil.eng.peng(5234.567, 6, True), 234567),
(putil.eng.peng(5234, 0, True), 0)
]
)
def test_peng_frac(arg, ref):
""" Test peng_frac function behavior """
assert putil.eng.peng_frac(arg) == ref
def test_peng_int():
""" Test peng_int function behavior """
assert putil.eng.peng_int(putil.eng.peng(5234.567, 6, True)) == 5
def test_peng_mant():
""" Test peng_mant function behavior """
assert putil.eng.peng_mant(putil.eng.peng(5234.567, 3, True)) == 5.235
def test_peng_power():
""" Test peng_power function behavior """
tup = putil.eng.peng_power(putil.eng.peng(1234.567, 3, True))
assert tup == ('k', 1000.0)
assert isinstance(tup[1], float)
@pytest.mark.parametrize(
'arg, ref', [
(putil.eng.peng(1, 3, True), ' '),
(putil.eng.peng(-10.5e-6, 3, False), 'u')
]
)
def test_peng_suffix(arg, ref):
""" Test peng_suffix function behavior """
assert putil.eng.peng_suffix(arg) == ref
@pytest.mark.eng
@pytest.mark.parametrize(
'args, extype, name', [
(dict(suffix='X', offset=-1), RuntimeError, 'suffix'),
(dict(suffix='M', offset='a'), RuntimeError, 'offset'),
(dict(suffix='M', offset=20), ValueError, 'offset'),
]
)
@pytest.mark.eng
def test_peng_suffix_math_exceptions(args, extype, name):
""" Test peng_suffix_math function exceptions """
AE(putil.eng.peng_suffix_math, extype, sarg(name), **args)
@pytest.mark.parametrize('args, ref', [((' ', 3), 'G'), (('u', -2), 'p')])
def test_peng_suffix_math(args, ref):
""" Test peng_suffix_math function behavior """
assert putil.eng.peng_suffix_math(*args) == ref
@pytest.mark.parametrize(
'num, frac_length, exp_length, sign_always, ref', [
('5.35E+3', DFLT, DFLT, DFLT, '5.35E+3'),
(0, DFLT, DFLT, DFLT, '0E+0'),
(0.1, DFLT, DFLT, DFLT, '1E-1'),
(0.01, DFLT, DFLT, DFLT, '1E-2'),
(0.001, DFLT, DFLT, DFLT, '1E-3'),
(0.00101, DFLT, DFLT, DFLT, '1.01E-3'),
(0.123456789012, DFLT, DFLT, DFLT, '1.23456789012E-1'),
(1234567.89012, DFLT, DFLT, DFLT, '1.23456789012E+6'),
(1, DFLT, DFLT, DFLT, '1E+0'),
(20, DFLT, DFLT, DFLT, '2E+1'),
(100, DFLT, DFLT, DFLT, '1E+2'),
(200, DFLT, DFLT, DFLT, '2E+2'),
(333, DFLT, DFLT, DFLT, '3.33E+2'),
(4567, DFLT, DFLT, DFLT, '4.567E+3'),
(4567.890, DFLT, DFLT, DFLT, '4.56789E+3'),
(500, 3, DFLT, DFLT, '5.000E+2'),
(4567.890, 8, DFLT, DFLT, '4.56789000E+3'),
(99.999, 1, DFLT, DFLT, '1.0E+2'),
(4567.890, DFLT, DFLT, True, '+4.56789E+3'),
(500, 3, DFLT, True, '+5.000E+2'),
(4567.890, 8, DFLT, True, '+4.56789000E+3'),
(99.999, 1, DFLT, True, '+1.0E+2'),
(500, 3, 2, True, '+5.000E+02'),
(4567.890, 8, 3, True, '+4.56789000E+003'),
(9999999999.999, 1, 1, True, '+1.0E+10'),
(-0.1, DFLT, DFLT, DFLT, '-1E-1'),
(-0.01, DFLT, DFLT, DFLT, '-1E-2'),
(-0.001, DFLT, DFLT, DFLT, '-1E-3'),
(-0.00101, DFLT, DFLT, DFLT, '-1.01E-3'),
(-0.123456789012, DFLT, DFLT, DFLT, '-1.23456789012E-1'),
(-1234567.89012, DFLT, DFLT, DFLT, '-1.23456789012E+6'),
(-1, DFLT, DFLT, DFLT, '-1E+0'),
(-20, DFLT, DFLT, DFLT, '-2E+1'),
(-100, DFLT, DFLT, DFLT, '-1E+2'),
(-200, DFLT, DFLT, DFLT, '-2E+2'),
(-333, DFLT, DFLT, DFLT, '-3.33E+2'),
(-4567, DFLT, DFLT, DFLT, '-4.567E+3'),
(-4567.890, DFLT, DFLT, DFLT, '-4.56789E+3'),
(-500, 3, DFLT, DFLT, '-5.000E+2'),
(-4567.890, 8, DFLT, DFLT, '-4.56789000E+3'),
(-99.999, 1, DFLT, DFLT, '-1.0E+2'),
(-4567.890, DFLT, DFLT, True, '-4.56789E+3'),
(-500, 3, DFLT, True, '-5.000E+2'),
(-4567.890, 8, DFLT, True, '-4.56789000E+3'),
(-99.999, 1, DFLT, True, '-1.0E+2'),
(-500, 3, 2, True, '-5.000E+02'),
(-4567.890, 8, 3, True, '-4.56789000E+003'),
(-9999999999.999, 1, 1, True, '-1.0E+10'),
]
)
def test_to_scientific_string(num, frac_length, exp_length, sign_always, ref):
""" Test _to_scientific function behavior """
fp = functools.partial
obj = putil.eng.to_scientific_string
obj = obj if isdflt(frac_length) else fp(obj, frac_length=frac_length)
obj = obj if isdflt(exp_length) else fp(obj, exp_length=exp_length)
obj = obj if isdflt(sign_always) else fp(obj, sign_always=sign_always)
assert obj(num) == ref
CVECTOR = [-1+2j, 3+4j, 5+6j, 7+8j, 9-10j, 11+12j, -13+14j, 15678-16j]
@pytest.mark.parametrize(
'vector, args, ref, header', [
(
None,
DFLT,
'None',
''
),
(
[1, 2, 3, 4, 5, 6, 7, 8],
DFLT,
'[ 1, 2, 3, 4, 5, 6, 7, 8 ]',
''
),
(
[1, 2, 3, 4, 5, 6, 7, 8],
dict(indent=20),
'[ 1, 2, 3, 4, 5, 6, 7, 8 ]',
''
),
(
[1, 2, 3, 4, 5, 6, 7, 8],
dict(indent=20),
'[ 1, 2, 3, 4, 5, 6, 7, 8 ]',
''
),
(
[1, 2, 3, 4, 5, 6, 7, 8],
dict(limit=True),
'[ 1, 2, 3, ..., 6, 7, 8 ]',
''
),
(
[1, 2, 3, 4, 5, 6, 7, 8],
dict(limit=True, indent=20),
'[ 1, 2, 3, ..., 6, 7, 8 ]',
''
),
# Float and integer item #ref = (
(
[1e-3, 20e-6, 300e+6, 4e-12, 5.25e3, -6e-9, 700, 0.8],
dict(eng=True),
'[ 1.000m, 20.000u, 300.000M, 4.000p,'
' 5.250k, -6.000n, 700.000 , 800.000m ]',
''
),
(
[1e-3, 20e-6, 300e+6, 4e-12, 5.25e3, -6e-9, 700, 0.8],
dict(eng=True, indent=20),
'[ 1.000m, 20.000u, 300.000M, 4.000p,'
' 5.250k, -6.000n, 700.000 , 800.000m ]',
''
),
(
[1e-3, 20e-6, 300e+6, 4e-12, 5.25e3, -6e-9, 700, 0.8],
dict(limit=True, eng=True),
'[ 1.000m, 20.000u, 300.000M,'
' ...,'
' -6.000n, 700.000 , 800.000m ]',
''
),
(
[1e-3, 20e-6, 300e+6, 4e-12, 5.25e3, -6e-9, 700, 0.8],
dict(limit=True, eng=True, indent=20),
'[ 1.000m, 20.000u, 300.000M,'
' ...,'
' -6.000n, 700.000 , 800.000m ]',
''
),
(
[1e-3, 20e-6, 300e+6, 4e-12, 5.25e3, -6e-9, 700, 0.8],
dict(eng=True, frac_length=1),
'[ 1.0m, 20.0u, 300.0M, 4.0p,'
' 5.3k, -6.0n, 700.0 , 800.0m ]',
''
),
(
[1e-3, 20e-6, 300e+6, 4e-12, 5.25e3, -6e-9, 700, 0.8],
dict(eng=True, frac_length=1, indent=20),
'[ 1.0m, 20.0u, 300.0M, 4.0p,'
' 5.3k, -6.0n, 700.0 , 800.0m ]',
''
),
(
[1e-3, 20e-6, 300e+6, 4e-12, 5.25e3, -6e-9, 700, 0.8],
dict(limit=True, eng=True, frac_length=1),
'[ 1.0m, 20.0u, 300.0M, ..., -6.0n, 700.0 , 800.0m ]',
''
),
(
[1e-3, 20e-6, 300e+6, 4e-12, 5.25e3, -6e-9, 700, 0.8],
dict(limit=True, indent=20, eng=True, frac_length=1),
'[ 1.0m, 20.0u, 300.0M, ..., -6.0n, 700.0 , 800.0m ]',
''
),
(
[1, 2, 3, 4, 5, 6, 7, 8],
dict(width=8),
#12345678
'[ 1, 2,\n'
' 3, 4,\n'
' 5, 6,\n'
' 7, 8 ]',
''
),
(
[1, 2, 3, 4, 5, 6, 7, 8],
dict(width=10),
'[ 1, 2, 3,\n'
' 4, 5, 6,\n'
' 7, 8 ]',
''
),
(
[1e-3, 20e-6, 300e+6, 4e-12, 5.25e3, -6e-9, 700, 8, 9],
dict(width=20, eng=True, frac_length=0),
'[ 1m, 20u,\n'
' 300M, 4p,\n'
' 5k, -6n,\n'
' 700 , 8 ,\n'
' 9 ]',
''
),
(
[1e-3, 20e-6, 300e+6, 4e-12, 5.25e3, -6e-9, 700, 0.8],
dict(width=30, eng=True, frac_length=1),
'[ 1.0m, 20.0u, 300.0M,\n'
' 4.0p, 5.3k, -6.0n,\n'
' 700.0 , 800.0m ]',
''
),
(
[1e-3, 20e-6, 300e+6, 4e-12, 5.25e3, -6e-9, 700, 8, 9],
dict(width=20, eng=True, frac_length=0, limit=True),
'[ 1m,\n'
' 20u,\n'
' 300M,\n'
' ...\n'
' 700 ,\n'
' 8 ,\n'
' 9 ]',
''
),
(
[1e-3, 20e-6, 300e+6, 4e-12, 5.25e3, -6e-9, 700, 8, 9],
dict(width=30, eng=True, frac_length=1, limit=True),
'[ 1.0m, 20.0u, 300.0M,\n'
' ...\n'
' 700.0 , 8.0 , 9.0 ]',
''
),
(
[1e-3, 20e-6, 300e+6, 4e-12, 5.25e3, -6e-9, 700, 8, 9],
dict(width=30, eng=True, frac_length=1, limit=True, indent=8),
'Vector: [ 1.0m, 20.0u, 300.0M,\n'
' ...\n'
' 700.0 , 8.0 , 9.0 ]',
'Vector: '
),
(
[1e-3, 20e-6, 300e+6, 4e-12, 5.25e3, -6e-9, 700, 0.8],
dict(width=30, eng=True, frac_length=1, indent=8),
'Vector: [ 1.0m, 20.0u, 300.0M,\n'
' 4.0p, 5.3k, -6.0n,\n'
' 700.0 , 800.0m ]',
'Vector: '
),
(
[
1.23456789, 2.45678901, 3.45678901, 4.56789012,
5.67890123, 6.78901234, 7.89012345
],
dict(limit=True, width=80-22, indent=22),
'Independent variable: [ 1.23456789, 2.45678901, 3.45678901,\n'
' ...\n'
' 5.67890123, 6.78901234, 7.89012345 ]',
'Independent variable: '
),
(
[
1.23456789, 2.45678901, 3.45678901, 4.56789012,
5.67890123, 6.78901234, 7.89012345
],
dict(width=49, indent=17),
'Independent var: [ 1.23456789, 2.45678901, 3.45678901, '
'4.56789012,\n'
' 5.67890123, 6.78901234, 7.89012345 ]',
'Independent var: '
),
# Complex items
(
CVECTOR,
DFLT,
'[ -1+2j, 3+4j, 5+6j, 7+8j, 9-10j, 11+12j, -13+14j, 15678-16j ]',
''
),
(
CVECTOR,
dict(indent=20),
'[ -1+2j, 3+4j, 5+6j, 7+8j, 9-10j, 11+12j, -13+14j, 15678-16j ]',
''
),
(
CVECTOR,
dict(limit=True),
'[ -1+2j, 3+4j, 5+6j, ..., 11+12j, -13+14j, 15678-16j ]',
''
),
(
CVECTOR,
dict(limit=True, indent=20),
'[ -1+2j, 3+4j, 5+6j, ..., 11+12j, -13+14j, 15678-16j ]',
''
),
(
CVECTOR,
dict(eng=True),
'[ -1.000 + 2.000 j, 3.000 + 4.000 j,'
' 5.000 + 6.000 j,'
' 7.000 + 8.000 j, 9.000 - 10.000 j,'
' 11.000 + 12.000 j,'
' -13.000 + 14.000 j, 15.678k- 16.000 j ]',
''
),
(
CVECTOR,
dict(eng=True, indent=20),
'[ -1.000 + 2.000 j, 3.000 + 4.000 j,'
' 5.000 + 6.000 j,'
' 7.000 + 8.000 j, 9.000 - 10.000 j,'
' 11.000 + 12.000 j,'
' -13.000 + 14.000 j, 15.678k- 16.000 j ]',
''
),
(
CVECTOR,
dict(limit=True, eng=True),
'[ -1.000 + 2.000 j, 3.000 + 4.000 j,'
' 5.000 + 6.000 j,'
' ..., 11.000 + 12.000 j, -13.000 + 14.000 j,'
' 15.678k- 16.000 j ]',
''
),
(
CVECTOR,
dict(limit=True, eng=True, indent=20),
'[ -1.000 + 2.000 j, 3.000 + 4.000 j,'
' 5.000 + 6.000 j,'
' ..., 11.000 + 12.000 j, -13.000 + 14.000 j,'
' 15.678k- 16.000 j ]',
''
),
(
CVECTOR,
dict(eng=True, frac_length=1),
'[ -1.0 + 2.0 j, 3.0 + 4.0 j, 5.0 + 6.0 j,'
' 7.0 + 8.0 j, 9.0 - 10.0 j, 11.0 + 12.0 j,'
' -13.0 + 14.0 j, 15.7k- 16.0 j ]',
''
),
(
CVECTOR,
dict(eng=True, frac_length=1, indent=20),
'[ -1.0 + 2.0 j, 3.0 + 4.0 j, 5.0 + 6.0 j,'
' 7.0 + 8.0 j, 9.0 - 10.0 j, 11.0 + 12.0 j,'
' -13.0 + 14.0 j, 15.7k- 16.0 j ]',
''
),
(
CVECTOR,
dict(limit=True, eng=True, frac_length=1),
'[ -1.0 + 2.0 j, 3.0 + 4.0 j, 5.0 + 6.0 j,'
' ..., 11.0 + 12.0 j, -13.0 + 14.0 j, 15.7k- 16.0 j ]',
''
),
(
CVECTOR,
dict(limit=True, eng=True, frac_length=1, indent=20),
'[ -1.0 + 2.0 j, 3.0 + 4.0 j, 5.0 + 6.0 j,'
' ..., 11.0 + 12.0 j, -13.0 + 14.0 j, 15.7k- 16.0 j ]',
''
),
(
CVECTOR,
dict(width=22),
'[ -1+2j, 3+4j, 5+6j,\n'
' 7+8j, 9-10j, 11+12j,\n'
' -13+14j, 15678-16j ]',
''
),
(
CVECTOR,
dict(width=20),
'[ -1+2j, 3+4j, 5+6j,\n'
' 7+8j, 9-10j,\n'
' 11+12j, -13+14j,\n'
' 15678-16j ]',
''
),
(
CVECTOR,
dict(width=29, eng=True, frac_length=0),
'[ -1 + 2 j, 3 + 4 j,\n'
' 5 + 6 j, 7 + 8 j,\n'
' 9 - 10 j, 11 + 12 j,\n'
' -13 + 14 j, 16k- 16 j ]',
''
),
(
CVECTOR,
dict(width=37, eng=True, frac_length=1),
'[ -1.0 + 2.0 j, 3.0 + 4.0 j,\n'
' 5.0 + 6.0 j, 7.0 + 8.0 j,\n'
' 9.0 - 10.0 j, 11.0 + 12.0 j,\n'
' -13.0 + 14.0 j, 15.7k- 16.0 j ]',
''
),
(
CVECTOR,
dict(width=16, eng=True, frac_length=0),
'[ -1 + 2 j,\n'
' 3 + 4 j,\n'
' 5 + 6 j,\n'
' 7 + 8 j,\n'
' 9 - 10 j,\n'
' 11 + 12 j,\n'
' -13 + 14 j,\n'
' 16k- 16 j ]',
''
),
(
CVECTOR,
dict(width=16, eng=True, frac_length=0, limit=True),
'[ -1 + 2 j,\n'
' 3 + 4 j,\n'
' 5 + 6 j,\n'
' ...\n'
' 11 + 12 j,\n'
' -13 + 14 j,\n'
' 16k- 16 j ]',
''
),
(
CVECTOR,
dict(width=56, eng=True, frac_length=1, limit=True),
'[ -1.0 + 2.0 j, 3.0 + 4.0 j, 5.0 + 6.0 j,\n'
' ...\n'
' 11.0 + 12.0 j, -13.0 + 14.0 j, 15.7k- 16.0 j ]',
''
),
(
CVECTOR,
dict(width=64, eng=True, frac_length=1, limit=True, indent=8),
'Vector: [ -1.0 + 2.0 j, 3.0 + 4.0 j, 5.0 + 6.0 j,\n'
' ...\n'
' 11.0 + 12.0 j, -13.0 + 14.0 j, 15.7k- 16.0 j ]',
'Vector: '
),
(
CVECTOR,
dict(width=20, indent=8),
'Vector: [ -1+2j, 3+4j, 5+6j,\n'
' 7+8j, 9-10j,\n'
' 11+12j, -13+14j,\n'
' 15678-16j ]',
'Vector: '
),
(
CVECTOR,
dict(width=30, indent=8, limit=True),
'Vector: [ -1+2j, 3+4j, 5+6j,\n'
' ...\n'
' 11+12j, -13+14j, 15678-16j ]',
'Vector: '
),
(
CVECTOR,
dict(width=20, indent=8, limit=True),
'Vector: [ -1+2j,\n'
' 3+4j,\n'
' 5+6j,\n'
' ...\n'
' 11+12j,\n'
' -13+14j,\n'
' 15678-16j ]',
'Vector: '
),
(
array(
[
-0.10081675027325637-0.06910517142735251j,
0.018754229185649937+0.017142783560861786j,
0+18j
]
),
DFLT,
'[ -0.100816750273-0.0691051714274j, '
'0.0187542291856+0.0171427835609j, 18j ]',
''
),
(
array(
[
-0.10081675027325637-0.06910517142735251j,
0.018754229185649937+0.017142783560861786j,
0+18j
]
),
dict(width=60, limit=True, indent=20),
'Dependent variable: [ -0.100816750273-0.0691051714274j,\n'
' 0.0187542291856+0.0171427835609j, 18j ]',
'Dependent variable: '
),
(
array(
[
-0.10081675027325637-0.06910517142735251j,
0.018754229185649937+0.017142783560861786j,
0+18j,
0.118754229185649937+0.117142783560861786j,
0.218754229185649937+0.217142783560861786j,
0+28j,
10+2j,
]
),
dict(width=60),
'[ -0.100816750273-0.0691051714274j,\n'
' 0.0187542291856+0.0171427835609j, 18j,\n'
' 0.118754229186+0.117142783561j,\n'
' 0.218754229186+0.217142783561j, 28j, 10+2j ]',
''
),
(
array(
[
-0.10081675027325637-0.06910517142735251j,
0.018754229185649937+0.017142783560861786j,
0+18j,
0.118754229185649937+0.117142783560861786j,
0.218754229185649937+0.217142783560861786j,
0+28j,
10+2j,
]
),
dict(width=60, limit=True),
'[ -0.100816750273-0.0691051714274j,\n'
' 0.0187542291856+0.0171427835609j,\n'
' 18j,\n'
' ...\n'
' 0.218754229186+0.217142783561j,\n'
' 28j,\n'
' 10+2j ]',
''
),
]
)
def test_pprint_vector(vector, args, ref, header):
""" Test pprint_vector function behavior """
obj = putil.eng.pprint_vector
obj = obj if isdflt(args) else functools.partial(obj, **args)
CS(header+obj(vector), ref)
@pytest.mark.parametrize(
'args', [
dict(
vector=[1e-3, 20e-6, 300e+6, 4e-12, 5.25e3, -6e-9, 700, 8, 9],
width=5, eng=True, frac_length=1, limit=True
),
dict(
vector=[-1+2j, 3, 5+6j, 7+8j, 9-10j, 11+12j, -13+14j, 15678-16j],
width=8, limit=True
)
]
)
@pytest.mark.eng
def test_pprint_vector_exceptions(args):
""" Test pprint_vector function exceptions """
msg = 'Argument `width` is too small'
AE(putil.eng.pprint_vector, ValueError, msg, **args)
@pytest.mark.parametrize(
'num, dec, ref', [
(None, DFLT, None),
(1.3333, 2, 1.33),
(1.5555E-12, 2, 1.56E-12),
(3, 2, 3),
(array([1.3333, 2.666666]), 2, array([1.33, 2.67])),
(array([1.3333E-12, 2.666666E-12]), 2, array([1.33E-12, 2.67E-12])),
(array([1, 3]), 2, array([1, 3])),
]
)
def test_round_mantissa(num, dec, ref):
""" Test round_mantissa function behavior """
obj = putil.eng.round_mantissa
obj = obj if isdflt(dec) else functools.partial(obj, decimals=dec)
test = obj(num) == ref
assert test.all() if isinstance(num, ndarray) else test
| 35.567759 | 79 | 0.467253 | 5,026 | 40,156 | 3.703542 | 0.089136 | 0.030354 | 0.011604 | 0.019179 | 0.807027 | 0.767111 | 0.747233 | 0.717417 | 0.692221 | 0.683518 | 0 | 0.395008 | 0.327572 | 40,156 | 1,128 | 80 | 35.599291 | 0.294349 | 0.032847 | 0 | 0.33205 | 0 | 0.018287 | 0.251119 | 0.019739 | 0 | 0 | 0.000259 | 0 | 0.013474 | 1 | 0.019249 | false | 0 | 0.005775 | 0 | 0.025987 | 0.00385 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
17a91a7e999d98b9ee31d5190c4a274bbcf7a631 | 5,598 | py | Python | case/scenario_test/test_02_register_login_update.py | lzpsgh/AscTrio | f969beece5dc93d29063da03793521bc54b814dd | [
"MIT"
] | 5 | 2021-07-21T06:50:51.000Z | 2022-03-31T04:18:28.000Z | case/scenario_test/test_02_register_login_update.py | lzpsgh/AscTrio | f969beece5dc93d29063da03793521bc54b814dd | [
"MIT"
] | null | null | null | case/scenario_test/test_02_register_login_update.py | lzpsgh/AscTrio | f969beece5dc93d29063da03793521bc54b814dd | [
"MIT"
] | 1 | 2022-03-28T01:50:03.000Z | 2022-03-28T01:50:03.000Z | import allure
import pytest
from serv.zzzzz.user import register_user, login_user, get_one_user_info, update_user
from util.log_util import logger
@allure.step("步骤1 ==>> 注册用户")
def step_1(username, password, telephone, sex, address):
logger.info("步骤1 ==>> 注册用户 ==>> {}, {}, {}, {}, {}".format(username, password, telephone, sex, address))
@allure.step("步骤2 ==>> 登录用户")
def step_2(username):
logger.info("步骤2 ==>> 登录管理员用户:{}".format(username))
@allure.step("步骤3 ==>> 查看新注册用户ID")
def step_3(id):
logger.info("步骤3 ==>> 查看新注册用户ID:{}".format(id))
@allure.step("步骤4 ==>> 根据ID修改用户信息")
def step_4(id):
logger.info("步骤4 ==>> 修改用户ID:{}".format(id))
@allure.severity(allure.severity_level.BLOCKER)
@allure.epic("针对业务场景的测试")
@allure.feature("场景:用户注册-用户登录-修改用户")
class TestRegLogUpdate():
@allure.story("用例--注册/登录/修改--预期成功")
@allure.description("该用例是针对 注册-登录-修改 场景的测试")
@allure.issue("https://www.cnblogs.com/wintest", name="点击,跳转到对应BUG的链接地址")
@allure.testcase("https://www.cnblogs.com/wintest", name="点击,跳转到对应用例的链接地址")
@allure.title("用户注册登录修改--预期成功")
@pytest.mark.multiple
@pytest.mark.usefixtures("delete_register_user")
def test_user_register_login_update_success(self, testcase_data):
username = testcase_data["register"]["username"]
password = testcase_data["register"]["password"]
telephone = testcase_data["register"]["telephone"]
sex = testcase_data["register"]["sex"]
address = testcase_data["register"]["address"]
admin_user = testcase_data["login"]["admin_user"]
admin_pwd = testcase_data["login"]["admin_pwd"]
new_password = testcase_data["update"]["new_password"]
new_telephone = testcase_data["update"]["new_telephone"]
new_sex = testcase_data["update"]["new_sex"]
new_address = testcase_data["update"]["new_address"]
except_result = testcase_data["except_result"]
except_code = testcase_data["except_code"]
except_msg = testcase_data["except_msg"]
logger.info("*************** 开始执行用例 ***************")
result = register_user(username, password, telephone, sex, address)
step_1(username, password, telephone, sex, address)
assert result.success is True, result.error
result = login_user(admin_user, admin_pwd)
step_2(admin_user)
assert result.success is True, result.error
admin_token = result.token
result = get_one_user_info(username)
id = result.response.json().get("data")[0].get("id")
step_3(id)
assert result.success is True, result.error
result = update_user(id, admin_user, new_password, new_telephone, admin_token, new_sex, new_address)
step_4(id)
assert result.success == except_result, result.error
logger.info("code ==>> 期望结果:{}, 实际结果:【 {} 】".format(except_code, result.response.json().get("code")))
assert result.response.json().get("code") == except_code
assert except_msg in result.msg
logger.info("*************** 结束执行用例 ***************")
@allure.story("用例--注册/登录/修改--预期失败")
@allure.description("该用例是针对 注册-登录-修改 场景的测试")
@allure.issue("https://www.cnblogs.com/wintest", name="点击,跳转到对应BUG的链接地址")
@allure.testcase("https://www.cnblogs.com/wintest", name="点击,跳转到对应用例的链接地址")
@allure.title("用户注册登录修改--预期失败")
@pytest.mark.multiple
@pytest.mark.usefixtures("delete_register_user")
def test_user_register_login_update_fail(self, testcase_data):
username = testcase_data["register"]["username"]
password = testcase_data["register"]["password"]
telephone = testcase_data["register"]["telephone"]
sex = testcase_data["register"]["sex"]
address = testcase_data["register"]["address"]
admin_user = testcase_data["login"]["admin_user"]
admin_pwd = testcase_data["login"]["admin_pwd"]
new_password = testcase_data["update"]["new_password"]
new_telephone = testcase_data["update"]["new_telephone"]
new_sex = testcase_data["update"]["new_sex"]
new_address = testcase_data["update"]["new_address"]
except_result = testcase_data["except_result"]
except_code = testcase_data["except_code"]
except_msg = testcase_data["except_msg"]
logger.info("*************** 开始执行用例 ***************")
result = register_user(username, password, telephone, sex, address)
step_1(username, password, telephone, sex, address)
assert result.success is True, result.error
result = login_user(admin_user, admin_pwd)
step_2(admin_user)
assert result.success is True, result.error
admin_token = result.token
result = get_one_user_info(username)
id = result.response.json().get("data")[0].get("id")
step_3(id)
assert result.success is True, result.error
result = update_user(id + 1, admin_user, new_password, new_telephone, admin_token, new_sex, new_address)
step_4(id)
assert result.success == except_result, result.error
logger.info("code ==>> 期望结果:{}, 实际结果:【 {} 】".format(except_code, result.response.json().get("code")))
assert result.response.json().get("code") == except_code
assert except_msg in result.msg
logger.info("*************** 结束执行用例 ***************")
@pytest.mark.negative
@pytest.mark.skip(reason="skip跳过示例:暂时无法运行该用例")
def test_user_register_login_update_fail_2(self):
pass
if __name__ == '__main__':
pytest.main(["-q", "-s", "test_02_register_login_update.py"])
| 44.784 | 112 | 0.653448 | 679 | 5,598 | 5.154639 | 0.169367 | 0.102857 | 0.057143 | 0.048 | 0.83 | 0.82 | 0.809143 | 0.786857 | 0.786857 | 0.786857 | 0 | 0.00565 | 0.177921 | 5,598 | 124 | 113 | 45.145161 | 0.754889 | 0 | 0 | 0.685185 | 0 | 0 | 0.219364 | 0.005716 | 0 | 0 | 0 | 0 | 0.111111 | 1 | 0.064815 | false | 0.12037 | 0.037037 | 0 | 0.111111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
17bfd8679a5610d0adecf8ceee05c82626588fe6 | 6,950 | py | Python | cloudinis/policies/object_resources.py | MosenzonTal/Cloudi | 65bb04c50584b02f909bf84d6323a9c6a02e819b | [
"FSFAP"
] | null | null | null | cloudinis/policies/object_resources.py | MosenzonTal/Cloudi | 65bb04c50584b02f909bf84d6323a9c6a02e819b | [
"FSFAP"
] | null | null | null | cloudinis/policies/object_resources.py | MosenzonTal/Cloudi | 65bb04c50584b02f909bf84d6323a9c6a02e819b | [
"FSFAP"
] | 1 | 2021-07-04T10:51:54.000Z | 2021-07-04T10:51:54.000Z | import boto3
class Resource:
@staticmethod
def list_all_of_a_kind(user, region):
raise NotImplementedError
@staticmethod
def destroy_resource(resource_id, user, region):
raise NotImplementedError
@staticmethod
def list_tags(resource, user, region):
raise NotImplementedError
@staticmethod
def list_tags_by_id(resource_id, user, region):
raise NotImplementedError
class EC2(Resource):
@staticmethod
def list_all_of_a_kind(user, region):
client = boto3.client('ec2', aws_access_key_id=user.access_key, aws_secret_access_key=user.secret_key,
aws_session_token=user.session_token, region_name=region)
return client.describe_instances()
@staticmethod
def destroy_resource(resource_id, user, region):
client = boto3.resource('ec2', aws_access_key_id=user.access_key,
aws_secret_access_key=user.secret_key,
aws_session_token=user.session_token, region_name=region)
client.instances.filter(InstanceIds=[resource_id]).terminate()
@staticmethod
def list_tags(resource, user, region):
tags = {}
try:
for tag in resource["Tags"]:
tags[tag["Key"]] = tag["Value"]
except:
return False
return tags
@staticmethod
def list_tags_by_id(resource_id, user, region):
client = boto3.client('ec2', aws_access_key_id=user.access_key, aws_secret_access_key=user.secret_key,
aws_session_token=user.session_token, region_name=region)
resource = client.describe_instances(Filters=[{'Name': 'instance-id', 'Values': [resource_id]}])
tags = {}
try:
for tag in resource["Reservations"][0]["Instances"][0]["Tags"]:
tags[tag["Key"]] = tag["Value"]
except:
return False
return tags
class Volume(Resource):
@staticmethod
def list_all_of_a_kind(user, region):
client = boto3.client('ec2', aws_access_key_id=user.access_key, aws_secret_access_key=user.secret_key,
aws_session_token=user.session_token, region_name=region)
return client.describe_volumes()
@staticmethod
def destroy_resource(resource_id, user, region):
client = boto3.client('ec2', aws_access_key_id=user.access_key,
aws_secret_access_key=user.secret_key,
aws_session_token=user.session_token, region_name=region)
client.delete_volume(VolumeId=resource_id)
@staticmethod
def list_tags(resource, user, region):
tags = {}
try:
for tag in resource["Tags"]:
tags[tag["Key"]] = tag["Value"]
except:
return False
return tags
@staticmethod
def list_tags_by_id(resource_id, user, region):
client = boto3.client('ec2', aws_access_key_id=user.access_key, aws_secret_access_key=user.secret_key,
aws_session_token=user.session_token, region_name=region)
resource = client.describe_volumes(Filters=[{'Name': 'volume-id', 'Values': [resource_id]}])
tags = {}
try:
for tag in resource["Volumes"][0]["Tags"]:
tags[tag["Key"]] = tag["Value"]
except:
return False
return tags
class S3(Resource):
@staticmethod
def list_all_of_a_kind(user, region):
client = boto3.client('s3', aws_access_key_id=user.access_key, aws_secret_access_key=user.secret_key,
aws_session_token=user.session_token, region_name=region)
return client.list_buckets()
@staticmethod
def destroy_resource(resource_id, user, region):
client = boto3.resource('s3', aws_access_key_id=user.access_key,
aws_secret_access_key=user.secret_key,
aws_session_token=user.session_token, region_name=region)
bucket = client.Bucket(resource_id)
for key in bucket.objects.all():
key.delete()
bucket.delete()
@staticmethod
def list_tags(resource, user, region):
tags = {}
try:
client = boto3.client('s3', aws_access_key_id=user.access_key, aws_secret_access_key=user.secret_key,
aws_session_token=user.session_token, region_name=region)
response = client.get_bucket_tagging(Bucket=resource["Name"])
try:
for tag in response["TagSet"]:
tags[tag["Key"]] = tag["Value"]
except:
return False
except:
return False
return tags
@staticmethod
def list_tags_by_id(resource_id, user, region):
tags = {}
try:
client = boto3.client('s3', aws_access_key_id=user.access_key, aws_secret_access_key=user.secret_key,
aws_session_token=user.session_token, region_name=region)
response = client.get_bucket_tagging(Bucket=resource_id)
try:
for tag in response["TagSet"]:
tags[tag["Key"]] = tag["Value"]
except:
return False
except:
return False
return tags
class EIP(Resource):
@staticmethod
def list_all_of_a_kind(user, region):
client = boto3.client('ec2', aws_access_key_id=user.access_key, aws_secret_access_key=user.secret_key,
aws_session_token=user.session_token, region_name=region)
return client.describe_addresses()
@staticmethod
def destroy_resource(resource_id, user, region):
client = boto3.client('ec2', aws_access_key_id=user.access_key, aws_secret_access_key=user.secret_key,
aws_session_token=user.session_token, region_name=region)
client.release_address(AllocationId=resource_id)
@staticmethod
def list_tags(resource, user, region):
tags = {}
try:
for tag in resource["Tags"]:
tags[tag["Key"]] = tag["Value"]
except:
return False
return tags
@staticmethod
def list_tags_by_id(resource_id, user, region):
client = boto3.client('ec2', aws_access_key_id=user.access_key, aws_secret_access_key=user.secret_key,
aws_session_token=user.session_token, region_name=region)
resource = client.describe_addresses(Filters=[{'Name': 'allocation-id', 'Values': [resource_id]}])
tags = {}
try:
for tag in resource["Addresses"][0]["Tags"]:
tags[tag["Key"]] = tag["Value"]
except:
return False
return tags | 35.459184 | 113 | 0.603597 | 796 | 6,950 | 4.978643 | 0.080402 | 0.088569 | 0.071915 | 0.045925 | 0.908907 | 0.908907 | 0.895534 | 0.893515 | 0.860459 | 0.852637 | 0 | 0.006787 | 0.300432 | 6,950 | 196 | 114 | 35.459184 | 0.808309 | 0 | 0 | 0.821656 | 0 | 0 | 0.034384 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.127389 | false | 0 | 0.006369 | 0 | 0.305732 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
17c7055cc90666d98d6721648e0c3084097f46af | 1,692 | py | Python | torch_glow/tests/nodes/sub_test.py | brightstandlamp/glow | e27a74e150a2284300bb7da5529dcbfeb55e1f00 | [
"Apache-2.0"
] | null | null | null | torch_glow/tests/nodes/sub_test.py | brightstandlamp/glow | e27a74e150a2284300bb7da5529dcbfeb55e1f00 | [
"Apache-2.0"
] | null | null | null | torch_glow/tests/nodes/sub_test.py | brightstandlamp/glow | e27a74e150a2284300bb7da5529dcbfeb55e1f00 | [
"Apache-2.0"
] | 1 | 2020-02-13T10:46:16.000Z | 2020-02-13T10:46:16.000Z | from __future__ import absolute_import, division, print_function, unicode_literals
import torch
from tests.utils import jitVsGlow
def test_sub_basic():
"""Basic test of the PyTorch sub Node on Glow."""
def test_f(a, b):
c = a.sub(b)
return c.sub(c)
x = torch.randn(4)
y = torch.randn(4)
jitVsGlow(test_f, x, y, expected_fused_ops={"aten::sub"})
def test_sub_broadcast_1():
"""Test of the PyTorch sub Node on Glow with broadcasting."""
def test_f(a, b):
c = a.sub(b)
return c.sub(c)
x = torch.randn(8, 3, 4, 2)
y = torch.randn(4, 2)
jitVsGlow(test_f, x, y, expected_fused_ops={"aten::sub"})
def test_sub_broadcast_2():
"""Test of the PyTorch sub Node on Glow with broadcasting."""
def test_f(a, b):
c = a.sub(b)
return c.sub(c)
x = torch.randn(8, 3, 4, 2)
y = torch.randn(1, 2)
jitVsGlow(test_f, x, y, expected_fused_ops={"aten::sub"})
def test_sub_broadcast_3():
"""Test of the PyTorch sub Node on Glow with broadcasting."""
def test_f(a, b):
c = a.sub(b)
return c.sub(c)
x = torch.randn(4, 2)
y = torch.randn(8, 3, 4, 2)
jitVsGlow(test_f, x, y, expected_fused_ops={"aten::sub"})
def test_sub_float():
"""Test of the PyTorch aten::sub Node with a float argument"""
def test_f(a):
return (a*a).sub(3.9)
x = torch.randn(4)
jitVsGlow(test_f, x, expected_fused_ops={"aten::sub"})
def test_sub_int():
"""Test of the PyTorch aten::sub Node with an int argument"""
def test_f(a):
return (a*a).sub(20)
x = torch.randn(4)
jitVsGlow(test_f, x, expected_fused_ops={"aten::sub"})
| 21.15 | 82 | 0.608156 | 286 | 1,692 | 3.437063 | 0.171329 | 0.085453 | 0.061038 | 0.09766 | 0.843337 | 0.841302 | 0.829095 | 0.817904 | 0.715158 | 0.658189 | 0 | 0.022656 | 0.243499 | 1,692 | 79 | 83 | 21.417722 | 0.745313 | 0.191489 | 0 | 0.609756 | 0 | 0 | 0.040389 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.292683 | false | 0 | 0.073171 | 0.04878 | 0.512195 | 0.02439 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
17dfeea78a89f49792b95c3d4b74b78d4db0455b | 42,346 | py | Python | sdk/python/pulumi_signalfx/heatmap_chart.py | pulumi/pulumi-signalfx | 0b0701eda3b9e4c5c590f3547f8cfc5d1ff8b456 | [
"ECL-2.0",
"Apache-2.0"
] | 2 | 2020-05-09T17:37:04.000Z | 2021-10-08T06:43:07.000Z | sdk/python/pulumi_signalfx/heatmap_chart.py | pulumi/pulumi-signalfx | 0b0701eda3b9e4c5c590f3547f8cfc5d1ff8b456 | [
"ECL-2.0",
"Apache-2.0"
] | 61 | 2019-11-25T17:04:45.000Z | 2022-03-31T15:44:52.000Z | sdk/python/pulumi_signalfx/heatmap_chart.py | pulumi/pulumi-signalfx | 0b0701eda3b9e4c5c590f3547f8cfc5d1ff8b456 | [
"ECL-2.0",
"Apache-2.0"
] | 2 | 2020-05-08T06:44:20.000Z | 2020-11-24T12:22:55.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from . import _utilities
from . import outputs
from ._inputs import *
__all__ = ['HeatmapChartArgs', 'HeatmapChart']
@pulumi.input_type
class HeatmapChartArgs:
def __init__(__self__, *,
program_text: pulumi.Input[str],
color_range: Optional[pulumi.Input['HeatmapChartColorRangeArgs']] = None,
color_scales: Optional[pulumi.Input[Sequence[pulumi.Input['HeatmapChartColorScaleArgs']]]] = None,
description: Optional[pulumi.Input[str]] = None,
disable_sampling: Optional[pulumi.Input[bool]] = None,
group_bies: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
hide_timestamp: Optional[pulumi.Input[bool]] = None,
max_delay: Optional[pulumi.Input[int]] = None,
minimum_resolution: Optional[pulumi.Input[int]] = None,
name: Optional[pulumi.Input[str]] = None,
refresh_interval: Optional[pulumi.Input[int]] = None,
sort_by: Optional[pulumi.Input[str]] = None,
timezone: Optional[pulumi.Input[str]] = None,
unit_prefix: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a HeatmapChart resource.
:param pulumi.Input[str] program_text: Signalflow program text for the chart. More info at <https://developers.signalfx.com/docs/signalflow-overview>.
:param pulumi.Input['HeatmapChartColorRangeArgs'] color_range: Values and color for the color range. Example: `color_range : { min : 0, max : 100, color : "#0000ff" }`. Look at this [link](https://docs.signalfx.com/en/latest/charts/chart-options-tab.html).
:param pulumi.Input[Sequence[pulumi.Input['HeatmapChartColorScaleArgs']]] color_scales: One to N blocks, each defining a single color range including both the color to display for that range and the borders of the range. Example: `color_scale { gt = 60, color = "blue" } color_scale { lte = 60, color = "yellow" }`. Look at this [link](https://docs.signalfx.com/en/latest/charts/chart-options-tab.html).
:param pulumi.Input[str] description: Description of the chart.
:param pulumi.Input[bool] disable_sampling: If `false`, samples a subset of the output MTS, which improves UI performance. `false` by default.
:param pulumi.Input[Sequence[pulumi.Input[str]]] group_bies: Properties to group by in the heatmap (in nesting order).
:param pulumi.Input[bool] hide_timestamp: Whether to show the timestamp in the chart. `false` by default.
:param pulumi.Input[int] max_delay: How long (in seconds) to wait for late datapoints.
:param pulumi.Input[int] minimum_resolution: The minimum resolution (in seconds) to use for computing the underlying program.
:param pulumi.Input[str] name: Name of the chart.
:param pulumi.Input[int] refresh_interval: How often (in seconds) to refresh the values of the heatmap.
:param pulumi.Input[str] sort_by: The property to use when sorting the elements. Must be prepended with `+` for ascending or `-` for descending (e.g. `-foo`).
:param pulumi.Input[str] timezone: The property value is a string that denotes the geographic region associated with the time zone, (default UTC).
:param pulumi.Input[str] unit_prefix: Must be `"Metric"` or `"Binary`". `"Metric"` by default.
"""
pulumi.set(__self__, "program_text", program_text)
if color_range is not None:
pulumi.set(__self__, "color_range", color_range)
if color_scales is not None:
pulumi.set(__self__, "color_scales", color_scales)
if description is not None:
pulumi.set(__self__, "description", description)
if disable_sampling is not None:
pulumi.set(__self__, "disable_sampling", disable_sampling)
if group_bies is not None:
pulumi.set(__self__, "group_bies", group_bies)
if hide_timestamp is not None:
pulumi.set(__self__, "hide_timestamp", hide_timestamp)
if max_delay is not None:
pulumi.set(__self__, "max_delay", max_delay)
if minimum_resolution is not None:
pulumi.set(__self__, "minimum_resolution", minimum_resolution)
if name is not None:
pulumi.set(__self__, "name", name)
if refresh_interval is not None:
pulumi.set(__self__, "refresh_interval", refresh_interval)
if sort_by is not None:
pulumi.set(__self__, "sort_by", sort_by)
if timezone is not None:
pulumi.set(__self__, "timezone", timezone)
if unit_prefix is not None:
pulumi.set(__self__, "unit_prefix", unit_prefix)
@property
@pulumi.getter(name="programText")
def program_text(self) -> pulumi.Input[str]:
"""
Signalflow program text for the chart. More info at <https://developers.signalfx.com/docs/signalflow-overview>.
"""
return pulumi.get(self, "program_text")
@program_text.setter
def program_text(self, value: pulumi.Input[str]):
pulumi.set(self, "program_text", value)
@property
@pulumi.getter(name="colorRange")
def color_range(self) -> Optional[pulumi.Input['HeatmapChartColorRangeArgs']]:
"""
Values and color for the color range. Example: `color_range : { min : 0, max : 100, color : "#0000ff" }`. Look at this [link](https://docs.signalfx.com/en/latest/charts/chart-options-tab.html).
"""
return pulumi.get(self, "color_range")
@color_range.setter
def color_range(self, value: Optional[pulumi.Input['HeatmapChartColorRangeArgs']]):
pulumi.set(self, "color_range", value)
@property
@pulumi.getter(name="colorScales")
def color_scales(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['HeatmapChartColorScaleArgs']]]]:
"""
One to N blocks, each defining a single color range including both the color to display for that range and the borders of the range. Example: `color_scale { gt = 60, color = "blue" } color_scale { lte = 60, color = "yellow" }`. Look at this [link](https://docs.signalfx.com/en/latest/charts/chart-options-tab.html).
"""
return pulumi.get(self, "color_scales")
@color_scales.setter
def color_scales(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['HeatmapChartColorScaleArgs']]]]):
pulumi.set(self, "color_scales", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
Description of the chart.
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter(name="disableSampling")
def disable_sampling(self) -> Optional[pulumi.Input[bool]]:
"""
If `false`, samples a subset of the output MTS, which improves UI performance. `false` by default.
"""
return pulumi.get(self, "disable_sampling")
@disable_sampling.setter
def disable_sampling(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "disable_sampling", value)
@property
@pulumi.getter(name="groupBies")
def group_bies(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Properties to group by in the heatmap (in nesting order).
"""
return pulumi.get(self, "group_bies")
@group_bies.setter
def group_bies(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "group_bies", value)
@property
@pulumi.getter(name="hideTimestamp")
def hide_timestamp(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to show the timestamp in the chart. `false` by default.
"""
return pulumi.get(self, "hide_timestamp")
@hide_timestamp.setter
def hide_timestamp(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "hide_timestamp", value)
@property
@pulumi.getter(name="maxDelay")
def max_delay(self) -> Optional[pulumi.Input[int]]:
"""
How long (in seconds) to wait for late datapoints.
"""
return pulumi.get(self, "max_delay")
@max_delay.setter
def max_delay(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_delay", value)
@property
@pulumi.getter(name="minimumResolution")
def minimum_resolution(self) -> Optional[pulumi.Input[int]]:
"""
The minimum resolution (in seconds) to use for computing the underlying program.
"""
return pulumi.get(self, "minimum_resolution")
@minimum_resolution.setter
def minimum_resolution(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "minimum_resolution", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
Name of the chart.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="refreshInterval")
def refresh_interval(self) -> Optional[pulumi.Input[int]]:
"""
How often (in seconds) to refresh the values of the heatmap.
"""
return pulumi.get(self, "refresh_interval")
@refresh_interval.setter
def refresh_interval(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "refresh_interval", value)
@property
@pulumi.getter(name="sortBy")
def sort_by(self) -> Optional[pulumi.Input[str]]:
"""
The property to use when sorting the elements. Must be prepended with `+` for ascending or `-` for descending (e.g. `-foo`).
"""
return pulumi.get(self, "sort_by")
@sort_by.setter
def sort_by(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "sort_by", value)
@property
@pulumi.getter
def timezone(self) -> Optional[pulumi.Input[str]]:
"""
The property value is a string that denotes the geographic region associated with the time zone, (default UTC).
"""
return pulumi.get(self, "timezone")
@timezone.setter
def timezone(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "timezone", value)
@property
@pulumi.getter(name="unitPrefix")
def unit_prefix(self) -> Optional[pulumi.Input[str]]:
"""
Must be `"Metric"` or `"Binary`". `"Metric"` by default.
"""
return pulumi.get(self, "unit_prefix")
@unit_prefix.setter
def unit_prefix(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "unit_prefix", value)
@pulumi.input_type
class _HeatmapChartState:
def __init__(__self__, *,
color_range: Optional[pulumi.Input['HeatmapChartColorRangeArgs']] = None,
color_scales: Optional[pulumi.Input[Sequence[pulumi.Input['HeatmapChartColorScaleArgs']]]] = None,
description: Optional[pulumi.Input[str]] = None,
disable_sampling: Optional[pulumi.Input[bool]] = None,
group_bies: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
hide_timestamp: Optional[pulumi.Input[bool]] = None,
max_delay: Optional[pulumi.Input[int]] = None,
minimum_resolution: Optional[pulumi.Input[int]] = None,
name: Optional[pulumi.Input[str]] = None,
program_text: Optional[pulumi.Input[str]] = None,
refresh_interval: Optional[pulumi.Input[int]] = None,
sort_by: Optional[pulumi.Input[str]] = None,
timezone: Optional[pulumi.Input[str]] = None,
unit_prefix: Optional[pulumi.Input[str]] = None,
url: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering HeatmapChart resources.
:param pulumi.Input['HeatmapChartColorRangeArgs'] color_range: Values and color for the color range. Example: `color_range : { min : 0, max : 100, color : "#0000ff" }`. Look at this [link](https://docs.signalfx.com/en/latest/charts/chart-options-tab.html).
:param pulumi.Input[Sequence[pulumi.Input['HeatmapChartColorScaleArgs']]] color_scales: One to N blocks, each defining a single color range including both the color to display for that range and the borders of the range. Example: `color_scale { gt = 60, color = "blue" } color_scale { lte = 60, color = "yellow" }`. Look at this [link](https://docs.signalfx.com/en/latest/charts/chart-options-tab.html).
:param pulumi.Input[str] description: Description of the chart.
:param pulumi.Input[bool] disable_sampling: If `false`, samples a subset of the output MTS, which improves UI performance. `false` by default.
:param pulumi.Input[Sequence[pulumi.Input[str]]] group_bies: Properties to group by in the heatmap (in nesting order).
:param pulumi.Input[bool] hide_timestamp: Whether to show the timestamp in the chart. `false` by default.
:param pulumi.Input[int] max_delay: How long (in seconds) to wait for late datapoints.
:param pulumi.Input[int] minimum_resolution: The minimum resolution (in seconds) to use for computing the underlying program.
:param pulumi.Input[str] name: Name of the chart.
:param pulumi.Input[str] program_text: Signalflow program text for the chart. More info at <https://developers.signalfx.com/docs/signalflow-overview>.
:param pulumi.Input[int] refresh_interval: How often (in seconds) to refresh the values of the heatmap.
:param pulumi.Input[str] sort_by: The property to use when sorting the elements. Must be prepended with `+` for ascending or `-` for descending (e.g. `-foo`).
:param pulumi.Input[str] timezone: The property value is a string that denotes the geographic region associated with the time zone, (default UTC).
:param pulumi.Input[str] unit_prefix: Must be `"Metric"` or `"Binary`". `"Metric"` by default.
:param pulumi.Input[str] url: The URL of the chart.
"""
if color_range is not None:
pulumi.set(__self__, "color_range", color_range)
if color_scales is not None:
pulumi.set(__self__, "color_scales", color_scales)
if description is not None:
pulumi.set(__self__, "description", description)
if disable_sampling is not None:
pulumi.set(__self__, "disable_sampling", disable_sampling)
if group_bies is not None:
pulumi.set(__self__, "group_bies", group_bies)
if hide_timestamp is not None:
pulumi.set(__self__, "hide_timestamp", hide_timestamp)
if max_delay is not None:
pulumi.set(__self__, "max_delay", max_delay)
if minimum_resolution is not None:
pulumi.set(__self__, "minimum_resolution", minimum_resolution)
if name is not None:
pulumi.set(__self__, "name", name)
if program_text is not None:
pulumi.set(__self__, "program_text", program_text)
if refresh_interval is not None:
pulumi.set(__self__, "refresh_interval", refresh_interval)
if sort_by is not None:
pulumi.set(__self__, "sort_by", sort_by)
if timezone is not None:
pulumi.set(__self__, "timezone", timezone)
if unit_prefix is not None:
pulumi.set(__self__, "unit_prefix", unit_prefix)
if url is not None:
pulumi.set(__self__, "url", url)
@property
@pulumi.getter(name="colorRange")
def color_range(self) -> Optional[pulumi.Input['HeatmapChartColorRangeArgs']]:
"""
Values and color for the color range. Example: `color_range : { min : 0, max : 100, color : "#0000ff" }`. Look at this [link](https://docs.signalfx.com/en/latest/charts/chart-options-tab.html).
"""
return pulumi.get(self, "color_range")
@color_range.setter
def color_range(self, value: Optional[pulumi.Input['HeatmapChartColorRangeArgs']]):
pulumi.set(self, "color_range", value)
@property
@pulumi.getter(name="colorScales")
def color_scales(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['HeatmapChartColorScaleArgs']]]]:
"""
One to N blocks, each defining a single color range including both the color to display for that range and the borders of the range. Example: `color_scale { gt = 60, color = "blue" } color_scale { lte = 60, color = "yellow" }`. Look at this [link](https://docs.signalfx.com/en/latest/charts/chart-options-tab.html).
"""
return pulumi.get(self, "color_scales")
@color_scales.setter
def color_scales(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['HeatmapChartColorScaleArgs']]]]):
pulumi.set(self, "color_scales", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
Description of the chart.
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter(name="disableSampling")
def disable_sampling(self) -> Optional[pulumi.Input[bool]]:
"""
If `false`, samples a subset of the output MTS, which improves UI performance. `false` by default.
"""
return pulumi.get(self, "disable_sampling")
@disable_sampling.setter
def disable_sampling(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "disable_sampling", value)
@property
@pulumi.getter(name="groupBies")
def group_bies(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Properties to group by in the heatmap (in nesting order).
"""
return pulumi.get(self, "group_bies")
@group_bies.setter
def group_bies(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "group_bies", value)
@property
@pulumi.getter(name="hideTimestamp")
def hide_timestamp(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to show the timestamp in the chart. `false` by default.
"""
return pulumi.get(self, "hide_timestamp")
@hide_timestamp.setter
def hide_timestamp(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "hide_timestamp", value)
@property
@pulumi.getter(name="maxDelay")
def max_delay(self) -> Optional[pulumi.Input[int]]:
"""
How long (in seconds) to wait for late datapoints.
"""
return pulumi.get(self, "max_delay")
@max_delay.setter
def max_delay(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_delay", value)
@property
@pulumi.getter(name="minimumResolution")
def minimum_resolution(self) -> Optional[pulumi.Input[int]]:
"""
The minimum resolution (in seconds) to use for computing the underlying program.
"""
return pulumi.get(self, "minimum_resolution")
@minimum_resolution.setter
def minimum_resolution(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "minimum_resolution", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
Name of the chart.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="programText")
def program_text(self) -> Optional[pulumi.Input[str]]:
"""
Signalflow program text for the chart. More info at <https://developers.signalfx.com/docs/signalflow-overview>.
"""
return pulumi.get(self, "program_text")
@program_text.setter
def program_text(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "program_text", value)
@property
@pulumi.getter(name="refreshInterval")
def refresh_interval(self) -> Optional[pulumi.Input[int]]:
"""
How often (in seconds) to refresh the values of the heatmap.
"""
return pulumi.get(self, "refresh_interval")
@refresh_interval.setter
def refresh_interval(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "refresh_interval", value)
@property
@pulumi.getter(name="sortBy")
def sort_by(self) -> Optional[pulumi.Input[str]]:
"""
The property to use when sorting the elements. Must be prepended with `+` for ascending or `-` for descending (e.g. `-foo`).
"""
return pulumi.get(self, "sort_by")
@sort_by.setter
def sort_by(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "sort_by", value)
@property
@pulumi.getter
def timezone(self) -> Optional[pulumi.Input[str]]:
"""
The property value is a string that denotes the geographic region associated with the time zone, (default UTC).
"""
return pulumi.get(self, "timezone")
@timezone.setter
def timezone(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "timezone", value)
@property
@pulumi.getter(name="unitPrefix")
def unit_prefix(self) -> Optional[pulumi.Input[str]]:
"""
Must be `"Metric"` or `"Binary`". `"Metric"` by default.
"""
return pulumi.get(self, "unit_prefix")
@unit_prefix.setter
def unit_prefix(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "unit_prefix", value)
@property
@pulumi.getter
def url(self) -> Optional[pulumi.Input[str]]:
"""
The URL of the chart.
"""
return pulumi.get(self, "url")
@url.setter
def url(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "url", value)
class HeatmapChart(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
color_range: Optional[pulumi.Input[pulumi.InputType['HeatmapChartColorRangeArgs']]] = None,
color_scales: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['HeatmapChartColorScaleArgs']]]]] = None,
description: Optional[pulumi.Input[str]] = None,
disable_sampling: Optional[pulumi.Input[bool]] = None,
group_bies: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
hide_timestamp: Optional[pulumi.Input[bool]] = None,
max_delay: Optional[pulumi.Input[int]] = None,
minimum_resolution: Optional[pulumi.Input[int]] = None,
name: Optional[pulumi.Input[str]] = None,
program_text: Optional[pulumi.Input[str]] = None,
refresh_interval: Optional[pulumi.Input[int]] = None,
sort_by: Optional[pulumi.Input[str]] = None,
timezone: Optional[pulumi.Input[str]] = None,
unit_prefix: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
This chart type displays the specified plot in a heatmap fashion. This format is similar to the [Infrastructure Navigator](https://signalfx-product-docs.readthedocs-hosted.com/en/latest/built-in-content/infra-nav.html#infra), with squares representing each source for the selected metric, and the color of each square representing the value range of the metric.
## Example Usage
```python
import pulumi
import pulumi_signalfx as signalfx
myheatmapchart0 = signalfx.HeatmapChart("myheatmapchart0",
color_range=signalfx.HeatmapChartColorRangeArgs(
color="#ff0000",
max_value=100,
min_value=0,
),
color_scales=[
signalfx.HeatmapChartColorScaleArgs(
color="green",
gte=99,
),
signalfx.HeatmapChartColorScaleArgs(
color="yellow",
gte=95,
lt=99,
),
signalfx.HeatmapChartColorScaleArgs(
color="red",
lt=95,
),
],
description="Very cool Heatmap",
disable_sampling=True,
group_bies=[
"hostname",
"host",
],
hide_timestamp=True,
program_text=\"\"\"myfilters = filter("cluster_name", "prod") and filter("role", "search")
data("cpu.total.idle", filter=myfilters).publish()
\"\"\",
sort_by="+host",
timezone="Europe/Paris")
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[pulumi.InputType['HeatmapChartColorRangeArgs']] color_range: Values and color for the color range. Example: `color_range : { min : 0, max : 100, color : "#0000ff" }`. Look at this [link](https://docs.signalfx.com/en/latest/charts/chart-options-tab.html).
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['HeatmapChartColorScaleArgs']]]] color_scales: One to N blocks, each defining a single color range including both the color to display for that range and the borders of the range. Example: `color_scale { gt = 60, color = "blue" } color_scale { lte = 60, color = "yellow" }`. Look at this [link](https://docs.signalfx.com/en/latest/charts/chart-options-tab.html).
:param pulumi.Input[str] description: Description of the chart.
:param pulumi.Input[bool] disable_sampling: If `false`, samples a subset of the output MTS, which improves UI performance. `false` by default.
:param pulumi.Input[Sequence[pulumi.Input[str]]] group_bies: Properties to group by in the heatmap (in nesting order).
:param pulumi.Input[bool] hide_timestamp: Whether to show the timestamp in the chart. `false` by default.
:param pulumi.Input[int] max_delay: How long (in seconds) to wait for late datapoints.
:param pulumi.Input[int] minimum_resolution: The minimum resolution (in seconds) to use for computing the underlying program.
:param pulumi.Input[str] name: Name of the chart.
:param pulumi.Input[str] program_text: Signalflow program text for the chart. More info at <https://developers.signalfx.com/docs/signalflow-overview>.
:param pulumi.Input[int] refresh_interval: How often (in seconds) to refresh the values of the heatmap.
:param pulumi.Input[str] sort_by: The property to use when sorting the elements. Must be prepended with `+` for ascending or `-` for descending (e.g. `-foo`).
:param pulumi.Input[str] timezone: The property value is a string that denotes the geographic region associated with the time zone, (default UTC).
:param pulumi.Input[str] unit_prefix: Must be `"Metric"` or `"Binary`". `"Metric"` by default.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: HeatmapChartArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
This chart type displays the specified plot in a heatmap fashion. This format is similar to the [Infrastructure Navigator](https://signalfx-product-docs.readthedocs-hosted.com/en/latest/built-in-content/infra-nav.html#infra), with squares representing each source for the selected metric, and the color of each square representing the value range of the metric.
## Example Usage
```python
import pulumi
import pulumi_signalfx as signalfx
myheatmapchart0 = signalfx.HeatmapChart("myheatmapchart0",
color_range=signalfx.HeatmapChartColorRangeArgs(
color="#ff0000",
max_value=100,
min_value=0,
),
color_scales=[
signalfx.HeatmapChartColorScaleArgs(
color="green",
gte=99,
),
signalfx.HeatmapChartColorScaleArgs(
color="yellow",
gte=95,
lt=99,
),
signalfx.HeatmapChartColorScaleArgs(
color="red",
lt=95,
),
],
description="Very cool Heatmap",
disable_sampling=True,
group_bies=[
"hostname",
"host",
],
hide_timestamp=True,
program_text=\"\"\"myfilters = filter("cluster_name", "prod") and filter("role", "search")
data("cpu.total.idle", filter=myfilters).publish()
\"\"\",
sort_by="+host",
timezone="Europe/Paris")
```
:param str resource_name: The name of the resource.
:param HeatmapChartArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(HeatmapChartArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
color_range: Optional[pulumi.Input[pulumi.InputType['HeatmapChartColorRangeArgs']]] = None,
color_scales: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['HeatmapChartColorScaleArgs']]]]] = None,
description: Optional[pulumi.Input[str]] = None,
disable_sampling: Optional[pulumi.Input[bool]] = None,
group_bies: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
hide_timestamp: Optional[pulumi.Input[bool]] = None,
max_delay: Optional[pulumi.Input[int]] = None,
minimum_resolution: Optional[pulumi.Input[int]] = None,
name: Optional[pulumi.Input[str]] = None,
program_text: Optional[pulumi.Input[str]] = None,
refresh_interval: Optional[pulumi.Input[int]] = None,
sort_by: Optional[pulumi.Input[str]] = None,
timezone: Optional[pulumi.Input[str]] = None,
unit_prefix: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = HeatmapChartArgs.__new__(HeatmapChartArgs)
__props__.__dict__["color_range"] = color_range
__props__.__dict__["color_scales"] = color_scales
__props__.__dict__["description"] = description
__props__.__dict__["disable_sampling"] = disable_sampling
__props__.__dict__["group_bies"] = group_bies
__props__.__dict__["hide_timestamp"] = hide_timestamp
__props__.__dict__["max_delay"] = max_delay
__props__.__dict__["minimum_resolution"] = minimum_resolution
__props__.__dict__["name"] = name
if program_text is None and not opts.urn:
raise TypeError("Missing required property 'program_text'")
__props__.__dict__["program_text"] = program_text
__props__.__dict__["refresh_interval"] = refresh_interval
__props__.__dict__["sort_by"] = sort_by
__props__.__dict__["timezone"] = timezone
__props__.__dict__["unit_prefix"] = unit_prefix
__props__.__dict__["url"] = None
super(HeatmapChart, __self__).__init__(
'signalfx:index/heatmapChart:HeatmapChart',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
color_range: Optional[pulumi.Input[pulumi.InputType['HeatmapChartColorRangeArgs']]] = None,
color_scales: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['HeatmapChartColorScaleArgs']]]]] = None,
description: Optional[pulumi.Input[str]] = None,
disable_sampling: Optional[pulumi.Input[bool]] = None,
group_bies: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
hide_timestamp: Optional[pulumi.Input[bool]] = None,
max_delay: Optional[pulumi.Input[int]] = None,
minimum_resolution: Optional[pulumi.Input[int]] = None,
name: Optional[pulumi.Input[str]] = None,
program_text: Optional[pulumi.Input[str]] = None,
refresh_interval: Optional[pulumi.Input[int]] = None,
sort_by: Optional[pulumi.Input[str]] = None,
timezone: Optional[pulumi.Input[str]] = None,
unit_prefix: Optional[pulumi.Input[str]] = None,
url: Optional[pulumi.Input[str]] = None) -> 'HeatmapChart':
"""
Get an existing HeatmapChart resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[pulumi.InputType['HeatmapChartColorRangeArgs']] color_range: Values and color for the color range. Example: `color_range : { min : 0, max : 100, color : "#0000ff" }`. Look at this [link](https://docs.signalfx.com/en/latest/charts/chart-options-tab.html).
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['HeatmapChartColorScaleArgs']]]] color_scales: One to N blocks, each defining a single color range including both the color to display for that range and the borders of the range. Example: `color_scale { gt = 60, color = "blue" } color_scale { lte = 60, color = "yellow" }`. Look at this [link](https://docs.signalfx.com/en/latest/charts/chart-options-tab.html).
:param pulumi.Input[str] description: Description of the chart.
:param pulumi.Input[bool] disable_sampling: If `false`, samples a subset of the output MTS, which improves UI performance. `false` by default.
:param pulumi.Input[Sequence[pulumi.Input[str]]] group_bies: Properties to group by in the heatmap (in nesting order).
:param pulumi.Input[bool] hide_timestamp: Whether to show the timestamp in the chart. `false` by default.
:param pulumi.Input[int] max_delay: How long (in seconds) to wait for late datapoints.
:param pulumi.Input[int] minimum_resolution: The minimum resolution (in seconds) to use for computing the underlying program.
:param pulumi.Input[str] name: Name of the chart.
:param pulumi.Input[str] program_text: Signalflow program text for the chart. More info at <https://developers.signalfx.com/docs/signalflow-overview>.
:param pulumi.Input[int] refresh_interval: How often (in seconds) to refresh the values of the heatmap.
:param pulumi.Input[str] sort_by: The property to use when sorting the elements. Must be prepended with `+` for ascending or `-` for descending (e.g. `-foo`).
:param pulumi.Input[str] timezone: The property value is a string that denotes the geographic region associated with the time zone, (default UTC).
:param pulumi.Input[str] unit_prefix: Must be `"Metric"` or `"Binary`". `"Metric"` by default.
:param pulumi.Input[str] url: The URL of the chart.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _HeatmapChartState.__new__(_HeatmapChartState)
__props__.__dict__["color_range"] = color_range
__props__.__dict__["color_scales"] = color_scales
__props__.__dict__["description"] = description
__props__.__dict__["disable_sampling"] = disable_sampling
__props__.__dict__["group_bies"] = group_bies
__props__.__dict__["hide_timestamp"] = hide_timestamp
__props__.__dict__["max_delay"] = max_delay
__props__.__dict__["minimum_resolution"] = minimum_resolution
__props__.__dict__["name"] = name
__props__.__dict__["program_text"] = program_text
__props__.__dict__["refresh_interval"] = refresh_interval
__props__.__dict__["sort_by"] = sort_by
__props__.__dict__["timezone"] = timezone
__props__.__dict__["unit_prefix"] = unit_prefix
__props__.__dict__["url"] = url
return HeatmapChart(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="colorRange")
def color_range(self) -> pulumi.Output[Optional['outputs.HeatmapChartColorRange']]:
"""
Values and color for the color range. Example: `color_range : { min : 0, max : 100, color : "#0000ff" }`. Look at this [link](https://docs.signalfx.com/en/latest/charts/chart-options-tab.html).
"""
return pulumi.get(self, "color_range")
@property
@pulumi.getter(name="colorScales")
def color_scales(self) -> pulumi.Output[Optional[Sequence['outputs.HeatmapChartColorScale']]]:
"""
One to N blocks, each defining a single color range including both the color to display for that range and the borders of the range. Example: `color_scale { gt = 60, color = "blue" } color_scale { lte = 60, color = "yellow" }`. Look at this [link](https://docs.signalfx.com/en/latest/charts/chart-options-tab.html).
"""
return pulumi.get(self, "color_scales")
@property
@pulumi.getter
def description(self) -> pulumi.Output[Optional[str]]:
"""
Description of the chart.
"""
return pulumi.get(self, "description")
@property
@pulumi.getter(name="disableSampling")
def disable_sampling(self) -> pulumi.Output[Optional[bool]]:
"""
If `false`, samples a subset of the output MTS, which improves UI performance. `false` by default.
"""
return pulumi.get(self, "disable_sampling")
@property
@pulumi.getter(name="groupBies")
def group_bies(self) -> pulumi.Output[Optional[Sequence[str]]]:
"""
Properties to group by in the heatmap (in nesting order).
"""
return pulumi.get(self, "group_bies")
@property
@pulumi.getter(name="hideTimestamp")
def hide_timestamp(self) -> pulumi.Output[Optional[bool]]:
"""
Whether to show the timestamp in the chart. `false` by default.
"""
return pulumi.get(self, "hide_timestamp")
@property
@pulumi.getter(name="maxDelay")
def max_delay(self) -> pulumi.Output[Optional[int]]:
"""
How long (in seconds) to wait for late datapoints.
"""
return pulumi.get(self, "max_delay")
@property
@pulumi.getter(name="minimumResolution")
def minimum_resolution(self) -> pulumi.Output[Optional[int]]:
"""
The minimum resolution (in seconds) to use for computing the underlying program.
"""
return pulumi.get(self, "minimum_resolution")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
Name of the chart.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter(name="programText")
def program_text(self) -> pulumi.Output[str]:
"""
Signalflow program text for the chart. More info at <https://developers.signalfx.com/docs/signalflow-overview>.
"""
return pulumi.get(self, "program_text")
@property
@pulumi.getter(name="refreshInterval")
def refresh_interval(self) -> pulumi.Output[Optional[int]]:
"""
How often (in seconds) to refresh the values of the heatmap.
"""
return pulumi.get(self, "refresh_interval")
@property
@pulumi.getter(name="sortBy")
def sort_by(self) -> pulumi.Output[Optional[str]]:
"""
The property to use when sorting the elements. Must be prepended with `+` for ascending or `-` for descending (e.g. `-foo`).
"""
return pulumi.get(self, "sort_by")
@property
@pulumi.getter
def timezone(self) -> pulumi.Output[Optional[str]]:
"""
The property value is a string that denotes the geographic region associated with the time zone, (default UTC).
"""
return pulumi.get(self, "timezone")
@property
@pulumi.getter(name="unitPrefix")
def unit_prefix(self) -> pulumi.Output[Optional[str]]:
"""
Must be `"Metric"` or `"Binary`". `"Metric"` by default.
"""
return pulumi.get(self, "unit_prefix")
@property
@pulumi.getter
def url(self) -> pulumi.Output[str]:
"""
The URL of the chart.
"""
return pulumi.get(self, "url")
| 48.450801 | 429 | 0.642469 | 5,006 | 42,346 | 5.257491 | 0.059728 | 0.091113 | 0.091683 | 0.045974 | 0.929519 | 0.917702 | 0.907215 | 0.902542 | 0.897944 | 0.874425 | 0 | 0.003767 | 0.241416 | 42,346 | 873 | 430 | 48.5063 | 0.815553 | 0.37869 | 0 | 0.844721 | 1 | 0 | 0.109437 | 0.023644 | 0 | 0 | 0 | 0 | 0 | 1 | 0.165631 | false | 0.00207 | 0.014493 | 0 | 0.279503 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
aa38eee1d74b74a40d9bbb669bda282dffcf63be | 146 | py | Python | tests/basics/int_divzero.py | learnforpractice/micropython-cpp | 004bc8382f74899e7b876cc29bfa6a9cc976ba10 | [
"MIT"
] | 13,648 | 2015-01-01T01:34:51.000Z | 2022-03-31T16:19:53.000Z | tests/basics/int_divzero.py | learnforpractice/micropython-cpp | 004bc8382f74899e7b876cc29bfa6a9cc976ba10 | [
"MIT"
] | 7,092 | 2015-01-01T07:59:11.000Z | 2022-03-31T23:52:18.000Z | tests/basics/int_divzero.py | learnforpractice/micropython-cpp | 004bc8382f74899e7b876cc29bfa6a9cc976ba10 | [
"MIT"
] | 4,942 | 2015-01-02T11:48:50.000Z | 2022-03-31T19:57:10.000Z | try:
1 // 0
except ZeroDivisionError:
print("ZeroDivisionError")
try:
1 % 0
except ZeroDivisionError:
print("ZeroDivisionError")
| 14.6 | 30 | 0.684932 | 14 | 146 | 7.142857 | 0.428571 | 0.08 | 0.1 | 0.22 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0.034783 | 0.212329 | 146 | 9 | 31 | 16.222222 | 0.834783 | 0 | 0 | 0.75 | 0 | 0 | 0.232877 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
aa3f27edc9a943f12ba448a10d633f14a93b1a0a | 186 | py | Python | movie_classifier/api/health.py | daniele21/Genre_Detection | c79c62e1c784a3bd89100b791dbe2d717ce2f72e | [
"MIT"
] | null | null | null | movie_classifier/api/health.py | daniele21/Genre_Detection | c79c62e1c784a3bd89100b791dbe2d717ce2f72e | [
"MIT"
] | null | null | null | movie_classifier/api/health.py | daniele21/Genre_Detection | c79c62e1c784a3bd89100b791dbe2d717ce2f72e | [
"MIT"
] | null | null | null | from flask import make_response, jsonify
from flask import current_app as app
@app.route('/health', methods=['GET'])
def health():
return make_response(jsonify({'Status': 'OK'}))
| 20.666667 | 51 | 0.715054 | 26 | 186 | 5 | 0.653846 | 0.138462 | 0.230769 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.134409 | 186 | 8 | 52 | 23.25 | 0.807453 | 0 | 0 | 0 | 0 | 0 | 0.096774 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.4 | 0.2 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
a4d00f95fddf53b77c9a706e186a5575e2b18e2f | 8,033 | py | Python | Resnet18_3D/model.py | aryaman4152/model-implementations-PyTorch | a748bb3661fb26f9230d7420182f170f812298d7 | [
"MIT"
] | 1 | 2021-07-03T08:50:16.000Z | 2021-07-03T08:50:16.000Z | Resnet18_3D/model.py | aryaman4152/model-implementations-pytorch | a748bb3661fb26f9230d7420182f170f812298d7 | [
"MIT"
] | null | null | null | Resnet18_3D/model.py | aryaman4152/model-implementations-pytorch | a748bb3661fb26f9230d7420182f170f812298d7 | [
"MIT"
] | null | null | null | __author__ = "Aryaman Sharma"
class Conv1(torch.nn.Module):
def __init__(self, in_channels=1, out_channels=64):
super(Conv1, self).__init__()
self.mpool = torch.nn.MaxPool3d(kernel_size=3, stride=2, padding=3)
self.conv1 = torch.nn.Conv3d(in_channels=1, out_channels=64, stride=2, padding=3, kernel_size=7, bias=False)
self.bn1 = torch.nn.BatchNorm3d(64)
self.relu = torch.nn.ReLU()
def forward(self, x):
x = self.conv1(x)
x = self.bn1(x)
x = self.relu(x)
return x
class Conv2_x(torch.nn.Module):
def __init__(self, in_channels=64, out_channels=64):
super(Conv2_x, self).__init__()
self.mpool = torch.nn.MaxPool3d(kernel_size=3, stride=2, padding=1)
self.conv2_1 = torch.nn.Conv3d(in_channels=in_channels, out_channels=out_channels, kernel_size=3, stride=1, padding=1, bias=False)
self.bn1 = torch.nn.BatchNorm3d(out_channels)
self.relu = torch.nn.ReLU()
self.conv2_2 = torch.nn.Conv3d(in_channels=out_channels, out_channels=out_channels, kernel_size=3, stride=1, padding=1, bias=False)
self.bn2 = torch.nn.BatchNorm3d(out_channels)
self.conv2_3 = torch.nn.Conv3d(in_channels=out_channels, out_channels=out_channels, kernel_size=3, stride=1, padding=1, bias=False)
self.bn3 = torch.nn.BatchNorm3d(out_channels)
self.conv2_4 = torch.nn.Conv3d(in_channels=out_channels, out_channels=out_channels, kernel_size=3, stride=1, padding=1, bias=False)
self.bn4 = torch.nn.BatchNorm3d(out_channels)
self.identity_downsample = torch.nn.Sequential(
torch.nn.Conv3d(in_channels, out_channels, kernel_size=1, stride=2, bias=False),
torch.nn.BatchNorm3d(out_channels)
)
def forward(self, x):
idx = x.clone()
idx = self.identity_downsample(idx)
x = self.mpool(x)
x = self.conv2_1(x)
x = self.bn1(x)
x = self.relu(x)
x = self.conv2_2(x)
x = self.bn2(x)
x += idx
x = self.relu(x)
idx2 = x.clone()
x = self.conv2_3(x)
x = self.bn3(x)
x = self.relu(x)
x = self.conv2_4(x)
x = self.bn4(x)
x += idx2
x = self.relu(x)
return x
class Conv3_x(torch.nn.Module):
def __init__(self, in_channels=64, out_channels=128):
super(Conv3_x, self).__init__()
self.conv3_1 = torch.nn.Conv3d(in_channels=in_channels, out_channels=out_channels, kernel_size= 3, stride=2, padding=1, bias=False)
self.bn1 = torch.nn.BatchNorm3d(out_channels)
self.conv3_2 = torch.nn.Conv3d(in_channels=out_channels, out_channels=out_channels, kernel_size= 3, stride=1, padding=1, bias=False)
self.bn2 = torch.nn.BatchNorm3d(out_channels)
self.conv3_3 = torch.nn.Conv3d(in_channels=out_channels, out_channels=out_channels, kernel_size= 3, stride=1, padding=1, bias=False)
self.bn3 = torch.nn.BatchNorm3d(out_channels)
self.conv3_4 = torch.nn.Conv3d(in_channels=out_channels, out_channels=out_channels, kernel_size= 3, stride=1, padding=1, bias=False)
self.bn4 = torch.nn.BatchNorm3d(out_channels)
self.relu = torch.nn.ReLU()
self.identity_downsample = torch.nn.Sequential(
torch.nn.Conv3d(in_channels, out_channels, kernel_size=1, stride=2, bias=False),
torch.nn.BatchNorm3d(out_channels)
)
def forward(self, x):
idx = x.clone()
idx = self.identity_downsample(idx)
x = self.conv3_1(x)
x = self.bn1(x)
x = self.relu(x)
x = self.conv3_2(x)
x = self.bn2(x)
x += idx
x = self.relu(x)
idx2 = x.clone()
x = self.conv3_3(x)
x = self.bn3(x)
x = self.relu(x)
x = self.conv3_4(x)
x = self.bn4(x)
x += idx2
x = self.relu(x)
return x
class Conv4_x(torch.nn.Module):
def __init__(self, in_channels=128, out_channels=256):
super(Conv4_x, self).__init__()
self.conv3_1 = torch.nn.Conv3d(in_channels=in_channels, out_channels=out_channels, kernel_size= 3, stride=2, padding=1, bias=False)
self.bn1 = torch.nn.BatchNorm3d(out_channels)
self.conv3_2 = torch.nn.Conv3d(in_channels=out_channels, out_channels=out_channels, kernel_size= 3, stride=1, padding=1, bias=False)
self.bn2 = torch.nn.BatchNorm3d(out_channels)
self.conv3_3 = torch.nn.Conv3d(in_channels=out_channels, out_channels=out_channels, kernel_size= 3, stride=1, padding=1, bias=False)
self.bn3 = torch.nn.BatchNorm3d(out_channels)
self.conv3_4 = torch.nn.Conv3d(in_channels=out_channels, out_channels=out_channels, kernel_size= 3, stride=1, padding=1, bias=False)
self.bn4 = torch.nn.BatchNorm3d(out_channels)
self.relu = torch.nn.ReLU()
self.identity_downsample = torch.nn.Sequential(
torch.nn.Conv3d(in_channels, out_channels, kernel_size=1, stride=2, bias=False),
torch.nn.BatchNorm3d(out_channels)
)
def forward(self, x):
idx = x.clone()
idx = self.identity_downsample(idx)
x = self.conv3_1(x)
x = self.bn1(x)
x = self.relu(x)
x = self.conv3_2(x)
x = self.bn2(x)
x += idx
x = self.relu(x)
idx2 = x.clone()
x = self.conv3_3(x)
x = self.bn3(x)
x = self.relu(x)
x = self.conv3_4(x)
x = self.bn4(x)
x += idx2
x = self.relu(x)
return x
class Conv5_x(torch.nn.Module):
def __init__(self, in_channels=256, out_channels=512):
super(Conv5_x, self).__init__()
self.conv3_1 = torch.nn.Conv3d(in_channels=in_channels, out_channels=out_channels, kernel_size= 3, stride=2, padding=1, bias=False)
self.bn1 = torch.nn.BatchNorm3d(out_channels)
self.conv3_2 = torch.nn.Conv3d(in_channels=out_channels, out_channels=out_channels, kernel_size= 3, stride=1, padding=1, bias=False)
self.bn2 = torch.nn.BatchNorm3d(out_channels)
self.conv3_3 = torch.nn.Conv3d(in_channels=out_channels, out_channels=out_channels, kernel_size= 3, stride=1, padding=1, bias=False)
self.bn3 = torch.nn.BatchNorm3d(out_channels)
self.conv3_4 = torch.nn.Conv3d(in_channels=out_channels, out_channels=out_channels, kernel_size= 3, stride=1, padding=1, bias=False)
self.bn4 = torch.nn.BatchNorm3d(out_channels)
self.relu = torch.nn.ReLU()
self.identity_downsample = torch.nn.Sequential(
torch.nn.Conv3d(in_channels, out_channels, kernel_size=1, stride=2, bias=False),
torch.nn.BatchNorm3d(out_channels)
)
def forward(self, x):
idx = x.clone()
idx = self.identity_downsample(idx)
x = self.conv3_1(x)
x = self.bn1(x)
x = self.relu(x)
x = self.conv3_2(x)
x = self.bn2(x)
x += idx
x = self.relu(x)
idx2 = x.clone()
x = self.conv3_3(x)
x = self.bn3(x)
x = self.relu(x)
x = self.conv3_4(x)
x = self.bn4(x)
x += idx2
x = self.relu(x)
return x
class Res183d(torch.nn.Module):
def __init__(self, in_channels=1, num_classes=1):
super(Res183d, self).__init__()
self.conv1 = Conv1(in_channels=1, out_channels=64)
self.conv2 = Conv2_x(in_channels=64, out_channels=64)
self.conv3 = Conv3_x(in_channels=64, out_channels=128)
self.conv4 = Conv4_x(in_channels=128, out_channels=256)
self.conv5 = Conv5_x(in_channels=256, out_channels=512)
self.apool = torch.nn.AdaptiveAvgPool3d((1,1,1))
self.fc = torch.nn.Linear(512, num_classes)
def forward(self, x):
x = self.conv1(x)
x = self.conv2(x)
x = self.conv3(x)
x = self.conv4(x)
x = self.conv5(x)
x = self.apool(x)
x = torch.flatten(x, 1)
x = self.fc(x)
return x
| 40.366834 | 140 | 0.622432 | 1,192 | 8,033 | 3.989933 | 0.050336 | 0.182717 | 0.191758 | 0.129521 | 0.909378 | 0.902649 | 0.859125 | 0.84714 | 0.845038 | 0.812658 | 0 | 0.051635 | 0.250218 | 8,033 | 198 | 141 | 40.570707 | 0.738004 | 0 | 0 | 0.714286 | 0 | 0 | 0.001743 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.068571 | false | 0 | 0 | 0 | 0.137143 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
a4e6119a73def9701ccc30a479e667b7a8e5e7b4 | 3,295 | py | Python | src/apps/surveys19/tests/setup.py | travishen/alss-dev | 226e8c4f933de39615775a504191428591962c9f | [
"MIT"
] | null | null | null | src/apps/surveys19/tests/setup.py | travishen/alss-dev | 226e8c4f933de39615775a504191428591962c9f | [
"MIT"
] | null | null | null | src/apps/surveys19/tests/setup.py | travishen/alss-dev | 226e8c4f933de39615775a504191428591962c9f | [
"MIT"
] | null | null | null | from django.core.management import call_command
def setup_fixtures():
call_command("loaddata", "fixtures/surveys19/age-scope.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/city-town-code.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/contract.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/education-level.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/farm-related-business.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/farmer-work-day.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/gender.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/income-range.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/lack.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/land-status.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/product-type.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/unit.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/land-type.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/life-style.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/loss.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/management-type.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/market-type.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/month.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/product.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/refuse-reason.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/relationship.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/work-type.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/test/survey.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/test/addressmatch.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/test/annualincome.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/test/business.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/test/cropmarketing.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/test/landarea.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/test/livestockmarketing.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/test/longtermhire.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/test/longtermlack.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/test/nosalaryhire.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/test/numberworkers.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/test/phone.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/test/population.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/test/subsidy.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/test/refuse.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/test/shorttermhire.yaml", verbosity=0)
call_command("loaddata", "fixtures/surveys19/test/shorttermlack.yaml", verbosity=0)
| 71.630435 | 92 | 0.760243 | 395 | 3,295 | 6.237975 | 0.156962 | 0.178571 | 0.300731 | 0.427354 | 0.836851 | 0.82224 | 0.82224 | 0.82224 | 0.578734 | 0 | 0 | 0.038845 | 0.085888 | 3,295 | 45 | 93 | 73.222222 | 0.779216 | 0 | 0 | 0 | 0 | 0 | 0.527769 | 0.43308 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02439 | true | 0 | 0.02439 | 0 | 0.04878 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
a4ecf5346c3410b532a070b6450b22a104a703ca | 152 | py | Python | models/__init__.py | zhangtianer521/GAT | 9697c57127caacd824312289f9b9bd7db71b74a2 | [
"MIT"
] | null | null | null | models/__init__.py | zhangtianer521/GAT | 9697c57127caacd824312289f9b9bd7db71b74a2 | [
"MIT"
] | null | null | null | models/__init__.py | zhangtianer521/GAT | 9697c57127caacd824312289f9b9bd7db71b74a2 | [
"MIT"
] | null | null | null | from .gat import GAT_BNF, GAT_bi_BNF, Brainnetcnn, GAT_AE_BNF, GAT_AE_BNF_single, GAT_FC_BNF, GAT_Node2Vec_BNF, GAT_AE_BNF_V2
from .sp_gat import SpGAT
| 50.666667 | 125 | 0.842105 | 31 | 152 | 3.612903 | 0.419355 | 0.214286 | 0.214286 | 0.196429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014599 | 0.098684 | 152 | 2 | 126 | 76 | 0.80292 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
a4f737225bf5f7d72e48efd8fb393145a3b07d97 | 5,123 | py | Python | tests/search_test.py | mtcolman/django-DefectDojo | 76175aca446e077884bdb5e1d8e2a671a0840775 | [
"BSD-3-Clause"
] | 249 | 2016-09-06T21:04:40.000Z | 2018-01-19T15:59:44.000Z | tests/search_test.py | mtcolman/django-DefectDojo | 76175aca446e077884bdb5e1d8e2a671a0840775 | [
"BSD-3-Clause"
] | 275 | 2021-02-19T15:16:15.000Z | 2022-03-31T21:09:29.000Z | tests/search_test.py | mtcolman/django-DefectDojo | 76175aca446e077884bdb5e1d8e2a671a0840775 | [
"BSD-3-Clause"
] | 152 | 2016-09-06T21:04:54.000Z | 2018-01-18T08:52:24.000Z | import unittest
import sys
from base_test_class import BaseTestCase
from selenium.webdriver.common.by import By
class SearchTests(BaseTestCase):
def test_login(self):
driver = self.driver
def test_search(self):
# very basic search test to see if it doesn't 500
driver = self.goto_some_page()
driver.find_element(By.ID, "simple_search").clear()
driver.find_element(By.ID, "simple_search").send_keys('finding')
driver.find_element(By.ID, "simple_search_submit").click()
def test_search_cve(self):
# very basic search test to see if it doesn't 500
driver = self.goto_some_page()
driver.find_element(By.ID, "simple_search").clear()
driver.find_element(By.ID, "simple_search").send_keys('cve:CVE-2020-12345')
driver.find_element(By.ID, "simple_search_submit").click()
driver.find_element(By.ID, "simple_search").clear()
driver.find_element(By.ID, "simple_search").send_keys('CVE-2020-12345')
driver.find_element(By.ID, "simple_search_submit").click()
def test_search_tag(self):
# very basic search test to see if it doesn't 500
driver = self.goto_some_page()
driver.find_element(By.ID, "simple_search").clear()
driver.find_element(By.ID, "simple_search").send_keys('tag:magento')
driver.find_element(By.ID, "simple_search_submit").click()
def test_search_product_tag(self):
# very basic search test to see if it doesn't 500
driver = self.goto_some_page()
driver.find_element(By.ID, "simple_search").clear()
driver.find_element(By.ID, "simple_search").send_keys('product-tag:java')
driver.find_element(By.ID, "simple_search_submit").click()
def test_search_engagement_tag(self):
# very basic search test to see if it doesn't 500
driver = self.goto_some_page()
driver.find_element(By.ID, "simple_search").clear()
driver.find_element(By.ID, "simple_search").send_keys('engagement-tag:php')
driver.find_element(By.ID, "simple_search_submit").click()
def test_search_test_tag(self):
# very basic search test to see if it doesn't 500
driver = self.goto_some_page()
driver.find_element(By.ID, "simple_search").clear()
driver.find_element(By.ID, "simple_search").send_keys('test-tag:go')
driver.find_element(By.ID, "simple_search_submit").click()
def test_search_tags(self):
# very basic search test to see if it doesn't 500
driver = self.goto_some_page()
driver.find_element(By.ID, "simple_search").clear()
driver.find_element(By.ID, "simple_search").send_keys('tags:php')
driver.find_element(By.ID, "simple_search_submit").click()
def test_search_product_tags(self):
# very basic search test to see if it doesn't 500
driver = self.goto_some_page()
driver.find_element(By.ID, "simple_search").clear()
driver.find_element(By.ID, "simple_search").send_keys('product-tags:java')
driver.find_element(By.ID, "simple_search_submit").click()
def test_search_engagement_tags(self):
# very basic search test to see if it doesn't 500
driver = self.goto_some_page()
driver.find_element(By.ID, "simple_search").clear()
driver.find_element(By.ID, "simple_search").send_keys('engagement-tags:php')
driver.find_element(By.ID, "simple_search_submit").click()
def test_search_test_tags(self):
# very basic search test to see if it doesn't 500
driver = self.goto_some_page()
driver.find_element(By.ID, "simple_search").clear()
driver.find_element(By.ID, "simple_search").send_keys('test-tags:go')
driver.find_element(By.ID, "simple_search_submit").click()
def test_search_id(self):
# very basic search test to see if it doesn't 500
driver = self.goto_some_page()
driver.find_element(By.ID, "simple_search").clear()
driver.find_element(By.ID, "simple_search").send_keys('id:1')
driver.find_element(By.ID, "simple_search_submit").click()
def suite():
suite = unittest.TestSuite()
suite.addTest(BaseTestCase('test_login'))
suite.addTest(BaseTestCase('disable_block_execution'))
suite.addTest(SearchTests('test_search'))
suite.addTest(SearchTests('test_search_cve'))
suite.addTest(SearchTests('test_search_tag'))
suite.addTest(SearchTests('test_search_product_tag'))
suite.addTest(SearchTests('test_search_engagement_tag'))
suite.addTest(SearchTests('test_search_test_tag'))
suite.addTest(SearchTests('test_search_tags'))
suite.addTest(SearchTests('test_search_product_tags'))
suite.addTest(SearchTests('test_search_engagement_tags'))
suite.addTest(SearchTests('test_search_test_tags'))
suite.addTest(SearchTests('test_search_id'))
return suite
if __name__ == "__main__":
runner = unittest.TextTestRunner(descriptions=True, failfast=True, verbosity=2)
ret = not runner.run(suite()).wasSuccessful()
BaseTestCase.tearDownDriver()
sys.exit(ret)
| 43.786325 | 84 | 0.695881 | 719 | 5,123 | 4.699583 | 0.102921 | 0.10654 | 0.181119 | 0.202427 | 0.85913 | 0.829831 | 0.730986 | 0.730986 | 0.730986 | 0.730986 | 0 | 0.012601 | 0.178997 | 5,123 | 116 | 85 | 44.163793 | 0.790775 | 0.102869 | 0 | 0.406977 | 0 | 0 | 0.209424 | 0.031414 | 0 | 0 | 0 | 0 | 0 | 1 | 0.151163 | false | 0 | 0.046512 | 0 | 0.22093 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
3537defa75a6d268dce3867c43b441b2a31aa013 | 31,116 | py | Python | tests/unit_tests/test_michelson/test_micheline.py | m-kus/pytezos | dfb7e34a4ca24b5cf40541900c5f761c61571996 | [
"MIT"
] | null | null | null | tests/unit_tests/test_michelson/test_micheline.py | m-kus/pytezos | dfb7e34a4ca24b5cf40541900c5f761c61571996 | [
"MIT"
] | null | null | null | tests/unit_tests/test_michelson/test_micheline.py | m-kus/pytezos | dfb7e34a4ca24b5cf40541900c5f761c61571996 | [
"MIT"
] | null | null | null | from unittest import TestCase
from parameterized import parameterized
from pytezos.michelson.micheline import blind_unpack
from pytezos.michelson.types.base import MichelsonType
from pytezos.michelson.forge import forge_script_expr, forge_micheline, unforge_micheline
from pytezos.operation.forge import forge_operation_group
unknown_data = [
'0501000000056f776e6572',
'050a000000160000e8b36c80efb51ec85a14562426049aa182a3ce38',
'050100000006706175736564',
'050303',
'05010000000866616c6c6261636b',
'0502000000270316031607430368010000001655706172616d4e6f53756368456e747279506f696e7403420327',
'0501000000086e65774f776e6572',
'050306',
'0501000000096f70657261746f7273',
'050200000000',
'050100000009746f6b656e436f6465',
'050100000005545a425443',
'050100000009746f6b656e4e616d65',
'050100000005545a425443',
'05010000000b746f74616c4275726e6564',
'050000',
'05010000000b746f74616c4d696e746564',
'050000',
'05010000000b746f74616c537570706c79',
'050000',
'05010000000d72656465656d41646472657373',
'050a000000160000e8b36c80efb51ec85a14562426049aa182a3ce38',
'0507070100000004636f6465010000000863616c6c4275726e',
'05020000054903210316051f02000000020317050d0362072f0200000029034f07430368010000001a55706172616d417267756d656e74556e7061636b4661696c6564034203270200000000034203210316051f02000000020317051f02000000c20321074303690a0000000f0501000000096f70657261746f72730329072f020000002507430368010000001a5553746f72653a206e6f206669656c64206f70657261746f727303270200000000050d0566036e072f020000002d0743036801000000225553746f72653a206661696c656420746f20756e7061636b206f70657261746f72730327020000000003480339072c02000000000200000026074307650368036c0707010000001353656e64657249734e6f744f70657261746f72030b0327051f02000000960321074303690a0000001305010000000d72656465656d416464726573730329072f020000002907430368010000001e5553746f72653a206e6f206669656c642072656465656d4164647265737303270200000000050d036e072f02000000310743036801000000265553746f72653a206661696c656420746f20756e7061636b2072656465656d4164647265737303270200000000034203210316051f020000000203170321051f02000002a8034c0342051f02000000020321034c051f02000000020321034c03160743036801000000066c65646765720342030c0329072f020000000c053e076503620760036e03620200000044050d076503620760036e0362072f020000002a07430368010000001f5553746f72653a206661696c656420746f20756e7061636b206c6564676572032702000000000346072f02000000290317074303620000034c03420743036801000000104e6f74456e6f75676842616c616e636503420327020000000003210316071f000202000000020321057000020317034c034b0356072f020000002e0316051f02000000020321034c031703420743036801000000104e6f74456e6f75676842616c616e6365034203270200000000051f020000000d0321051f020000000203170316034c03200342051f02000000020321034c051f02000000700321031603300325072c020000002603210317034503300325072c020000000e0320053e076503620760036e03620200000002034602000000020346034c03160743036801000000066c65646765720342030c051f0200000014072f0200000004053e03690200000004030c034603500317033b051f02000000900321074303690a0000001105010000000b746f74616c537570706c790329072f020000002707430368010000001c5553746f72653a206e6f206669656c6420746f74616c537570706c7903270200000000050d0362072f020000002f0743036801000000245553746f72653a206661696c656420746f20756e7061636b20746f74616c537570706c790327020000000003120356072f020000002a07430368010000001f496e7465726e616c3a204e6567617469766520746f74616c20737570706c7903270200000000030c0346074303690a0000001105010000000b746f74616c537570706c790350051f02000000900321074303690a0000001105010000000b746f74616c4275726e65640329072f020000002707430368010000001c5553746f72653a206e6f206669656c6420746f74616c4275726e656403270200000000050d0362072f020000002f0743036801000000245553746f72653a206661696c656420746f20756e7061636b20746f74616c4275726e6564032702000000000312030c0346074303690a0000001105010000000b746f74616c4275726e65640350053d036d034203210316051f020000000203170342',
'0507070100000004636f6465010000000863616c6c4d696e74',
'05020000042d03210316051f02000000020317050d0765036e0362072f0200000029034f07430368010000001a55706172616d417267756d656e74556e7061636b4661696c6564034203270200000000034203210316051f02000000020317051f02000000c20321074303690a0000000f0501000000096f70657261746f72730329072f020000002507430368010000001a5553746f72653a206e6f206669656c64206f70657261746f727303270200000000050d0566036e072f020000002d0743036801000000225553746f72653a206661696c656420746f20756e7061636b206f70657261746f72730327020000000003480339072c02000000000200000026074307650368036c0707010000001353656e64657249734e6f744f70657261746f72030b03270321051f0200000232051f02000000020321034c051f02000000020321034c03160743036801000000066c65646765720342030c0329072f020000000c053e076503620760036e03620200000044050d076503620760036e0362072f020000002a07430368010000001f5553746f72653a206661696c656420746f20756e7061636b206c6564676572032702000000000346072f02000000350321031703300325072c020000000c053e076503620760036e0362020000001503210317051f02000000060723036e0362034203460200000036051f02000000020321034c0317051f0200000004032103160312051f020000000d0321051f020000000203170316034c032003420346034c0321051f020000003203160743036801000000066c65646765720342030c051f0200000014072f0200000004053e03690200000004030c0346035003170330051f02000000900321074303690a0000001105010000000b746f74616c537570706c790329072f020000002707430368010000001c5553746f72653a206e6f206669656c6420746f74616c537570706c7903270200000000050d0362072f020000002f0743036801000000245553746f72653a206661696c656420746f20756e7061636b20746f74616c537570706c790327020000000003120356072f020000002a07430368010000001f496e7465726e616c3a204e6567617469766520746f74616c20737570706c7903270200000000030c0346074303690a0000001105010000000b746f74616c537570706c7903500317051f02000000900321074303690a0000001105010000000b746f74616c4d696e7465640329072f020000002707430368010000001c5553746f72653a206e6f206669656c6420746f74616c4d696e74656403270200000000050d0362072f020000002f0743036801000000245553746f72653a206661696c656420746f20756e7061636b20746f74616c4d696e746564032702000000000312030c0346074303690a0000001105010000000b746f74616c4d696e7465640350053d036d034203210316051f020000000203170342',
'0507070100000004636f6465010000000963616c6c5061757365',
'05020000014803210316051f02000000020317050d036c072f0200000029034f07430368010000001a55706172616d417267756d656e74556e7061636b4661696c6564034203270200000000034203210316032003170321074303690a0000000f0501000000096f70657261746f72730329072f020000002507430368010000001a5553746f72653a206e6f206669656c64206f70657261746f727303270200000000050d0566036e072f020000002d0743036801000000225553746f72653a206661696c656420746f20756e7061636b206f70657261746f72730327020000000003480339072c02000000000200000026074307650368036c0707010000001353656e64657249734e6f744f70657261746f72030b032707430359030a030c0346074303690a0000000c0501000000067061757365640350053d036d034203210316051f020000000203170342',
'0507070100000004636f6465010000000b63616c6c417070726f7665',
'05020000037103210316051f02000000020317050d0765036e0362072f0200000029034f07430368010000001a55706172616d417267756d656e74556e7061636b4661696c6564034203270200000000034203210316051f0200000002031703480342051f02000000b40321074303690a0000000c0501000000067061757365640329072f02000000220743036801000000175553746f72653a206e6f206669656c642070617573656403270200000000050d0359072f020000002a07430368010000001f5553746f72653a206661696c656420746f20756e7061636b2070617573656403270200000000072c0200000027034f074303680100000018546f6b656e4f7065726174696f6e73417265506175736564034203270200000000051f02000000020321034c051f02000000020321034c0321051f020000008703160743036801000000066c65646765720342030c0329072f020000000c053e076503620760036e03620200000044050d076503620760036e0362072f020000002a07430368010000001f5553746f72653a206661696c656420746f20756e7061636b206c6564676572032702000000000346072f02000000060723036e036202000000020317031703160329072f02000000060743036200000200000000032103300325072c020000000203200200000043051f02000000020321034c0317031703300325072c020000000203200200000022074303680100000015556e73616665416c6c6f77616e63654368616e676503420327051f02000000020321034c051f020000000403210316034c0743036801000000066c65646765720342030c0329072f020000000c053e076503620760036e03620200000044050d076503620760036e0362072f020000002a07430368010000001f5553746f72653a206661696c656420746f20756e7061636b206c6564676572032702000000000346072f02000000140723036e036207430362000003420723036e0362020000000403210317071f0002020000000203210570000203170317032103300325072c02000000060320053e036202000000020346071f00030200000002032105700003031703160350051f020000000d0321051f020000000203160317034c0320034c0342034c03160743036801000000066c65646765720342030c051f0200000004030c03460350053d036d034203210316051f020000000203170342',
'0507070100000004636f6465010000000b63616c6c556e7061757365',
'05020000013503210316051f02000000020317050d036c072f0200000029034f07430368010000001a55706172616d417267756d656e74556e7061636b4661696c6564034203270200000000034203210316032003170321074303690a0000000b0501000000056f776e65720329072f02000000210743036801000000165553746f72653a206e6f206669656c64206f776e657203270200000000050d036e072f020000002907430368010000001e5553746f72653a206661696c656420746f20756e7061636b206f776e657203270200000000034803190325072c0200000000020000001f034f07430368010000001053656e64657249734e6f744f776e657203420327074303590303030c0346074303690a0000000c0501000000067061757365640350053d036d034203210316051f020000000203170342',
'0507070100000004636f6465010000000c63616c6c4765744f776e6572',
'05020000016703210316051f02000000020317050d0765036c036e072f0200000029034f07430368010000001a55706172616d417267756d656e74556e7061636b4661696c6564034203270200000000034203210316051f0200000002031703210316051f02000000020317051f02000000350555036e072f0200000025034f074303680100000016556e6578706563746564436f6e747261637454797065034203270200000000034203210316051f02000000020317051f020000000b051f02000000020321034c03420317074303690a0000000b0501000000056f776e65720329072f02000000210743036801000000165553746f72653a206e6f206669656c64206f776e657203270200000000050d036e072f020000002907430368010000001e5553746f72653a206661696c656420746f20756e7061636b206f776e657203270200000000051f02000000020313034d053d036d034c031b034203210316051f020000000203170342',
'0507070100000004636f6465010000000c63616c6c5472616e73666572',
'0502000008d203210316051f02000000020317050d0765036e0765036e0362072f0200000029034f07430368010000001a55706172616d417267756d656e74556e7061636b4661696c6564034203270200000000034203210316051f02000000020317051f02000000b40321074303690a0000000c0501000000067061757365640329072f02000000220743036801000000175553746f72653a206e6f206669656c642070617573656403270200000000050d0359072f020000002a07430368010000001f5553746f72653a206661696c656420746f20756e7061636b2070617573656403270200000000072c0200000027034f074303680100000018546f6b656e4f7065726174696f6e734172655061757365640342032702000000000321032103170316051f0200000002031603190325072c02000000020320020000078203210316034803190325072c020000000002000002790321051f02000002700321051f02000000b5051f020000000203210316034803420321051f020000008703170743036801000000066c65646765720342030c0329072f020000000c053e076503620760036e03620200000044050d076503620760036e0362072f020000002a07430368010000001f5553746f72653a206661696c656420746f20756e7061636b206c6564676572032702000000000346072f02000000060723036e03620200000002031703160329072f0200000006074303620000020000000003210316051f020000006b0348051f0200000060032103170317071f00020200000002032105700002034b0356072f020000003b051f02000000020321034c051f02000000020321034c0317031703420743036801000000124e6f74456e6f756768416c6c6f77616e636503420327020000000003420342051f020000000403200320051f02000000020321034c051f020000000403210316034c0743036801000000066c65646765720342030c0329072f020000000c053e076503620760036e03620200000044050d076503620760036e0362072f020000002a07430368010000001f5553746f72653a206661696c656420746f20756e7061636b206c6564676572032702000000000346072f02000000140723036e036207430362000003420723036e0362020000000403210317071f0002020000000203210570000203170317032103300325072c02000000060320053e036202000000020346071f00030200000002032105700003031703160350051f020000000d0321051f020000000203160317034c0320034c0342034c03160743036801000000066c65646765720342030c051f0200000004030c03460350051f02000000020321034c051f02000000020321034c031703160743036801000000066c65646765720342030c0329072f020000000c053e076503620760036e03620200000044050d076503620760036e0362072f020000002a07430368010000001f5553746f72653a206661696c656420746f20756e7061636b206c6564676572032702000000000346072f020000003903210317031703300325072c020000000c053e076503620760036e03620200000017032103170317051f02000000060723036e0362034203460200000038051f02000000020321034c03170317051f0200000004032103160312051f020000000d0321051f020000000203170316034c032003420346034c0321051f0200000034031703160743036801000000066c65646765720342030c051f0200000014072f0200000004053e03690200000004030c034603500321051f02000000f7031703170330051f02000000900321074303690a0000001105010000000b746f74616c537570706c790329072f020000002707430368010000001c5553746f72653a206e6f206669656c6420746f74616c537570706c7903270200000000050d0362072f020000002f0743036801000000245553746f72653a206661696c656420746f20756e7061636b20746f74616c537570706c790327020000000003120356072f020000002a07430368010000001f496e7465726e616c3a204e6567617469766520746f74616c20737570706c7903270200000000030c0346074303690a0000001105010000000b746f74616c537570706c790350051f02000000020321034c051f02000000020321034c03160743036801000000066c65646765720342030c0329072f020000000c053e076503620760036e03620200000044050d076503620760036e0362072f020000002a07430368010000001f5553746f72653a206661696c656420746f20756e7061636b206c6564676572032702000000000346072f020000002b03170317074303620000034c03420743036801000000104e6f74456e6f75676842616c616e636503420327020000000003210316071f0002020000000203210570000203170317034c034b0356072f02000000300316051f02000000020321034c0317031703420743036801000000104e6f74456e6f75676842616c616e6365034203270200000000051f020000000d0321051f020000000203170316034c03200342051f02000000020321034c051f02000000700321031603300325072c020000002603210317034503300325072c020000000e0320053e076503620760036e03620200000002034602000000020346034c03160743036801000000066c65646765720342030c051f0200000014072f0200000004053e03690200000004030c0346035003170317033b051f02000000900321074303690a0000001105010000000b746f74616c537570706c790329072f020000002707430368010000001c5553746f72653a206e6f206669656c6420746f74616c537570706c7903270200000000050d0362072f020000002f0743036801000000245553746f72653a206661696c656420746f20756e7061636b20746f74616c537570706c790327020000000003120356072f020000002a07430368010000001f496e7465726e616c3a204e6567617469766520746f74616c20737570706c7903270200000000030c0346074303690a0000001105010000000b746f74616c537570706c790350053d036d034203210316051f020000000203170342',
'0507070100000004636f6465010000000e63616c6c47657442616c616e6365',
'05020000017b03210316051f02000000020317050d0765036e036e072f0200000029034f07430368010000001a55706172616d417267756d656e74556e7061636b4661696c6564034203270200000000034203210316051f0200000002031703210316051f02000000020317051f020000003505550362072f0200000025034f074303680100000016556e6578706563746564436f6e747261637454797065034203270200000000034203210316051f02000000020317051f020000000b051f02000000020321034c034203210316051f020000000203170743036801000000066c65646765720342030c0329072f020000000c053e076503620760036e03620200000044050d076503620760036e0362072f020000002a07430368010000001f5553746f72653a206661696c656420746f20756e7061636b206c6564676572032702000000000346072f020000000607430362000002000000020316051f02000000020313034d053d036d034c031b034203210316051f020000000203170342',
'0507070100000004636f6465010000000f63616c6c4164644f70657261746f72',
'0502000001d903210316051f02000000020317050d036e072f0200000029034f07430368010000001a55706172616d417267756d656e74556e7061636b4661696c6564034203270200000000034203210316051f02000000020317051f02000000af0321074303690a0000000b0501000000056f776e65720329072f02000000210743036801000000165553746f72653a206e6f206669656c64206f776e657203270200000000050d036e072f020000002907430368010000001e5553746f72653a206661696c656420746f20756e7061636b206f776e657203270200000000034803190325072c0200000000020000001f034f07430368010000001053656e64657249734e6f744f776e657203420327051f02000000920321074303690a0000000f0501000000096f70657261746f72730329072f020000002507430368010000001a5553746f72653a206e6f206669656c64206f70657261746f727303270200000000050d0566036e072f020000002d0743036801000000225553746f72653a206661696c656420746f20756e7061636b206f70657261746f72730327020000000007430359030a0350030c0346074303690a0000000f0501000000096f70657261746f72730350053d036d034203210316051f020000000203170342',
'0507070100000004636f6465010000001063616c6c476574416c6c6f77616e6365',
'0502000001a003210316051f02000000020317050d07650765036e036e036e072f0200000029034f07430368010000001a55706172616d417267756d656e74556e7061636b4661696c6564034203270200000000034203210316051f0200000002031703210316051f02000000020317051f020000003505550362072f0200000025034f074303680100000016556e6578706563746564436f6e747261637454797065034203270200000000034203210316051f02000000020317051f020000000b051f02000000020321034c034203210316051f020000000203170321051f020000008703160743036801000000066c65646765720342030c0329072f020000000c053e076503620760036e03620200000044050d076503620760036e0362072f020000002a07430368010000001f5553746f72653a206661696c656420746f20756e7061636b206c6564676572032702000000000346072f02000000060723036e03620200000002031703170329072f02000000060743036200000200000000051f02000000020313034d053d036d034c031b034203210316051f020000000203170342',
'0507070100000004636f6465010000001063616c6c476574546f6b656e436f6465',
'05020000017303210316051f02000000020317050d0765036c036e072f0200000029034f07430368010000001a55706172616d417267756d656e74556e7061636b4661696c6564034203270200000000034203210316051f0200000002031703210316051f02000000020317051f020000003505550368072f0200000025034f074303680100000016556e6578706563746564436f6e747261637454797065034203270200000000034203210316051f02000000020317051f020000000b051f02000000020321034c03420317074303690a0000000f050100000009746f6b656e436f64650329072f020000002507430368010000001a5553746f72653a206e6f206669656c6420746f6b656e436f646503270200000000050d0368072f020000002d0743036801000000225553746f72653a206661696c656420746f20756e7061636b20746f6b656e436f646503270200000000051f02000000020313034d053d036d034c031b034203210316051f020000000203170342',
'0507070100000004636f6465010000001063616c6c476574546f6b656e4e616d65',
'05020000017303210316051f02000000020317050d0765036c036e072f0200000029034f07430368010000001a55706172616d417267756d656e74556e7061636b4661696c6564034203270200000000034203210316051f0200000002031703210316051f02000000020317051f020000003505550368072f0200000025034f074303680100000016556e6578706563746564436f6e747261637454797065034203270200000000034203210316051f02000000020317051f020000000b051f02000000020321034c03420317074303690a0000000f050100000009746f6b656e4e616d650329072f020000002507430368010000001a5553746f72653a206e6f206669656c6420746f6b656e4e616d6503270200000000050d0368072f020000002d0743036801000000225553746f72653a206661696c656420746f20756e7061636b20746f6b656e4e616d6503270200000000051f02000000020313034d053d036d034c031b034203210316051f020000000203170342',
'0507070100000004636f6465010000001263616c6c476574546f74616c4275726e6564',
'05020000017903210316051f02000000020317050d0765036c036e072f0200000029034f07430368010000001a55706172616d417267756d656e74556e7061636b4661696c6564034203270200000000034203210316051f0200000002031703210316051f02000000020317051f020000003505550362072f0200000025034f074303680100000016556e6578706563746564436f6e747261637454797065034203270200000000034203210316051f02000000020317051f020000000b051f02000000020321034c03420317074303690a0000001105010000000b746f74616c4275726e65640329072f020000002707430368010000001c5553746f72653a206e6f206669656c6420746f74616c4275726e656403270200000000050d0362072f020000002f0743036801000000245553746f72653a206661696c656420746f20756e7061636b20746f74616c4275726e656403270200000000051f02000000020313034d053d036d034c031b034203210316051f020000000203170342',
'0507070100000004636f6465010000001263616c6c476574546f74616c4d696e746564',
'05020000017903210316051f02000000020317050d0765036c036e072f0200000029034f07430368010000001a55706172616d417267756d656e74556e7061636b4661696c6564034203270200000000034203210316051f0200000002031703210316051f02000000020317051f020000003505550362072f0200000025034f074303680100000016556e6578706563746564436f6e747261637454797065034203270200000000034203210316051f02000000020317051f020000000b051f02000000020321034c03420317074303690a0000001105010000000b746f74616c4d696e7465640329072f020000002707430368010000001c5553746f72653a206e6f206669656c6420746f74616c4d696e74656403270200000000050d0362072f020000002f0743036801000000245553746f72653a206661696c656420746f20756e7061636b20746f74616c4d696e74656403270200000000051f02000000020313034d053d036d034c031b034203210316051f020000000203170342',
'0507070100000004636f6465010000001263616c6c476574546f74616c537570706c79',
'05020000017903210316051f02000000020317050d0765036c036e072f0200000029034f07430368010000001a55706172616d417267756d656e74556e7061636b4661696c6564034203270200000000034203210316051f0200000002031703210316051f02000000020317051f020000003505550362072f0200000025034f074303680100000016556e6578706563746564436f6e747261637454797065034203270200000000034203210316051f02000000020317051f020000000b051f02000000020321034c03420317074303690a0000001105010000000b746f74616c537570706c790329072f020000002707430368010000001c5553746f72653a206e6f206669656c6420746f74616c537570706c7903270200000000050d0362072f020000002f0743036801000000245553746f72653a206661696c656420746f20756e7061636b20746f74616c537570706c7903270200000000051f02000000020313034d053d036d034c031b034203210316051f020000000203170342',
'0507070100000004636f6465010000001263616c6c52656d6f76654f70657261746f72',
'0502000001d903210316051f02000000020317050d036e072f0200000029034f07430368010000001a55706172616d417267756d656e74556e7061636b4661696c6564034203270200000000034203210316051f02000000020317051f02000000af0321074303690a0000000b0501000000056f776e65720329072f02000000210743036801000000165553746f72653a206e6f206669656c64206f776e657203270200000000050d036e072f020000002907430368010000001e5553746f72653a206661696c656420746f20756e7061636b206f776e657203270200000000034803190325072c0200000000020000001f034f07430368010000001053656e64657249734e6f744f776e657203420327051f02000000920321074303690a0000000f0501000000096f70657261746f72730329072f020000002507430368010000001a5553746f72653a206e6f206669656c64206f70657261746f727303270200000000050d0566036e072f020000002d0743036801000000225553746f72653a206661696c656420746f20756e7061636b206f70657261746f7273032702000000000743035903030350030c0346074303690a0000000f0501000000096f70657261746f72730350053d036d034203210316051f020000000203170342',
'0507070100000004636f6465010000001363616c6c4163636570744f776e657273686970',
'05020000025003210316051f02000000020317050d036c072f0200000029034f07430368010000001a55706172616d417267756d656e74556e7061636b4661696c6564034203270200000000034203210316032003170321074303690a0000000e0501000000086e65774f776e65720329072f02000000240743036801000000195553746f72653a206e6f206669656c64206e65774f776e657203270200000000050d0563036e072f020000002c0743036801000000215553746f72653a206661696c656420746f20756e7061636b206e65774f776e657203270200000000072f0200000029034f07430368010000001a4e6f74496e5472616e736665724f776e6572736869704d6f6465034203270200000034034803190325072c02000000000200000022034f07430368010000001353656e64657249734e6f744e65774f776e6572034203270321074303690a0000000e0501000000086e65774f776e65720329072f02000000240743036801000000195553746f72653a206e6f206669656c64206e65774f776e657203270200000000050d0563036e072f020000002c0743036801000000215553746f72653a206661696c656420746f20756e7061636b206e65774f776e657203270200000000072f0200000029034f07430368010000001a4e6f74496e5472616e736665724f776e6572736869704d6f646503420327020000003b030c0346074303690a0000000b0501000000056f776e65720350053e036e030c0346074303690a0000000e0501000000086e65774f776e65720350053d036d034203210316051f020000000203170342',
'0507070100000004636f6465010000001463616c6c47657452656465656d41646472657373',
'05020000017f03210316051f02000000020317050d0765036c036e072f0200000029034f07430368010000001a55706172616d417267756d656e74556e7061636b4661696c6564034203270200000000034203210316051f0200000002031703210316051f02000000020317051f02000000350555036e072f0200000025034f074303680100000016556e6578706563746564436f6e747261637454797065034203270200000000034203210316051f02000000020317051f020000000b051f02000000020321034c03420317074303690a0000001305010000000d72656465656d416464726573730329072f020000002907430368010000001e5553746f72653a206e6f206669656c642072656465656d4164647265737303270200000000050d036e072f02000000310743036801000000265553746f72653a206661696c656420746f20756e7061636b2072656465656d4164647265737303270200000000051f02000000020313034d053d036d034c031b034203210316051f020000000203170342',
'0507070100000004636f6465010000001463616c6c53657452656465656d41646472657373',
'05020000014203210316051f02000000020317050d036e072f0200000029034f07430368010000001a55706172616d417267756d656e74556e7061636b4661696c6564034203270200000000034203210316051f02000000020317051f02000000af0321074303690a0000000b0501000000056f776e65720329072f02000000210743036801000000165553746f72653a206e6f206669656c64206f776e657203270200000000050d036e072f020000002907430368010000001e5553746f72653a206661696c656420746f20756e7061636b206f776e657203270200000000034803190325072c0200000000020000001f034f07430368010000001053656e64657249734e6f744f776e657203420327030c0346074303690a0000001305010000000d72656465656d416464726573730350053d036d034203210316051f020000000203170342',
'0507070100000004636f6465010000001563616c6c5472616e736665724f776e657273686970',
'05020000013f03210316051f02000000020317050d036e072f0200000029034f07430368010000001a55706172616d417267756d656e74556e7061636b4661696c6564034203270200000000034203210316051f02000000020317051f02000000af0321074303690a0000000b0501000000056f776e65720329072f02000000210743036801000000165553746f72653a206e6f206669656c64206f776e657203270200000000050d036e072f020000002907430368010000001e5553746f72653a206661696c656420746f20756e7061636b206f776e657203270200000000034803190325072c0200000000020000001f034f07430368010000001053656e64657249734e6f744f776e6572034203270346030c0346074303690a0000000e0501000000086e65774f776e65720350053d036d034203210316051f020000000203170342']
class TestPacking(TestCase):
@parameterized.expand([
({"bytes": "000018896fcfc6690baefa9aedc6d759f9bf05727e8c"},
{"prim": "address"},
"expru2YV8AanTTUSV4K21P7X4DzbuWQFVk7NewDuP1A5uamffiiFA3"),
({"string": "tz1MsmYzmqxHs9trE1qQugZxxcLPqAXdQaX9"},
{"prim": "address"},
"expru2YV8AanTTUSV4K21P7X4DzbuWQFVk7NewDuP1A5uamffiiFA3"),
({"string": "Game one!"},
{"prim": "string"},
"exprtiRSZkLKYRess9GZ3ryb4cVQD36WLo2oysZBFxKTZ2jXqcHWGj"),
({"int": "505506"},
{"prim": "int"},
"exprufzwVGdAX7zG91UpiAkR2yVxEDE75tHD5YgSBmYMUx22teZTCM"),
([{"int": "1"}, {"int": "1"}, {"int": "1"}, {"int": "1"}],
{"prim": "pair", "args": [{"prim": "int"}, {"prim": "int"}, {"prim": "int"}, {"prim": "int"}]},
"expruN32WETsB2Dx1AynDmMufVr1As9qdnjRxKQ82rk2qZ4uxuKVMK")
])
def test_get_key_hash(self, val_expr, type_expr, expected):
ty = MichelsonType.match(type_expr)
key = ty.from_micheline_value(val_expr).pack(legacy=True)
self.assertEqual(expected, forge_script_expr(key))
@parameterized.expand([(x,) for x in unknown_data])
def test_blind_unpack(self, data):
data = bytes.fromhex(data)
res = blind_unpack(data)
self.assertNotEqual(data, res)
def test_regr_local_remote_diff(self):
opg = {'branch': 'BKpLvH3E3bUa5Z2nb3RkH2p6EKLfymvxUAEgtRJnu4m9UX1TWUb',
'contents': [{'amount': '0',
'counter': '446245',
'destination': 'KT1VYUxhLoSvouozCaDGL1XcswnagNfwr3yi',
'fee': '104274',
'gas_limit': '1040000',
'kind': 'transaction',
'parameters': {'entrypoint': 'default',
'value': {'prim': 'Unit'}},
'source': 'tz1grSQDByRpnVs7sPtaprNZRp531ZKz6Jmm',
'storage_limit': '60000'}],
'protocol': 'PsCARTHAGazKbHtnKfLzQg3kms52kSRpgnDY982a9oYsSXRLQEb',
'signature': None}
local = forge_operation_group(opg).hex()
remote = "0dc397b7865779d87bd47d406e8b4eee84498f22ab01dff124433c7f057af5ae6c00e8b36c80efb51ec85a1456" \
"2426049aa182a3ce38d2ae06a59e1b80bd3fe0d4030001e5ebf2dcc7dcc9d13c2c45cd76823dd604740c7f0000"
self.assertEqual(remote, local)
def test_forge_combs(self):
expr = {'prim': 'Pair', 'args': [{'int': '1'}, {'int': '2'}, {'int': '3'}, {'int': '4'}]}
self.assertEqual(expr, unforge_micheline(forge_micheline(expr)))
def test_prim_sequence_three_args(self):
packed = "0502000000f003200743036e0a00000016010cd84cb6f78f1e146e5e86b3648327edfd45618e0007430368010000000c63616c6c6261636b2d343034037706550765096500000031046e0000000625766f746572045d0000000a2563616e64696461746504590000000f25657865637574655f766f74696e670000000c25766f74655f706172616d73046e00000007256275636b657400000010256c61756e63685f63616c6c6261636b072f020000000203270200000004034c03200743036a00000521000307430359030a0743035d0a00000015002523250b271e153be6c2668954114be101d04d3d05700005054200030342034d"
data = bytes.fromhex(packed)
result = unforge_micheline(data[1:])
expected_result = [{'prim': 'DROP'}, {'prim': 'PUSH', 'args': [{'prim': 'address'}, {'bytes': '010cd84cb6f78f1e146e5e86b3648327edfd45618e00'}]}, {'prim': 'PUSH', 'args': [{'prim': 'string'}, {'string': 'callback-404'}]}, {'prim': 'SELF_ADDRESS'}, {'prim': 'CONTRACT', 'args': [{'prim': 'pair', 'args': [{'prim': 'pair', 'args': [{'prim': 'address', 'annots': ['%voter']}, {'prim': 'key_hash', 'annots': ['%candidate']}, {'prim': 'bool', 'annots': ['%execute_voting']}], 'annots': ['%vote_params']}, {'prim': 'address', 'annots': ['%bucket']}]}], 'annots': ['%launch_callback']}, {'prim': 'IF_NONE', 'args': [[{'prim': 'FAILWITH'}], [{'prim': 'SWAP'}, {'prim': 'DROP'}]]}, {'prim': 'PUSH', 'args': [{'prim': 'mutez'}, {'int': '0'}]}, {'prim': 'DUP', 'args': [{'int': '3'}]}, {'prim': 'PUSH', 'args': [{'prim': 'bool'}, {'prim': 'True'}]}, {'prim': 'PUSH', 'args': [{'prim': 'key_hash'}, {'bytes': '002523250b271e153be6c2668954114be101d04d3d'}]}, {'prim': 'DIG', 'args': [{'int': '5'}]}, {'prim': 'PAIR', 'args': [{'int': '3'}]}, {'prim': 'PAIR'}, {'prim': 'TRANSFER_TOKENS'}]
self.assertListEqual(result, expected_result)
self.assertListEqual(expected_result, unforge_micheline(forge_micheline(expected_result))) | 225.478261 | 4,535 | 0.926919 | 428 | 31,116 | 67.261682 | 0.450935 | 0.002779 | 0.002084 | 0.002779 | 0.004307 | 0.003196 | 0 | 0 | 0 | 0 | 0 | 0.816791 | 0.039947 | 31,116 | 138 | 4,536 | 225.478261 | 0.146888 | 0 | 0 | 0.08871 | 0 | 0 | 0.89321 | 0.866215 | 0 | 1 | 0 | 0 | 0.048387 | 1 | 0.040323 | false | 0 | 0.048387 | 0 | 0.096774 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
103ce768ea1a39547f491c702fcbb3fc50cc85c0 | 3,064 | py | Python | python_modules/dagster/dagster_tests/utils_tests/test_typing_api.py | JPeer264/dagster-fork | 32cc87a36134be7c442fa85d6867eb1d3301aea0 | [
"Apache-2.0"
] | 3 | 2020-09-09T04:10:23.000Z | 2021-11-08T02:10:42.000Z | python_modules/dagster/dagster_tests/utils_tests/test_typing_api.py | JPeer264/dagster-fork | 32cc87a36134be7c442fa85d6867eb1d3301aea0 | [
"Apache-2.0"
] | 2 | 2021-05-11T13:36:27.000Z | 2021-09-03T01:53:11.000Z | python_modules/dagster/dagster_tests/utils_tests/test_typing_api.py | JPeer264/dagster-fork | 32cc87a36134be7c442fa85d6867eb1d3301aea0 | [
"Apache-2.0"
] | 1 | 2021-02-21T12:16:47.000Z | 2021-02-21T12:16:47.000Z | import typing
from dagster.utils.typing_api import (
get_optional_inner_type,
is_closed_python_dict_type,
is_closed_python_list_type,
is_closed_python_optional_type,
is_closed_python_set_type,
is_closed_python_tuple_type,
)
def test_closed_python_dict():
assert is_closed_python_dict_type(typing.Dict[str, int]) is True
assert is_closed_python_dict_type(dict) is False
assert is_closed_python_dict_type(typing.Dict) is False
assert is_closed_python_dict_type(None) is False
assert is_closed_python_dict_type(1) is False
assert is_closed_python_dict_type('foobar') is False
assert is_closed_python_dict_type(typing.Optional) is False
assert is_closed_python_dict_type(typing.List) is False
def test_is_typing_optional_py_3():
assert is_closed_python_optional_type(typing.Optional[int])
assert not is_closed_python_optional_type(typing.Optional)
assert not is_closed_python_optional_type(None)
assert not is_closed_python_optional_type(int)
assert not is_closed_python_optional_type(list)
assert not is_closed_python_optional_type('foobar')
def test_get_inner_optional_py_3():
assert get_optional_inner_type(typing.Optional[int]) is int
def test_closed_tuple_type():
assert is_closed_python_tuple_type(typing.Tuple[int, str]) is True
assert is_closed_python_tuple_type(tuple) is False
assert is_closed_python_tuple_type(typing.Tuple) is False
assert is_closed_python_tuple_type(1) is False
assert is_closed_python_tuple_type('foobar') is False
assert is_closed_python_tuple_type(typing.Optional) is False
assert is_closed_python_tuple_type(typing.List) is False
def test_closed_set_type():
assert is_closed_python_set_type(typing.Set[int]) is True
assert is_closed_python_set_type(set) is False
assert is_closed_python_set_type(typing.Set) is False
assert is_closed_python_set_type(1) is False
assert is_closed_python_set_type('foobar') is False
assert is_closed_python_set_type(typing.Optional) is False
assert is_closed_python_set_type(typing.List) is False
assert is_closed_python_set_type(typing.Dict) is False
assert is_closed_python_set_type(typing.Dict[int, str]) is False
assert is_closed_python_set_type(typing.Tuple) is False
assert is_closed_python_set_type(typing.Tuple[int, str]) is False
def test_closed_list_type():
assert is_closed_python_list_type(typing.List[int]) is True
assert is_closed_python_list_type(typing.List) is False
assert is_closed_python_list_type(list) is False
assert is_closed_python_list_type(None) is False
assert is_closed_python_list_type(1) is False
assert is_closed_python_list_type('foobar') is False
assert is_closed_python_list_type(typing.Optional) is False
assert is_closed_python_list_type(typing.Dict) is False
assert is_closed_python_list_type(typing.Dict[int, str]) is False
assert is_closed_python_list_type(typing.Tuple) is False
assert is_closed_python_list_type(typing.Tuple[int, str]) is False
| 40.315789 | 70 | 0.806136 | 501 | 3,064 | 4.489022 | 0.05988 | 0.26145 | 0.298799 | 0.337928 | 0.899066 | 0.834593 | 0.809693 | 0.681636 | 0.460205 | 0.165407 | 0 | 0.002266 | 0.13577 | 3,064 | 75 | 71 | 40.853333 | 0.847054 | 0 | 0 | 0 | 0 | 0 | 0.009791 | 0 | 0 | 0 | 0 | 0 | 0.745763 | 1 | 0.101695 | true | 0 | 0.033898 | 0 | 0.135593 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
106ca0d11cdea860c706101d543a3f23c7263f7b | 6,370 | py | Python | graphtheory/traversing/bfs.py | gitter-badger/graphs-dict | 2be1a5b140feb050eec799d6cadf6de5eef01745 | [
"BSD-3-Clause"
] | 36 | 2015-09-20T20:55:39.000Z | 2021-09-20T05:49:03.000Z | graphtheory/traversing/bfs.py | gitter-badger/graphs-dict | 2be1a5b140feb050eec799d6cadf6de5eef01745 | [
"BSD-3-Clause"
] | 6 | 2016-03-25T21:41:46.000Z | 2020-02-12T03:18:59.000Z | graphtheory/traversing/bfs.py | gitter-badger/graphs-dict | 2be1a5b140feb050eec799d6cadf6de5eef01745 | [
"BSD-3-Clause"
] | 9 | 2016-09-12T07:57:27.000Z | 2022-03-21T16:15:39.000Z | #!/usr/bin/python
try:
from Queue import Queue
except ImportError: # Python 3
from queue import Queue
class BFSWithQueue:
"""Breadth-First Search.
Attributes
----------
graph : input graph
color : dict with nodes, private
distance : dict with nodes (distances to source node)
parent : dict (BFS tree)
dag : graph (BFS tree)
Examples
--------
>>> from graphtheory.structures.edges import Edge
>>> from graphtheory.structures.graphs import Graph
>>> from graphtheory.traversing.bfs import BFSWithQueue
>>> G = Graph(n=10, False) # an exemplary undirected graph
# Add nodes and edges here.
>>> order = list()
>>> algorithm = BFSWithQueue(G)
>>> algorithm.run(source=0, pre_action=lambda node: order.append(node))
>>> order # visited nodes
>>> algorithm.distance[target] # distance from source to target
>>> algorithm.parent # BFS tree as a dict
>>> algorithm.dag # BFS tree as a directed graph
>>> algorithm.path(source, target)
Notes
-----
Based on:
Cormen, T. H., Leiserson, C. E., Rivest, R. L., and Stein, C., 2009,
Introduction to Algorithms, third edition, The MIT Press,
Cambridge, London.
https://en.wikipedia.org/wiki/Breadth-first_search
"""
def __init__(self, graph):
"""The algorithm initialization."""
self.graph = graph
self.color = dict(((node, "WHITE") for node in self.graph.iternodes()))
self.distance = dict(((node, float("inf")) for node in self.graph.iternodes()))
self.parent = dict(((node, None) for node in self.graph.iternodes()))
self.dag = self.graph.__class__(self.graph.v(), directed=True)
for node in self.graph.iternodes(): # isolated nodes are possible
self.dag.add_node(node)
def run(self, source=None, pre_action=None, post_action=None):
"""Executable pseudocode."""
if source is not None:
self._visit(source, pre_action, post_action)
else:
for node in self.graph.iternodes():
if self.color[node] == "WHITE":
self._visit(node, pre_action, post_action)
def _visit(self, node, pre_action=None, post_action=None):
"""Explore the connected component."""
self.color[node] = "GREY"
self.distance[node] = 0
self.parent[node] = None
Q = Queue()
Q.put(node) # node is GREY
if pre_action: # when Q.put
pre_action(node)
while not Q.empty():
source = Q.get()
for edge in self.graph.iteroutedges(source):
if self.color[edge.target] == "WHITE":
self.color[edge.target] = "GREY"
self.distance[edge.target] = self.distance[source] + 1
self.parent[edge.target] = source
self.dag.add_edge(edge)
Q.put(edge.target) # target is GREY
if pre_action: # when Q.put
pre_action(edge.target)
self.color[source] = "BLACK"
if post_action: # source became BLACK
post_action(source)
def path(self, source, target):
"""Construct a path from source to target."""
if source == target:
return [source]
elif self.parent[target] is None:
raise ValueError("no path to target")
else:
return self.path(source, self.parent[target]) + [target]
class SimpleBFS:
"""Breadth-First Search.
Attributes
----------
graph : input graph
parent : dict (BFS tree)
dag : graph (BFS tree)
Examples
--------
>>> from graphtheory.structures.edges import Edge
>>> from graphtheory.structures.graphs import Graph
>>> from graphtheory.traversing.bfs import SimpleBFS
>>> G = Graph(n=10, False) # an exemplary undirected graph
# Add nodes and edges here.
>>> order = list()
>>> algorithm = SimpleBFS(G)
>>> algorithm.run(source=0, pre_action=lambda node: order.append(node))
>>> order # visited nodes
>>> algorithm.parent # BFS tree as a dict
>>> algorithm.dag # BFS tree as a directed graph
>>> algorithm.path(source, target)
Notes
-----
Based on:
Cormen, T. H., Leiserson, C. E., Rivest, R. L., and Stein, C., 2009,
Introduction to Algorithms, third edition, The MIT Press,
Cambridge, London.
https://en.wikipedia.org/wiki/Breadth-first_search
"""
def __init__(self, graph):
"""The algorithm initialization."""
self.graph = graph
self.parent = dict()
self.dag = self.graph.__class__(self.graph.v(), directed=True)
for node in self.graph.iternodes(): # isolated nodes are possible
self.dag.add_node(node)
def run(self, source=None, pre_action=None, post_action=None):
"""Executable pseudocode."""
if source is not None:
self._visit(source, pre_action, post_action)
else:
for node in self.graph.iternodes():
if node not in self.parent:
self._visit(node, pre_action, post_action)
def _visit(self, node, pre_action=None, post_action=None):
"""Explore the connected component."""
Q = Queue()
self.parent[node] = None # before Q.put
Q.put(node)
if pre_action: # when Q.put
pre_action(node)
while not Q.empty():
source = Q.get()
for edge in self.graph.iteroutedges(source):
if edge.target not in self.parent:
self.parent[edge.target] = source # before Q.put
self.dag.add_edge(edge)
Q.put(edge.target)
if pre_action: # when Q.put
pre_action(edge.target)
if post_action:
post_action(source)
def path(self, source, target):
"""Construct a path from source to target."""
if source == target:
return [source]
elif self.parent[target] is None:
raise ValueError("no path to target")
else:
return self.path(source, self.parent[target]) + [target]
# EOF
| 35.19337 | 87 | 0.573155 | 759 | 6,370 | 4.73386 | 0.183136 | 0.045088 | 0.027554 | 0.025327 | 0.821598 | 0.798775 | 0.798775 | 0.75007 | 0.75007 | 0.732257 | 0 | 0.003862 | 0.308948 | 6,370 | 180 | 88 | 35.388889 | 0.812358 | 0.374882 | 0 | 0.781609 | 0 | 0 | 0.017818 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.091954 | false | 0 | 0.034483 | 0 | 0.195402 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
52c7ea0922475eadd1b41623f29f3709090d9c95 | 1,172 | py | Python | tests/test_goals.py | iafisher/khaganate-snapshot | 796b2c0c8053a2ef0acb852efaea8a06baad38c4 | [
"MIT"
] | null | null | null | tests/test_goals.py | iafisher/khaganate-snapshot | 796b2c0c8053a2ef0acb852efaea8a06baad38c4 | [
"MIT"
] | null | null | null | tests/test_goals.py | iafisher/khaganate-snapshot | 796b2c0c8053a2ef0acb852efaea8a06baad38c4 | [
"MIT"
] | null | null | null | import unittest
from datetime import date
from base.goals import get_start_of_quarter
class GoalsTests(unittest.TestCase):
def test_get_start_of_quarter(self):
self.assertEqual(get_start_of_quarter(date(2022, 1, 1)), date(2022, 1, 1))
self.assertEqual(get_start_of_quarter(date(2022, 2, 1)), date(2022, 1, 1))
self.assertEqual(get_start_of_quarter(date(2022, 3, 1)), date(2022, 1, 1))
self.assertEqual(get_start_of_quarter(date(2022, 4, 1)), date(2022, 4, 1))
self.assertEqual(get_start_of_quarter(date(2022, 5, 1)), date(2022, 4, 1))
self.assertEqual(get_start_of_quarter(date(2022, 6, 1)), date(2022, 4, 1))
self.assertEqual(get_start_of_quarter(date(2022, 7, 1)), date(2022, 7, 1))
self.assertEqual(get_start_of_quarter(date(2022, 8, 1)), date(2022, 7, 1))
self.assertEqual(get_start_of_quarter(date(2022, 9, 1)), date(2022, 7, 1))
self.assertEqual(get_start_of_quarter(date(2022, 10, 1)), date(2022, 10, 1))
self.assertEqual(get_start_of_quarter(date(2022, 11, 1)), date(2022, 10, 1))
self.assertEqual(get_start_of_quarter(date(2022, 12, 1)), date(2022, 10, 1))
| 48.833333 | 84 | 0.686007 | 193 | 1,172 | 3.943005 | 0.160622 | 0.2523 | 0.183968 | 0.312746 | 0.808147 | 0.792378 | 0.792378 | 0.792378 | 0.739816 | 0.739816 | 0 | 0.153061 | 0.163823 | 1,172 | 23 | 85 | 50.956522 | 0.623469 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.705882 | 1 | 0.058824 | false | 0 | 0.176471 | 0 | 0.294118 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
52e5dea0a530425f62aa7a2ddc57cdb34a9215a2 | 18,133 | py | Python | usaspending_api/agency/tests/integration/test_agency_budget_function.py | ststuck/usaspending-api | b13bd5bcba0369ff8512f61a34745626c3969391 | [
"CC0-1.0"
] | 217 | 2016-11-03T17:09:53.000Z | 2022-03-10T04:17:54.000Z | usaspending_api/agency/tests/integration/test_agency_budget_function.py | ststuck/usaspending-api | b13bd5bcba0369ff8512f61a34745626c3969391 | [
"CC0-1.0"
] | 622 | 2016-09-02T19:18:23.000Z | 2022-03-29T17:11:01.000Z | usaspending_api/agency/tests/integration/test_agency_budget_function.py | ststuck/usaspending-api | b13bd5bcba0369ff8512f61a34745626c3969391 | [
"CC0-1.0"
] | 93 | 2016-09-07T20:28:57.000Z | 2022-02-25T00:25:27.000Z | import pytest
from rest_framework import status
from usaspending_api.common.helpers.fiscal_year_helpers import current_fiscal_year
url = "/api/v2/agency/{code}/budget_function/{query_params}"
@pytest.mark.django_db
def test_budget_function_list_success(client, monkeypatch, agency_account_data, helpers):
helpers.mock_current_fiscal_year(monkeypatch)
resp = client.get(url.format(code="007", query_params=""))
expected_result = {
"fiscal_year": helpers.get_mocked_current_fiscal_year(),
"toptier_code": "007",
"messages": [],
"page_metadata": {
"hasNext": False,
"hasPrevious": False,
"next": None,
"page": 1,
"previous": None,
"total": 3,
"limit": 10,
},
"results": [
{
"gross_outlay_amount": 11100000.0,
"name": "NAME 1",
"obligated_amount": 111.0,
"children": [{"gross_outlay_amount": 11100000.0, "name": "NAME 1A", "obligated_amount": 111.0}],
},
{
"gross_outlay_amount": 100000.0,
"name": "NAME 6",
"obligated_amount": 100.0,
"children": [{"gross_outlay_amount": 100000.0, "name": "NAME 6A", "obligated_amount": 100.0}],
},
{
"gross_outlay_amount": 1000000.0,
"name": "NAME 5",
"obligated_amount": 10.0,
"children": [{"gross_outlay_amount": 1000000.0, "name": "NAME 5A", "obligated_amount": 10.0}],
},
],
}
assert resp.status_code == status.HTTP_200_OK
assert resp.json() == expected_result
query_params = "?fiscal_year=2017"
resp = client.get(url.format(code="008", query_params=query_params))
expected_result = {
"fiscal_year": 2017,
"toptier_code": "008",
"messages": [],
"page_metadata": {
"hasNext": False,
"hasPrevious": False,
"next": None,
"page": 1,
"previous": None,
"total": 1,
"limit": 10,
},
"results": [
{
"gross_outlay_amount": 10000.0,
"name": "NAME 2",
"obligated_amount": 1000.0,
"children": [{"gross_outlay_amount": 10000.0, "name": "NAME 2A", "obligated_amount": 1000.0}],
}
],
}
assert resp.status_code == status.HTTP_200_OK
assert resp.json() == expected_result
# this agency has a record but the amounts are both 0, so we expect this return no results
query_params = "?fiscal_year=2016"
resp = client.get(url.format(code="010", query_params=query_params))
expected_result = {
"fiscal_year": 2016,
"toptier_code": "010",
"messages": [
"Account data powering this endpoint were first collected in "
"FY2017 Q2 under the DATA Act; as such, there are no data "
"available for prior fiscal years."
],
"page_metadata": {
"hasNext": False,
"hasPrevious": False,
"next": None,
"page": 1,
"previous": None,
"total": 0,
"limit": 10,
},
"results": [],
}
assert resp.status_code == status.HTTP_200_OK
assert resp.json() == expected_result
@pytest.mark.django_db
def test_budget_function_list_too_early(client, agency_account_data):
query_params = "?fiscal_year=2007"
resp = client.get(url.format(code="007", query_params=query_params))
assert resp.status_code == status.HTTP_422_UNPROCESSABLE_ENTITY
@pytest.mark.django_db
def test_budget_function_list_future(client, agency_account_data):
query_params = "?fiscal_year=" + str(current_fiscal_year() + 1)
resp = client.get(url.format(code="007", query_params=query_params))
assert resp.status_code == status.HTTP_422_UNPROCESSABLE_ENTITY
@pytest.mark.django_db
def test_budget_function_list_bad_sort(client, agency_account_data):
query_params = "?sort=not valid"
resp = client.get(url.format(code="007", query_params=query_params))
assert resp.status_code == status.HTTP_400_BAD_REQUEST
@pytest.mark.django_db
def test_budget_function_list_bad_order(client, agency_account_data):
query_params = "?order=not valid"
resp = client.get(url.format(code="007", query_params=query_params))
assert resp.status_code == status.HTTP_400_BAD_REQUEST
@pytest.mark.django_db
def test_budget_function_list_sort_by_name(client, monkeypatch, agency_account_data, helpers):
helpers.mock_current_fiscal_year(monkeypatch)
query_params = f"?fiscal_year={helpers.get_mocked_current_fiscal_year()}&order=asc&sort=name"
resp = client.get(url.format(code="007", query_params=query_params))
expected_result = {
"fiscal_year": helpers.get_mocked_current_fiscal_year(),
"toptier_code": "007",
"messages": [],
"page_metadata": {
"hasNext": False,
"hasPrevious": False,
"next": None,
"page": 1,
"previous": None,
"total": 3,
"limit": 10,
},
"results": [
{
"gross_outlay_amount": 11100000.0,
"name": "NAME 1",
"obligated_amount": 111.0,
"children": [{"gross_outlay_amount": 11100000.0, "name": "NAME 1A", "obligated_amount": 111.0}],
},
{
"gross_outlay_amount": 1000000.0,
"name": "NAME 5",
"obligated_amount": 10.0,
"children": [{"gross_outlay_amount": 1000000.0, "name": "NAME 5A", "obligated_amount": 10.0}],
},
{
"gross_outlay_amount": 100000.0,
"name": "NAME 6",
"obligated_amount": 100.0,
"children": [{"gross_outlay_amount": 100000.0, "name": "NAME 6A", "obligated_amount": 100.0}],
},
],
}
assert resp.status_code == status.HTTP_200_OK
assert resp.json() == expected_result
query_params = f"?fiscal_year={helpers.get_mocked_current_fiscal_year()}&order=desc&sort=name"
resp = client.get(url.format(code="007", query_params=query_params))
expected_result = {
"fiscal_year": helpers.get_mocked_current_fiscal_year(),
"toptier_code": "007",
"messages": [],
"page_metadata": {
"hasNext": False,
"hasPrevious": False,
"next": None,
"page": 1,
"previous": None,
"total": 3,
"limit": 10,
},
"results": [
{
"gross_outlay_amount": 100000.0,
"name": "NAME 6",
"obligated_amount": 100.0,
"children": [{"gross_outlay_amount": 100000.0, "name": "NAME 6A", "obligated_amount": 100.0}],
},
{
"gross_outlay_amount": 1000000.0,
"name": "NAME 5",
"obligated_amount": 10.0,
"children": [{"gross_outlay_amount": 1000000.0, "name": "NAME 5A", "obligated_amount": 10.0}],
},
{
"gross_outlay_amount": 11100000.0,
"name": "NAME 1",
"obligated_amount": 111.0,
"children": [{"gross_outlay_amount": 11100000.0, "name": "NAME 1A", "obligated_amount": 111.0}],
},
],
}
assert resp.status_code == status.HTTP_200_OK
assert resp.json() == expected_result
@pytest.mark.django_db
def test_budget_function_list_sort_by_obligated_amount(client, monkeypatch, agency_account_data, helpers):
helpers.mock_current_fiscal_year(monkeypatch)
query_params = f"?fiscal_year={helpers.get_mocked_current_fiscal_year()}&order=asc&sort=obligated_amount"
resp = client.get(url.format(code="007", query_params=query_params))
expected_result = {
"fiscal_year": helpers.get_mocked_current_fiscal_year(),
"toptier_code": "007",
"messages": [],
"page_metadata": {
"hasNext": False,
"hasPrevious": False,
"next": None,
"page": 1,
"previous": None,
"total": 3,
"limit": 10,
},
"results": [
{
"gross_outlay_amount": 1000000.0,
"name": "NAME 5",
"obligated_amount": 10.0,
"children": [{"gross_outlay_amount": 1000000.0, "name": "NAME 5A", "obligated_amount": 10.0}],
},
{
"gross_outlay_amount": 100000.0,
"name": "NAME 6",
"obligated_amount": 100.0,
"children": [{"gross_outlay_amount": 100000.0, "name": "NAME 6A", "obligated_amount": 100.0}],
},
{
"gross_outlay_amount": 11100000.0,
"name": "NAME 1",
"obligated_amount": 111.0,
"children": [{"gross_outlay_amount": 11100000.0, "name": "NAME 1A", "obligated_amount": 111.0}],
},
],
}
assert resp.status_code == status.HTTP_200_OK
assert resp.json() == expected_result
query_params = f"?fiscal_year={helpers.get_mocked_current_fiscal_year()}&order=desc&sort=obligated_amount"
resp = client.get(url.format(code="007", query_params=query_params))
expected_result = {
"fiscal_year": helpers.get_mocked_current_fiscal_year(),
"toptier_code": "007",
"messages": [],
"page_metadata": {
"hasNext": False,
"hasPrevious": False,
"next": None,
"page": 1,
"previous": None,
"total": 3,
"limit": 10,
},
"results": [
{
"gross_outlay_amount": 11100000.0,
"name": "NAME 1",
"obligated_amount": 111.0,
"children": [{"gross_outlay_amount": 11100000.0, "name": "NAME 1A", "obligated_amount": 111.0}],
},
{
"gross_outlay_amount": 100000.0,
"name": "NAME 6",
"obligated_amount": 100.0,
"children": [{"gross_outlay_amount": 100000.0, "name": "NAME 6A", "obligated_amount": 100.0}],
},
{
"gross_outlay_amount": 1000000.0,
"name": "NAME 5",
"obligated_amount": 10.0,
"children": [{"gross_outlay_amount": 1000000.0, "name": "NAME 5A", "obligated_amount": 10.0}],
},
],
}
assert resp.status_code == status.HTTP_200_OK
assert resp.json() == expected_result
@pytest.mark.django_db
def test_budget_function_list_sort_by_gross_outlay_amount(client, monkeypatch, agency_account_data, helpers):
helpers.mock_current_fiscal_year(monkeypatch)
query_params = f"?fiscal_year={helpers.get_mocked_current_fiscal_year()}&order=asc&sort=gross_outlay_amount"
resp = client.get(url.format(code="007", query_params=query_params))
expected_result = {
"fiscal_year": helpers.get_mocked_current_fiscal_year(),
"toptier_code": "007",
"messages": [],
"page_metadata": {
"hasNext": False,
"hasPrevious": False,
"next": None,
"page": 1,
"previous": None,
"total": 3,
"limit": 10,
},
"results": [
{
"gross_outlay_amount": 100000.0,
"name": "NAME 6",
"obligated_amount": 100.0,
"children": [{"gross_outlay_amount": 100000.0, "name": "NAME 6A", "obligated_amount": 100.0}],
},
{
"gross_outlay_amount": 1000000.0,
"name": "NAME 5",
"obligated_amount": 10.0,
"children": [{"gross_outlay_amount": 1000000.0, "name": "NAME 5A", "obligated_amount": 10.0}],
},
{
"gross_outlay_amount": 11100000.0,
"name": "NAME 1",
"obligated_amount": 111.0,
"children": [{"gross_outlay_amount": 11100000.0, "name": "NAME 1A", "obligated_amount": 111.0}],
},
],
}
assert resp.status_code == status.HTTP_200_OK
assert resp.json() == expected_result
query_params = f"?fiscal_year={helpers.get_mocked_current_fiscal_year()}&order=desc&sort=gross_outlay_amount"
resp = client.get(url.format(code="007", query_params=query_params))
expected_result = {
"fiscal_year": helpers.get_mocked_current_fiscal_year(),
"toptier_code": "007",
"messages": [],
"page_metadata": {
"hasNext": False,
"hasPrevious": False,
"next": None,
"page": 1,
"previous": None,
"total": 3,
"limit": 10,
},
"results": [
{
"gross_outlay_amount": 11100000.0,
"name": "NAME 1",
"obligated_amount": 111.0,
"children": [{"gross_outlay_amount": 11100000.0, "name": "NAME 1A", "obligated_amount": 111.0}],
},
{
"gross_outlay_amount": 1000000.0,
"name": "NAME 5",
"obligated_amount": 10.0,
"children": [{"gross_outlay_amount": 1000000.0, "name": "NAME 5A", "obligated_amount": 10.0}],
},
{
"gross_outlay_amount": 100000.0,
"name": "NAME 6",
"obligated_amount": 100.0,
"children": [{"gross_outlay_amount": 100000.0, "name": "NAME 6A", "obligated_amount": 100.0}],
},
],
}
assert resp.status_code == status.HTTP_200_OK
assert resp.json() == expected_result
@pytest.mark.django_db
def test_budget_function_list_search(client, monkeypatch, agency_account_data, helpers):
helpers.mock_current_fiscal_year(monkeypatch)
query_params = f"?fiscal_year={helpers.get_mocked_current_fiscal_year()}&filter=NAME 6"
resp = client.get(url.format(code="007", query_params=query_params))
expected_result = {
"fiscal_year": helpers.get_mocked_current_fiscal_year(),
"toptier_code": "007",
"messages": [],
"page_metadata": {
"hasNext": False,
"hasPrevious": False,
"next": None,
"page": 1,
"previous": None,
"total": 1,
"limit": 10,
},
"results": [
{
"gross_outlay_amount": 100000.0,
"name": "NAME 6",
"obligated_amount": 100.0,
"children": [{"gross_outlay_amount": 100000.0, "name": "NAME 6A", "obligated_amount": 100.0}],
}
],
}
assert resp.status_code == status.HTTP_200_OK
assert resp.json() == expected_result
query_params = f"?fiscal_year={helpers.get_mocked_current_fiscal_year()}&filter=AME 5"
resp = client.get(url.format(code="007", query_params=query_params))
expected_result = {
"fiscal_year": helpers.get_mocked_current_fiscal_year(),
"toptier_code": "007",
"messages": [],
"page_metadata": {
"hasNext": False,
"hasPrevious": False,
"next": None,
"page": 1,
"previous": None,
"total": 1,
"limit": 10,
},
"results": [
{
"gross_outlay_amount": 1000000.0,
"name": "NAME 5",
"obligated_amount": 10.0,
"children": [{"gross_outlay_amount": 1000000.0, "name": "NAME 5A", "obligated_amount": 10.0}],
}
],
}
assert resp.status_code == status.HTTP_200_OK
assert resp.json() == expected_result
@pytest.mark.django_db
def test_budget_function_list_pagination(client, agency_account_data):
query_params = f"?fiscal_year=2020&limit=2&page=1"
resp = client.get(url.format(code="007", query_params=query_params))
expected_result = {
"fiscal_year": 2020,
"toptier_code": "007",
"messages": [],
"page_metadata": {
"hasNext": True,
"hasPrevious": False,
"next": 2,
"page": 1,
"previous": None,
"total": 3,
"limit": 2,
},
"results": [
{
"gross_outlay_amount": 11100000.0,
"name": "NAME 1",
"obligated_amount": 111.0,
"children": [{"gross_outlay_amount": 11100000.0, "name": "NAME 1A", "obligated_amount": 111.0}],
},
{
"gross_outlay_amount": 100000.0,
"name": "NAME 6",
"obligated_amount": 100.0,
"children": [{"gross_outlay_amount": 100000.0, "name": "NAME 6A", "obligated_amount": 100.0}],
},
],
}
assert resp.status_code == status.HTTP_200_OK
assert resp.json() == expected_result
query_params = f"?fiscal_year=2020&limit=2&page=2"
resp = client.get(url.format(code="007", query_params=query_params))
expected_result = {
"fiscal_year": 2020,
"toptier_code": "007",
"messages": [],
"page_metadata": {
"hasNext": False,
"hasPrevious": True,
"next": None,
"page": 2,
"previous": 1,
"total": 3,
"limit": 2,
},
"results": [
{
"gross_outlay_amount": 1000000.0,
"name": "NAME 5",
"obligated_amount": 10.0,
"children": [{"gross_outlay_amount": 1000000.0, "name": "NAME 5A", "obligated_amount": 10.0}],
},
],
}
assert resp.status_code == status.HTTP_200_OK
assert resp.json() == expected_result
| 35.694882 | 113 | 0.53626 | 1,890 | 18,133 | 4.866667 | 0.075132 | 0.068167 | 0.105349 | 0.058708 | 0.92868 | 0.927702 | 0.912155 | 0.905414 | 0.883235 | 0.868232 | 0 | 0.075212 | 0.323223 | 18,133 | 507 | 114 | 35.765286 | 0.674299 | 0.004853 | 0 | 0.716129 | 0 | 0 | 0.277282 | 0.0419 | 0 | 0 | 0 | 0 | 0.064516 | 1 | 0.021505 | false | 0 | 0.006452 | 0 | 0.027957 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
eaa59b64ee14f8fa73aeb575472d4e322a296222 | 123 | py | Python | summarize/nn/beam_search/__init__.py | danieldeutsch/summarize | f36a86d58f381ff1f607f356dad3d6ef7b0e0224 | [
"Apache-2.0"
] | 15 | 2019-11-01T11:49:44.000Z | 2021-01-19T06:59:32.000Z | summarize/nn/beam_search/__init__.py | CogComp/summary-cloze | b38e3e8c7755903477fd92a4cff27125cbf5553d | [
"Apache-2.0"
] | 2 | 2020-03-30T07:54:01.000Z | 2021-11-15T16:27:42.000Z | summarize/nn/beam_search/__init__.py | CogComp/summary-cloze | b38e3e8c7755903477fd92a4cff27125cbf5553d | [
"Apache-2.0"
] | 3 | 2019-12-06T05:57:51.000Z | 2019-12-11T11:34:21.000Z | from summarize.nn.beam_search.beam_search import BeamSearch
from summarize.nn.beam_search.relaxed import RelaxedBeamSearch
| 41 | 62 | 0.886179 | 17 | 123 | 6.235294 | 0.529412 | 0.283019 | 0.283019 | 0.358491 | 0.471698 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.065041 | 123 | 2 | 63 | 61.5 | 0.921739 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
eab1777046e88f768158dfeacf87bd24ff856a5d | 339,129 | py | Python | tests/test_constraints.py | cyrilbois/PFNET.py | 81d2fd911c6e6aae4c5de0d1739c6f5361799ce2 | [
"BSD-2-Clause"
] | null | null | null | tests/test_constraints.py | cyrilbois/PFNET.py | 81d2fd911c6e6aae4c5de0d1739c6f5361799ce2 | [
"BSD-2-Clause"
] | null | null | null | tests/test_constraints.py | cyrilbois/PFNET.py | 81d2fd911c6e6aae4c5de0d1739c6f5361799ce2 | [
"BSD-2-Clause"
] | null | null | null | #***************************************************#
# This file is part of PFNET. #
# #
# Copyright (c) 2015, Tomas Tinoco De Rubira. #
# #
# PFNET is released under the BSD 2-clause license. #
#***************************************************#
import os
import unittest
import pfnet as pf
import numpy as np
from . import test_cases
from numpy.linalg import norm
from scipy.sparse import coo_matrix,triu,tril,eye
NUM_TRIALS = 25
EPS = 5.0 # %
TOL = 1e-4
class TestConstraints(unittest.TestCase):
def setUp(self):
# Network
self.T = 2
# Random
np.random.seed(0)
def test_constr_FACTS_EQ(self):
# Constants
h = 1e-8
# Multiperiods
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
# Vars
net.set_flags('bus',
'variable',
'any',
['voltage magnitude', 'voltage angle'])
net.set_flags('facts',
'variable',
'any',
'all')
self.assertEqual(net.num_vars, (2*net.num_buses+9*net.num_facts)*self.T)
x0 = net.get_var_values()+1e-4*np.random.randn(net.num_vars)
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
# Constraint
constr = pf.Constraint('FACTS equations',net)
self.assertEqual(constr.name,'FACTS equations')
f = constr.f
J = constr.J
A = constr.A
b = constr.b
# Before
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,0))
self.assertEqual(A.nnz,0)
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.J_row,0)
self.assertEqual(constr.A_row,0)
self.assertEqual(constr.num_extra_vars,0)
num_statcom = len([f for f in net.facts if f.is_STATCOM()])
num_SSSC = len([f for f in net.facts if f.is_SSSC()])
num_UPFC = len([f for f in net.facts if f.is_UPFC()])
num_seriesenabled = len([f for f in net.facts if f.is_in_normal_series_mode()])
num_seriesdisabled = len([f for f in net.facts if f.is_series_link_disabled()])
# Verify analyze
Jnnz = 28*num_seriesenabled;
rowsJ = 4*num_seriesenabled
rowsA = 2*net.num_facts
Annz = 7*net.num_facts
for facts in net.facts:
if not facts.is_regulator():
rowsA = rowsA+1
Annz = Annz+1
if facts.P_max_dc == 0 or facts.is_series_link_disabled():
rowsA = rowsA+1
Annz = Annz+1
if facts.is_series_link_disabled():
rowsA = rowsA+5
Annz = Annz+5
constr.analyze()
self.assertEqual(constr.J_nnz, Jnnz*self.T)
self.assertEqual(constr.A_nnz, Annz*self.T)
self.assertEqual(constr.J_row, rowsJ*self.T)
self.assertEqual(constr.A_row, rowsA*self.T)
y_init = constr.init_extra_vars
self.assertEqual(y_init.size,constr.num_extra_vars)
self.assertTrue(np.all(y_init == 0.))
y0 = np.random.rand(constr.num_extra_vars)
constr.eval(x0,y0)
self.assertEqual(constr.J_nnz,Jnnz*self.T)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.J_row,rowsJ*self.T)
self.assertEqual(constr.A_row,0)
f = constr.f
J = constr.J
A = constr.A
b = constr.b
# After
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(rowsJ*self.T,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(rowsA*self.T,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(rowsJ*self.T,net.num_vars+constr.num_extra_vars))
self.assertEqual(J.nnz,Jnnz*self.T)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(rowsA*self.T,net.num_vars+constr.num_extra_vars))
self.assertEqual(A.nnz,Annz*self.T)
self.assertTrue(not np.any(np.isinf(b)))
self.assertTrue(not np.any(np.isnan(b)))
self.assertTrue(not np.any(np.isinf(f)))
self.assertTrue(not np.any(np.isnan(f)))
self.assertTrue(not np.any(np.isinf(J.data)))
self.assertTrue(not np.any(np.isnan(J.data)))
self.assertTrue(not np.any(np.isinf(A.data)))
self.assertTrue(not np.any(np.isnan(A.data)))
# Ax=b check
for k in range(self.T):
for facts in net.facts:
self.assertTrue(facts.has_flags('variable',
['active power',
'reactive power',
'series voltage magnitude',
'series voltage angle']))
index_Pk = np.where(A.col == facts.index_P_k[k])[0]
index_Pm = np.where(A.col == facts.index_P_m[k])[0]
index_Pdc = np.where(A.col == facts.index_P_dc[k])[0]
index_Qk = np.where(A.col == facts.index_Q_k[k])[0]
index_Qm = np.where(A.col == facts.index_Q_m[k])[0]
index_Qsh = np.where(A.col == facts.index_Q_sh[k])[0]
index_Qs = np.where(A.col == facts.index_Q_s[k])[0]
index_vmags = np.where(A.col == facts.index_v_mag_s[k])[0]
index_vangs = np.where(A.col == facts.index_v_ang_s[k])[0]
self.assertEqual(index_Pk.size,1)
self.assertEqual(index_Qk.size,1)
self.assertEqual(A.data[index_Pk],1.)
self.assertEqual(A.data[index_Qk],1.)
self.assertEqual(b[A.row[index_Pk]],0.)
self.assertEqual(b[A.row[index_Qk]],0.)
if not facts.is_regulator():
self.assertEqual(index_Qsh.size,2)
self.assertEqual(A.data[index_Qsh[0]],1.)
self.assertEqual(A.data[index_Qsh[1]],-1.)
self.assertEqual(b[A.row[index_Qsh[0]]],0.)
self.assertEqual(b[A.row[index_Qsh[1]]],0.)
if facts.P_max_dc ==0 or facts.is_series_link_disabled():
self.assertEqual(index_Pdc.size,2)
self.assertEqual(A.data[index_Pdc[0]],-1.)
self.assertEqual(A.data[index_Pdc[1]],1.)
self.assertEqual(b[A.row[index_Pdc[0]]],0.)
self.assertEqual(b[A.row[index_Pdc[1]]],0.)
else:
self.assertEqual(index_Pdc.size,1)
self.assertEqual(A.data[index_Pdc],-1.)
self.assertEqual(b[A.row[index_Pdc]],0.)
if facts.is_series_link_disabled():
self.assertEqual(index_Pm.size,2)
for index in index_Pm:
self.assertEqual(A.data[index],1.)
self.assertEqual(b[A.row[index]],0.)
self.assertEqual(index_Qm.size,2)
for index in index_Qm:
self.assertEqual(A.data[index],1.)
self.assertEqual(b[A.row[index]],0.)
self.assertEqual(index_Qs.size,2)
self.assertEqual(index_vmags.size,1)
self.assertEqual(index_vangs.size,1)
self.assertEqual(A.data[index_Qs[0]],-1.)
self.assertEqual(A.data[index_Qs[1]],1.)
self.assertEqual(A.data[index_vmags],1.)
self.assertEqual(A.data[index_vangs],1.)
self.assertEqual(b[A.row[index_Qs[0]]],0.)
self.assertEqual(b[A.row[index_Qs[1]]],0.)
self.assertEqual(b[A.row[index_vmags]],0.)
self.assertEqual(b[A.row[index_vangs]],0.)
# f check
flags = {}
for t in range(self.T):
for bus in net.buses:
flags[(t,bus.index)] = False
J_row = 0
for t in range(self.T):
for branch in net.branches:
for bus in [branch.bus_k, branch.bus_m]:
if not flags[(t, bus.index)]:
facts_onthisbus = [facts for facts in net.facts if ((facts.bus_k == bus) and (facts.is_in_normal_series_mode()))]
for facts in facts_onthisbus:
busk = facts.bus_k
busm = facts.bus_m
vmag_k = x0[busk.index_v_mag[t]]
vang_k = x0[busk.index_v_ang[t]]
vmag_m = x0[busm.index_v_mag[t]]
vang_m = x0[busm.index_v_ang[t]]
vmag_s = x0[facts.index_v_mag_s[t]]
vang_s = x0[facts.index_v_ang_s[t]]
P_m = x0[facts.index_P_m[t]]
P_dc = x0[facts.index_P_dc[t]]
Q_m = x0[facts.index_Q_m[t]]
Q_s = x0[facts.index_Q_s[t]]
f1 = -vmag_k*np.cos(vang_k)+vmag_m*np.cos(vang_m)-vmag_s*np.cos(vang_s)
f2 = -vmag_k*np.sin(vang_k)+vmag_m*np.sin(vang_m)-vmag_s*np.sin(vang_s)
f3 = vmag_s*P_m*np.cos(vang_s)-vmag_s*Q_m*np.sin(vang_s)-vmag_m*P_dc*np.cos(vang_m)+vmag_m*Q_s*np.sin(vang_m)
f4 = vmag_s*P_m*np.sin(vang_s)+vmag_s*Q_m*np.cos(vang_s)-vmag_m*P_dc*np.sin(vang_m)-vmag_m*Q_s*np.cos(vang_m)
self.assertAlmostEqual(f1,f[J_row])
self.assertAlmostEqual(f2,f[J_row+1])
self.assertAlmostEqual(f3,f[J_row+2])
self.assertAlmostEqual(f4,f[J_row+3])
J_row += 4
flags[(t,bus.index)] = True
# Jacobian check
pf.tests.utils.check_constraint_Jacobian(self,
constr,
x0,
np.zeros(0),
NUM_TRIALS,
TOL,
EPS,
h)
# Sigle Hessian check
pf.tests.utils.check_constraint_single_Hessian(self,
constr,
x0,
np.zeros(0),
NUM_TRIALS,
TOL,
EPS,
h)
# Combined Hessian check
pf.tests.utils.check_constraint_combined_Hessian(self,
constr,
x0,
np.zeros(0),
NUM_TRIALS,
TOL,
EPS,
h)
def test_constr_FACTS_PSET_SWITCH(self):
# Multiperiods
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
# Vars
net.set_flags('facts',
'variable',
'any',
'active power')
self.assertEqual(net.num_vars, 3*net.num_facts*self.T)
# Constraint
constr = pf.Constraint('switching FACTS active power control',net)
self.assertEqual(constr.name,'switching FACTS active power control')
f = constr.f
J = constr.J
A = constr.A
b = constr.b
# Before
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,0))
self.assertEqual(A.nnz,0)
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.J_row,0)
self.assertEqual(constr.A_row,0)
self.assertEqual(constr.num_extra_vars,0)
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
# Verify analyze
constr.analyze()
num = len([f for f in net.facts if f.is_in_normal_series_mode() and f.P_max_dc > 0.])
Annz = num*self.T
Arow = Annz
self.assertEqual(constr.A_nnz,Annz)
self.assertEqual(constr.A_row,Arow)
# Verify evaluation
constr.eval(x0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.A_row,0)
f = constr.f
J = constr.J
b = constr.b
A = constr.A
# After
self.assertTrue(not np.any(np.isinf(b)))
self.assertTrue(not np.any(np.isnan(b)))
self.assertTrue(not np.any(np.isinf(f)))
self.assertTrue(not np.any(np.isnan(f)))
self.assertTrue(not np.any(np.isinf(J.data)))
self.assertTrue(not np.any(np.isnan(J.data)))
self.assertTrue(not np.any(np.isinf(A.data)))
self.assertTrue(not np.any(np.isnan(A.data)))
# Ax = b Check
for t in range(self.T):
for f in net.facts:
self.assertTrue(f.has_flags('variable', 'active power'))
if f.is_in_normal_series_mode() and f.P_max_dc > 0.:
indexP = np.where(A.col == f.index_P_m[t])[0]
self.assertEqual(indexP.size,1)
self.assertEqual(A.data[indexP],1)
self.assertEqual(b[A.row[indexP]],f.P_set[t])
def test_constr_FACTS_QSET_SWITCH(self):
# Multiperiods
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
num_vsc = net.num_vsc_converters
if num_vsc == 0:
continue
# Vars
net.set_flags('facts',
'variable',
'any',
'reactive power')
self.assertEqual(net.num_vars, 4*net.num_facts*self.T)
# Constraint
constr = pf.Constraint('switching FACTS reactive power control',net)
self.assertEqual(constr.name,'switching FACTS reactive power control')
f = constr.f
J = constr.J
A = constr.A
b = constr.b
# Before
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,0))
self.assertEqual(A.nnz,0)
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.J_row,0)
self.assertEqual(constr.A_row,0)
self.assertEqual(constr.num_extra_vars,0)
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
# Verify analyze
constr.analyze()
num = len([f for f in net.facts if f.is_in_normal_series_mode()])
Annz = num*self.T
Arow = Annz
self.assertEqual(constr.A_nnz,Annz)
self.assertEqual(constr.A_row,Arow)
# Verify evaluation
constr.eval(x0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.A_row,0)
f = constr.f
J = constr.J
b = constr.b
A = constr.A
# After
self.assertTrue(not np.any(np.isinf(b)))
self.assertTrue(not np.any(np.isnan(b)))
self.assertTrue(not np.any(np.isinf(f)))
self.assertTrue(not np.any(np.isnan(f)))
self.assertTrue(not np.any(np.isinf(J.data)))
self.assertTrue(not np.any(np.isnan(J.data)))
self.assertTrue(not np.any(np.isinf(A.data)))
self.assertTrue(not np.any(np.isnan(A.data)))
# Ax = b Check
for t in range(self.T):
for f in net.facts:
self.assertTrue(f.has_flags('variable', 'reactive power'))
if f.is_in_normal_series_mode():
indexP = np.where(A.col == f.index_Q_m[t])[0]
self.assertEqual(indexP.size,1)
self.assertEqual(A.data[indexP],1)
self.assertEqual(b[A.row[indexP]],f.Q_set[t])
def test_constr_REG_PF(self):
# Constants
h = 1e-8
# Multiperiod
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
# Vars
net.set_flags('vsc converter',
'variable',
'any',
['active power', 'reactive power'])
self.assertEqual(net.num_vars, 2*net.get_num_vsc_converters()*self.T)
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
# PF
for vsc in net.vsc_converters:
if vsc.is_in_f_ac_mode():
vsc.target_power_factor = np.sign(np.random.randn())*np.minimum(np.random.rand(), 0.2)
# Constraint
constr = pf.Constraint('power factor regulation',net)
self.assertEqual(constr.name,'power factor regulation')
f = constr.f
J = constr.J
A = constr.A
b = constr.b
# Before
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,0))
self.assertEqual(A.nnz,0)
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.J_row,0)
self.assertEqual(constr.A_row,0)
self.assertEqual(constr.num_extra_vars,0)
Jnnz = 0
for i in range(net.num_buses):
bus = net.get_bus(i)
for vsc in bus.vsc_converters:
if vsc.is_in_f_ac_mode():
Jnnz += 4
Annz = 4*net.get_num_vsc_converters_in_f_ac_mode()
rowsJ = 2*net.get_num_vsc_converters_in_f_ac_mode()
rowsA = net.get_num_vsc_converters_in_f_ac_mode()
constr.analyze()
self.assertEqual(constr.J_nnz, Jnnz*self.T)
self.assertEqual(constr.A_nnz, Annz*self.T)
self.assertEqual(constr.J_row, rowsJ*self.T)
self.assertEqual(constr.A_row, rowsA*self.T)
self.assertEqual(constr.num_extra_vars, rowsJ*self.T)
y_init = constr.init_extra_vars
self.assertEqual(y_init.size,constr.num_extra_vars)
self.assertTrue(np.all(y_init == 0.))
y0 = np.random.rand(constr.num_extra_vars)
constr.eval(x0,y0)
self.assertEqual(constr.J_nnz,Jnnz*self.T)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.J_row,rowsJ*self.T)
self.assertEqual(constr.A_row,0)
f = constr.f
J = constr.J
A = constr.A
b = constr.b
# After
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(rowsJ*self.T,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(rowsA*self.T,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(rowsJ*self.T,net.num_vars+constr.num_extra_vars))
self.assertEqual(J.nnz,Jnnz*self.T)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(rowsA*self.T,net.num_vars+constr.num_extra_vars))
self.assertEqual(A.nnz,Annz*self.T)
self.assertTrue(not np.any(np.isinf(b)))
self.assertTrue(not np.any(np.isnan(b)))
self.assertTrue(not np.any(np.isinf(f)))
self.assertTrue(not np.any(np.isnan(f)))
self.assertTrue(not np.any(np.isinf(J.data)))
self.assertTrue(not np.any(np.isnan(J.data)))
self.assertTrue(not np.any(np.isinf(A.data)))
self.assertTrue(not np.any(np.isnan(A.data)))
# Ax=b check
for k in range(J.shape[0]//2):
index1 = np.where(A.col == net.num_vars+2*k)[0]
index2 = np.where(A.col == net.num_vars+2*k+1)[0]
self.assertEqual(index1.size,1)
self.assertEqual(index2.size,1)
self.assertEqual(A.row[index1[0]],A.row[index2[0]])
index3 = np.where(A.row == A.row[index1[0]])[0]
self.assertEqual(index3.size,4)
for vsc in net.vsc_converters:
if vsc.is_in_f_ac_mode():
gamma = vsc.target_power_factor
factor = np.sqrt((1-gamma**2.)/(gamma**2.))
for t in range(self.T):
iQ = vsc.index_Q[t]
iP = vsc.index_P[t]
k = np.where(A.col == iQ)[0]
self.assertEqual(k.size, 1)
k = np.where(A.row == A.row[k])[0]
self.assertEqual(k.size, 4)
for kk in k:
if A.col[kk] == iQ:
self.assertEqual(A.data[kk], 1.)
elif A.col[kk] == iP:
if vsc.target_power_factor >= 0:
self.assertAlmostEqual(A.data[kk], -factor)
else:
self.assertAlmostEqual(A.data[kk], factor)
else:
if (A.col[kk]-net.num_vars) % 2 == 0:
self.assertAlmostEqual(A.data[kk], -factor) # y
else:
self.assertAlmostEqual(A.data[kk], factor) # z
# f check
eps = 1e-8
J_row = 0
for t in range(self.T):
for bus in net.buses:
for vsc in bus.vsc_converters:
if vsc.is_in_f_ac_mode():
self.assertTrue(vsc.has_flags('variable', ['active power', 'reactive power']))
y = y0[J_row]
z = y0[J_row+1]
Q = vsc.Q[t]
Qmax = vsc.Q_max
Qmin = vsc.Q_min
CompY = (Q-Qmin)+y-np.sqrt((Q-Qmin)**2.+y**2.+2*eps)
CompZ = (Qmax-Q)+z-np.sqrt((Qmax-Q)**2.+z**2.+2*eps)
self.assertAlmostEqual(CompY,f[J_row])
self.assertAlmostEqual(CompZ,f[J_row+1])
J_row += 2
# Jacobian check
pf.tests.utils.check_constraint_Jacobian(self,
constr,
x0,
y0,
NUM_TRIALS,
TOL,
EPS,
h)
# Sigle Hessian check
pf.tests.utils.check_constraint_single_Hessian(self,
constr,
x0,
y0,
NUM_TRIALS,
TOL,
EPS,
h)
# Combined Hessian check
pf.tests.utils.check_constraint_combined_Hessian(self,
constr,
x0,
y0,
NUM_TRIALS,
TOL,
EPS,
h)
def test_constr_REG_PF_SWITCH(self):
# Multiperiod
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
self.assertEqual(net.num_vars,0)
# Vars
net.set_flags('vsc converter',
'variable',
'any',
['active power', 'reactive power'])
self.assertEqual(net.num_vars, 2*net.get_num_vsc_converters()*self.T)
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
# PF
for vsc in net.vsc_converters:
if vsc.is_in_f_ac_mode():
vsc.target_power_factor = np.sign(np.random.randn())*np.minimum(np.random.rand(), 0.2)
# Constraint
constr = pf.Constraint('switching power factor regulation',net)
self.assertEqual(constr.name,'switching power factor regulation')
f = constr.f
J = constr.J
A = constr.A
b = constr.b
# Before
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,0))
self.assertEqual(A.nnz,0)
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
# Manual count
nnz = 0
num_constr = 0
for vsc in net.vsc_converters:
if vsc.is_in_f_ac_mode() and vsc.has_flags('variable', ['active power', 'reactive power']):
num_constr += 1
nnz += 2
constr.analyze()
self.assertEqual(constr.A.shape[0],num_constr*self.T)
self.assertEqual(nnz*self.T,constr.A_nnz)
constr.eval(x0)
self.assertEqual(0,constr.A_nnz)
f = constr.f
J = constr.J
A = constr.A
b = constr.b
# After
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(num_constr*self.T,))
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(num_constr*self.T,net.num_vars))
self.assertEqual(A.nnz,nnz*self.T)
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,net.num_vars))
self.assertEqual(J.nnz,0)
self.assertTrue(not np.any(np.isinf(b)))
self.assertTrue(not np.any(np.isnan(b)))
# Detailed check
Ai = A.row
Aj = A.col
Ad = A.data
self.assertEqual(Ai.size,nnz*self.T)
self.assertEqual(Aj.size,nnz*self.T)
self.assertEqual(Ad.size,nnz*self.T)
nnz = 0
row = 0
for t in range(self.T):
for bus in net.buses:
for vsc in bus.vsc_converters:
if vsc.is_in_f_ac_mode():
gamma = vsc.target_power_factor
factor = np.sqrt(1-gamma**2.)/np.abs(gamma)
self.assertEqual(b[row], 0.)
self.assertEqual(Ai[nnz], row)
self.assertEqual(Aj[nnz], vsc.index_P[t])
if gamma >= 0.:
self.assertAlmostEqual(Ad[nnz], -factor)
else:
self.assertAlmostEqual(Ad[nnz], factor)
nnz += 1
self.assertEqual(Ai[nnz], row)
self.assertEqual(Aj[nnz], vsc.index_Q[t])
self.assertEqual(Ad[nnz], 1.)
nnz += 1
row += 1
self.assertEqual(row,A.shape[0])
self.assertEqual(nnz,A.nnz)
def test_constr_HVDCPF(self):
# Multiperiods
for case in test_cases.CASES:
net = pf.Parser(case).parse(case, self.T)
self.assertEqual(net.num_periods, self.T)
# Vars
net.set_flags('dc bus',
'variable',
'any',
'voltage')
# Vars
net.set_flags('vsc converter',
'variable',
'any',
'dc power')
net.set_flags('csc converter',
'variable',
'any',
'dc power')
self.assertEqual(net.num_vars, (net.num_dc_buses +
2*net.num_vsc_converters +
2*net.num_csc_converters)*self.T)
# Constraint
constr = pf.Constraint('HVDC power balance',net)
self.assertEqual(constr.name,'HVDC power balance')
f = constr.f
J = constr.J
A = constr.A
b = constr.b
G = constr.G
l = constr.l
u = constr.u
# Before
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,0))
self.assertEqual(A.nnz,0)
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.J_row,0)
self.assertEqual(constr.A_row,0)
self.assertEqual(constr.H_combined.nnz, 0)
self.assertTupleEqual(constr.H_combined.shape, (0,0))
self.assertTrue(type(G) is coo_matrix)
self.assertTupleEqual(G.shape,(0,0))
self.assertEqual(G.nnz,0)
self.assertEqual(constr.num_extra_vars,0)
x0 = net.get_var_values()+1e-1*np.random.randn(net.num_vars)
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
flags = np.zeros(net.num_dc_buses*self.T, dtype=int)
for t in range(self.T):
for bus in net.dc_buses:
flags[bus.index_t[t]] = 1
self.assertEqual(np.sum(flags), flags.size)
constr.analyze()
A = constr.A
b = constr.b
self.assertTupleEqual(constr.J.shape, (0, net.num_vars))
self.assertTupleEqual(constr.G.shape, (0, net.num_vars))
self.assertEqual(constr.J.nnz, 0)
self.assertEqual(constr.G.nnz, 0)
self.assertEqual(constr.l.size, 0)
self.assertEqual(constr.u.size, 0)
self.assertEqual(constr.f.size, 0)
self.assertTupleEqual(constr.A.shape, (net.num_dc_buses*self.T, net.num_vars))
self.assertEqual(constr.A.nnz, (net.num_vsc_converters +
net.num_csc_converters +
4*net.num_dc_branches)*self.T)
self.assertEqual(constr.b.size, net.num_dc_buses*self.T)
i_mis_manual = np.zeros(net.num_dc_buses*self.T)
i_mis = A*x0-b
for t in range(self.T):
for bus in net.dc_buses:
for branch in bus.branches:
self.assertTrue(branch.bus_k.has_flags('variable', 'voltage'))
self.assertTrue(branch.bus_m.has_flags('variable', 'voltage'))
ikm = (x0[branch.bus_k.index_v[t]]-x0[branch.bus_m.index_v[t]])/branch.r
self.assertEqual(ikm, branch.get_i_km(x0)[t])
if bus.is_equal(branch.bus_k):
i_out = ikm
else:
i_out = -ikm
i_mis_manual[bus.index_t[t]] -= i_out
for conv in net.vsc_converters:
self.assertTrue(conv.has_flags('variable', 'dc power'))
i_in = x0[conv.index_i_dc[t]]
i_mis_manual[conv.dc_bus.index_t[t]] += i_in
for conv in net.csc_converters:
self.assertTrue(conv.has_flags('variable', 'dc power'))
i_in = x0[conv.index_i_dc[t]]
i_mis_manual[conv.dc_bus.index_t[t]] += i_in
if not i_mis.size:
self.assertTrue(np.all(i_mis_manual == i_mis))
else:
self.assertLessEqual(np.max(np.abs(i_mis_manual-i_mis)), 1e-10)
net.set_var_values(x0)
for t in range(self.T):
for bus in net.dc_buses:
self.assertNotEqual(bus.v[t], 0.)
self.assertNotEqual(bus.v[t], 1.)
for conv in net.vsc_converters:
self.assertNotEqual(conv.P_dc[t], 0.)
self.assertNotEqual(conv.i_dc[t], 0.)
# Test with no variables
net.clear_flags()
self.assertEqual(net.num_vars, 0)
constr.analyze()
A = constr.A
b = constr.b
self.assertTupleEqual(constr.J.shape, (0, net.num_vars))
self.assertTupleEqual(constr.G.shape, (0, net.num_vars))
self.assertEqual(constr.J.nnz, 0)
self.assertEqual(constr.G.nnz, 0)
self.assertEqual(constr.l.size, 0)
self.assertEqual(constr.u.size, 0)
self.assertEqual(constr.f.size, 0)
self.assertTupleEqual(constr.A.shape, (net.num_dc_buses*self.T, 0))
self.assertEqual(constr.A.nnz, 0)
self.assertEqual(constr.b.size, net.num_dc_buses*self.T)
x0 = net.get_var_values()
self.assertEqual(x0.size, 0)
i_mis_manual = np.zeros(net.num_dc_buses*self.T)
i_mis = A*x0-b
for t in range(self.T):
for bus in net.dc_buses:
for branch in bus.branches:
self.assertFalse(branch.bus_k.has_flags('variable', 'voltage'))
self.assertFalse(branch.bus_m.has_flags('variable', 'voltage'))
ikm = branch.i_km[t]
if bus.is_equal(branch.bus_k):
i_out = ikm
else:
i_out = -ikm
i_mis_manual[bus.index_t[t]] -= i_out
for conv in net.vsc_converters:
self.assertFalse(conv.has_flags('variable', 'dc power'))
i_in = conv.i_dc[t]
i_mis_manual[conv.dc_bus.index_t[t]] += i_in
for conv in net.csc_converters:
self.assertFalse(conv.has_flags('variable', 'dc power'))
i_in = conv.i_dc[t]
i_mis_manual[conv.dc_bus.index_t[t]] += i_in
if not i_mis.size:
self.assertTrue(np.all(i_mis_manual == i_mis))
else:
self.assertLessEqual(np.max(np.abs(i_mis_manual-i_mis)), 1e-10)
def test_constr_VSC_EQ(self):
# Constants
h = 1e-10
# Multiperiods
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
# Vars
net.set_flags('vsc converter',
'variable',
'any',
['dc power', 'active power'])
net.set_flags('dc bus',
'variable',
'any',
'voltage')
# Check if dc bus indexes are setting to unique values
busindicest = [bus.index_t for bus in net.dc_buses]
self.assertEqual(len(np.unique(busindicest)), net.num_dc_buses*self.T)
busindicesv = [bus.index_v for bus in net.dc_buses]
self.assertEqual(len(np.unique(busindicesv)), net.num_dc_buses*self.T)
# Check if vsc different variables index are setting to unique values
vscindicesPac = [vsc.index_P for vsc in net.vsc_converters]
self.assertEqual(len(np.unique(vscindicesPac)), net.num_vsc_converters*self.T)
vscindicesPdc = [vsc.index_P_dc for vsc in net.vsc_converters]
self.assertEqual(len(np.unique(vscindicesPdc)), net.num_vsc_converters*self.T)
vscindicesidc = [vsc.index_i_dc for vsc in net.vsc_converters]
self.assertEqual(len(np.unique(vscindicesidc)), net.num_vsc_converters*self.T)
self.assertEqual(net.num_vars, (3*net.num_vsc_converters+net.num_dc_buses)*self.T)
# Constraint
constr = pf.Constraint('VSC converter equations',net)
self.assertEqual(constr.name,'VSC converter equations')
f = constr.f
J = constr.J
A = constr.A
b = constr.b
# Before
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,0))
self.assertEqual(A.nnz,0)
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.J_row,0)
self.assertEqual(constr.A_row,0)
self.assertEqual(constr.num_extra_vars,0)
x0 = net.get_var_values()+1e-4*np.random.randn(net.num_vars)
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
# Verify analyze
constr.analyze()
Annz = 3*net.num_vsc_converters*self.T
Jnnz =3*net.num_vsc_converters*self.T
Arow = net.num_vsc_converters*self.T
Jrow = net.num_vsc_converters*self.T
self.assertEqual(constr.A_nnz,Annz)
self.assertEqual(constr.A_row,Arow)
self.assertEqual(constr.J_nnz,Jnnz)
self.assertEqual(constr.J_row,Jrow)
# Verify evaluation
constr.eval(x0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.A_row,0)
self.assertEqual(constr.J_nnz,Jnnz)
self.assertEqual(constr.J_row,Jrow)
f = constr.f
J = constr.J
b = constr.b
A = constr.A
# After
self.assertTrue(not np.any(np.isinf(b)))
self.assertTrue(not np.any(np.isnan(b)))
self.assertTupleEqual(b.shape,(Arow,))
self.assertTrue(not np.any(np.isinf(f)))
self.assertTrue(not np.any(np.isnan(f)))
self.assertTupleEqual(f.shape,(Jrow,))
self.assertTrue(not np.any(np.isinf(J.data)))
self.assertTrue(not np.any(np.isnan(J.data)))
self.assertTupleEqual(J.shape,(Jrow,net.num_vars))
self.assertTrue(not np.any(np.isinf(A.data)))
self.assertTrue(not np.any(np.isnan(A.data)))
self.assertTupleEqual(A.shape,(Arow,net.num_vars))
# Ax = b check
coeffB = [vsc.loss_coeff_B for vsc in net.vsc_converters]
sumcoefB = np.sum(np.abs(coeffB))
self.assertAlmostEqual(norm(A.data,1),2*Arow+sumcoefB*self.T) # Almost, because of float type, to avoid precision errors
for k in range(self.T):
for vsc in net.vsc_converters:
self.assertTrue(vsc.has_flags('variable',['dc power','active power']))
indexP = np.where(A.col == vsc.index_P[k])[0]
indexPdc = np.where(A.col == vsc.index_P_dc[k])[0]
indexidc = np.where(A.col == vsc.index_i_dc[k])[0]
self.assertEqual(indexP.size,1)
self.assertEqual(indexPdc.size,1)
self.assertEqual(indexidc.size,1)
self.assertEqual(A.data[indexP],1.)
self.assertEqual(A.data[indexPdc],1.)
if vsc.P_dc_set[k] <= 0:
self.assertEqual(A.data[indexidc],-vsc.loss_coeff_B)
else:
self.assertEqual(A.data[indexidc],vsc.loss_coeff_B)
self.assertEqual(b[A.row[indexP]],-1.*vsc.loss_coeff_A)
# f check
J_row = 0
for t in range(self.T):
for bus in net.dc_buses:
vsc_onthisbus = [vsc for vsc in net.vsc_converters if vsc.dc_bus == bus]
for vsc in vsc_onthisbus:
indexPdc = np.where(J.col == vsc.index_P_dc[t])[0]
indexidc = np.where(J.col == vsc.index_i_dc[t])[0]
indexv = np.where(J.col == bus.index_v[t])[0]
dP = x0[J.col[indexPdc]] - x0[J.col[indexidc]]*x0[J.col[indexv]]
self.assertAlmostEqual(f[J_row],dP)
J_row += 1
# Jacobian check
pf.tests.utils.check_constraint_Jacobian(self,
constr,
x0,
np.zeros(0),
NUM_TRIALS,
TOL,
EPS,
h)
# Sigle Hessian check
pf.tests.utils.check_constraint_single_Hessian(self,
constr,
x0,
np.zeros(0),
NUM_TRIALS,
TOL,
EPS,
h)
# Combined Hessian check
pf.tests.utils.check_constraint_combined_Hessian(self,
constr,
x0,
np.zeros(0),
NUM_TRIALS,
TOL,
EPS,
h)
def test_constr_VSC_DC_PSET(self):
# Multiperiods
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
# Check if bus indexes are setting to unique values
busindices = [bus.index_t for bus in net.dc_buses]
self.assertEqual(len(np.unique(busindices)), net.num_dc_buses*self.T)
# Vars
net.set_flags('vsc converter',
'variable',
'any',
'active power')
self.assertEqual(net.num_vars, net.num_vsc_converters*self.T)
# Constraint
constr = pf.Constraint('VSC DC power control',net)
self.assertEqual(constr.name,'VSC DC power control')
f = constr.f
J = constr.J
A = constr.A
b = constr.b
# Before
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,0))
self.assertEqual(A.nnz,0)
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.J_row,0)
self.assertEqual(constr.A_row,0)
self.assertEqual(constr.num_extra_vars,0)
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
# Verify analyze
constr.analyze()
# Check if vsc index are setting to unique values
vscindices = [vsc.index_P for vsc in net.vsc_converters]
self.assertEqual(len(np.unique(vscindices)), net.num_vsc_converters*self.T)
dcmodevsc = [vsc for vsc in net.vsc_converters if vsc.is_in_P_dc_mode()]
Annz = len(dcmodevsc)*self.T
Arow = Annz
self.assertEqual(constr.A_nnz,Annz)
self.assertEqual(constr.A_row,Arow)
# Verify evaluation
constr.eval(x0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.A_row,0)
f = constr.f
J = constr.J
b = constr.b
A = constr.A
# After
self.assertTrue(not np.any(np.isinf(b)))
self.assertTrue(not np.any(np.isnan(b)))
self.assertTrue(not np.any(np.isinf(f)))
self.assertTrue(not np.any(np.isnan(f)))
self.assertTrue(not np.any(np.isinf(J.data)))
self.assertTrue(not np.any(np.isnan(J.data)))
self.assertTrue(not np.any(np.isinf(A.data)))
self.assertTrue(not np.any(np.isnan(A.data)))
# Ax = b Check
for k in range(self.T):
for vsc in dcmodevsc:
self.assertTrue(vsc.has_flags('variable', ['active power']))
indexP = np.where(A.col == vsc.index_P[k])[0]
self.assertEqual(indexP.size,1)
self.assertEqual(A.data[indexP],-1)
self.assertEqual(b[A.row[indexP]],vsc.P_dc_set[k])
def test_constr_VSC_DC_VSET(self):
# Multiperiods
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
# Check if bus indexes are setting to unique values
busindices = [bus.index_t for bus in net.dc_buses]
self.assertEqual(len(np.unique(busindices)), net.num_dc_buses*self.T)
# Vars
net.set_flags('dc bus',
'variable',
'any',
'voltage')
self.assertEqual(net.num_vars, net.num_dc_buses*self.T)
# Constraint
constr = pf.Constraint('VSC DC voltage control',net)
self.assertEqual(constr.name,'VSC DC voltage control')
f = constr.f
J = constr.J
A = constr.A
b = constr.b
# Before
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,0))
self.assertEqual(A.nnz,0)
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.J_row,0)
self.assertEqual(constr.A_row,0)
self.assertEqual(constr.num_extra_vars,0)
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
# Verify analyze
constr.analyze()
dcmodevsc = [vsc for vsc in net.vsc_converters if vsc.is_in_v_dc_mode()]
Annz = len(dcmodevsc)*self.T
Arow = Annz
self.assertEqual(constr.A_nnz,Annz)
self.assertEqual(constr.A_row,Arow)
# Verify evaluation
constr.eval(x0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.A_row,0)
f = constr.f
J = constr.J
A = constr.A
b = constr.b
# After
self.assertTrue(not np.any(np.isinf(b)))
self.assertTrue(not np.any(np.isnan(b)))
self.assertTrue(not np.any(np.isinf(f)))
self.assertTrue(not np.any(np.isnan(f)))
self.assertTrue(not np.any(np.isinf(J.data)))
self.assertTrue(not np.any(np.isnan(J.data)))
self.assertTrue(not np.any(np.isinf(A.data)))
self.assertTrue(not np.any(np.isnan(A.data)))
# Verify A matrix
self.assertTrue(np.all(A.data == 1))
for t in range(0,self.T):
indices = [vsc.dc_bus.index_v[t] for vsc in dcmodevsc]
self.assertTrue(np.all(A.col[t*len(dcmodevsc):(t*len(dcmodevsc)+len(dcmodevsc))] == indices))
# Verify b vector
for t in range(0,self.T):
setpoints = [vsc.v_dc_set[t] for vsc in dcmodevsc]
self.assertTrue(np.all(b[t*len(dcmodevsc):(t*len(dcmodevsc)+len(dcmodevsc))] == setpoints))
def test_constr_LOAD_VDEP(self):
# Constants
h = 1e-10
# Multiperiods
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
# Vars
net.set_flags('bus',
'variable',
'any',
'voltage magnitude')
net.set_flags('load',
'variable',
'any',
['active power', 'reactive power'])
self.assertEqual(net.num_vars, (2*net.num_loads+net.num_buses)*self.T)
x0 = net.get_var_values()+1e-5*np.random.randn(net.num_vars)
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
# Loads comps
for load in net.loads:
load.comp_ci = np.random.randn(self.T)
load.comp_cj = np.random.randn(self.T)
load.comp_cg = np.random.randn()
load.comp_cb = np.random.randn()
load.comp_cp = load.P
load.comp_cq = load.Q
# Constraint
constr = pf.Constraint('load voltage dependence',net)
self.assertEqual(constr.name,'load voltage dependence')
f = constr.f
J = constr.J
A = constr.A
b = constr.b
# Before
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,0))
self.assertEqual(A.nnz,0)
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.J_row,0)
self.assertEqual(constr.A_row,0)
self.assertEqual(constr.num_extra_vars,0)
Jnnz = 4*net.num_loads*self.T
rowsJ = 2*net.num_loads*self.T
constr.analyze()
self.assertEqual(constr.J_nnz,Jnnz)
self.assertEqual(constr.J_row,rowsJ)
self.assertEqual(constr.num_extra_vars,0)
self.assertLessEqual(constr.J_row, constr.H_nnz.size)
self.assertLessEqual(2*net.num_loads*net.num_periods, constr.H_nnz.size)
self.assertTrue(np.all(constr.H_nnz[:2*net.num_loads*net.num_periods] == 1))
for i in range(rowsJ):
H = constr.get_H_single(i)
self.assertEqual(H.shape[0], net.num_vars)
self.assertEqual(H.shape[1], net.num_vars)
self.assertEqual(H.nnz, 1)
H = constr.H_combined
self.assertEqual(H.shape[0], net.num_vars)
self.assertEqual(H.shape[1], net.num_vars)
self.assertEqual(H.nnz, rowsJ)
constr.eval(x0)
self.assertEqual(constr.J_nnz,Jnnz)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.J_row,rowsJ)
self.assertEqual(constr.A_row,0)
f = constr.f
J = constr.J
A = constr.A
b = constr.b
# After
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(rowsJ,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(rowsJ,net.num_vars))
self.assertEqual(J.nnz,Jnnz)
self.assertTrue(not np.any(np.isinf(b)))
self.assertTrue(not np.any(np.isnan(b)))
self.assertTrue(not np.any(np.isinf(f)))
self.assertTrue(not np.any(np.isnan(f)))
self.assertTrue(not np.any(np.isinf(J.data)))
self.assertTrue(not np.any(np.isnan(J.data)))
self.assertTrue(not np.any(np.isinf(A.data)))
self.assertTrue(not np.any(np.isnan(A.data)))
# f check
J_row = 0
for t in range(self.T):
for bus in net.buses:
for load in bus.loads:
Sp = (x0[load.index_P[t]] -
load.comp_cp[t] -
load.comp_ci[t]*x0[bus.index_v_mag[t]] -
load.comp_cg*(x0[bus.index_v_mag[t]])**2.)
Sq = (x0[load.index_Q[t]] -
load.comp_cq[t] -
load.comp_cj[t]*x0[bus.index_v_mag[t]] +
load.comp_cb*(x0[bus.index_v_mag[t]])**2.)
self.assertAlmostEqual(Sp,f[J_row])
self.assertAlmostEqual(Sq,f[J_row+1])
J_row += 2
# Jacobian check
pf.tests.utils.check_constraint_Jacobian(self,
constr,
x0,
np.zeros(0),
NUM_TRIALS,
TOL,
EPS,
h)
# Single Hessian check
pf.tests.utils.check_constraint_single_Hessian(self,
constr,
x0,
np.zeros(0),
NUM_TRIALS,
TOL,
EPS,
h)
# Combined Hessian check
pf.tests.utils.check_constraint_combined_Hessian(self,
constr,
x0,
np.zeros(0),
NUM_TRIALS,
TOL,
EPS,
h)
def test_constr_CFUNC(self):
h = 1e-9
# Multiperiod
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
net.set_flags('bus',
'variable',
'any',
'voltage magnitude')
net.set_flags('generator',
'variable',
'any',
'active power')
self.assertEqual(net.num_vars, (net.num_buses+net.num_generators)*net.num_periods)
x = net.get_var_values() + 1e-2*np.random.rand(net.num_vars)
func = pf.Function('generation cost', 1., net)
constr = pf.Constraint('constrained function', net)
rhs = 100.
constr.set_parameter("rhs", rhs)
constr.set_parameter("func", func)
# Equality
constr.set_parameter("op", "=")
constr.analyze()
self.assertEqual(constr.num_extra_vars, 0)
self.assertEqual(constr.G.nnz, 0)
self.assertTupleEqual(constr.G.shape, (0, net.num_vars))
self.assertEqual(constr.l.size, 0)
self.assertEqual(constr.u.size, 0)
self.assertEqual(constr.l_extra_vars.size, 0)
self.assertEqual(constr.u_extra_vars.size, 0)
self.assertEqual(constr.init_extra_vars.size, 0)
self.assertEqual(constr.f.size, 1)
self.assertEqual(constr.J.nnz, net.num_vars)
self.assertTupleEqual(constr.J.shape, (1, net.num_vars))
H = constr.get_H_single(0)
self.assertEqual(H.nnz, func.Hphi.nnz)
self.assertTupleEqual(H.shape, (net.num_vars, net.num_vars))
self.assertEqual(func.phi, 0.)
constr.eval(x)
net.update_properties(x)
self.assertNotEqual(func.phi, 0.)
self.assertLess(np.abs(func.phi-np.sum(net.gen_P_cost)), 1e-12*(np.abs(func.phi)+1.))
self.assertEqual(constr.f[0], func.phi - rhs - 0.)
pf.tests.utils.check_constraint_Jacobian(self, constr, x, np.zeros(0), NUM_TRIALS, TOL, EPS, h, quiet=True)
pf.tests.utils.check_constraint_single_Hessian(self, constr, x, np.zeros(0), NUM_TRIALS, TOL, EPS, h, quiet=True)
pf.tests.utils.check_constraint_combined_Hessian(self, constr, x, np.zeros(0), NUM_TRIALS, TOL, EPS, h, quiet=True)
# Inequality >=
constr.set_parameter("op", ">=")
constr.analyze()
self.assertEqual(constr.num_extra_vars, 1)
self.assertEqual(constr.G.nnz, 1)
self.assertEqual(constr.G.row[0], 0)
self.assertEqual(constr.G.col[0], net.num_vars)
self.assertEqual(constr.G.data[0], 1.)
self.assertTupleEqual(constr.G.shape, (1, net.num_vars+1))
self.assertEqual(constr.l.size, 1)
self.assertEqual(constr.u.size, 1)
self.assertEqual(constr.l_extra_vars.size, 1)
self.assertEqual(constr.u_extra_vars.size, 1)
self.assertEqual(constr.init_extra_vars.size, 1)
self.assertEqual(constr.l[0],0)
self.assertEqual(constr.l_extra_vars[0],0)
self.assertEqual(constr.u[0],1e8)
self.assertEqual(constr.u_extra_vars[0],1e8)
self.assertEqual(constr.f.size, 1)
self.assertEqual(constr.J.nnz, net.num_vars+1)
self.assertTupleEqual(constr.J.shape, (1, net.num_vars+1))
H = constr.get_H_single(0)
self.assertEqual(H.nnz, func.Hphi.nnz)
self.assertTupleEqual(H.shape, (net.num_vars+1, net.num_vars+1))
self.assertEqual(func.phi, 0.)
y = np.random.randn(1)
constr.eval(x,y)
net.update_properties(x)
self.assertNotEqual(func.phi, 0.)
self.assertLess(np.abs(func.phi-np.sum(net.gen_P_cost)), 1e-12*(np.abs(func.phi)+1.))
self.assertEqual(constr.f[0], func.phi - rhs - y[0])
pf.tests.utils.check_constraint_Jacobian(self, constr, x, y, NUM_TRIALS, TOL, EPS, h, quiet=True)
pf.tests.utils.check_constraint_single_Hessian(self, constr, x, y, NUM_TRIALS, TOL, EPS, h, quiet=True)
pf.tests.utils.check_constraint_combined_Hessian(self, constr, x, y, NUM_TRIALS, TOL, EPS, h, quiet=True)
self.assertEqual(constr.G*np.hstack((x,y)),y[0])
# Inequality <=
constr.set_parameter("op", "<=")
constr.analyze()
self.assertEqual(constr.num_extra_vars, 1)
self.assertEqual(constr.G.nnz, 1)
self.assertEqual(constr.G.row[0], 0)
self.assertEqual(constr.G.col[0], net.num_vars)
self.assertEqual(constr.G.data[0], 1.)
self.assertTupleEqual(constr.G.shape, (1, net.num_vars+1))
self.assertEqual(constr.l.size, 1)
self.assertEqual(constr.u.size, 1)
self.assertEqual(constr.l_extra_vars.size, 1)
self.assertEqual(constr.u_extra_vars.size, 1)
self.assertEqual(constr.init_extra_vars.size, 1)
self.assertEqual(constr.l[0],-1e8)
self.assertEqual(constr.l_extra_vars[0],-1e8)
self.assertEqual(constr.u[0],0)
self.assertEqual(constr.u_extra_vars[0],0)
self.assertEqual(constr.f.size, 1)
self.assertEqual(constr.J.nnz, net.num_vars+1)
self.assertTupleEqual(constr.J.shape, (1, net.num_vars+1))
H = constr.get_H_single(0)
self.assertEqual(H.nnz, func.Hphi.nnz)
self.assertTupleEqual(H.shape, (net.num_vars+1, net.num_vars+1))
self.assertEqual(func.phi, 0.)
y = np.random.randn(1)
constr.eval(x,y)
net.update_properties(x)
self.assertNotEqual(func.phi, 0.)
self.assertLess(np.abs(func.phi-np.sum(net.gen_P_cost)), 1e-12*(np.abs(func.phi)+1.))
self.assertEqual(constr.f[0], func.phi - rhs - y[0])
pf.tests.utils.check_constraint_Jacobian(self, constr, x, y, NUM_TRIALS, TOL, EPS, h, quiet=True)
pf.tests.utils.check_constraint_single_Hessian(self, constr, x, y, NUM_TRIALS, TOL, EPS, h, quiet=True)
pf.tests.utils.check_constraint_combined_Hessian(self, constr, x, y, NUM_TRIALS, TOL, EPS, h, quiet=True)
self.assertEqual(constr.G*np.hstack((x,y)),y[0])
def test_constr_FIX(self):
# Single period
for case in test_cases.CASES:
net = pf.Parser(case).parse(case)
self.assertEqual(net.num_periods,1)
# add vargens
net.add_var_generators_from_parameters(net.get_load_buses(),80.,50.,30.,5,0.05)
for vargen in net.var_generators:
vargen.P = vargen.index*1.5
vargen.Q = vargen.index*2.5
self.assertGreater(net.num_var_generators,0)
self.assertEqual(net.num_vars,0)
self.assertEqual(net.num_fixed,0)
# Vars
net.set_flags('bus',
'variable',
'any',
['voltage magnitude','voltage angle'])
net.set_flags('generator',
'variable',
'slack',
'active power')
net.set_flags('generator',
'variable',
'regulator',
'reactive power')
net.set_flags('branch',
'variable',
'tap changer',
'tap ratio')
net.set_flags('branch',
'variable',
'phase shifter',
'phase shift')
net.set_flags('shunt',
'variable',
'switching - v',
'susceptance')
net.set_flags('variable generator',
'variable',
'any',
['active power','reactive power'])
net.set_flags('battery',
'variable',
'any',
['charging power','energy level'])
net.set_flags('load',
'variable',
'any',
'active power')
self.assertGreater(net.num_vars,0)
self.assertEqual(net.num_fixed,0)
self.assertEqual(net.num_vars,
2*net.num_buses +
net.get_num_slack_gens() +
net.get_num_reg_gens() +
net.get_num_tap_changers() +
net.get_num_phase_shifters() +
net.get_num_switched_v_shunts() +
net.num_var_generators*2+
3*net.num_batteries+
net.num_loads)
# Fixed
net.set_flags('bus',
'fixed',
'slack',
['voltage magnitude','voltage angle'])
net.set_flags('bus',
'fixed',
'regulated by generator',
'voltage magnitude')
net.set_flags('generator',
'fixed',
'regulator',
'reactive power')
net.set_flags('branch',
'fixed',
'tap changer',
'tap ratio')
net.set_flags('branch',
'fixed',
'phase shifter',
'phase shift')
net.set_flags('shunt',
'fixed',
'switching - v',
'susceptance')
net.set_flags('variable generator',
'fixed',
'any',
['active power','reactive power'])
net.set_flags('battery',
'fixed',
'any',
['charging power','energy level'])
net.set_flags('load',
'fixed',
'any',
'active power')
self.assertGreater(net.num_fixed,0)
self.assertEqual(net.num_fixed,
2*(net.get_num_slack_buses()) +
(net.get_num_buses_reg_by_gen()-net.get_num_slack_buses()) +
net.get_num_reg_gens() +
net.get_num_tap_changers() +
net.get_num_phase_shifters() +
net.get_num_switched_v_shunts() +
net.num_var_generators*2+
3*net.num_batteries+
net.num_loads)
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
constr = pf.Constraint('variable fixing',net)
self.assertEqual(constr.name,'variable fixing')
f = constr.f
J = constr.J
A = constr.A
b = constr.b
G = constr.G
l = constr.l
u = constr.u
# Before
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,0))
self.assertEqual(A.nnz,0)
self.assertTrue(type(G) is coo_matrix)
self.assertTupleEqual(G.shape,(0,0))
self.assertEqual(G.nnz,0)
self.assertTrue(type(l) is np.ndarray)
self.assertTupleEqual(l.shape,(0,))
self.assertTrue(type(u) is np.ndarray)
self.assertTupleEqual(u.shape,(0,))
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.G_nnz,0)
A_nnz = net.num_fixed
constr.analyze()
self.assertEqual(A_nnz,constr.A_nnz)
constr.eval(x0)
self.assertEqual(0,constr.A_nnz)
f = constr.f
J = constr.J
A = constr.A
b = constr.b
G = constr.G
l = constr.l
u = constr.u
# After
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(net.num_fixed,))
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(net.num_fixed,net.num_vars))
self.assertEqual(A.nnz,net.num_fixed)
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,net.num_vars))
self.assertEqual(J.nnz,0)
self.assertTrue(type(G) is coo_matrix)
self.assertTupleEqual(G.shape,(0,net.num_vars))
self.assertEqual(G.nnz,0)
self.assertTrue(type(l) is np.ndarray)
self.assertTupleEqual(l.shape,(0,))
self.assertTrue(type(u) is np.ndarray)
self.assertTupleEqual(u.shape,(0,))
self.assertTrue(not np.any(np.isinf(b)))
self.assertTrue(not np.any(np.isnan(b)))
# Vargen
for vargen in net.var_generators:
ar = np.where(A.col == vargen.index_P)[0]
self.assertEqual(ar.size,1)
self.assertEqual(A.col[ar[0]],vargen.index_P)
self.assertEqual(b[A.row[ar[0]]],vargen.P)
self.assertEqual(b[A.row[ar[0]]],vargen.index*1.5)
for vargen in net.var_generators:
ar = np.where(A.col == vargen.index_Q)[0]
self.assertEqual(ar.size,1)
self.assertEqual(A.col[ar[0]],vargen.index_Q)
self.assertEqual(b[A.row[ar[0]]],vargen.Q)
self.assertEqual(b[A.row[ar[0]]],vargen.index*2.5)
# Batteries
for bat in net.batteries:
ar = np.where(A.col == bat.index_Pc)[0]
self.assertEqual(ar.size,1)
self.assertEqual(A.col[ar[0]],bat.index_Pc)
self.assertEqual(b[A.row[ar[0]]],max([bat.P,0]))
for bat in net.batteries:
ar = np.where(A.col == bat.index_Pd)[0]
self.assertEqual(ar.size,1)
self.assertEqual(A.col[ar[0]],bat.index_Pd)
self.assertEqual(b[A.row[ar[0]]],max([-bat.P,0]))
for bat in net.batteries:
ar = np.where(A.col == bat.index_E)[0]
self.assertEqual(ar.size,1)
self.assertEqual(A.col[ar[0]],bat.index_E)
self.assertEqual(b[A.row[ar[0]]],bat.E)
# Load
for load in net.loads:
self.assertTrue(load.has_flags('variable','active power'))
self.assertTrue(load.has_flags('fixed','active power'))
ar = np.where(A.col == load.index_P)[0]
self.assertEqual(ar.size,1)
self.assertEqual(A.col[ar[0]],load.index_P)
self.assertEqual(b[A.row[ar[0]]],load.P)
# Projections
P1 = constr.get_var_projection()
P2 = constr.get_extra_var_projection()
self.assertTrue(isinstance(P1,coo_matrix))
self.assertTrue(isinstance(P2,coo_matrix))
self.assertEqual(P1.shape[0],net.num_vars)
self.assertEqual(P2.shape[0],0)
self.assertEqual(P1.shape[1],net.num_vars)
self.assertEqual(P2.shape[1],net.num_vars)
self.assertEqual(P1.nnz,net.num_vars)
self.assertEqual(P2.nnz,0)
self.assertLess(np.linalg.norm(x0-P1*x0),1e-12)
# Multiperiods
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
# add vargens
net.add_var_generators_from_parameters(net.get_load_buses(),80.,50.,30.,5,0.05)
for vargen in net.var_generators:
vargen.P = np.random.rand(self.T)*10
vargen.Q = np.random.rand(self.T)*10
self.assertEqual(vargen.num_periods,self.T)
self.assertGreater(net.num_var_generators,0)
self.assertEqual(net.num_vars,0)
self.assertEqual(net.num_fixed,0)
# Vars
net.set_flags('bus',
['variable','fixed'],
'any',
['voltage magnitude','voltage angle'])
net.set_flags('generator',
['variable','fixed'],
'slack',
'active power')
net.set_flags('generator',
['variable','fixed'],
'regulator',
'reactive power')
net.set_flags('branch',
['variable','fixed'],
'tap changer',
'tap ratio')
net.set_flags('branch',
['variable','fixed'],
'phase shifter',
'phase shift')
net.set_flags('shunt',
['variable','fixed'],
'switching - v',
'susceptance')
net.set_flags('variable generator',
['variable','fixed'],
'any',
['active power','reactive power'])
net.set_flags('battery',
['variable','fixed'],
'any',
['charging power','energy level'])
net.set_flags('load',
['variable','fixed'],
'any',
'active power')
self.assertGreater(net.num_vars,0)
self.assertEqual(net.num_fixed,net.num_vars)
self.assertEqual(net.num_vars,
(2*net.num_buses +
net.get_num_slack_gens() +
net.get_num_reg_gens() +
net.get_num_tap_changers() +
net.get_num_phase_shifters() +
net.get_num_switched_v_shunts() +
net.num_var_generators*2+
3*net.num_batteries+
net.num_loads)*self.T)
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
constr = pf.Constraint('variable fixing',net)
self.assertEqual(constr.name,'variable fixing')
f = constr.f
J = constr.J
A = constr.A
b = constr.b
G = constr.G
l = constr.l
u = constr.u
# Before
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,0))
self.assertEqual(A.nnz,0)
self.assertTrue(type(G) is coo_matrix)
self.assertTupleEqual(G.shape,(0,0))
self.assertEqual(G.nnz,0)
self.assertTrue(type(l) is np.ndarray)
self.assertTupleEqual(l.shape,(0,))
self.assertTrue(type(u) is np.ndarray)
self.assertTupleEqual(u.shape,(0,))
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.G_nnz,0)
A_nnz = net.num_fixed
constr.analyze()
self.assertEqual(A_nnz,constr.A_nnz)
constr.eval(x0)
self.assertEqual(0,constr.A_nnz)
f = constr.f
J = constr.J
A = constr.A
b = constr.b
G = constr.G
l = constr.l
u = constr.u
# After
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(net.num_fixed,))
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(net.num_fixed,net.num_vars))
self.assertEqual(A.nnz,net.num_fixed)
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,net.num_vars))
self.assertEqual(J.nnz,0)
self.assertTrue(type(G) is coo_matrix)
self.assertTupleEqual(G.shape,(0,net.num_vars))
self.assertEqual(G.nnz,0)
self.assertTrue(type(l) is np.ndarray)
self.assertTupleEqual(l.shape,(0,))
self.assertTrue(type(u) is np.ndarray)
self.assertTupleEqual(u.shape,(0,))
self.assertTrue(not np.any(np.isinf(b)))
self.assertTrue(not np.any(np.isnan(b)))
# Time loop
for t in range(self.T):
# bus
for bus in net.buses:
ar = np.where(A.col == bus.index_v_mag[t])[0]
self.assertEqual(ar.size,1)
self.assertEqual(A.col[ar[0]],bus.index_v_mag[t])
if bus.is_regulated_by_gen():
self.assertEqual(b[A.row[ar[0]]],bus.v_set[t])
else:
self.assertEqual(b[A.row[ar[0]]],bus.v_mag[t])
ar = np.where(A.col == bus.index_v_ang[t])[0]
self.assertEqual(ar.size,1)
self.assertEqual(A.col[ar[0]],bus.index_v_ang[t])
self.assertEqual(b[A.row[ar[0]]],bus.v_ang[t])
# Gens
for gen in net.generators:
if gen.is_slack():
ar = np.where(A.col == gen.index_P[t])[0]
self.assertEqual(ar.size,1)
self.assertEqual(A.col[ar[0]],gen.index_P[t])
self.assertEqual(b[A.row[ar[0]]],gen.P[t])
if gen.is_regulator():
ar = np.where(A.col == gen.index_Q[t])[0]
self.assertEqual(ar.size,1)
self.assertEqual(A.col[ar[0]],gen.index_Q[t])
self.assertEqual(A.data[ar[0]],1.)
self.assertEqual(b[A.row[ar[0]]],gen.Q[t])
# Shunts
for shunt in net.shunts:
if shunt.is_switched_v():
ar = np.where(A.col == shunt.index_b[t])[0]
self.assertEqual(ar.size,1)
self.assertEqual(A.col[ar[0]],shunt.index_b[t])
self.assertEqual(b[A.row[ar[0]]],shunt.b[t])
# Branch
for branch in net.branches:
if branch.is_tap_changer():
ar = np.where(A.col == branch.index_ratio[t])[0]
self.assertEqual(ar.size,1)
self.assertEqual(A.col[ar[0]],branch.index_ratio[t])
self.assertEqual(b[A.row[ar[0]]],branch.ratio[t])
if branch.is_phase_shifter():
ar = np.where(A.col == branch.index_phase[t])[0]
self.assertEqual(ar.size,1)
self.assertEqual(A.col[ar[0]],branch.index_phase[t])
self.assertEqual(b[A.row[ar[0]]],branch.phase[t])
# Vargen
for vargen in net.var_generators:
ar = np.where(A.col == vargen.index_P[t])[0]
self.assertEqual(ar.size,1)
self.assertEqual(A.col[ar[0]],vargen.index_P[t])
self.assertEqual(b[A.row[ar[0]]],vargen.P[t])
ar = np.where(A.col == vargen.index_Q[t])[0]
self.assertEqual(ar.size,1)
self.assertEqual(A.col[ar[0]],vargen.index_Q[t])
self.assertEqual(b[A.row[ar[0]]],vargen.Q[t])
# Batteries
for bat in net.batteries:
ar = np.where(A.col == bat.index_Pc[t])[0]
self.assertEqual(ar.size,1)
self.assertEqual(A.col[ar[0]],bat.index_Pc[t])
self.assertEqual(b[A.row[ar[0]]],max([bat.P[t],0]))
ar = np.where(A.col == bat.index_Pd[t])[0]
self.assertEqual(ar.size,1)
self.assertEqual(A.col[ar[0]],bat.index_Pd[t])
self.assertEqual(b[A.row[ar[0]]],max([-bat.P[t],0]))
ar = np.where(A.col == bat.index_E[t])[0]
self.assertEqual(ar.size,1)
self.assertEqual(A.col[ar[0]],bat.index_E[t])
self.assertEqual(b[A.row[ar[0]]],bat.E[t])
# Load
for load in net.loads:
self.assertTrue(load.has_flags('variable','active power'))
self.assertTrue(load.has_flags('fixed','active power'))
ar = np.where(A.col == load.index_P[t])[0]
self.assertEqual(ar.size,1)
self.assertEqual(A.col[ar[0]],load.index_P[t])
self.assertEqual(b[A.row[ar[0]]],load.P[t])
def test_constr_FIX_with_outages(self):
# Multiperiod
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
net.clear_outages()
gen = net.get_generator(0)
branch = net.get_branch(0)
gen.outage = True
branch.outage = True
self.assertTrue(gen.is_on_outage())
self.assertTrue(branch.is_on_outage())
gen.P = np.random.rand(self.T)
gen.Q = np.random.rand(self.T)
branch.ratio = np.random.randn(self.T)
branch.phase = np.random.randn(self.T)
net.set_flags('generator',
['variable','fixed'],
'any',
['active power', 'reactive power'])
net.set_flags('branch',
['variable','fixed'],
'any',
['tap ratio', 'phase shift'])
self.assertEqual(net.num_vars,
self.T*(2*net.num_generators + 2*net.num_branches))
self.assertEqual(net.num_vars, net.num_fixed)
constr = pf.Constraint('variable fixing', net)
constr.analyze()
A = constr.A
b = constr.b
for t in range(self.T):
# gen P
k = np.where(A.col == gen.index_P[t])[0]
self.assertEqual(k.size, 1)
k = k[0]
i = A.row[k]
self.assertEqual(A.data[k], 1.)
self.assertEqual(b[i], gen.P[t])
# gen Q
k = np.where(A.col == gen.index_Q[t])[0]
self.assertEqual(k.size, 1)
k = k[0]
i = A.row[k]
self.assertEqual(A.data[k], 1.)
self.assertEqual(b[i], gen.Q[t])
# branch ratio
k = np.where(A.col == branch.index_ratio[t])[0]
self.assertEqual(k.size, 1)
k = k[0]
i = A.row[k]
self.assertEqual(A.data[k], 1.)
self.assertEqual(b[i], branch.ratio[t])
# branch phase
k = np.where(A.col == branch.index_phase[t])[0]
self.assertEqual(k.size, 1)
k = k[0]
i = A.row[k]
self.assertEqual(A.data[k], 1.)
self.assertEqual(b[i], branch.phase[t])
# Disconnect
net.clear_outages()
net.clear_flags()
self.assertEqual(net.num_vars, 0)
for bus in net.buses:
if bus.degree == 1:
self.assertEqual(len(bus.branches), 1)
bus.branches[0].outage = True
self.assertTrue(bus.branches[0].is_on_outage())
net.set_flags_of_component(bus,
['variable', 'fixed'],
['voltage magnitude', 'voltage angle'])
self.assertEqual(net.num_vars, 2*self.T)
self.assertEqual(net.num_vars, net.num_fixed)
self.assertTrue(bus.has_flags('variable', ['voltage magnitude',
'voltage angle']))
self.assertTrue(bus.has_flags('fixed', ['voltage magnitude',
'voltage angle']))
constr = pf.Constraint('variable fixing', net)
constr.analyze()
A = constr.A
b = constr.b
self.assertEqual(A.shape[0], 2*self.T)
for t in range(self.T):
# bus v mag
k = np.where(A.col == bus.index_v_mag[t])[0]
self.assertEqual(k.size, 1)
k = k[0]
self.assertEqual(A.data[k], 1.)
self.assertEqual(b[A.row[k]], bus.v_mag[t])
# bus v ang
k = np.where(A.col == bus.index_v_ang[t])[0]
self.assertEqual(k.size, 1)
k = k[0]
self.assertEqual(A.data[k], 1.)
self.assertEqual(b[A.row[k]], bus.v_ang[t])
break
def test_constr_BOUND(self):
# Single period
for case in test_cases.CASES:
net = pf.Parser(case).parse(case)
self.assertEqual(net.num_periods,1)
# add vargens
net.add_var_generators_from_parameters(net.get_load_buses(),80.,50.,30.,5,0.05)
for vargen in net.var_generators:
vargen.P = vargen.index*1.5
vargen.Q = vargen.index*2.5
vargen.P_ava = vargen.index*3.
vargen.P_max = 100.
vargen.P_min = 0.
vargen.Q_max = 50.
vargen.Q_min = -50.
self.assertGreater(net.num_var_generators,0)
self.assertEqual(net.num_bounded,0)
self.assertEqual(net.num_vars,0)
self.assertEqual(net.num_fixed,0)
# loads
for load in net.loads:
load.P_min = -2.4*(load.index+1)
load.P_max = 3.3*(load.index+1)
load.Q_min = 1.2*(load.index+2.)
load.Q_max = 5.8*(load.index+3.)
# Vars
net.set_flags('bus',
'variable',
'regulated by generator',
['voltage magnitude','voltage angle'])
net.set_flags('generator',
'variable',
'regulator',
['active power','reactive power'])
net.set_flags('load',
'variable',
'adjustable active power',
['active power','reactive power'])
net.set_flags('branch',
'variable',
'tap changer',
'tap ratio')
net.set_flags('branch',
'variable',
'phase shifter',
'phase shift')
net.set_flags('shunt',
'variable',
'switching - v',
'susceptance')
net.set_flags('variable generator',
'variable',
'any',
['active power','reactive power'])
net.set_flags('battery',
'variable',
'any',
['charging power','energy level'])
net.set_flags('vsc converter',
'variable',
'any',
['dc power', 'active power', 'reactive power'])
net.set_flags('facts',
'variable',
'any',
['series voltage magnitude','series voltage angle',
'active power', 'reactive power'])
net.set_flags('dc bus',
'variable',
'any',
'voltage')
net.set_flags('csc converter',
'variable',
'any',
'all')
num_vars_saved = net.num_vars
self.assertGreater(net.num_vars,0)
self.assertEqual(net.num_fixed,0)
self.assertEqual(net.num_vars,
(net.get_num_buses_reg_by_gen()*2 +
net.get_num_reg_gens()*2 +
2*net.get_num_P_adjust_loads() +
net.get_num_tap_changers() +
net.get_num_phase_shifters()*1 +
net.get_num_switched_v_shunts() +
net.num_var_generators*2+
3*net.num_batteries+
4*net.num_vsc_converters+
9*net.num_facts +
net.num_dc_buses +
6*net.num_csc_converters))
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
constr = pf.Constraint('variable bounds',net)
self.assertEqual(constr.name,'variable bounds')
f = constr.f
J = constr.J
A = constr.A
b = constr.b
G = constr.G
l = constr.l
u = constr.u
# Before
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,0))
self.assertEqual(A.nnz,0)
self.assertTrue(type(G) is coo_matrix)
self.assertTupleEqual(G.shape,(0,0))
self.assertEqual(G.nnz,0)
self.assertTrue(type(l) is np.ndarray)
self.assertTupleEqual(l.shape,(0,))
self.assertTrue(type(u) is np.ndarray)
self.assertTupleEqual(u.shape,(0,))
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.G_nnz,0)
constr.analyze()
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.G_nnz,0)
constr.eval(x0)
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.G_nnz,0)
f = constr.f
J = constr.J
A = constr.A
b = constr.b
G = constr.G
l = constr.l
u = constr.u
# After
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,net.num_vars))
self.assertEqual(A.nnz,0)
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,net.num_vars))
self.assertEqual(J.nnz,0)
self.assertTrue(type(G) is coo_matrix)
self.assertTupleEqual(G.shape,(net.num_vars,net.num_vars))
self.assertEqual(G.nnz,net.num_vars)
self.assertTrue(np.all(G.row == np.array(range(net.num_vars))))
self.assertTrue(np.all(G.col == np.array(range(net.num_vars))))
self.assertTrue(np.all(G.data == np.ones(net.num_vars)))
self.assertTrue(type(l) is np.ndarray)
self.assertTupleEqual(l.shape,(net.num_vars,))
self.assertTrue(type(u) is np.ndarray)
self.assertTupleEqual(u.shape,(net.num_vars,))
E = G-eye(net.num_vars)
self.assertGreater(G.nnz,0)
self.assertGreater(norm(G.data,np.inf),0.5)
self.assertEqual(E.nnz,0)
self.assertTrue(not np.any(np.isinf(l)))
self.assertTrue(not np.any(np.isnan(l)))
self.assertTrue(not np.any(np.isinf(u)))
self.assertTrue(not np.any(np.isnan(u)))
self.assertTrue(not np.any(np.isinf(b)))
self.assertTrue(not np.any(np.isnan(b)))
# Bounds
for bus in net.buses:
if bus.is_regulated_by_gen():
self.assertTrue(bus.has_flags('variable',
['voltage magnitude',
'voltage angle']))
self.assertEqual(u[bus.index_v_mag],pf.BUS_INF_V_MAG)
self.assertEqual(u[bus.index_v_ang],pf.BUS_INF_V_ANG)
self.assertEqual(l[bus.index_v_mag],0.)
self.assertEqual(l[bus.index_v_ang],-pf.BUS_INF_V_ANG)
else:
self.assertFalse(bus.has_flags('variable',
['voltage magnitude',
'voltage angle']))
for branch in net.branches:
if branch.is_tap_changer():
self.assertTrue(branch.has_flags('variable','tap ratio'))
self.assertEqual(u[branch.index_ratio],pf.BRANCH_INF_RATIO)
self.assertEqual(l[branch.index_ratio],0.)
else:
self.assertFalse(branch.has_flags('variable','tap ratio'))
if branch.is_phase_shifter():
self.assertTrue(branch.has_flags('variable','phase shift'))
self.assertLess(np.abs(u[branch.index_phase]-np.pi*2.),1e-10)
self.assertLess(np.abs(l[branch.index_phase]+np.pi*2.),1e-10)
else:
self.assertFalse(branch.has_flags('variable','phase shift'))
for gen in net.generators:
if gen.is_regulator():
self.assertTrue(gen.has_flags('variable',['active power','reactive power']))
self.assertEqual(u[gen.index_P],pf.GEN_INF_P)
self.assertEqual(u[gen.index_Q],pf.GEN_INF_Q)
self.assertEqual(l[gen.index_P],-pf.GEN_INF_P)
self.assertEqual(l[gen.index_Q],-pf.GEN_INF_Q)
else:
self.assertFalse(gen.has_flags('variable',
['active power','reactive power']))
for load in net.loads:
self.assertTrue(load.has_flags('variable','active power'))
self.assertTrue(load.has_flags('variable','reactive power'))
self.assertTrue(load.has_flags('variable',['active power','reactive power']))
self.assertEqual(u[load.index_P],pf.LOAD_INF_P)
self.assertEqual(l[load.index_P],-pf.LOAD_INF_P)
self.assertEqual(u[load.index_Q],pf.LOAD_INF_Q)
self.assertEqual(l[load.index_Q],-pf.LOAD_INF_Q)
for vargen in net.var_generators:
self.assertTrue(vargen.has_flags('variable',
['active power','reactive power']))
self.assertEqual(u[vargen.index_P],pf.VARGEN_INF_P)
self.assertEqual(u[vargen.index_Q],pf.VARGEN_INF_Q)
self.assertEqual(l[vargen.index_P],-pf.VARGEN_INF_P)
self.assertEqual(l[vargen.index_Q],-pf.VARGEN_INF_Q)
for shunt in net.shunts:
if shunt.is_switched_v():
self.assertTrue(shunt.has_flags('variable','susceptance'))
self.assertEqual(u[shunt.index_b],pf.SHUNT_INF_SUSC)
self.assertEqual(l[shunt.index_b],-pf.SHUNT_INF_SUSC)
else:
self.assertFalse(shunt.has_flags('variable','susceptance'))
for bat in net.batteries:
self.assertTrue(bat.has_flags('variable','charging power'))
self.assertTrue(bat.has_flags('variable','energy level'))
self.assertEqual(u[bat.index_Pc],pf.BAT_INF_P)
self.assertEqual(l[bat.index_Pc],0.)
self.assertEqual(u[bat.index_Pd],pf.BAT_INF_P)
self.assertEqual(l[bat.index_Pd],0.)
self.assertEqual(u[bat.index_E],pf.BAT_INF_E)
self.assertEqual(l[bat.index_E],0.)
for vsc_conv in net.vsc_converters:
self.assertTrue(vsc_conv.has_flags('variable','active power'))
self.assertTrue(vsc_conv.has_flags('variable','reactive power'))
self.assertTrue(vsc_conv.has_flags('variable','dc power'))
self.assertEqual(u[vsc_conv.index_P],pf.CONVVSC_INF_P)
self.assertEqual(l[vsc_conv.index_P],-pf.CONVVSC_INF_P)
self.assertEqual(u[vsc_conv.index_Q],pf.CONVVSC_INF_Q)
self.assertEqual(l[vsc_conv.index_Q],-pf.CONVVSC_INF_Q)
self.assertEqual(u[vsc_conv.index_P_dc],pf.CONVVSC_INF_PDC)
self.assertEqual(l[vsc_conv.index_P_dc],-pf.CONVVSC_INF_PDC)
self.assertEqual(u[vsc_conv.index_i_dc],pf.CONVVSC_INF_PDC)
self.assertEqual(l[vsc_conv.index_i_dc],-pf.CONVVSC_INF_PDC)
for facts in net.facts:
self.assertTrue(facts.has_flags('variable','series voltage magnitude'))
self.assertTrue(facts.has_flags('variable','series voltage angle'))
self.assertTrue(facts.has_flags('variable','active power'))
self.assertTrue(facts.has_flags('variable','reactive power'))
self.assertEqual(u[facts.index_v_mag_s],pf.FACTS_INF_VMAG_S)
self.assertEqual(l[facts.index_v_mag_s],0.)
self.assertEqual(u[facts.index_v_ang_s],pf.FACTS_INF_VANG_S)
self.assertEqual(l[facts.index_v_ang_s],-pf.FACTS_INF_VANG_S)
self.assertEqual(u[facts.index_P_k],pf.FACTS_INF_P)
self.assertEqual(l[facts.index_P_k],-pf.FACTS_INF_P)
self.assertEqual(u[facts.index_P_m],pf.FACTS_INF_P)
self.assertEqual(l[facts.index_P_m],-pf.FACTS_INF_P)
self.assertEqual(u[facts.index_P_dc],pf.FACTS_INF_P)
self.assertEqual(l[facts.index_P_dc],-pf.FACTS_INF_P)
self.assertEqual(u[facts.index_Q_k],pf.FACTS_INF_Q)
self.assertEqual(l[facts.index_Q_k],-pf.FACTS_INF_Q)
self.assertEqual(u[facts.index_Q_m],pf.FACTS_INF_Q)
self.assertEqual(l[facts.index_Q_m],-pf.FACTS_INF_Q)
self.assertEqual(u[facts.index_Q_s],pf.FACTS_INF_Q)
self.assertEqual(l[facts.index_Q_s],-pf.FACTS_INF_Q)
self.assertEqual(u[facts.index_Q_sh],pf.FACTS_INF_Q)
self.assertEqual(l[facts.index_Q_sh],-pf.FACTS_INF_Q)
for bus in net.dc_buses:
self.assertEqual(u[bus.index_v], pf.BUSDC_INF_V)
self.assertEqual(l[bus.index_v], -pf.BUSDC_INF_V)
for csc in net.csc_converters:
self.assertEqual(u[csc.index_P], pf.CONVCSC_INF_P)
self.assertEqual(l[csc.index_P], -pf.CONVCSC_INF_P)
self.assertEqual(u[csc.index_Q], pf.CONVCSC_INF_Q)
self.assertEqual(l[csc.index_Q], -pf.CONVCSC_INF_Q)
self.assertEqual(u[csc.index_P_dc], pf.CONVCSC_INF_PDC)
self.assertEqual(l[csc.index_P_dc], -pf.CONVCSC_INF_PDC)
self.assertEqual(u[csc.index_i_dc], pf.CONVCSC_INF_PDC)
self.assertEqual(l[csc.index_i_dc], -pf.CONVCSC_INF_PDC)
self.assertEqual(u[csc.index_angle], pf.CONVCSC_INF_ANGLE)
self.assertEqual(l[csc.index_angle], -pf.CONVCSC_INF_ANGLE)
self.assertEqual(u[csc.index_ratio], pf.CONVCSC_INF_RATIO)
self.assertEqual(l[csc.index_ratio], -pf.CONVCSC_INF_RATIO)
# Add bounded flags
net.set_flags('bus',
'bounded',
'regulated by generator',
['voltage magnitude','voltage angle'])
net.set_flags('generator',
'bounded',
'regulator',
['active power','reactive power'])
net.set_flags('load',
'bounded',
'adjustable active power',
['active power','reactive power'])
net.set_flags('branch',
'bounded',
'tap changer',
'tap ratio')
net.set_flags('branch',
'bounded',
'phase shifter',
'phase shift')
net.set_flags('shunt',
'bounded',
'switching - v',
'susceptance')
net.set_flags('variable generator',
'bounded',
'any',
['active power','reactive power'])
net.set_flags('battery',
'bounded',
'any',
['charging power','energy level'])
net.set_flags('vsc converter',
'bounded',
'any',
['dc power', 'active power','reactive power'])
net.set_flags('facts',
'bounded',
'any',
['series voltage magnitude','series voltage angle',
'active power','reactive power'])
net.set_flags('dc bus',
'bounded',
'any',
'voltage')
net.set_flags('csc converter',
'bounded',
'any',
'all')
self.assertEqual(net.num_vars,num_vars_saved)
self.assertEqual(net.num_fixed,0)
self.assertEqual(net.num_bounded,net.num_vars)
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
constr = pf.Constraint('variable bounds',net)
self.assertEqual(constr.name,'variable bounds')
constr.analyze()
G = constr.G
l = constr.l
u = constr.u
self.assertTrue(type(G) is coo_matrix)
self.assertTupleEqual(G.shape,(net.num_vars,net.num_vars))
self.assertEqual(G.nnz,net.num_vars)
self.assertTrue(np.all(G.row == np.array(range(net.num_vars))))
self.assertTrue(np.all(G.col == np.array(range(net.num_vars))))
self.assertTrue(np.all(G.data == np.ones(net.num_vars)))
self.assertTrue(type(l) is np.ndarray)
self.assertTupleEqual(l.shape,(net.num_vars,))
self.assertTrue(type(u) is np.ndarray)
self.assertTupleEqual(u.shape,(net.num_vars,))
E = G-eye(net.num_vars)
self.assertGreater(G.nnz,0)
self.assertGreater(norm(G.data,np.inf),0.5)
self.assertEqual(E.nnz,0)
# Bounds
for bus in net.buses:
if bus.is_regulated_by_gen():
self.assertTrue(bus.has_flags('bounded',
['voltage magnitude',
'voltage angle']))
self.assertTrue(bus.has_flags('variable',
['voltage magnitude',
'voltage angle']))
self.assertEqual(u[bus.index_v_mag],bus.v_max)
self.assertEqual(u[bus.index_v_ang],pf.BUS_INF_V_ANG)
self.assertEqual(l[bus.index_v_mag],bus.v_min)
self.assertEqual(l[bus.index_v_ang],-pf.BUS_INF_V_ANG)
else:
self.assertFalse(bus.has_flags('bounded',
['voltage magnitude',
'voltage angle']))
for branch in net.branches:
if branch.is_tap_changer():
self.assertTrue(branch.has_flags('bounded','tap ratio'))
self.assertEqual(u[branch.index_ratio],branch.ratio_max)
self.assertEqual(l[branch.index_ratio],branch.ratio_min)
else:
self.assertFalse(branch.has_flags('bounded','tap ratio'))
if branch.is_phase_shifter():
self.assertTrue(branch.has_flags('bounded','phase shift'))
self.assertEqual(u[branch.index_phase],branch.phase_max)
self.assertEqual(l[branch.index_phase],branch.phase_min)
else:
self.assertFalse(branch.has_flags('bounded','phase shift'))
for gen in net.generators:
if gen.is_regulator():
self.assertTrue(gen.has_flags('bounded',['active power','reactive power']))
self.assertEqual(u[gen.index_P],gen.P_max)
self.assertEqual(u[gen.index_Q],gen.Q_max)
self.assertEqual(l[gen.index_P],gen.P_min)
self.assertEqual(l[gen.index_Q],gen.Q_min)
else:
self.assertFalse(gen.has_flags('bounded',['active power','reactive power']))
for load in net.loads:
self.assertTrue(load.has_flags('bounded','active power'))
self.assertTrue(load.has_flags('bounded','reactive power'))
self.assertTrue(load.has_flags('bounded',['active power','reactive power']))
self.assertEqual(u[load.index_P],load.P_max)
self.assertEqual(l[load.index_P],load.P_min)
self.assertEqual(u[load.index_Q],load.Q_max)
self.assertEqual(l[load.index_Q],load.Q_min)
for vargen in net.var_generators:
self.assertTrue(vargen.has_flags('bounded',['active power','reactive power']))
self.assertEqual(u[vargen.index_P],vargen.P_ava)
self.assertEqual(u[vargen.index_Q],vargen.Q_max)
self.assertEqual(l[vargen.index_P],vargen.P_min)
self.assertEqual(l[vargen.index_Q],vargen.Q_min)
for shunt in net.shunts:
if shunt.is_switched_v():
self.assertTrue(shunt.has_flags('bounded','susceptance'))
self.assertEqual(u[shunt.index_b],shunt.b_max)
self.assertEqual(l[shunt.index_b],shunt.b_min)
else:
self.assertFalse(shunt.has_flags('bounded','susceptance'))
for bat in net.batteries:
self.assertTrue(bat.has_flags('bounded','charging power'))
self.assertTrue(bat.has_flags('bounded','energy level'))
self.assertEqual(u[bat.index_Pc],bat.P_max)
self.assertEqual(l[bat.index_Pc],0.)
self.assertEqual(u[bat.index_Pd],-bat.P_min)
self.assertEqual(l[bat.index_Pd],0.)
self.assertEqual(u[bat.index_E],bat.E_max)
self.assertEqual(l[bat.index_E],0.)
for vsc_conv in net.vsc_converters:
self.assertTrue(vsc_conv.has_flags('bounded','active power'))
self.assertTrue(vsc_conv.has_flags('bounded','reactive power'))
self.assertTrue(vsc_conv.has_flags('bounded','dc power'))
self.assertEqual(u[vsc_conv.index_P],vsc_conv.P_max)
self.assertEqual(l[vsc_conv.index_P],vsc_conv.P_min)
self.assertEqual(u[vsc_conv.index_Q],vsc_conv.Q_max)
self.assertEqual(l[vsc_conv.index_Q],vsc_conv.Q_min)
self.assertEqual(u[vsc_conv.index_P_dc],pf.CONVVSC_INF_PDC)
self.assertEqual(l[vsc_conv.index_P_dc],-pf.CONVVSC_INF_PDC)
self.assertEqual(u[vsc_conv.index_i_dc],pf.CONVVSC_INF_PDC)
self.assertEqual(l[vsc_conv.index_i_dc],-pf.CONVVSC_INF_PDC)
for facts in net.facts:
self.assertTrue(facts.has_flags('bounded','series voltage magnitude'))
self.assertTrue(facts.has_flags('bounded','series voltage angle'))
self.assertTrue(facts.has_flags('bounded','active power'))
self.assertTrue(facts.has_flags('bounded','reactive power'))
self.assertEqual(u[facts.index_v_mag_s],facts.v_max_s)
self.assertEqual(l[facts.index_v_mag_s],0.)
self.assertEqual(u[facts.index_v_ang_s],pf.FACTS_INF_VANG_S)
self.assertEqual(l[facts.index_v_ang_s],-pf.FACTS_INF_VANG_S)
self.assertEqual(u[facts.index_P_k],pf.FACTS_INF_P)
self.assertEqual(l[facts.index_P_k],-pf.FACTS_INF_P)
self.assertEqual(u[facts.index_P_m],pf.FACTS_INF_P)
self.assertEqual(l[facts.index_P_m],-pf.FACTS_INF_P)
self.assertEqual(u[facts.index_P_dc],facts.P_max_dc)
self.assertEqual(l[facts.index_P_dc],-facts.P_max_dc)
self.assertEqual(u[facts.index_Q_k],pf.FACTS_INF_Q)
self.assertEqual(l[facts.index_Q_k],-pf.FACTS_INF_Q)
self.assertEqual(u[facts.index_Q_m],pf.FACTS_INF_Q)
self.assertEqual(l[facts.index_Q_m],-pf.FACTS_INF_Q)
self.assertEqual(u[facts.index_Q_s],facts.Q_max_s)
self.assertEqual(l[facts.index_Q_s],facts.Q_min_s)
self.assertEqual(u[facts.index_Q_sh],facts.Q_max_sh)
self.assertEqual(l[facts.index_Q_sh],facts.Q_min_sh)
for bus in net.dc_buses:
self.assertEqual(u[bus.index_v], pf.BUSDC_INF_V)
self.assertEqual(l[bus.index_v], -pf.BUSDC_INF_V)
for csc in net.csc_converters:
self.assertEqual(u[csc.index_P], pf.CONVCSC_INF_P)
self.assertEqual(l[csc.index_P], -pf.CONVCSC_INF_P)
self.assertEqual(u[csc.index_Q], pf.CONVCSC_INF_Q)
self.assertEqual(l[csc.index_Q], -pf.CONVCSC_INF_Q)
self.assertEqual(u[csc.index_P_dc], pf.CONVCSC_INF_PDC)
self.assertEqual(l[csc.index_P_dc], -pf.CONVCSC_INF_PDC)
self.assertEqual(u[csc.index_i_dc], pf.CONVCSC_INF_PDC)
self.assertEqual(l[csc.index_i_dc], -pf.CONVCSC_INF_PDC)
self.assertEqual(u[csc.index_angle], pf.CONVCSC_INF_ANGLE)
self.assertEqual(l[csc.index_angle], -pf.CONVCSC_INF_ANGLE)
self.assertEqual(u[csc.index_ratio], pf.CONVCSC_INF_RATIO)
self.assertEqual(l[csc.index_ratio], -pf.CONVCSC_INF_RATIO)
# Sensitivities
net.clear_sensitivities()
for branch in net.branches:
self.assertEqual(branch.sens_ratio_u_bound, 0.)
self.assertEqual(branch.sens_ratio_l_bound, 0.)
self.assertEqual(branch.sens_phase_u_bound, 0.)
self.assertEqual(branch.sens_phase_l_bound, 0.)
for bus in net.buses:
self.assertEqual(bus.sens_P_balance,0.)
self.assertEqual(bus.sens_Q_balance,0.)
self.assertEqual(bus.sens_v_mag_u_bound,0.)
self.assertEqual(bus.sens_v_mag_l_bound,0.)
self.assertEqual(bus.sens_v_ang_u_bound,0.)
self.assertEqual(bus.sens_v_ang_l_bound,0.)
for gen in net.generators:
self.assertEqual(gen.sens_P_u_bound,0.)
self.assertEqual(gen.sens_P_l_bound,0.)
self.assertEqual(gen.sens_Q_u_bound,0.)
self.assertEqual(gen.sens_Q_l_bound,0.)
for load in net.loads:
self.assertEqual(load.sens_P_u_bound,0.)
self.assertEqual(load.sens_P_l_bound,0.)
for shunt in net.shunts:
self.assertEqual(shunt.sens_b_u_bound, 0.)
self.assertEqual(shunt.sens_b_l_bound, 0.)
mu = np.random.randn(net.num_vars)
pi = np.random.randn(net.num_vars)
constr.store_sensitivities(None,None,mu,pi)
# Branch sens
for branch in net.branches:
if branch.is_tap_changer():
self.assertEqual(branch.sens_ratio_u_bound, mu[branch.index_ratio])
self.assertEqual(branch.sens_ratio_l_bound, pi[branch.index_ratio])
else:
self.assertEqual(branch.sens_ratio_u_bound, 0.)
self.assertEqual(branch.sens_ratio_l_bound, 0.)
if branch.is_phase_shifter():
self.assertEqual(branch.sens_phase_u_bound, mu[branch.index_phase])
self.assertEqual(branch.sens_phase_l_bound, pi[branch.index_phase])
else:
self.assertEqual(branch.sens_phase_u_bound, 0.)
self.assertEqual(branch.sens_phase_l_bound, 0.)
# Bus sens
for bus in net.buses:
self.assertEqual(bus.sens_P_balance,0.)
self.assertEqual(bus.sens_Q_balance,0.)
if bus.is_regulated_by_gen():
self.assertTrue(bus.has_flags('variable','voltage angle'))
self.assertNotEqual(bus.sens_v_ang_u_bound,0.)
self.assertNotEqual(bus.sens_v_ang_l_bound,0.)
self.assertEqual(bus.sens_v_mag_u_bound,mu[bus.index_v_mag])
self.assertEqual(bus.sens_v_mag_l_bound,pi[bus.index_v_mag])
self.assertEqual(bus.sens_v_ang_u_bound,mu[bus.index_v_ang])
self.assertEqual(bus.sens_v_ang_l_bound,pi[bus.index_v_ang])
else:
self.assertEqual(bus.sens_v_mag_u_bound,0.)
self.assertEqual(bus.sens_v_mag_l_bound,0.)
self.assertEqual(bus.sens_v_ang_u_bound,0.)
self.assertEqual(bus.sens_v_ang_l_bound,0.)
# Gen sens
for gen in net.generators:
if gen.is_regulator():
self.assertTrue(gen.has_flags('variable','active power'))
self.assertNotEqual(gen.sens_P_u_bound,0.)
self.assertNotEqual(gen.sens_P_l_bound,0.)
self.assertEqual(gen.sens_P_u_bound, mu[gen.index_P])
self.assertEqual(gen.sens_P_l_bound, pi[gen.index_P])
self.assertEqual(gen.sens_Q_u_bound, mu[gen.index_Q])
self.assertEqual(gen.sens_Q_l_bound, pi[gen.index_Q])
else:
self.assertEqual(gen.sens_P_u_bound, 0.)
self.assertEqual(gen.sens_P_l_bound, 0.)
self.assertEqual(gen.sens_Q_u_bound, 0.)
self.assertEqual(gen.sens_Q_l_bound, 0.)
# Load sens
for load in net.loads:
self.assertTrue(load.has_flags('variable','active power'))
self.assertNotEqual(load.sens_P_u_bound,0.)
self.assertNotEqual(load.sens_P_l_bound,0.)
self.assertEqual(load.sens_P_u_bound,mu[load.index_P])
self.assertEqual(load.sens_P_l_bound,pi[load.index_P])
# Shunts
for shunt in net.shunts:
if shunt.is_switched_v():
self.assertEqual(shunt.sens_b_u_bound,mu[shunt.index_b])
self.assertEqual(shunt.sens_b_l_bound,pi[shunt.index_b])
else:
self.assertEqual(shunt.sens_b_u_bound, 0.)
self.assertEqual(shunt.sens_b_l_bound, 0.)
# Multi period
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
# add vargens
net.add_var_generators_from_parameters(net.get_load_buses(),80.,50.,30.,5,0.05)
for vargen in net.var_generators:
vargen.P = np.random.rand(self.T)
vargen.Q = np.random.rand(self.T)
vargen.P_ava = vargen.P*3.4
vargen.P_max = 100.
vargen.P_min = 0.
vargen.Q_max = 50.
vargen.Q_min = -50.
self.assertEqual(vargen.num_periods,self.T)
for t in range(self.T):
self.assertEqual(vargen.P_ava[t],vargen.P[t]*3.4)
self.assertGreater(net.num_var_generators,0)
self.assertEqual(net.num_bounded,0)
self.assertEqual(net.num_vars,0)
self.assertEqual(net.num_fixed,0)
# add batteries
gen_buses = net.get_generator_buses()
net.add_batteries_from_parameters(gen_buses,20.,40.,0.8,0.9)
# loads
for load in net.loads:
load.P_min = -2.4*(load.index+1)*np.array(range(net.num_periods))
load.P_max = 3.3*(load.index+1)*np.array(range(net.num_periods))
load.Q = 3.5*load.index*np.array(range(net.num_periods))
load.Q_min = 1.2*(load.index+1)*np.array(range(net.num_periods))
load.Q_max = 7.5*(load.index+1)*np.array(range(net.num_periods))
# Vars
net.set_flags('bus',
'variable',
'any',
['voltage magnitude','voltage angle'])
net.set_flags('generator',
'variable',
'any',
['active power','reactive power'])
net.set_flags('load',
'variable',
'any',
['active power','reactive power'])
net.set_flags('branch',
'variable',
'tap changer',
['tap ratio'])
net.set_flags('branch',
'variable',
'phase shifter',
'phase shift')
net.set_flags('shunt',
'variable',
'switching - v',
['susceptance'])
net.set_flags('variable generator',
'variable',
'any',
['active power','reactive power'])
net.set_flags('battery',
'variable',
'any',
['charging power','energy level'])
net.set_flags('vsc converter',
'variable',
'any',
['dc power', 'active power', 'reactive power'])
net.set_flags('facts',
'variable',
'any',
['series voltage magnitude','series voltage angle',
'active power', 'reactive power'])
net.set_flags('dc bus',
'variable',
'any',
'voltage')
net.set_flags('csc converter',
'variable',
'any',
'all')
self.assertGreater(net.num_vars,0)
self.assertEqual(net.num_fixed,0)
self.assertEqual(net.num_vars,
(net.num_buses*2 +
net.num_generators*2 +
2*net.num_loads +
net.get_num_tap_changers() +
net.get_num_phase_shifters() +
net.get_num_switched_v_shunts() +
net.num_var_generators*2 +
3*net.num_batteries +
4*net.num_vsc_converters +
9*net.num_facts +
net.num_dc_buses +
6*net.num_csc_converters)*self.T)
x0 = net.get_var_values()
constr = pf.Constraint('variable bounds',net)
self.assertEqual(constr.name,'variable bounds')
constr.analyze()
constr.eval(x0)
G = constr.G
l = constr.l
u = constr.u
self.assertTrue(type(G) is coo_matrix)
self.assertTupleEqual(G.shape,(net.num_vars,net.num_vars))
self.assertEqual(G.nnz,net.num_vars)
self.assertTrue(np.all(G.row == np.array(range(net.num_vars))))
self.assertTrue(np.all(G.col == np.array(range(net.num_vars))))
self.assertTrue(np.all(G.data == np.ones(net.num_vars)))
self.assertTrue(type(l) is np.ndarray)
self.assertTupleEqual(l.shape,(net.num_vars,))
self.assertTrue(type(u) is np.ndarray)
self.assertTupleEqual(u.shape,(net.num_vars,))
for t in range(self.T):
for bus in net.buses:
self.assertEqual(u[bus.index_v_mag[t]],pf.BUS_INF_V_MAG)
self.assertEqual(u[bus.index_v_ang[t]],pf.BUS_INF_V_ANG)
self.assertEqual(l[bus.index_v_mag[t]],0)
self.assertEqual(l[bus.index_v_ang[t]],-pf.BUS_INF_V_ANG)
for gen in net.generators:
self.assertEqual(u[gen.index_P[t]],pf.GEN_INF_P)
self.assertEqual(u[gen.index_Q[t]],pf.GEN_INF_Q)
self.assertEqual(l[gen.index_P[t]],-pf.GEN_INF_P)
self.assertEqual(l[gen.index_Q[t]],-pf.GEN_INF_Q)
for branch in net.branches:
if branch.is_tap_changer():
self.assertEqual(u[branch.index_ratio[t]],pf.BRANCH_INF_RATIO)
self.assertEqual(l[branch.index_ratio[t]],0.)
if branch.is_phase_shifter():
self.assertLess(np.abs(u[branch.index_phase[t]]-np.pi*2.),1e-10)
self.assertLess(np.abs(l[branch.index_phase[t]]+np.pi*2.),1e-10)
for vargen in net.var_generators:
self.assertEqual(u[vargen.index_P[t]],pf.VARGEN_INF_P)
self.assertEqual(u[vargen.index_Q[t]],pf.VARGEN_INF_Q)
self.assertEqual(l[vargen.index_P[t]],-pf.VARGEN_INF_P)
self.assertEqual(l[vargen.index_Q[t]],-pf.VARGEN_INF_Q)
for load in net.loads:
self.assertEqual(u[load.index_P[t]],pf.LOAD_INF_P)
self.assertEqual(l[load.index_P[t]],-pf.LOAD_INF_P)
self.assertEqual(u[load.index_Q[t]],pf.LOAD_INF_Q)
self.assertEqual(l[load.index_Q[t]],-pf.LOAD_INF_Q)
for shunt in net.shunts:
if shunt.is_switched_v():
self.assertEqual(u[shunt.index_b[t]],pf.SHUNT_INF_SUSC)
self.assertEqual(l[shunt.index_b[t]],-pf.SHUNT_INF_SUSC)
for vsc_conv in net.vsc_converters:
self.assertTrue(vsc_conv.has_flags('variable','active power'))
self.assertTrue(vsc_conv.has_flags('variable','reactive power'))
self.assertTrue(vsc_conv.has_flags('variable','dc power'))
self.assertEqual(u[vsc_conv.index_P[t]],pf.CONVVSC_INF_P)
self.assertEqual(l[vsc_conv.index_P[t]],-pf.CONVVSC_INF_P)
self.assertEqual(u[vsc_conv.index_Q[t]],pf.CONVVSC_INF_Q)
self.assertEqual(l[vsc_conv.index_Q[t]],-pf.CONVVSC_INF_Q)
self.assertEqual(u[vsc_conv.index_P_dc[t]],pf.CONVVSC_INF_PDC)
self.assertEqual(l[vsc_conv.index_P_dc[t]],-pf.CONVVSC_INF_PDC)
self.assertEqual(u[vsc_conv.index_i_dc[t]],pf.CONVVSC_INF_PDC)
self.assertEqual(l[vsc_conv.index_i_dc[t]],-pf.CONVVSC_INF_PDC)
for facts in net.facts:
self.assertTrue(facts.has_flags('variable','series voltage magnitude'))
self.assertTrue(facts.has_flags('variable','series voltage angle'))
self.assertTrue(facts.has_flags('variable','active power'))
self.assertTrue(facts.has_flags('variable','reactive power'))
self.assertEqual(u[facts.index_v_mag_s[t]],pf.FACTS_INF_VMAG_S)
self.assertEqual(l[facts.index_v_mag_s[t]],0.)
self.assertEqual(u[facts.index_v_ang_s[t]],pf.FACTS_INF_VANG_S)
self.assertEqual(l[facts.index_v_ang_s[t]],-pf.FACTS_INF_VANG_S)
self.assertEqual(u[facts.index_P_k[t]],pf.FACTS_INF_P)
self.assertEqual(l[facts.index_P_k[t]],-pf.FACTS_INF_P)
self.assertEqual(u[facts.index_P_m[t]],pf.FACTS_INF_P)
self.assertEqual(l[facts.index_P_m[t]],-pf.FACTS_INF_P)
self.assertEqual(u[facts.index_P_dc[t]],pf.FACTS_INF_P)
self.assertEqual(l[facts.index_P_dc[t]],-pf.FACTS_INF_P)
self.assertEqual(u[facts.index_Q_k[t]],pf.FACTS_INF_Q)
self.assertEqual(l[facts.index_Q_k[t]],-pf.FACTS_INF_Q)
self.assertEqual(u[facts.index_Q_m[t]],pf.FACTS_INF_Q)
self.assertEqual(l[facts.index_Q_m[t]],-pf.FACTS_INF_Q)
self.assertEqual(u[facts.index_Q_s[t]],pf.FACTS_INF_Q)
self.assertEqual(l[facts.index_Q_s[t]],-pf.FACTS_INF_Q)
self.assertEqual(u[facts.index_Q_sh[t]],pf.FACTS_INF_Q)
self.assertEqual(l[facts.index_Q_sh[t]],-pf.FACTS_INF_Q)
for bus in net.dc_buses:
self.assertEqual(u[bus.index_v[t]], pf.BUSDC_INF_V)
self.assertEqual(l[bus.index_v[t]], -pf.BUSDC_INF_V)
for csc in net.csc_converters:
self.assertEqual(u[csc.index_P[t]], pf.CONVCSC_INF_P)
self.assertEqual(l[csc.index_P[t]], -pf.CONVCSC_INF_P)
self.assertEqual(u[csc.index_Q[t]], pf.CONVCSC_INF_Q)
self.assertEqual(l[csc.index_Q[t]], -pf.CONVCSC_INF_Q)
self.assertEqual(u[csc.index_P_dc[t]], pf.CONVCSC_INF_PDC)
self.assertEqual(l[csc.index_P_dc[t]], -pf.CONVCSC_INF_PDC)
self.assertEqual(u[csc.index_i_dc[t]], pf.CONVCSC_INF_PDC)
self.assertEqual(l[csc.index_i_dc[t]], -pf.CONVCSC_INF_PDC)
self.assertEqual(u[csc.index_angle[t]], pf.CONVCSC_INF_ANGLE)
self.assertEqual(l[csc.index_angle[t]], -pf.CONVCSC_INF_ANGLE)
self.assertEqual(u[csc.index_ratio[t]], pf.CONVCSC_INF_RATIO)
self.assertEqual(l[csc.index_ratio[t]], -pf.CONVCSC_INF_RATIO)
# Row info
for t in range(self.T):
for bus in net.buses:
i = bus.index_v_mag[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:bus:%d:voltage magnitude:%d' %(bus.index,t))
i = bus.index_v_ang[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:bus:%d:voltage angle:%d' %(bus.index,t))
for gen in net.generators:
i = gen.index_P[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:generator:%d:active power:%d' %(gen.index,t))
i = gen.index_Q[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:generator:%d:reactive power:%d' %(gen.index,t))
for load in net.loads:
i = load.index_P[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:load:%d:active power:%d' %(load.index,t))
i = load.index_Q[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:load:%d:reactive power:%d' %(load.index,t))
for vargen in net.var_generators:
i = vargen.index_P[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:variable generator:%d:active power:%d' %(vargen.index,t))
i = vargen.index_Q[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:variable generator:%d:reactive power:%d' %(vargen.index,t))
for branch in net.branches:
if branch.is_tap_changer():
i = branch.index_ratio[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:branch:%d:tap ratio:%d' %(branch.index,t))
if branch.is_phase_shifter():
i = branch.index_phase[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:branch:%d:phase shift:%d' %(branch.index,t))
for shunt in net.shunts:
if shunt.is_switched_v():
i = shunt.index_b[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:shunt:%d:susceptance:%d' %(shunt.index,t))
for bat in net.batteries:
i = bat.index_Pc[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:battery:%d:charging power:%d' %(bat.index,t))
i = bat.index_Pd[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:battery:%d:discharging power:%d' %(bat.index,t))
i = bat.index_E[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:battery:%d:energy level:%d' %(bat.index,t))
for vsc_conv in net.vsc_converters:
i = vsc_conv.index_P[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:vsc converter:%d:active power:%d' %(vsc_conv.index,t))
i = vsc_conv.index_Q[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:vsc converter:%d:reactive power:%d' %(vsc_conv.index,t))
i = vsc_conv.index_P_dc[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:vsc converter:%d:dc power:%d' %(vsc_conv.index,t))
i = vsc_conv.index_i_dc[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:vsc converter:%d:dc current:%d' %(vsc_conv.index,t))
for facts in net.facts:
i = facts.index_v_mag_s[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:facts:%d:series voltage magnitude:%d' %(facts.index,t))
i = facts.index_v_ang_s[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:facts:%d:series voltage angle:%d' %(facts.index,t))
i = facts.index_P_k[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:facts:%d:active power k:%d' %(facts.index,t))
i = facts.index_P_m[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:facts:%d:active power m:%d' %(facts.index,t))
i = facts.index_P_dc[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:facts:%d:dc power:%d' %(facts.index,t))
i = facts.index_Q_k[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:facts:%d:reactive power k:%d' %(facts.index,t))
i = facts.index_Q_m[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:facts:%d:reactive power m:%d' %(facts.index,t))
i = facts.index_Q_s[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:facts:%d:series reactive power:%d' %(facts.index,t))
i = facts.index_Q_sh[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:facts:%d:shunt reactive power:%d' %(facts.index,t))
for bus in net.dc_buses:
i = bus.index_v[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:dc bus:%d:voltage:%d' %(bus.index,t))
for csc in net.csc_converters:
i = csc.index_P[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:csc converter:%d:active power:%d' %(csc.index,t))
i = csc.index_Q[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:csc converter:%d:reactive power:%d' %(csc.index,t))
i = csc.index_P_dc[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:csc converter:%d:dc power:%d' %(csc.index,t))
i = csc.index_i_dc[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:csc converter:%d:dc current:%d' %(csc.index,t))
i = csc.index_angle[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:csc converter:%d:angle:%d' %(csc.index,t))
i = csc.index_ratio[t]
s = constr.get_G_row_info_string(i)
self.assertEqual(constr.get_A_row_info_string(i),"")
self.assertEqual(constr.get_J_row_info_string(i),"")
self.assertEqual(s,'variable bounds:csc converter:%d:tap ratio:%d' %(csc.index,t))
# Bounded
net.set_flags('bus',
'bounded',
'any',
['voltage magnitude','voltage angle'])
net.set_flags('generator',
'bounded',
'any',
['active power','reactive power'])
net.set_flags('load',
'bounded',
'any',
['active power','reactive power'])
net.set_flags('branch',
'bounded',
'tap changer',
['tap ratio'])
net.set_flags('branch',
'bounded',
'phase shifter',
'phase shift')
net.set_flags('shunt',
'bounded',
'switching - v',
['susceptance'])
net.set_flags('variable generator',
'bounded',
'any',
['active power','reactive power'])
net.set_flags('battery',
'bounded',
'any',
['charging power','energy level'])
net.set_flags('vsc converter',
'bounded',
'any',
['dc power', 'active power','reactive power'])
net.set_flags('facts',
'bounded',
'any',
['series voltage magnitude','series voltage angle',
'active power','reactive power'])
net.set_flags('dc bus',
'bounded',
'any',
'voltage')
net.set_flags('csc converter',
'bounded',
'any',
'all')
self.assertGreater(net.num_vars,0)
self.assertEqual(net.num_bounded,net.num_vars)
x0 = net.get_var_values()
constr = pf.Constraint('variable bounds',net)
self.assertEqual(constr.name,'variable bounds')
constr.analyze()
constr.eval(x0)
G = constr.G
l = constr.l
u = constr.u
for t in range(self.T):
for bus in net.buses:
self.assertEqual(u[bus.index_v_mag[t]],bus.v_max)
self.assertEqual(u[bus.index_v_ang[t]],pf.BUS_INF_V_ANG)
self.assertEqual(l[bus.index_v_mag[t]],bus.v_min)
self.assertEqual(l[bus.index_v_ang[t]],-pf.BUS_INF_V_ANG)
for gen in net.generators:
self.assertEqual(u[gen.index_P[t]],gen.P_max)
self.assertEqual(u[gen.index_Q[t]],gen.Q_max)
self.assertEqual(l[gen.index_P[t]],gen.P_min)
self.assertEqual(l[gen.index_Q[t]],gen.Q_min)
for branch in net.branches:
if branch.is_tap_changer():
self.assertEqual(u[branch.index_ratio[t]],branch.ratio_max)
self.assertEqual(l[branch.index_ratio[t]],branch.ratio_min)
if branch.is_phase_shifter():
self.assertEqual(u[branch.index_phase[t]],branch.phase_max)
self.assertEqual(l[branch.index_phase[t]],branch.phase_min)
for vargen in net.var_generators:
self.assertEqual(u[vargen.index_P[t]],vargen.P_ava[t])
self.assertEqual(u[vargen.index_Q[t]],vargen.Q_max)
self.assertEqual(l[vargen.index_P[t]],vargen.P_min)
self.assertEqual(l[vargen.index_Q[t]],vargen.Q_min)
for load in net.loads:
self.assertEqual(u[load.index_P[t]],load.P_max[t])
self.assertEqual(l[load.index_P[t]],load.P_min[t])
self.assertEqual(u[load.index_P[t]],3.3*(load.index+1)*t)
self.assertEqual(l[load.index_P[t]],-2.4*(load.index+1)*t)
self.assertEqual(u[load.index_Q[t]],7.5*(load.index+1)*t)
self.assertEqual(l[load.index_Q[t]],1.2*(load.index+1)*t)
for shunt in net.shunts:
if shunt.is_switched_v():
self.assertEqual(u[shunt.index_b[t]],shunt.b_max)
self.assertEqual(l[shunt.index_b[t]],shunt.b_min)
for vsc_conv in net.vsc_converters:
self.assertTrue(vsc_conv.has_flags('bounded','active power'))
self.assertTrue(vsc_conv.has_flags('bounded','reactive power'))
self.assertTrue(vsc_conv.has_flags('bounded','dc power'))
self.assertEqual(u[vsc_conv.index_P[t]],vsc_conv.P_max)
self.assertEqual(l[vsc_conv.index_P[t]],vsc_conv.P_min)
self.assertEqual(u[vsc_conv.index_Q[t]],vsc_conv.Q_max)
self.assertEqual(l[vsc_conv.index_Q[t]],vsc_conv.Q_min)
self.assertEqual(u[vsc_conv.index_P_dc[t]],pf.CONVVSC_INF_PDC)
self.assertEqual(l[vsc_conv.index_P_dc[t]],-pf.CONVVSC_INF_PDC)
self.assertEqual(u[vsc_conv.index_i_dc[t]],pf.CONVVSC_INF_PDC)
self.assertEqual(l[vsc_conv.index_i_dc[t]],-pf.CONVVSC_INF_PDC)
for facts in net.facts:
self.assertTrue(facts.has_flags('bounded','series voltage magnitude'))
self.assertTrue(facts.has_flags('bounded','series voltage angle'))
self.assertTrue(facts.has_flags('bounded','active power'))
self.assertTrue(facts.has_flags('bounded','reactive power'))
self.assertEqual(u[facts.index_v_mag_s[t]],facts.v_max_s)
self.assertEqual(l[facts.index_v_mag_s[t]],0.)
self.assertEqual(u[facts.index_v_ang_s[t]],pf.FACTS_INF_VANG_S)
self.assertEqual(l[facts.index_v_ang_s[t]],-pf.FACTS_INF_VANG_S)
self.assertEqual(u[facts.index_P_k[t]],pf.FACTS_INF_P)
self.assertEqual(l[facts.index_P_k[t]],-pf.FACTS_INF_P)
self.assertEqual(u[facts.index_P_m[t]],pf.FACTS_INF_P)
self.assertEqual(l[facts.index_P_m[t]],-pf.FACTS_INF_P)
self.assertEqual(u[facts.index_P_dc[t]],facts.P_max_dc)
self.assertEqual(l[facts.index_P_dc[t]],-facts.P_max_dc)
self.assertEqual(u[facts.index_Q_k[t]],pf.FACTS_INF_Q)
self.assertEqual(l[facts.index_Q_k[t]],-pf.FACTS_INF_Q)
self.assertEqual(u[facts.index_Q_m[t]],pf.FACTS_INF_Q)
self.assertEqual(l[facts.index_Q_m[t]],-pf.FACTS_INF_Q)
self.assertEqual(u[facts.index_Q_s[t]],facts.Q_max_s)
self.assertEqual(l[facts.index_Q_s[t]],facts.Q_min_s)
self.assertEqual(u[facts.index_Q_sh[t]],facts.Q_max_sh)
self.assertEqual(l[facts.index_Q_sh[t]],facts.Q_min_sh)
for bus in net.dc_buses:
self.assertEqual(u[bus.index_v[t]], pf.BUSDC_INF_V)
self.assertEqual(l[bus.index_v[t]], -pf.BUSDC_INF_V)
for csc in net.csc_converters:
self.assertEqual(u[csc.index_P[t]], pf.CONVCSC_INF_P)
self.assertEqual(l[csc.index_P[t]], -pf.CONVCSC_INF_P)
self.assertEqual(u[csc.index_Q[t]], pf.CONVCSC_INF_Q)
self.assertEqual(l[csc.index_Q[t]], -pf.CONVCSC_INF_Q)
self.assertEqual(u[csc.index_P_dc[t]], pf.CONVCSC_INF_PDC)
self.assertEqual(l[csc.index_P_dc[t]], -pf.CONVCSC_INF_PDC)
self.assertEqual(u[csc.index_i_dc[t]], pf.CONVCSC_INF_PDC)
self.assertEqual(l[csc.index_i_dc[t]], -pf.CONVCSC_INF_PDC)
self.assertEqual(u[csc.index_angle[t]], pf.CONVCSC_INF_ANGLE)
self.assertEqual(l[csc.index_angle[t]], -pf.CONVCSC_INF_ANGLE)
self.assertEqual(u[csc.index_ratio[t]], pf.CONVCSC_INF_RATIO)
self.assertEqual(l[csc.index_ratio[t]], -pf.CONVCSC_INF_RATIO)
# Sensitivities
mu = np.random.randn(net.num_vars)
pi = np.random.randn(net.num_vars)
net.clear_sensitivities()
constr.store_sensitivities(None,None,mu,pi)
for t in range(self.T):
# Branch sens
for branch in net.branches:
if branch.is_tap_changer():
self.assertEqual(branch.sens_ratio_u_bound[t], mu[branch.index_ratio[t]])
self.assertEqual(branch.sens_ratio_l_bound[t], pi[branch.index_ratio[t]])
else:
self.assertEqual(branch.sens_ratio_u_bound[t], 0.)
self.assertEqual(branch.sens_ratio_l_bound[t], 0.)
if branch.is_phase_shifter():
self.assertEqual(branch.sens_phase_u_bound[t], mu[branch.index_phase[t]])
self.assertEqual(branch.sens_phase_l_bound[t], pi[branch.index_phase[t]])
else:
self.assertEqual(branch.sens_phase_u_bound[t], 0.)
self.assertEqual(branch.sens_phase_l_bound[t], 0.)
# Bus sens
for bus in net.buses:
self.assertEqual(bus.sens_P_balance[t],0.)
self.assertEqual(bus.sens_Q_balance[t],0.)
self.assertEqual(bus.sens_v_mag_u_bound[t], mu[bus.index_v_mag[t]])
self.assertEqual(bus.sens_v_mag_l_bound[t], pi[bus.index_v_mag[t]])
self.assertEqual(bus.sens_v_ang_u_bound[t], mu[bus.index_v_ang[t]])
self.assertEqual(bus.sens_v_ang_l_bound[t], pi[bus.index_v_ang[t]])
# Gen sens
for gen in net.generators:
self.assertEqual(gen.sens_P_u_bound[t], mu[gen.index_P[t]])
self.assertEqual(gen.sens_P_l_bound[t], pi[gen.index_P[t]])
self.assertEqual(gen.sens_Q_u_bound[t], mu[gen.index_Q[t]])
self.assertEqual(gen.sens_Q_l_bound[t], pi[gen.index_Q[t]])
# Load sens
for load in net.loads:
self.assertEqual(load.sens_P_u_bound[t], mu[load.index_P[t]])
self.assertEqual(load.sens_P_l_bound[t], pi[load.index_P[t]])
# Shunts
for shunt in net.shunts:
if shunt.is_switched_v():
self.assertEqual(shunt.sens_b_u_bound[t], mu[shunt.index_b[t]])
self.assertEqual(shunt.sens_b_l_bound[t], pi[shunt.index_b[t]])
def test_constr_LBOUND_with_outages(self):
# Multiperiod
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
net.clear_outages()
gen = net.get_generator(0)
branch = net.get_branch(0)
gen.outage = True
branch.outage = True
self.assertTrue(gen.is_on_outage())
self.assertTrue(branch.is_on_outage())
gen.P_min = np.random.rand()
gen.Q_min = np.random.rand()
branch.ratio_min = np.random.randn()
branch.phase_min = np.random.randn()
gen.P_max = gen.P_min + 3.
gen.Q_max = gen.Q_min + 4.
branch.ratio_max = branch.ratio_min + 5.
branch.phase_max = branch.phase_min + 2.
net.set_flags('generator',
['variable','bounded'],
'any',
['active power', 'reactive power'])
net.set_flags('branch',
['variable','bounded'],
'any',
['tap ratio', 'phase shift'])
self.assertEqual(net.num_vars,
self.T*(2*net.num_generators + 2*net.num_branches))
self.assertEqual(net.num_vars, net.num_bounded)
constr = pf.Constraint('variable bounds', net)
constr.analyze()
l = constr.l
u = constr.u
G = constr.G
for t in range(self.T):
# gen P
k = np.where(G.col == gen.index_P[t])[0]
self.assertEqual(k.size, 1)
k = k[0]
i = G.row[k]
self.assertEqual(G.data[k], 1.)
self.assertEqual(l[i], gen.P_min)
self.assertEqual(u[i], gen.P_max)
self.assertEqual(u[i], l[i] + 3.)
# gen Q
k = np.where(G.col == gen.index_Q[t])[0]
self.assertEqual(k.size, 1)
k = k[0]
i = G.row[k]
self.assertEqual(G.data[k], 1.)
self.assertEqual(l[i], gen.Q_min)
self.assertEqual(u[i], gen.Q_max)
self.assertEqual(u[i], l[i] + 4.)
# branch ratio
k = np.where(G.col == branch.index_ratio[t])[0]
self.assertEqual(k.size, 1)
k = k[0]
i = G.row[k]
self.assertEqual(G.data[k], 1.)
self.assertEqual(l[i], branch.ratio_min)
self.assertEqual(u[i], branch.ratio_max)
self.assertEqual(u[i], l[i] + 5.)
# branch phase
k = np.where(G.col == branch.index_phase[t])[0]
self.assertEqual(k.size, 1)
k = k[0]
i = G.row[k]
self.assertEqual(G.data[k], 1.)
self.assertEqual(l[i], branch.phase_min)
self.assertEqual(u[i], branch.phase_max)
self.assertEqual(u[i], l[i] + 2.)
# Disconnect
net.clear_outages()
net.clear_flags()
self.assertEqual(net.num_vars, 0)
for bus in net.buses:
if bus.degree == 1:
self.assertEqual(len(bus.branches), 1)
bus.branches[0].outage = True
self.assertTrue(bus.branches[0].is_on_outage())
net.set_flags_of_component(bus,
['variable', 'bounded'],
['voltage magnitude', 'voltage angle'])
self.assertEqual(net.num_vars, 2*self.T)
self.assertEqual(net.num_vars, net.num_bounded)
self.assertTrue(bus.has_flags('variable', ['voltage magnitude',
'voltage angle']))
self.assertTrue(bus.has_flags('bounded', ['voltage magnitude',
'voltage angle']))
constr = pf.Constraint('variable bounds', net)
constr.analyze()
G = constr.G
l = constr.l
u = constr.u
self.assertEqual(G.shape[0], 2*self.T)
for t in range(self.T):
k = np.where(G.col == bus.index_v_mag[t])[0]
self.assertEqual(k.size, 1)
k = k[0]
self.assertEqual(l[G.row[k]], bus.v_min)
self.assertEqual(u[G.row[k]], bus.v_max)
k = np.where(G.col == bus.index_v_ang[t])[0]
self.assertEqual(k.size, 1)
k = k[0]
self.assertEqual(l[G.row[k]], -100.)
self.assertEqual(u[G.row[k]], 100.)
break
def test_constr_PAR_GEN_P(self):
# Multiperiod
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
self.assertEqual(net.num_vars,0)
# Vars
net.set_flags('generator',
'variable',
'slack',
['active power','reactive power'])
net.set_flags('generator',
'variable',
'regulator',
'reactive power')
self.assertGreater(net.num_vars,0)
self.assertEqual(net.num_vars,(net.get_num_slack_gens()+net.get_num_reg_gens())*self.T)
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
# Constraint
constr = pf.Constraint('generator active power participation',net)
self.assertEqual(constr.name,'generator active power participation')
f = constr.f
J = constr.J
A = constr.A
b = constr.b
# Before
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,0))
self.assertEqual(A.nnz,0)
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
# Manual count
nnz = 0
num_constr = 0
for i in range(net.num_buses):
bus = net.get_bus(i)
if bus.is_slack():
num_constr += len(bus.generators)-1 # P participation
nnz += 2*(len(bus.generators)-1)
constr.analyze()
self.assertEqual(nnz*self.T,constr.A_nnz)
constr.eval(x0)
self.assertEqual(0,constr.A_nnz)
f = constr.f
J = constr.J
A = constr.A
b = constr.b
# After
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(num_constr*self.T,))
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(num_constr*self.T,net.num_vars))
self.assertEqual(A.nnz,nnz*self.T)
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,net.num_vars))
self.assertEqual(J.nnz,0)
self.assertTrue(not np.any(np.isinf(b)))
self.assertTrue(not np.any(np.isnan(b)))
# Detailed check
Ai = A.row
Aj = A.col
Ad = A.data
self.assertEqual(Ai.size,nnz*self.T)
self.assertEqual(Aj.size,nnz*self.T)
self.assertEqual(Ad.size,nnz*self.T)
i = 0
row = 0
for t in range(self.T):
for bus in net.buses:
if bus.is_slack():
gens = bus.generators
self.assertGreater(len(gens),0)
g1 = gens[0]
for g2 in gens[1:]:
self.assertEqual(b[row],0.)
self.assertEqual(Ai[i],row)
self.assertEqual(Aj[i],g1.index_P[t])
self.assertEqual(Ad[i],1.)
i += 1
self.assertEqual(Ai[i],row)
self.assertEqual(Aj[i],g2.index_P[t])
self.assertEqual(Ad[i],-1.)
i += 1
row += 1
self.assertEqual(i,nnz*self.T)
# Last check
x = np.zeros(net.num_vars)
for t in range(self.T):
for i in range(net.num_buses):
bus = net.get_bus(i)
if bus.is_slack():
self.assertGreater(len(bus.generators),0)
for g in bus.generators:
self.assertTrue(g.has_flags('variable','active power'))
x[g.index_P[t]] = 10.
self.assertGreater(norm(x),0)
self.assertTrue(norm(A*x-b) < 1e-10)
def test_constr_PAR_GEN_P_with_outages(self):
# Multiperiod
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
net.clear_outages()
net.clear_flags()
for bus in net.buses:
if bus.is_slack():
for branch in net.branches:
branch.outage = True
for gen in net.generators:
gen.outage = True
net.set_flags('generator',
'variable',
'any',
'active power')
self.assertEqual(net.num_vars, self.T*net.num_generators)
constr = pf.Constraint('generator active power participation', net)
constr.analyze()
A = constr.A
b = constr.b
self.assertEqual(A.shape[0], 0)
self.assertEqual(b.shape[0], 0)
net.clear_outages()
constr.analyze()
A = constr.A
b = constr.b
check = False
for bus in net.buses:
if bus.is_slack() and len(bus.generators) > 1:
check = True
if check:
self.assertGreater(A.shape[0], 0)
self.assertGreater(b.shape[0], 0)
def test_constr_PVPQ_SWITCHING(self):
# Multiperiod
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
self.assertEqual(net.num_vars,0)
# Vars
net.set_flags('bus',
'variable',
'regulated by generator',
'voltage magnitude')
net.set_flags('generator',
'variable',
'slack',
['active power','reactive power'])
net.set_flags('generator',
'variable',
'regulator',
'reactive power')
self.assertGreater(net.num_vars,0)
self.assertEqual(net.num_vars,
(net.get_num_buses_reg_by_gen()+net.get_num_slack_gens()+net.get_num_reg_gens())*self.T)
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
# Make it iteresting
for gen in net.generators:
gen.Q_par = np.random.rand()
# Constraint
constr = pf.Constraint('PVPQ switching',net)
self.assertEqual(constr.name,'PVPQ switching')
f = constr.f
J = constr.J
A = constr.A
b = constr.b
# Before
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,0))
self.assertEqual(A.nnz,0)
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
# Manual count
nnz = 0
num_constr = 0
for i in range(net.num_buses):
bus = net.get_bus(i)
if bus.is_regulated_by_gen():
num_constr += len(bus.reg_generators)
nnz += len(bus.reg_generators)*(len(bus.reg_generators)+1)
constr.analyze()
self.assertEqual(nnz*self.T,constr.A_nnz)
constr.eval(x0)
self.assertEqual(0,constr.A_nnz)
f = constr.f
J = constr.J
A = constr.A
b = constr.b
# After
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(num_constr*self.T,))
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(num_constr*self.T,net.num_vars))
self.assertEqual(A.nnz,nnz*self.T)
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,net.num_vars))
self.assertEqual(J.nnz,0)
self.assertTrue(not np.any(np.isinf(b)))
self.assertTrue(not np.any(np.isnan(b)))
# Detailed check
Ai = A.row
Aj = A.col
Ad = A.data
self.assertEqual(Ai.size,nnz*self.T)
self.assertEqual(Aj.size,nnz*self.T)
self.assertEqual(Ad.size,nnz*self.T)
nnz = 0
row = 0
for t in range(self.T):
for bus in net.buses:
if bus.is_regulated_by_gen():
self.assertEqual(b[row], bus.v_set[t])
self.assertEqual(Ai[nnz], row)
self.assertEqual(Aj[nnz], bus.index_v_mag[t])
self.assertEqual(Ad[nnz], 1.)
nnz += 1
for gen in bus.reg_generators:
self.assertEqual(Ai[nnz], row)
self.assertEqual(Aj[nnz], gen.index_Q[t])
self.assertEqual(Ad[nnz], 0.)
nnz += 1
row += 1
for i in range(len(bus.reg_generators)-1):
gen1 = bus.reg_generators[i]
gen2 = bus.reg_generators[i+1]
self.assertEqual(b[row], 0.)
self.assertEqual(Ai[nnz], row)
self.assertEqual(Aj[nnz], bus.index_v_mag[t])
self.assertEqual(Ad[nnz], 0.)
nnz += 1
for gen3 in bus.reg_generators:
self.assertEqual(Ai[nnz], row)
self.assertEqual(Aj[nnz], gen3.index_Q[t])
if gen3.index == gen1.index:
self.assertEqual(Ad[nnz], np.maximum(gen2.Q_par,1e-4))
elif gen3.index == gen2.index:
self.assertEqual(Ad[nnz], -np.maximum(gen1.Q_par,1e-4))
else:
self.assertEqual(Ad[nnz], 0.)
nnz += 1
row += 1
self.assertEqual(nnz,A.nnz)
# Now with no Q vars
net.clear_flags()
# Vars
net.set_flags('bus',
'variable',
'any',
'voltage magnitude')
self.assertEqual(net.num_vars, net.get_num_buses()*self.T)
# Analyze
constr.analyze()
A = constr.A
b = constr.b
self.assertEqual(A.shape[0], 0)
self.assertEqual(A.nnz, 0)
self.assertEqual(b.size, 0)
# Now with no v vars
net.clear_flags()
# Vars
net.set_flags('generator',
'variable',
'any',
'reactive power')
self.assertEqual(net.num_vars, net.get_num_generators()*self.T)
# Analyze
constr.analyze()
A = constr.A
b = constr.b
nnz = 0
m = 0
for bus in net.buses:
if bus.is_regulated_by_gen():
n = len(bus.reg_generators)
m += n-1
nnz += n*(n-1)
self.assertEqual(A.shape[0], m*self.T)
self.assertEqual(A.nnz, nnz*self.T)
def test_constr_PVPQ_SWITCHING_with_outages(self):
# Multiperiod
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
self.assertEqual(net.num_vars,0)
# Vars
net.set_flags('bus',
'variable',
'regulated by generator',
'voltage magnitude')
net.set_flags('generator',
'variable',
'slack',
['active power','reactive power'])
net.set_flags('generator',
'variable',
'regulator',
'reactive power')
self.assertGreater(net.num_vars,0)
self.assertEqual(net.num_vars,
(net.get_num_buses_reg_by_gen()+net.get_num_slack_gens()+net.get_num_reg_gens())*self.T)
constr = pf.Constraint('PVPQ switching', net)
constr.analyze()
A0 = constr.A.copy()
b0 = constr.b.copy()
self.assertEqual(net.get_num_branches_on_outage(), 0)
self.assertEqual(net.get_num_generators_on_outage(), 0)
for bus in net.buses:
if bus.is_regulated_by_gen():
for branch in net.branches:
branch.outage = True
self.assertNotEqual(net.get_num_branches_on_outage(), 0)
self.assertEqual(net.get_num_generators_on_outage(), 0)
constr = pf.Constraint('PVPQ switching', net)
constr.analyze()
A1 = constr.A.copy()
b1 = constr.b.copy()
self.assertEqual((A1-A0).tocoo().nnz, 0)
self.assertLess(norm(b1-b0), 1e-8)
for bus in net.buses:
if bus.is_regulated_by_gen():
for gen in bus.reg_generators:
gen.outage = True
self.assertFalse(bus.is_regulated_by_gen())
self.assertNotEqual(net.get_num_generators_on_outage(), 0)
constr = pf.Constraint('PVPQ switching', net)
constr.analyze()
A2 = constr.A.copy()
b2 = constr.b.copy()
self.assertEqual(A2.shape[0], 0)
self.assertEqual(b2.size, 0)
def test_constr_ACPF(self):
# Constants
h = 1e-10
# Multiperiods
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
# Add vargens
load_buses = net.get_load_buses()
net.add_var_generators_from_parameters(load_buses,80.,50.,30.,5,0.05)
self.assertGreater(net.num_var_generators,0)
self.assertEqual(net.num_var_generators,len(load_buses))
for vargen in net.var_generators:
vargen.P = np.random.rand(net.num_periods)
vargen.Q = np.random.randn(net.num_periods)
# Add batteries
gen_buses = net.get_generator_buses()
net.add_batteries_from_parameters(gen_buses,20.,40.,0.8,0.9)
self.assertGreater(net.num_batteries,0)
self.assertEqual(net.num_batteries,len(gen_buses))
for bat in net.batteries:
bat.P = np.random.randn(net.num_periods)
# No vars
self.assertEqual(net.num_vars,0)
# Constraint
constr = pf.Constraint('AC power balance',net)
self.assertEqual(constr.name,'AC power balance')
x0 = net.get_var_values()
self.assertEqual(x0.size,0)
constr.analyze()
constr.eval(x0)
f = constr.f
self.assertEqual(f.size,2*net.num_buses*net.num_periods)
# Check mismatches (no vars)
for t in range(net.num_periods):
for bus in net.buses:
P_mis = 0.
Q_mis = 0.
for branch in bus.branches_k:
P_mis -= branch.get_P_km()[t]
Q_mis -= branch.get_Q_km()[t]
for branch in bus.branches_m:
P_mis -= branch.get_P_mk()[t]
Q_mis -= branch.get_Q_mk()[t]
for gen in bus.generators:
P_mis += gen.P[t]
Q_mis += gen.Q[t]
for vargen in bus.var_generators:
P_mis += vargen.P[t]
Q_mis += vargen.Q[t]
for load in bus.loads:
P_mis -= load.P[t]
Q_mis -= load.Q[t]
for bat in bus.batteries:
P_mis -= bat.P[t]
for shunt in bus.shunts:
P_mis -= shunt.g*(bus.v_mag[t]**2.)
Q_mis -= -shunt.b[t]*(bus.v_mag[t]**2.)
for conv in bus.csc_converters:
P_mis += conv.P[t]
Q_mis += conv.Q[t]
for conv in bus.vsc_converters:
P_mis += conv.P[t]
Q_mis += conv.Q[t]
for facts in bus.facts_k:
P_mis += facts.P_k[t]
Q_mis += facts.Q_k[t]
for facts in bus.facts_m:
P_mis += facts.P_m[t]
Q_mis += facts.Q_m[t]
self.assertAlmostEqual(P_mis,f[bus.index_P[t]])
self.assertAlmostEqual(Q_mis,f[bus.index_Q[t]])
# Cross check mismatches with net properties (no vars)
net.update_properties()
dP_list = dict([(t,list()) for t in range(self.T)])
dQ_list = dict([(t,list()) for t in range(self.T)])
for t in range(self.T):
for i in range(net.num_buses):
bus = net.get_bus(i)
dP = f[bus.index_P[t]]
dQ = f[bus.index_Q[t]]
dP_list[t].append(dP)
dQ_list[t].append(dQ)
self.assertAlmostEqual(dP,bus.P_mismatch[t])
self.assertAlmostEqual(dQ,bus.Q_mismatch[t])
self.assertAlmostEqual(net.bus_P_mis[t],np.max(np.abs(dP_list[t]))*net.base_power)
self.assertAlmostEqual(net.bus_Q_mis[t],np.max(np.abs(dQ_list[t]))*net.base_power)
# Vars
net.set_flags('bus',
'variable',
'any',
['voltage magnitude','voltage angle'])
net.set_flags('generator',
'variable',
'any',
['active power','reactive power'])
net.set_flags('load',
'variable',
'any',
['active power','reactive power'])
net.set_flags('branch',
'variable',
'tap changer',
'tap ratio')
net.set_flags('branch',
'variable',
'phase shifter',
'phase shift')
net.set_flags('shunt',
'variable',
'switching - v',
'susceptance')
net.set_flags('variable generator',
'variable',
'any',
['active power','reactive power'])
net.set_flags('battery',
'variable',
'any',
['charging power','energy level'])
net.set_flags('vsc converter',
'variable',
'any',
['active power', 'reactive power', 'dc power'])
net.set_flags('facts',
'variable',
'any',
['active power', 'reactive power'])
self.assertEqual(net.num_vars,
(2*net.num_buses +
2*net.num_generators +
2*net.num_loads +
net.get_num_tap_changers() +
net.get_num_phase_shifters() +
net.get_num_switched_v_shunts() +
3*net.num_batteries +
net.num_var_generators*2 +
net.num_vsc_converters*4 +
net.num_facts*7)*self.T)
# Check facts
for facts in net.facts:
self.assertTrue(facts.has_flags('variable', 'active power'))
self.assertTrue(facts.has_flags('variable', 'reactive power'))
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
# Constraint
constr = pf.Constraint('AC power balance',net)
self.assertEqual(constr.name,'AC power balance')
f = constr.f
J = constr.J
A = constr.A
b = constr.b
G = constr.G
# Before
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,0))
self.assertEqual(A.nnz,0)
self.assertTrue(type(G) is coo_matrix)
self.assertTupleEqual(G.shape,(0,0))
self.assertEqual(G.nnz,0)
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.G_nnz,0)
num_statcom = len([f for f in net.facts if f.is_STATCOM()])
num_Jnnz = (net.num_buses*4 +
net.num_branches*8 +
net.get_num_tap_changers()*4 +
net.get_num_phase_shifters()*4 +
net.get_num_switched_v_shunts() +
net.num_generators*2 +
net.num_loads*2 +
net.num_batteries*2 +
net.num_var_generators*2 +
net.num_vsc_converters*2 +
(net.num_facts-num_statcom)*4+num_statcom*2)*self.T
constr.analyze()
self.assertEqual(num_Jnnz,constr.J_nnz)
constr.eval(x0)
self.assertEqual(num_Jnnz,constr.J_nnz)
f = constr.f
J = constr.J
A = constr.A
b = constr.b
G = constr.G
constr.combine_H(np.ones(f.size),False)
Hcomb = constr.H_combined
# After
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(2*net.num_buses*self.T,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(2*net.num_buses*self.T,net.num_vars))
self.assertEqual(J.nnz,num_Jnnz)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,net.num_vars))
self.assertEqual(A.nnz,0)
self.assertTrue(type(G) is coo_matrix)
self.assertTupleEqual(G.shape,(0,net.num_vars))
self.assertEqual(G.nnz,0)
self.assertTupleEqual(Hcomb.shape,(net.num_vars,net.num_vars))
self.assertEqual(Hcomb.nnz,2*(net.get_num_buses()*3 +
net.get_num_branches()*12 +
net.get_num_tap_changers()*9 +
net.get_num_phase_shifters()*10 +
net.get_num_switched_v_shunts())*self.T)
self.assertTrue(not np.any(np.isinf(f)))
self.assertTrue(not np.any(np.isnan(f)))
# Check mismatches
x1 = x0+np.random.randn(net.num_vars)
constr.eval(x1)
for t in range(net.num_periods):
for bus in net.buses:
P_mis = 0.
Q_mis = 0.
for branch in bus.branches_k:
P_mis -= branch.get_P_km(x1)[t]
Q_mis -= branch.get_Q_km(x1)[t]
for branch in bus.branches_m:
P_mis -= branch.get_P_mk(x1)[t]
Q_mis -= branch.get_Q_mk(x1)[t]
for gen in bus.generators:
P_mis += x1[gen.index_P[t]]
Q_mis += x1[gen.index_Q[t]]
for vargen in bus.var_generators:
P_mis += x1[vargen.index_P[t]]
Q_mis += x1[vargen.index_Q[t]]
for load in bus.loads:
P_mis -= x1[load.index_P[t]]
Q_mis -= x1[load.index_Q[t]]
for bat in bus.batteries:
P_mis -= x1[bat.index_Pc[t]]-x1[bat.index_Pd[t]]
for shunt in bus.shunts:
if shunt.has_flags('variable','susceptance'):
b = x1[shunt.index_b[t]]
else:
b = shunt.b[t]
if bus.has_flags('variable','voltage magnitude'):
v = x1[bus.index_v_mag[t]]
else:
v = bus.v_mag[t]
P_mis -= shunt.g*v*v
Q_mis -= -b*v*v
for conv in bus.vsc_converters:
if conv.has_flags('variable', 'active power'):
P_mis += x1[conv.index_P[t]]
else:
P_mis += conv.P[t]
if conv.has_flags('variable', 'reactive power'):
Q_mis += x1[conv.index_Q[t]]
else:
Q_mis += conv.Q[t]
for conv in bus.csc_converters:
P_mis += conv.P[t]
Q_mis += conv.Q[t]
for facts in bus.facts_k:
self.assertTrue(facts.has_flags('variable', 'active power'))
P_mis += x1[facts.index_P_k[t]]
self.assertTrue(facts.has_flags('variable', 'reactive power'))
Q_mis += x1[facts.index_Q_k[t]]
for facts in bus.facts_m:
self.assertTrue(facts.has_flags('variable', 'active power'))
P_mis += x1[facts.index_P_m[t]]
self.assertTrue(facts.has_flags('variable', 'reactive power'))
Q_mis += x1[facts.index_Q_m[t]]
self.assertAlmostEqual(P_mis,f[bus.index_P[t]])
self.assertAlmostEqual(Q_mis,f[bus.index_Q[t]])
# Cross check mismatches with net properties
constr.eval(x1)
net.update_properties(x1)
dP_list = dict([(t,list()) for t in range(self.T)])
dQ_list = dict([(t,list()) for t in range(self.T)])
for t in range(self.T):
for i in range(net.num_buses):
bus = net.get_bus(i)
dP = f[bus.index_P[t]]
dQ = f[bus.index_Q[t]]
dP_list[t].append(dP)
dQ_list[t].append(dQ)
self.assertAlmostEqual(dP,bus.P_mismatch[t])
self.assertAlmostEqual(dQ,bus.Q_mismatch[t])
self.assertAlmostEqual(net.bus_P_mis[t],np.max(np.abs(dP_list[t]))*net.base_power)
self.assertAlmostEqual(net.bus_Q_mis[t],np.max(np.abs(dQ_list[t]))*net.base_power)
# Check mismatches across time
for vargen in net.var_generators:
vargen.P = np.ones(net.num_periods)*0.2 # static
vargen.Q = np.ones(net.num_periods)*0.1 # static
for bat in net.batteries:
bat.P = np.ones(net.num_periods)*0.1 # static
x0 = net.get_var_values()
constr.eval(x0)
f = constr.f
J = constr.J
P_list = []
for t in range(self.T):
P_list.append(net.get_var_projection('all','any','all',t_start=t,t_end=t))
fp_list = [f[t*net.num_buses:(t+1)*net.num_buses] for t in range(self.T)]
fq_list = [f[(t+self.T)*net.num_buses:(t+1+self.T)*net.num_buses] for t in range(self.T)]
for t in range(self.T-1):
self.assertLess(norm(fp_list[t]-fp_list[t+1]),1e-12*norm(fp_list[t]))
self.assertLess(norm(fq_list[t]-fq_list[t+1]),1e-12*norm(fq_list[t]))
Jx = J*x0
Jxp_list = [Jx[t*net.num_buses:(t+1)*net.num_buses] for t in range(self.T)]
Jxq_list = [Jx[(t+self.T)*net.num_buses:(t+1+self.T)*net.num_buses] for t in range(self.T)]
for t in range(self.T-1):
self.assertLess(norm(Jxp_list[t]-Jxp_list[t+1]),1e-12*norm(Jxp_list[t]))
self.assertLess(norm(Jxq_list[t]-Jxq_list[t+1]),1e-12*norm(Jxq_list[t]))
for i in range(10):
Hp_list = []
Hq_list = []
j = np.random.randint(0,net.num_buses)
for t in range(self.T):
Hp_list.append(coo_matrix(P_list[t]*constr.get_H_single(t*net.num_buses+j)*P_list[t].T))
Hq_list.append(coo_matrix(P_list[t]*constr.get_H_single((t+self.T)*net.num_buses+j)*P_list[t].T))
for t in range(self.T-1):
self.assertTrue(np.all(Hp_list[t].row == Hp_list[t+1].row))
self.assertTrue(np.all(Hp_list[t].col == Hp_list[t+1].col))
self.assertLessEqual(norm(Hp_list[t].data-Hp_list[t+1].data),1e-12*norm(Hp_list[t].data))
self.assertTrue(np.all(Hq_list[t].row == Hq_list[t+1].row))
self.assertTrue(np.all(Hq_list[t].col == Hq_list[t+1].col))
self.assertLessEqual(norm(Hq_list[t].data-Hq_list[t+1].data),1e-12*norm(Hq_list[t].data))
# Jacobian check
pf.tests.utils.check_constraint_Jacobian(self,
constr,
x0,
np.zeros(0),
NUM_TRIALS,
TOL,
EPS,
h)
# Sigle Hessian check
pf.tests.utils.check_constraint_single_Hessian(self,
constr,
x0,
np.zeros(0),
NUM_TRIALS,
TOL,
EPS,
h)
# Combined Hessian check
pf.tests.utils.check_constraint_combined_Hessian(self,
constr,
x0,
np.zeros(0),
NUM_TRIALS,
TOL,
EPS,
h)
# Sensitivities
net.clear_sensitivities()
for t in range(self.T):
for i in range(net.num_buses):
bus = net.get_bus(i)
self.assertEqual(bus.sens_P_balance[t],0.)
self.assertEqual(bus.sens_Q_balance[t],0.)
sens = np.zeros(2*net.num_buses*self.T)
for t in range(self.T):
for i in range(net.num_buses):
bus = net.get_bus(i)
sens[bus.index_P[t]] = 3.5*bus.index_P[t]+0.33+t*2*net.num_buses
sens[bus.index_Q[t]] = 3.4*bus.index_Q[t]+0.32+t*2*net.num_buses
constr.store_sensitivities(None,sens,None,None)
for t in range(self.T):
for i in range(net.num_buses):
bus = net.get_bus(i)
self.assertEqual(bus.sens_P_balance[t],3.5*bus.index_P[t]+0.33+t*2*net.num_buses)
self.assertEqual(bus.sens_Q_balance[t],3.4*bus.index_Q[t]+0.32+t*2*net.num_buses)
def test_constr_ACPF_with_outages(self):
# Constants
h = 1e-10
# Multiperiods
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
# Vars
net.set_flags('bus',
'variable',
'any',
['voltage magnitude','voltage angle'])
net.set_flags('generator',
'variable',
'any',
['active power','reactive power'])
net.set_flags('load',
'variable',
'any',
['active power','reactive power'])
net.set_flags('branch',
'variable',
'tap changer',
'tap ratio')
net.set_flags('branch',
'variable',
'phase shifter',
'phase shift')
net.set_flags('shunt',
'variable',
'switching - v',
'susceptance')
net.set_flags('variable generator',
'variable',
'any',
['active power','reactive power'])
net.set_flags('battery',
'variable',
'any',
['charging power','energy level'])
self.assertEqual(net.num_vars,
(2*net.num_buses +
2*net.num_generators +
2*net.num_loads +
net.get_num_tap_changers() +
net.get_num_phase_shifters() +
net.get_num_switched_v_shunts() +
3*net.num_batteries +
net.num_var_generators*2)*self.T)
x0 = net.get_var_values()
constr0 = pf.Constraint('AC power balance', net)
constr0.analyze()
constr0.eval(x0)
buses = net.buses[:10]
side = []
for bus in buses:
for gen in bus.generators:
gen.outage = True
for br in bus.branches_k:
self.assertTrue(bus.is_equal(br.bus_k))
br.outage = True
side.append(br.bus_m)
for br in bus.branches_m:
self.assertTrue(bus.is_equal(br.bus_m))
br.outage = True
side.append(br.bus_k)
constr1 = pf.Constraint('AC power balance', net)
constr1.analyze()
constr1.eval(x0)
f0 = constr0.f
f1 = constr1.f
for bus in net.buses:
if bus not in buses+side:
for t in range(self.T):
i = bus.index_P[t]
j = bus.index_Q[t]
self.assertLess(np.abs(f0[i]-f1[i]), 1e-8)
self.assertLess(np.abs(f0[j]-f1[j]), 1e-8)
for bus in buses:
for t in range(self.T):
i = bus.index_P[t]
j = bus.index_Q[t]
dp = 0.
dq = 0.
for gen in bus.generators:
self.assertTrue(gen.is_on_outage())
dp += gen.P[t]
dq += gen.Q[t]
for br in bus.branches_k:
dp -= br.P_km[t]
dq -= br.Q_km[t]
for br in bus.branches_m:
dp -= br.P_mk[t]
dq -= br.Q_mk[t]
self.assertLess(np.abs(f1[i]+dp-f0[i]), 1e-8)
self.assertLess(np.abs(f1[j]+dq-f0[j]), 1e-8)
# Jacobian check
pf.tests.utils.check_constraint_Jacobian(self,
constr1,
x0,
np.zeros(0),
NUM_TRIALS,
TOL,
EPS,
h)
# Sigle Hessian check
pf.tests.utils.check_constraint_single_Hessian(self,
constr1,
x0,
np.zeros(0),
NUM_TRIALS,
TOL,
EPS,
h)
# Combined Hessian check
pf.tests.utils.check_constraint_combined_Hessian(self,
constr1,
x0,
np.zeros(0),
NUM_TRIALS,
TOL,
EPS,
h)
def test_constr_REG_VSET(self):
# Constants
h = 1e-8
# Multiperiod
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
# Vars
net.set_flags('bus',
'variable',
'not slack',
['voltage magnitude','voltage angle'])
net.set_flags('generator',
'variable',
'slack',
'active power')
net.set_flags('generator',
'variable',
'regulator',
'reactive power')
self.assertEqual(net.num_vars,
(2*(net.num_buses-net.get_num_slack_buses()) +
net.get_num_slack_gens() +
net.get_num_reg_gens())*self.T)
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
# Constraint
constr = pf.Constraint('voltage set point regulation',net)
self.assertEqual(constr.name,'voltage set point regulation')
f = constr.f
J = constr.J
A = constr.A
b = constr.b
# Before
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,0))
self.assertEqual(A.nnz,0)
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.J_row,0)
self.assertEqual(constr.A_row,0)
self.assertEqual(constr.num_extra_vars,0)
Jnnz = 0
for i in range(net.num_buses):
bus = net.get_bus(i)
if bus.is_regulated_by_gen() and not bus.is_slack():
for gen in bus.reg_generators:
Jnnz += 4
Annz = 3*(net.get_num_reg_gens()-net.get_num_slack_gens())
rowsJ = 2*(net.get_num_reg_gens()-net.get_num_slack_gens())
rowsA = net.get_num_reg_gens()-net.get_num_slack_gens()
constr.analyze()
self.assertEqual(constr.J_nnz,Jnnz*self.T)
self.assertEqual(constr.A_nnz,Annz*self.T)
self.assertEqual(constr.J_row,rowsJ*self.T)
self.assertEqual(constr.A_row,rowsA*self.T)
self.assertEqual(constr.num_extra_vars,rowsJ*self.T)
y_init = constr.init_extra_vars
self.assertEqual(y_init.size,constr.num_extra_vars)
self.assertTrue(np.all(y_init == 0.))
y0 = np.random.rand(constr.num_extra_vars)
constr.eval(x0,y0)
self.assertEqual(constr.J_nnz,Jnnz*self.T)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.J_row,rowsJ*self.T)
self.assertEqual(constr.A_row,0)
f = constr.f
J = constr.J
A = constr.A
b = constr.b
# After
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(rowsJ*self.T,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(rowsA*self.T,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(rowsJ*self.T,net.num_vars+constr.num_extra_vars))
self.assertEqual(J.nnz,Jnnz*self.T)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(rowsA*self.T,net.num_vars+constr.num_extra_vars))
self.assertEqual(A.nnz,Annz*self.T)
self.assertTrue(not np.any(np.isinf(b)))
self.assertTrue(not np.any(np.isnan(b)))
self.assertTrue(not np.any(np.isinf(f)))
self.assertTrue(not np.any(np.isnan(f)))
self.assertTrue(not np.any(np.isinf(J.data)))
self.assertTrue(not np.any(np.isnan(J.data)))
self.assertTrue(not np.any(np.isinf(A.data)))
self.assertTrue(not np.any(np.isnan(A.data)))
# Ax=b check
self.assertEqual(norm(A.data,1),rowsA*3*self.T)
self.assertEqual(np.sum(A.data),(net.get_num_reg_gens()-net.get_num_slack_gens())*self.T)
for k in range(J.shape[0]//2):
index1 = np.where(A.col == net.num_vars+2*k)[0]
index2 = np.where(A.col == net.num_vars+2*k+1)[0]
self.assertEqual(index1.size,1)
self.assertEqual(index2.size,1)
self.assertEqual(A.row[index1[0]],A.row[index2[0]])
index3 = np.where(A.row == A.row[index1[0]])[0]
self.assertEqual(index3.size,3)
for i in index3:
if A.col[i] == net.num_vars+2*k: # y
self.assertEqual(A.data[i],-1.)
elif A.col[i] == net.num_vars+2*k+1:
self.assertEqual(A.data[i],1.) # z
else:
self.assertEqual(A.data[i],1.) # v
# f check
flags = {}
eps = 1e-8
J_row = 0
for t in range(self.T):
for bus in net.buses:
if bus.is_regulated_by_gen() and not bus.is_slack():
for gen in bus.reg_generators:
y = y0[J_row]
z = y0[J_row+1]
Q = gen.Q[t]
Qmax = gen.Q_max
Qmin = gen.Q_min
CompY = (Q-Qmin)+y-np.sqrt((Q-Qmin)**2.+y**2.+2*eps)
CompZ = (Qmax-Q)+z-np.sqrt((Qmax-Q)**2.+z**2.+2*eps)
self.assertAlmostEqual(CompY,f[J_row])
self.assertAlmostEqual(CompZ,f[J_row+1])
J_row += 2
# Jacobian check
pf.tests.utils.check_constraint_Jacobian(self,
constr,
x0,
y0,
NUM_TRIALS,
TOL,
EPS,
h)
# Sigle Hessian check
pf.tests.utils.check_constraint_single_Hessian(self,
constr,
x0,
y0,
NUM_TRIALS,
TOL,
EPS,
h)
# Combined Hessian check
pf.tests.utils.check_constraint_combined_Hessian(self,
constr,
x0,
y0,
NUM_TRIALS,
TOL,
EPS,
h)
# Sensitivities
net.clear_sensitivities()
for t in range(self.T):
for i in range(net.num_buses):
bus = net.get_bus(i)
self.assertEqual(bus.sens_v_set_reg[t],0.)
sensf = np.zeros(constr.f.size)
sensA = np.ones(constr.b.size)*10.5
self.assertEqual(sensf.size,rowsJ*self.T)
Ji = constr.J.row
Jj = constr.J.col
Ai = constr.A.row
Aj = constr.A.col
for t in range(self.T):
for i in range(net.num_buses):
bus = net.get_bus(i)
if bus.is_regulated_by_gen() and not bus.is_slack():
indices = Ji[np.where(Jj == bus.reg_generators[-1].index_Q[t])[0]]
self.assertEqual(indices.size,2)
sensf[indices[0]] = -bus.index-10
sensf[indices[1]] = bus.index+11*(bus.index % 2)
constr.store_sensitivities(sensA,sensf,None,None)
for t in range(self.T):
for i in range(net.num_buses):
bus = net.get_bus(i)
if bus.is_regulated_by_gen() and not bus.is_slack():
if bus.index % 2 == 1:
self.assertEqual(bus.sens_v_set_reg[t],bus.index+11)
else:
self.assertEqual(bus.sens_v_set_reg[t],-bus.index-10 if bus.index != 0 else 10.5)
def test_constr_REG_VSET_with_outages(self):
# Multiperiod
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
# Vars
net.set_flags('bus',
'variable',
'not slack',
['voltage magnitude','voltage angle'])
net.set_flags('generator',
'variable',
'slack',
'active power')
net.set_flags('generator',
'variable',
'regulator',
'reactive power')
self.assertEqual(net.num_vars,
(2*(net.num_buses-net.get_num_slack_buses()) +
net.get_num_slack_gens() +
net.get_num_reg_gens())*self.T)
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
constr0 = pf.Constraint('voltage set point regulation', net)
constr0.analyze()
constr0.eval(x0)
for bus in net.buses:
if bus.is_regulated_by_gen():
for branch in bus.branches:
branch.outage = True
constr1 = pf.Constraint('voltage set point regulation', net)
constr1.analyze()
constr1.eval(x0)
self.assertEqual((constr0.A-constr1.A).tocoo().nnz, 0)
self.assertLess(norm(constr0.b-constr1.b), 1e-8)
self.assertEqual((constr0.J-constr1.J).tocoo().nnz, 0)
self.assertLess(norm(constr0.f-constr1.f), 1e-8)
for bus in net.buses:
if bus.is_regulated_by_gen():
for gen in bus.reg_generators:
gen.outage = True
self.assertFalse(bus.is_regulated_by_gen())
constr2 = pf.Constraint('voltage set point regulation', net)
constr2.analyze()
constr2.eval(x0)
self.assertEqual(constr2.A.shape[0], 0)
self.assertEqual(constr2.J.shape[0], 0)
def test_constr_REG_TRAN(self):
# Constants
h = 1e-8
normal = 1e0
eta = 1e-8
# Multiperiod
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
# Vars
net.set_flags('bus',
'variable',
'regulated by transformer',
'voltage magnitude')
net.set_flags('branch',
'variable',
'tap changer - v',
'tap ratio')
self.assertEqual(net.num_vars,
(net.get_num_buses_reg_by_tran() +
net.get_num_tap_changers_v())*self.T)
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
# Constraint
constr = pf.Constraint('voltage regulation by transformers',net)
self.assertEqual(constr.name,'voltage regulation by transformers')
f = constr.f
J = constr.J
A = constr.A
b = constr.b
# Before
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,0))
self.assertEqual(A.nnz,0)
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.J_row,0)
self.assertEqual(constr.A_row,0)
Jnnz = 10*net.get_num_tap_changers_v()
Annz = 3*net.get_num_tap_changers_v()
self.assertGreaterEqual(Jnnz,0)
self.assertGreaterEqual(Annz,0)
rowsJ = 4*net.get_num_tap_changers_v()
rowsA = net.get_num_tap_changers_v()
self.assertGreaterEqual(rowsJ,0)
self.assertGreaterEqual(rowsA,0)
constr.analyze()
self.assertEqual(constr.J_nnz,Jnnz*self.T)
self.assertEqual(constr.A_nnz,Annz*self.T)
self.assertEqual(constr.J_row,rowsJ*self.T)
self.assertEqual(constr.A_row,rowsA*self.T)
y_init = constr.init_extra_vars
self.assertEqual(y_init.size,constr.num_extra_vars)
self.assertTrue(np.all(y_init == 0.))
constr.eval(x0)
self.assertEqual(constr.J_nnz,Jnnz*self.T)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.J_row,rowsJ*self.T)
self.assertEqual(constr.A_row,0)
f = constr.f
J = constr.J
A = constr.A
b = constr.b
# After
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(rowsJ*self.T,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(rowsA*self.T,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(rowsJ*self.T,net.num_vars+constr.num_extra_vars))
self.assertEqual(J.nnz,Jnnz*self.T)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(rowsA*self.T,net.num_vars+constr.num_extra_vars))
self.assertEqual(A.nnz,Annz*self.T)
self.assertEqual(constr.num_extra_vars,rowsJ*self.T)
self.assertTrue(not np.any(np.isinf(b)))
self.assertTrue(not np.any(np.isnan(b)))
self.assertTrue(not np.any(np.isinf(f)))
self.assertTrue(not np.any(np.isnan(f)))
self.assertTrue(not np.any(np.isinf(J.data)))
self.assertTrue(not np.any(np.isnan(J.data)))
self.assertTrue(not np.any(np.isinf(A.data)))
self.assertTrue(not np.any(np.isnan(A.data)))
y0 = np.random.rand(constr.num_extra_vars)
# Ax=b check
self.assertEqual(norm(A.data,1),rowsA*3*self.T)
self.assertEqual(np.sum(A.data),net.get_num_tap_changers_v()*self.T)
# f check
index = 0
for t in range(self.T):
for bus in net.buses:
for br in bus.branches_k:
if br.is_tap_changer_v():
self.assertTrue(br.has_flags('variable','tap ratio'))
bus = br.reg_bus
fvmin = ((bus.v_mag[t]-bus.v_min_reg) - np.sqrt((bus.v_mag[t]-bus.v_min_reg)**2. + 2*eta))*normal
fvmax = ((bus.v_max_reg-bus.v_mag[t]) - np.sqrt((bus.v_max_reg-bus.v_mag[t])**2. + 2*eta))*normal
ftmax = ((br.ratio_max-br.ratio[t]) - np.sqrt((br.ratio_max-br.ratio[t])**2. + 2*eta))*normal
ftmin = ((br.ratio[t]-br.ratio_min) - np.sqrt((br.ratio[t]-br.ratio_min)**2. + 2*eta))*normal
self.assertLess(np.abs(fvmin-f[index]),1e-10*(1+np.abs(fvmin)))
self.assertLess(np.abs(fvmax-f[index+1]),1e-10*(1+np.abs(fvmax)))
self.assertLess(np.abs(ftmax-f[index+2]),1e-10*(1+np.abs(ftmax)))
self.assertLess(np.abs(ftmin-f[index+3]),1e-10*(1+np.abs(ftmin)))
index += 4
# Jacobian check
pf.tests.utils.check_constraint_Jacobian(self,
constr,
x0,
y0,
NUM_TRIALS,
TOL,
EPS,
h)
# Sigle Hessian check
pf.tests.utils.check_constraint_single_Hessian(self,
constr,
x0,
y0,
NUM_TRIALS,
TOL,
EPS,
h)
# Combined Hessian check
pf.tests.utils.check_constraint_combined_Hessian(self,
constr,
x0,
y0,
NUM_TRIALS,
TOL,
EPS,
h)
# Sensitivities
net.clear_sensitivities()
for t in range(self.T):
for i in range(net.num_buses):
bus = net.get_bus(i)
self.assertEqual(bus.sens_v_reg_by_tran[t],0.)
sens = np.zeros(constr.f.size)
counter = 0
for t in range(self.T):
for bus in net.buses:
for branch in bus.branches_k:
if branch.is_tap_changer_v():
sens[counter:counter+4] = branch.reg_bus.index*t
counter += 4
self.assertEqual(counter,constr.f.size)
constr.store_sensitivities(np.zeros(constr.A.shape[0]),sens,None,None)
for t in range(self.T):
for bus in net.buses:
for branch in bus.branches_k:
if branch.is_tap_changer_v():
self.assertEqual(branch.reg_bus.sens_v_reg_by_tran[t],branch.reg_bus.index*t)
def test_constr_REG_TRAN_with_outages(self):
# Multiperiod
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
# Vars
net.set_flags('bus',
'variable',
'regulated by transformer',
'voltage magnitude')
net.set_flags('branch',
'variable',
'tap changer - v',
'tap ratio')
self.assertEqual(net.num_vars,
(net.get_num_buses_reg_by_tran() +
net.get_num_tap_changers_v())*self.T)
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
constr0 = pf.Constraint('voltage regulation by transformers', net)
constr0.analyze()
constr0.eval(x0)
for bus in net.buses:
if bus.is_regulated_by_tran():
for gen in bus.generators:
gen.outage = True
constr1 = pf.Constraint('voltage regulation by transformers', net)
constr1.analyze()
constr1.eval(x0)
self.assertEqual((constr0.A-constr1.A).tocoo().nnz, 0)
self.assertLess(norm(constr0.b-constr1.b), 1e-8)
self.assertEqual((constr0.J-constr1.J).tocoo().nnz, 0)
self.assertLess(norm(constr0.f-constr1.f), 1e-8)
for bus in net.buses:
if bus.is_regulated_by_tran():
for branch in bus.reg_trans:
branch.outage = True
self.assertFalse(bus.is_regulated_by_tran())
constr2 = pf.Constraint('voltage regulation by transformers', net)
constr2.analyze()
constr2.eval(x0)
self.assertEqual(constr2.A.shape[0], 0)
self.assertEqual(constr2.J.shape[0], 0)
def test_constr_REG_SHUNT(self):
# Constants
h = 1e-8
normal = 1e0
eta = 1e-8
# Multiperiod
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
# Vars
net.set_flags('bus',
'variable',
'regulated by shunt',
'voltage magnitude')
net.set_flags('shunt',
'variable',
'switching - v',
'susceptance')
self.assertEqual(net.num_vars,
(net.get_num_buses_reg_by_shunt() +
net.get_num_switched_v_shunts())*self.T)
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
# Constraint
constr = pf.Constraint('voltage regulation by shunts',net)
self.assertEqual(constr.name,'voltage regulation by shunts')
f = constr.f
J = constr.J
A = constr.A
b = constr.b
# Before
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,0))
self.assertEqual(A.nnz,0)
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.J_row,0)
self.assertEqual(constr.A_row,0)
Jnnz = 10*net.get_num_switched_v_shunts()
Annz = 3*net.get_num_switched_v_shunts()
self.assertGreaterEqual(Jnnz,0)
self.assertGreaterEqual(Annz,0)
rowsJ = 4*net.get_num_switched_v_shunts()
rowsA = net.get_num_switched_v_shunts()
self.assertGreaterEqual(rowsJ,0)
self.assertGreaterEqual(rowsA,0)
constr.analyze()
self.assertEqual(constr.J_nnz,Jnnz*self.T)
self.assertEqual(constr.A_nnz,Annz*self.T)
self.assertEqual(constr.J_row,rowsJ*self.T)
self.assertEqual(constr.A_row,rowsA*self.T)
y_init = constr.init_extra_vars
self.assertEqual(y_init.size,constr.num_extra_vars)
self.assertTrue(np.all(y_init == 0.))
constr.eval(x0)
self.assertEqual(constr.J_nnz,Jnnz*self.T)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.J_row,rowsJ*self.T)
self.assertEqual(constr.A_row,0)
f = constr.f
J = constr.J
A = constr.A
b = constr.b
# After
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(rowsJ*self.T,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(rowsA*self.T,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(rowsJ*self.T,net.num_vars+constr.num_extra_vars))
self.assertEqual(J.nnz,Jnnz*self.T)
self.assertTrue(np.all(J.row <= rowsJ*self.T-1))
self.assertTrue(np.all(J.col <= net.num_vars+constr.num_extra_vars-1))
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(rowsA*self.T,net.num_vars+constr.num_extra_vars))
self.assertEqual(A.nnz,Annz*self.T)
self.assertTrue(np.all(A.row <= rowsA*self.T-1))
self.assertTrue(np.all(A.col <= net.num_vars+constr.num_extra_vars-1))
self.assertEqual(constr.num_extra_vars,rowsJ*self.T)
self.assertTrue(not np.any(np.isinf(b)))
self.assertTrue(not np.any(np.isnan(b)))
self.assertTrue(not np.any(np.isinf(f)))
self.assertTrue(not np.any(np.isnan(f)))
self.assertTrue(not np.any(np.isinf(J.data)))
self.assertTrue(not np.any(np.isnan(J.data)))
self.assertTrue(not np.any(np.isinf(A.data)))
self.assertTrue(not np.any(np.isnan(A.data)))
y0 = np.random.rand(constr.num_extra_vars)
# Ax=b check
self.assertEqual(norm(A.data,1),rowsA*3*self.T)
self.assertEqual(np.sum(A.data),net.get_num_switched_v_shunts()*self.T)
# f check
index = 0
for t in range(self.T):
for bus in net.buses:
for s in bus.reg_shunts:
self.assertEqual(bus.number,s.reg_bus.number)
self.assertTrue(bus.has_flags('variable','voltage magnitude'))
self.assertTrue(s.has_flags('variable','susceptance'))
fvmin = ((bus.v_mag[t]-bus.v_min_reg) - np.sqrt((bus.v_mag[t]-bus.v_min_reg)**2. + 2.*eta))*normal
fvmax = ((bus.v_max_reg-bus.v_mag[t]) - np.sqrt((bus.v_max_reg-bus.v_mag[t])**2. + 2.*eta))*normal
fbmax = ((s.b_max-s.b[t]) - np.sqrt((s.b_max-s.b[t])**2. + 2*eta))*normal
fbmin = ((s.b[t]-s.b_min) - np.sqrt((s.b[t]-s.b_min)**2. + 2*eta))*normal
self.assertLess(np.abs(fvmin-f[index]),1e-10*(1+np.abs(fvmin)))
self.assertLess(np.abs(fvmax-f[index+1]),1e-10*(1+np.abs(fvmax)))
self.assertLess(np.abs(fbmax-f[index+2]),1e-10*(1+np.abs(fbmax)))
self.assertLess(np.abs(fbmin-f[index+3]),1e-10*(1+np.abs(fbmin)))
index += 4
# Jacobian check
pf.tests.utils.check_constraint_Jacobian(self,
constr,
x0,
y0,
NUM_TRIALS,
TOL,
EPS,
h)
# Sigle Hessian check
pf.tests.utils.check_constraint_single_Hessian(self,
constr,
x0,
y0,
NUM_TRIALS,
TOL,
EPS,
h)
# Combined Hessian check
pf.tests.utils.check_constraint_combined_Hessian(self,
constr,
x0,
y0,
NUM_TRIALS,
TOL,
EPS,
h)
# Sensitivities
net.clear_sensitivities()
for t in range(self.T):
for i in range(net.num_buses):
bus = net.get_bus(i)
self.assertEqual(bus.sens_v_reg_by_shunt[t],0.)
sens = np.zeros(constr.f.size)
counter = 0
for t in range(self.T):
for bus in net.buses:
for shunt in bus.reg_shunts:
sens[counter:counter+4] = bus.index*t
counter += 4
self.assertEqual(counter,constr.f.size)
constr.store_sensitivities(np.zeros(constr.A.shape[0]),sens,None,None)
for t in range(self.T):
for bus in net.buses:
for shunt in bus.reg_shunts:
self.assertEqual(bus.sens_v_reg_by_shunt[t],bus.index*t)
def test_constr_REG_SHUNT_with_outages(self):
# Multiperiod
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
# Vars
net.set_flags('bus',
'variable',
'regulated by shunt',
'voltage magnitude')
net.set_flags('shunt',
'variable',
'switching - v',
'susceptance')
self.assertEqual(net.num_vars,
(net.get_num_buses_reg_by_shunt() +
net.get_num_switched_v_shunts())*self.T)
x0 = net.get_var_values()
constr0 = pf.Constraint('voltage regulation by shunts', net)
constr0.analyze()
constr0.eval(x0)
for bus in net.buses:
if bus.is_regulated_by_shunt():
for gen in bus.generators:
gen.outage = True
for branch in bus.branches:
branch.outage = True
constr1 = pf.Constraint('voltage regulation by shunts', net)
constr1.analyze()
constr1.eval(x0)
self.assertEqual((constr0.A-constr1.A).tocoo().nnz, 0)
self.assertLess(norm(constr0.b-constr1.b), 1e-8)
self.assertEqual((constr0.J-constr1.J).tocoo().nnz, 0)
self.assertLess(norm(constr0.f-constr1.f), 1e-8)
def test_robustness(self):
for case in test_cases.CASES:
net = pf.Network(self.T)
constraints = [pf.Constraint('variable fixing',net),
pf.Constraint('generator active power participation',net),
pf.Constraint('PVPQ switching',net),
pf.Constraint('AC power balance',net),
pf.Constraint('DC power balance',net),
pf.Constraint('voltage set point regulation',net),
pf.Constraint('voltage regulation by transformers',net),
pf.Constraint('voltage regulation by shunts',net),
pf.Constraint('AC branch flow limits',net)]
x0 = net.get_var_values()
for c in constraints:
self.assertTrue(isinstance(c.b,np.ndarray))
self.assertTrue(isinstance(c.A,coo_matrix))
self.assertTrue(isinstance(c.f,np.ndarray))
self.assertTrue(isinstance(c.J,coo_matrix))
self.assertEqual(c.b.size,0)
self.assertEqual(c.A.nnz,0)
self.assertTupleEqual(c.A.shape,(0,0))
self.assertEqual(c.f.size,0)
self.assertEqual(c.J.nnz,0)
self.assertTupleEqual(c.J.shape,(0,0))
list(map(lambda c: c.eval(x0),constraints))
list(map(lambda c: c.analyze(),constraints))
list(map(lambda c: c.eval(x0),constraints))
for c in constraints:
self.assertTrue(isinstance(c.b,np.ndarray))
self.assertTrue(isinstance(c.A,coo_matrix))
self.assertTrue(isinstance(c.f,np.ndarray))
self.assertTrue(isinstance(c.J,coo_matrix))
self.assertEqual(c.b.size,0)
self.assertEqual(c.A.nnz,0)
self.assertTupleEqual(c.A.shape,(0,0))
self.assertEqual(c.f.size,0)
self.assertEqual(c.J.nnz,0)
self.assertTupleEqual(c.J.shape,(0,0))
# Network changes
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
constraints = [pf.Constraint('variable fixing',net),
pf.Constraint('generator active power participation',net),
pf.Constraint('PVPQ switching',net),
pf.Constraint('AC power balance',net),
pf.Constraint('DC power balance',net),
pf.Constraint('voltage set point regulation',net),
pf.Constraint('voltage regulation by transformers',net),
pf.Constraint('voltage regulation by shunts',net),
pf.Constraint('AC branch flow limits',net)]
# After updating network
list(map(lambda c: c.analyze(),constraints))
list(map(lambda c: c.eval(x0),constraints))
for c in constraints:
self.assertTrue(isinstance(c.b,np.ndarray))
self.assertTrue(isinstance(c.A,coo_matrix))
self.assertTrue(isinstance(c.f,np.ndarray))
self.assertTrue(isinstance(c.J,coo_matrix))
# Add variables
net.set_flags('bus',
'variable',
'any',
['voltage magnitude','voltage angle'])
net.set_flags('generator',
'variable',
'any',
['active power','reactive power'])
net.set_flags('branch',
'variable',
'tap changer',
'tap ratio')
net.set_flags('branch',
'variable',
'phase shifter',
'phase shift')
net.set_flags('shunt',
'variable',
'switching - v',
'susceptance')
net.set_flags('battery',
'variable',
'any',
['charging power','energy level'])
self.assertEqual(net.num_vars,
(2*net.num_buses +
2*net.num_generators +
net.get_num_tap_changers()+
net.get_num_phase_shifters()+
net.get_num_switched_v_shunts()+
3*net.num_batteries)*self.T)
x0 = net.get_var_values()
# Before analyzing
list(map(lambda c: c.clear_error(),constraints))
for c in constraints:
self.assertRaises(pf.ConstraintError,c.eval,x0)
list(map(lambda c: c.clear_error(),constraints))
# Do it right
list(map(lambda c: c.analyze(),constraints))
list(map(lambda c: c.eval(x0),constraints))
for c in constraints:
self.assertTrue(isinstance(c.b,np.ndarray))
self.assertTrue(isinstance(c.A,coo_matrix))
self.assertTrue(isinstance(c.f,np.ndarray))
self.assertTrue(isinstance(c.J,coo_matrix))
self.assertEqual(c.A.shape[1],net.num_vars+c.num_extra_vars)
self.assertEqual(c.J.shape[1],net.num_vars+c.num_extra_vars)
if c.f.size:
self.assertTupleEqual(c.get_H_single(0).shape,
(net.num_vars+c.num_extra_vars,net.num_vars+c.num_extra_vars))
else:
self.assertTupleEqual(c.get_H_single(0).shape,(0,0))
def test_constr_DCPF(self):
# Single period
for case in test_cases.CASES:
net = pf.Parser(case).parse(case)
self.assertEqual(net.num_periods,1)
self.assertEqual(net.num_vars,0)
# Add vargens
load_buses = net.get_load_buses()
net.add_var_generators_from_parameters(load_buses,80.,50.,30.,5,0.05)
self.assertGreater(net.num_var_generators,0)
self.assertEqual(net.num_var_generators,len([b for b in net.buses if b.loads]))
for b in net.buses:
if b.loads:
self.assertGreater(len(b.var_generators),0)
for vargen in b.var_generators:
self.assertEqual(vargen.bus,b)
# batteries
for bat in net.batteries:
if bat.index % 2 == 0:
bat.P *= -1.
# Variables
net.set_flags('bus',
'variable',
'not slack',
'voltage angle')
net.set_flags('generator',
'variable',
'any',
'active power')
net.set_flags('load',
'variable',
'any',
'active power')
net.set_flags('variable generator',
'variable',
'any',
'active power')
net.set_flags('branch',
'variable',
'phase shifter',
'phase shift')
net.set_flags('battery',
'variable',
'any',
'charging power')
self.assertEqual(net.num_vars,
(net.num_buses-net.get_num_slack_buses() +
net.num_generators +
net.num_loads +
net.num_var_generators +
net.get_num_phase_shifters()+
2*net.num_batteries))
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
# Constraint
constr = pf.Constraint('DC power balance',net)
self.assertEqual(constr.name,'DC power balance')
f = constr.f
J = constr.J
A = constr.A
b = constr.b
# Before
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,0))
self.assertEqual(A.nnz,0)
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.J_row,0)
self.assertEqual(constr.A_row,0)
r = 0
for b in net.buses:
if b.is_slack():
r += len(b.branches)
# Analyze
constr.analyze()
f = constr.f
J = constr.J
A = constr.A
b = constr.b
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.J_row,0)
self.assertEqual(constr.A_row,net.num_buses*1)
self.assertEqual(constr.A_nnz,
(net.num_generators +
net.num_loads +
net.num_var_generators +
4*net.num_branches -
2*r +
2*net.get_num_phase_shifters()+
2*net.num_batteries))
self.assertTupleEqual(b.shape,(net.num_buses,))
self.assertTupleEqual(f.shape,(0,))
self.assertTupleEqual(A.shape,(net.num_buses,net.num_vars))
self.assertEqual(A.nnz,constr.A_nnz)
self.assertTupleEqual(J.shape,(0,net.num_vars))
constr.eval(x0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(A.nnz,
(net.num_generators +
net.num_loads +
net.num_var_generators +
4*net.num_branches -
2*r +
2*net.get_num_phase_shifters()+
2*net.num_batteries))
# Extract pieces
P1 = net.get_var_projection('bus','any','voltage angle')
P2 = net.get_var_projection('generator','any','active power')
P3 = net.get_var_projection('variable generator','any','active power')
P4 = net.get_var_projection('branch','any','phase shift')
P5 = net.get_var_projection('load','any','active power')
P6 = net.get_var_projection('battery','any','charging power')
G = A*P2.T
R = A*P3.T
Atheta = -A*P1.T
Aphi = -A*P4.T
L = -A*P5.T
B = -A*P6.T
x = np.random.randn(net.num_vars)
p = P2*x
r = P3*x
theta = P1*x
phi = P4*x
l = P5*x
Pb = P6*x
self.assertLess(norm((G*p+R*r-Atheta*theta-Aphi*phi-L*l-B*Pb)-A*x),1e-10)
# Sensitivities
for bus in net.buses:
self.assertEqual(bus.sens_P_balance,0.)
self.assertEqual(bus.sens_Q_balance,0.)
new_sens = np.random.randn(net.num_buses)
constr.store_sensitivities(new_sens,None,None,None)
for bus in net.buses:
self.assertNotEqual(bus.sens_P_balance,0.)
self.assertEqual(bus.sens_Q_balance,0.)
self.assertEqual(bus.sens_P_balance,new_sens[bus.index])
# mismatches
mismatches = A*x0-b
for bus in net.buses:
mis = 0
for gen in bus.generators:
mis += gen.P
for vargen in bus.var_generators:
mis += vargen.P
for load in bus.loads:
mis -= load.P
for bat in bus.batteries:
mis -= bat.P
for br in bus.branches_k:
mis -= br.P_km_DC
for br in bus.branches_m:
mis += br.P_km_DC
self.assertLess(np.abs(mismatches[bus.index]-mis),1e-8)
# No variables
net.clear_flags()
self.assertEqual(net.num_vars,0)
constr.del_matvec()
constr.analyze()
f1 = constr.f
J1 = constr.J
A1 = constr.A
b1 = constr.b
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.J_row,0)
self.assertEqual(constr.A_row,net.num_buses)
self.assertEqual(constr.A_nnz,0)
self.assertTupleEqual(b1.shape,(net.num_buses,))
self.assertTupleEqual(f1.shape,(0,))
self.assertTupleEqual(A1.shape,(net.num_buses,net.num_vars))
self.assertEqual(A1.nnz,constr.A_nnz)
self.assertTupleEqual(J1.shape,(0,net.num_vars))
x1 = net.get_var_values()
self.assertTrue(type(x1) is np.ndarray)
self.assertTupleEqual(x1.shape,(net.num_vars,))
mismatches1 = A1*x1-b1
for bus in net.buses:
mis = 0
for gen in bus.generators:
mis += gen.P
for vargen in bus.var_generators:
mis += vargen.P
for load in bus.loads:
mis -= load.P
for bat in bus.batteries:
mis -= bat.P
for br in bus.branches_k:
mis -= br.P_km_DC
for br in bus.branches_m:
mis -= br.P_mk_DC
self.assertLess(np.abs(mismatches1[bus.index_P]-mis),1e-8)
# Multi period
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
self.assertEqual(net.num_vars,0)
# Add vargens
load_buses = net.get_load_buses()
net.add_var_generators_from_parameters(load_buses,80.,50.,30.,5,0.05)
# batteries
for bat in net.batteries:
bat.P = np.random.randn(self.T)*10
# Variables
net.set_flags('bus',
'variable',
'not slack',
'voltage angle')
net.set_flags('generator',
'variable',
'any',
'active power')
net.set_flags('load',
'variable',
'any',
'active power')
net.set_flags('variable generator',
'variable',
'any',
'active power')
net.set_flags('branch',
'variable',
'phase shifter',
'phase shift')
net.set_flags('battery',
'variable',
'any',
'charging power')
self.assertEqual(net.num_vars,
(net.num_buses-net.get_num_slack_buses() +
net.num_generators +
net.num_loads +
net.num_var_generators +
net.get_num_phase_shifters()+
2*net.num_batteries)*self.T)
x0 = net.get_var_values()
# Count something
r = 0
for b in net.buses:
if b.is_slack():
r += len(b.branches)
# Constraint
constr = pf.Constraint('DC power balance',net)
self.assertEqual(constr.name,'DC power balance')
# Analyze
constr.analyze()
A = constr.A
b = constr.b
self.assertEqual(constr.A_row, net.num_buses*self.T)
self.assertEqual(constr.A_nnz,
(net.num_generators +
net.num_loads +
net.num_var_generators +
4*net.num_branches -
2*r +
2*net.get_num_phase_shifters()+
2*net.num_batteries)*self.T)
self.assertTupleEqual(b.shape,(net.num_buses*self.T,))
self.assertTupleEqual(A.shape,(net.num_buses*self.T,net.num_vars))
self.assertEqual(A.nnz,constr.A_nnz)
# Eval
constr.eval(x0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(A.nnz,
(net.num_generators +
net.num_loads +
net.num_var_generators +
4*net.num_branches -
2*r +
2*net.get_num_phase_shifters()+
2*net.num_batteries)*self.T)
# Mismatches
mismatches = A*x0-b
for t in range(self.T):
for bus in net.buses:
mis = 0
for gen in bus.generators:
mis += gen.P[t]
for vargen in bus.var_generators:
mis += vargen.P[t]
for load in bus.loads:
mis -= load.P[t]
for bat in bus.batteries:
mis -= bat.P[t]
for br in bus.branches_k:
mis -= br.P_km_DC[t]
for br in bus.branches_m:
mis -= br.P_mk_DC[t]
self.assertLess(np.abs(mismatches[bus.index_t[t]]-mis),1e-8)
# No variables
net.clear_flags()
self.assertEqual(net.num_vars,0)
constr.del_matvec()
constr.analyze()
A1 = constr.A
b1 = constr.b
x1 = net.get_var_values()
self.assertTupleEqual(x1.shape,(0,))
mismatches1 = A1*x1-b1
for t in range(self.T):
for bus in net.buses:
mis = 0
for gen in bus.generators:
mis += gen.P[t]
for vargen in bus.var_generators:
mis += vargen.P[t]
for load in bus.loads:
mis -= load.P[t]
for bat in bus.batteries:
mis -= bat.P[t]
for br in bus.branches_k:
mis -= br.P_km_DC[t]
for br in bus.branches_m:
mis -= br.P_mk_DC[t]
self.assertLess(np.abs(mismatches1[bus.index_P[t]]-mis),1e-8)
# Sensitivities
net.clear_sensitivities()
lam = np.random.randn(net.num_buses*net.num_periods)
self.assertEqual(lam.size, constr.A.shape[0])
for t in range(net.num_periods):
for bus in net.buses:
self.assertEqual(bus.sens_P_balance[t], 0.)
self.assertEqual(bus.sens_Q_balance[t], 0.)
constr.store_sensitivities(lam, None, None, None)
for t in range(net.num_periods):
for bus in net.buses:
self.assertEqual(bus.sens_P_balance[t], lam[bus.index_P[t]])
self.assertNotEqual(bus.sens_P_balance[t], 0.)
self.assertEqual(bus.sens_Q_balance[t], 0.)
def test_constr_DCPF_with_outages(self):
# Multiperiods
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
# Vars
net.set_flags('bus',
'variable',
'any',
'voltage angle')
net.set_flags('generator',
'variable',
'any',
'active power')
net.set_flags('load',
'variable',
'any',
'active power')
net.set_flags('branch',
'variable',
'phase shifter',
'phase shift')
self.assertEqual(net.num_vars,
(net.num_buses +
net.num_generators +
net.num_loads +
net.get_num_phase_shifters())*self.T)
x0 = net.get_var_values()
constr0 = pf.Constraint('DC power balance', net)
constr0.analyze()
constr0.eval(x0)
buses = net.buses[:10]
side = []
for bus in buses:
for gen in bus.generators:
gen.outage = True
for br in bus.branches_k:
self.assertTrue(bus.is_equal(br.bus_k))
br.outage = True
side.append(br.bus_m)
for br in bus.branches_m:
self.assertTrue(bus.is_equal(br.bus_m))
br.outage = True
side.append(br.bus_k)
constr1 = pf.Constraint('DC power balance', net)
constr1.analyze()
constr1.eval(x0)
f0 = constr0.A*x0-constr0.b
f1 = constr1.A*x0-constr1.b
for bus in net.buses:
if bus not in buses+side:
for t in range(self.T):
i = bus.index_P[t]
self.assertLess(np.abs(f0[i]-f1[i]), 1e-8)
for bus in buses:
for t in range(self.T):
i = bus.index_P[t]
dp = 0.
for gen in bus.generators:
self.assertTrue(gen.is_on_outage())
dp += gen.P[t]
for br in bus.branches_k:
dp -= br.P_km_DC[t]
for br in bus.branches_m:
dp -= br.P_mk_DC[t]
self.assertLess(np.abs(f1[i]+dp-f0[i]), 1e-8)
def test_constr_DC_FLOW_LIM(self):
# Single period
for case in test_cases.CASES:
net = pf.Parser(case).parse(case)
self.assertEqual(net.num_periods,1)
self.assertEqual(net.num_vars,0)
# Variables
net.set_flags('bus',
'variable',
'not slack',
'voltage angle')
self.assertEqual(net.num_vars,net.num_buses-net.get_num_slack_buses())
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
# Constraint
constr = pf.Constraint('DC branch flow limits',net)
self.assertEqual(constr.name,'DC branch flow limits')
# Num constr
num_constr = len([br for br in net.branches if br.ratingA != 0.])
f = constr.f
J = constr.J
A = constr.A
b = constr.b
G = constr.G
l = constr.l
u = constr.u
# Before
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(l) is np.ndarray)
self.assertTupleEqual(l.shape,(0,))
self.assertTrue(type(u) is np.ndarray)
self.assertTupleEqual(u.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,0))
self.assertEqual(A.nnz,0)
self.assertTrue(type(G) is coo_matrix)
self.assertTupleEqual(G.shape,(0,0))
self.assertEqual(G.nnz,0)
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.G_nnz,0)
self.assertEqual(constr.J_row,0)
self.assertEqual(constr.A_row,0)
self.assertEqual(constr.G_row,0)
# Analyze
constr.analyze()
f = constr.f
J = constr.J
A = constr.A
b = constr.b
l = constr.l
u = constr.u
G = constr.G
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.J_row,0)
self.assertEqual(constr.A_row,0)
self.assertEqual(constr.G_row,num_constr)
self.assertTupleEqual(b.shape,(0,))
self.assertTupleEqual(f.shape,(0,))
self.assertTupleEqual(l.shape,(num_constr,))
self.assertTupleEqual(u.shape,(num_constr,))
self.assertTupleEqual(A.shape,(0,net.num_vars))
self.assertTupleEqual(J.shape,(0,net.num_vars))
self.assertTupleEqual(G.shape,(num_constr,net.num_vars))
self.assertEqual(G.nnz,constr.G_nnz)
self.assertTrue(np.all(l <= u))
num = 0
for br in net.branches:
if br.ratingA == 0.:
continue
if not br.bus_k.is_slack():
num += 1
if not br.bus_m.is_slack():
num += 1
self.assertEqual(num,constr.G_nnz)
counter = 0
index = 0
for bus in net.buses:
for br in bus.branches_k:
if br.ratingA == 0.:
continue
off = 0
if br.bus_k.is_slack():
off = br.b*br.bus_k.v_ang
else:
self.assertEqual(G.row[counter],index)
self.assertEqual(G.col[counter],br.bus_k.index_v_ang)
self.assertEqual(G.data[counter],-br.b)
counter += 1
if br.bus_m.is_slack():
off = -br.b*br.bus_m.v_ang
else:
self.assertEqual(G.row[counter],index)
self.assertEqual(G.col[counter],br.bus_m.index_v_ang)
self.assertEqual(G.data[counter],br.b)
counter += 1
rating = br.ratingA
self.assertEqual(l[index],-rating+off-br.b*br.phase)
self.assertEqual(u[index],rating+off-br.b*br.phase)
index += 1
self.assertEqual(counter,G.nnz)
self.assertEqual(index,G.shape[0])
# Flow
Gx0 = constr.G*x0
self.assertTupleEqual(Gx0.shape,(num_constr,))
index = 0
for bus in net.buses:
for branch in bus.branches_k:
if branch.ratingA == 0.:
continue
bus1 = branch.bus_k
bus2 = branch.bus_m
if bus1.is_slack():
flow = Gx0[index]-branch.b*(bus1.v_ang-branch.phase)
elif bus2.is_slack():
flow = Gx0[index]-branch.b*(-bus2.v_ang-branch.phase)
else:
flow = Gx0[index]-branch.b*(-branch.phase)
self.assertLess(np.abs(branch.P_km_DC-flow),1e-10)
index += 1
# Sensitivities
index = 0
for branch in net.branches:
self.assertEqual(branch.sens_P_u_bound,0.)
self.assertEqual(branch.sens_P_l_bound,0.)
mu = np.random.randn(num_constr)
pi = np.random.randn(num_constr)
self.assertEqual(constr.G.shape[0],num_constr)
constr.store_sensitivities(None,None,mu,pi)
for bus in net.buses:
for branch in bus.branches_k:
if branch.ratingA == 0.:
continue
self.assertEqual(branch.sens_P_u_bound,mu[index])
self.assertEqual(branch.sens_P_l_bound,pi[index])
index += 1
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.G_nnz,0)
self.assertEqual(constr.J_row,0)
self.assertEqual(constr.A_row,0)
self.assertEqual(constr.G_row,num_constr)
# Multi period
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
self.assertEqual(net.num_vars,0)
# Nonzero angles
for bus in net.buses:
bus.v_ang = np.random.randn()*np.ones(self.T)
# Variables
net.set_flags('bus',
'variable',
'not slack',
'voltage angle')
self.assertEqual(net.num_vars,(net.num_buses-net.get_num_slack_buses())*self.T)
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
# Num constr
num_constr = len([br for br in net.branches if br.ratingA != 0.])
# Constraint
constr = pf.Constraint('DC branch flow limits',net)
self.assertEqual(constr.name,'DC branch flow limits')
constr.analyze()
G = constr.G
l = constr.l
u = constr.u
self.assertTupleEqual(l.shape,(num_constr*self.T,))
self.assertTupleEqual(u.shape,(num_constr*self.T,))
self.assertTupleEqual(G.shape,(num_constr*self.T,net.num_vars))
Projs = []
for t in range(self.T):
Projs.append(net.get_var_projection('all','any','all',t,t))
Gs = [G*P.T for P in Projs]
x0s = [P*x0 for P in Projs]
Gx0s = [(Gs[t]*x0s[t])[t*num_constr:(t+1)*num_constr] for t in range(self.T)]
ls = [l[t*num_constr:(t+1)*num_constr] for t in range(self.T)]
us = [u[t*num_constr:(t+1)*num_constr] for t in range(self.T)]
for t in range(self.T):
self.assertLessEqual(norm(Gx0s[t]-Gx0s[0]),1e-10*norm(Gx0s[0]))
self.assertLessEqual(norm(ls[t]-ls[0]),1e-10*norm(ls[0]))
self.assertLessEqual(norm(us[t]-us[0]),1e-10*norm(us[0]))
def test_constr_DC_FLOW_LIM_with_outages(self):
# Multi period
for case in test_cases.CASES:
net = pf.Parser(case).parse(case, self.T)
self.assertEqual(net.num_periods, self.T)
self.assertEqual(net.num_vars,0)
# Variables
net.set_flags('bus',
'variable',
'not slack',
'voltage angle')
self.assertEqual(net.num_vars,(net.num_buses-net.get_num_slack_buses())*self.T)
x0 = net.get_var_values()
constr = pf.Constraint('DC branch flow limits', net)
constr.analyze()
constr.eval(x0)
num_constr = len([br for br in net.branches if br.ratingA != 0.])*self.T
self.assertEqual(constr.G.shape[0], num_constr)
for branch in net.branches:
branch.outage = True
constr.analyze()
self.assertEqual(constr.G.shape[0], 0)
def test_constr_LINPF(self):
# Multiperiod
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
# load
if sum([l.P[0] for l in net.loads]) < 0:
lmin = np.min([l.P for l in net.loads])
for l in net.loads:
l.P = l.P + np.abs(lmin)
# add vargens
load_buses = net.get_load_buses()
net.add_var_generators_from_parameters(load_buses,80.,50.,30.,5,0.05)
self.assertGreater(net.num_var_generators,0)
self.assertEqual(net.num_var_generators,len(load_buses))
for vargen in net.var_generators:
vargen.Q = np.abs(vargen.P)
for t in range(self.T):
self.assertGreater(vargen.Q[t],0.)
# Vars
net.set_flags('bus',
'variable',
'any',
['voltage magnitude','voltage angle'])
net.set_flags('generator',
'variable',
'slack',
'active power')
net.set_flags('generator',
'variable',
'regulator',
'reactive power')
net.set_flags('branch',
'variable',
'tap changer',
'tap ratio')
net.set_flags('branch',
'variable',
'phase shifter',
'phase shift')
net.set_flags('shunt',
'variable',
'switching - v',
'susceptance')
net.set_flags('variable generator',
'variable',
'any',
['active power','reactive power'])
self.assertEqual(net.num_vars,
(2*net.get_num_buses() +
net.get_num_slack_gens() +
net.get_num_reg_gens() +
net.get_num_tap_changers() +
net.get_num_phase_shifters() +
net.get_num_switched_v_shunts() +
net.num_var_generators*2)*self.T)
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
# Constraint
constr = pf.Constraint('linearized AC power balance',net)
self.assertEqual(constr.name,'linearized AC power balance')
f = constr.f
J = constr.J
A = constr.A
b = constr.b
G = constr.G
# Before
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,0))
self.assertEqual(A.nnz,0)
self.assertTrue(type(G) is coo_matrix)
self.assertTupleEqual(G.shape,(0,0))
self.assertEqual(G.nnz,0)
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.G_nnz,0)
num_Annz = (net.num_buses*4 +
net.get_num_branches()*8 +
net.get_num_tap_changers()*4 +
net.get_num_phase_shifters()*4 +
net.get_num_switched_v_shunts() +
net.get_num_slack_gens() +
net.get_num_reg_gens()+
net.num_var_generators*2)
constr.analyze()
self.assertEqual(constr.A_nnz,0)
constr.eval(x0)
self.assertEqual(constr.A_nnz,0)
f = constr.f
J = constr.J
A = constr.A
b = constr.b
G = constr.G
# After
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(2*net.num_buses*self.T,))
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(2*net.num_buses*self.T,net.num_vars))
self.assertEqual(A.nnz,num_Annz*self.T)
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,net.num_vars))
self.assertEqual(J.nnz,0)
self.assertTrue(type(G) is coo_matrix)
self.assertTupleEqual(G.shape,(0,net.num_vars))
self.assertEqual(G.nnz,0)
self.assertTrue(not np.any(np.isinf(b)))
self.assertTrue(not np.any(np.isnan(b)))
# Check with ACPF
constrPF = pf.Constraint('AC power balance',net)
self.assertEqual(constrPF.name,'AC power balance')
constrPF.analyze()
constrPF.eval(x0)
self.assertEqual(A.nnz,constrPF.J.nnz)
self.assertTrue(np.all(A.row == constrPF.J.row))
self.assertTrue(np.all(A.col == constrPF.J.col))
self.assertTrue(np.all(A.data == constrPF.J.data))
self.assertGreater(norm(A.row),0)
self.assertGreater(norm(A.col),0)
self.assertGreater(norm(A.data),0)
self.assertGreater(norm(b),0)
self.assertLess(norm(b-(constrPF.J*x0-constrPF.f)),1e-10*(norm(b)+1))
# After eval
constr.eval(np.zeros(x0.size))
self.assertEqual(constr.A.nnz,constrPF.J.nnz)
self.assertTrue(np.all(constr.A.row == constrPF.J.row))
self.assertTrue(np.all(constr.A.col == constrPF.J.col))
self.assertTrue(np.all(constr.A.data == constrPF.J.data))
self.assertGreater(norm(constr.A.row),0)
self.assertGreater(norm(constr.A.col),0)
self.assertGreater(norm(constr.A.data),0)
self.assertGreater(norm(constr.b),0)
self.assertLess(norm(constr.b-(constrPF.J*x0-constrPF.f)),1e-10*(norm(b)+1))
def test_constr_LINPF_with_outages(self):
# Multiperiod
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
for gen in net.generators:
gen.outage = True
for branch in net.branches:
branch.outage = True
# Vars
net.set_flags('bus',
'variable',
'any',
['voltage magnitude','voltage angle'])
net.set_flags('generator',
'variable',
'slack',
'active power')
net.set_flags('generator',
'variable',
'regulator',
'reactive power')
net.set_flags('branch',
'variable',
'tap changer',
'tap ratio')
net.set_flags('branch',
'variable',
'phase shifter',
'phase shift')
net.set_flags('shunt',
'variable',
'switching - v',
'susceptance')
net.set_flags('variable generator',
'variable',
'any',
['active power','reactive power'])
self.assertEqual(net.num_vars,
(2*net.get_num_buses() +
net.get_num_slack_gens() +
net.get_num_reg_gens() +
net.get_num_tap_changers() +
net.get_num_phase_shifters() +
net.get_num_switched_v_shunts() +
net.num_var_generators*2)*self.T)
constr = pf.Constraint('linearized AC power balance',net)
constr.analyze()
x0 = net.get_var_values()
constrPF = pf.Constraint('AC power balance',net)
constrPF.analyze()
constrPF.eval(x0)
self.assertEqual(constr.A.nnz,constrPF.J.nnz)
self.assertTrue(np.all(constr.A.row == constrPF.J.row))
self.assertTrue(np.all(constr.A.col == constrPF.J.col))
self.assertTrue(np.all(constr.A.data == constrPF.J.data))
if net.num_shunts:
self.assertGreater(norm(constr.A.row),0)
self.assertGreater(norm(constr.A.col),0)
self.assertGreater(norm(constr.A.data),0)
self.assertGreater(norm(constr.b),0)
self.assertLess(norm(constr.b-(constrPF.J*x0-constrPF.f)),1e-10*(norm(constr.b)+1))
def test_constr_GEN_RAMP(self):
# Multi period
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
self.assertEqual(net.num_vars,0)
# Gens
for gen in net.generators:
gen.dP_max = np.random.rand()*100.
gen.P_prev = np.random.rand()*10.
gen.P = np.random.rand()*20
# Vars
net.set_flags('generator',
'variable',
'not slack',
'active power')
num = net.num_generators-net.get_num_slack_gens()
self.assertEqual(net.num_vars,num*self.T)
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
# Constraint
constr = pf.Constraint('generator ramp limits',net)
self.assertEqual(constr.name,'generator ramp limits')
f = constr.f
J = constr.J
A = constr.A
b = constr.b
l = constr.l
G = constr.G
u = constr.u
# Before
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,0))
self.assertEqual(A.nnz,0)
self.assertTrue(type(l) is np.ndarray)
self.assertTupleEqual(l.shape,(0,))
self.assertTrue(type(u) is np.ndarray)
self.assertTupleEqual(u.shape,(0,))
self.assertTrue(type(G) is coo_matrix)
self.assertTupleEqual(G.shape,(0,0))
self.assertEqual(G.nnz,0)
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.G_nnz,0)
constr.analyze()
self.assertEqual(constr.A_nnz,0)
constr.eval(x0)
self.assertEqual(constr.A_nnz,0)
f = constr.f
J = constr.J
A = constr.A
b = constr.b
l = constr.l
G = constr.G
u = constr.u
# After
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,net.num_vars))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,net.num_vars))
self.assertEqual(A.nnz,0)
self.assertTrue(type(l) is np.ndarray)
self.assertTupleEqual(l.shape,(num*self.T,))
self.assertTrue(type(u) is np.ndarray)
self.assertTupleEqual(u.shape,(num*self.T,))
self.assertTrue(type(G) is coo_matrix)
self.assertTupleEqual(G.shape,(num*self.T,net.num_vars))
self.assertEqual(G.nnz,num*(1 + (self.T-1)*2))
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.G_nnz,0)
for t in range(self.T):
for gen in net.generators:
if not gen.is_slack():
ac = np.where(G.col == gen.index_P[t])[0]
# Last time
if t == self.T-1:
self.assertEqual(ac.size,1)
i = G.row[ac[0]]
self.assertEqual(G.data[ac[0]],1.)
self.assertEqual(l[i],-gen.dP_max)
self.assertEqual(u[i],gen.dP_max)
ar = np.where(G.row == i)[0]
self.assertEqual(ar.size,2)
for j in ar:
if G.col[j] == gen.index_P[t]:
pass
else:
self.assertEqual(G.col[j],gen.index_P[t-1])
self.assertEqual(G.data[j],-1.)
# Not last time
else:
self.assertEqual(ac.size,2)
for i in ac:
self.assertEqual(G.col[i],gen.index_P[t])
# added
if G.data[i] == -1.:
self.assertEqual(l[G.row[i]],-gen.dP_max)
self.assertEqual(u[G.row[i]],gen.dP_max)
ar = np.where(G.row == G.row[i])[0]
self.assertEqual(ar.size,2)
for j in ar:
if G.col[j] == gen.index_P[t]:
pass
else:
self.assertEqual(G.col[j],gen.index_P[t+1])
self.assertEqual(G.data[j],1.)
# subtracted
else:
if t == 0:
self.assertEqual(l[G.row[i]],-gen.dP_max+gen.P_prev)
self.assertEqual(u[G.row[i]],gen.dP_max+gen.P_prev)
else:
self.assertEqual(l[G.row[i]],-gen.dP_max)
self.assertEqual(u[G.row[i]],gen.dP_max)
ar = np.where(G.row == G.row[i])[0]
self.assertEqual(ar.size,2)
for j in ar:
if G.col[j] == gen.index_P[t]:
pass
else:
self.assertEqual(G.col[j],gen.index_P[t-1])
self.assertEqual(G.data[j],-1.)
def test_constr_GEN_RAMP_with_outages(self):
# Multi period
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
# Vars
net.set_flags('generator',
'variable',
'not slack',
'active power')
num = net.num_generators-net.get_num_slack_gens()
self.assertEqual(net.num_vars,num*self.T)
x0 = net.get_var_values()
# Constraint
constr = pf.Constraint('generator ramp limits',net)
constr.analyze()
self.assertEqual(constr.A.shape[0], 0)
self.assertGreater(constr.G.shape[0], 0)
for gen in net.generators:
gen.outage = True
constr.analyze()
self.assertEqual(constr.A.shape[0], 0)
self.assertEqual(constr.G.shape[0], 0)
def test_constr_AC_FLOW_LIM(self):
# Constants
h = 1e-11
tol = 1e-2
eps = 1.1 # %
param = 1e-6
# Multiperiod
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
# Vars
net.set_flags('bus',
'variable',
'any',
['voltage magnitude','voltage angle'])
net.set_flags('branch',
'variable',
'tap changer',
'tap ratio')
net.set_flags('branch',
'variable',
'phase shifter',
'phase shift')
self.assertEqual(net.num_vars,
(2*net.get_num_buses() +
net.get_num_tap_changers() +
net.get_num_phase_shifters())*self.T)
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
# Constr
constr = pf.Constraint('AC branch flow limits',net)
self.assertEqual(constr.name,'AC branch flow limits')
constr.analyze()
num_constr = len([br for br in net.branches if br.ratingA != 0.])*2*net.num_periods
self.assertTupleEqual(constr.f.shape,(num_constr,))
self.assertEqual(constr.J_row,num_constr)
# zero ratings
for br in net.branches:
if br.ratingA == 0.:
br.ratingA = 100.
# Constraint
constr = pf.Constraint('AC branch flow limits',net)
self.assertEqual(constr.name,'AC branch flow limits')
f = constr.f
J = constr.J
A = constr.A
b = constr.b
G = constr.G
l = constr.l
u = constr.u
# Before
self.assertEqual(constr.num_extra_vars,0)
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,0))
self.assertEqual(A.nnz,0)
self.assertTrue(type(G) is coo_matrix)
self.assertTupleEqual(G.shape,(0,0))
self.assertEqual(G.nnz,0)
self.assertTrue(type(u) is np.ndarray)
self.assertTupleEqual(u.shape,(0,))
self.assertTrue(type(l) is np.ndarray)
self.assertTupleEqual(l.shape,(0,))
self.assertEqual(constr.J_row,0)
self.assertEqual(constr.A_row,0)
self.assertEqual(constr.G_row,0)
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.G_nnz,0)
self.assertEqual(constr.num_extra_vars,0)
num_constr = net.get_num_branches()*2*self.T
num_Jnnz = (net.get_num_branches()*8 +
net.get_num_tap_changers()*2 +
net.get_num_phase_shifters()*2)*self.T+num_constr
constr.analyze()
self.assertEqual(num_Jnnz,constr.J_nnz)
self.assertEqual(num_constr,constr.G_nnz)
self.assertEqual(num_constr,constr.J_row)
f = constr.f
J = constr.J
A = constr.A
b = constr.b
G = constr.G
l = constr.l
u = constr.u
# After analyze
self.assertEqual(constr.num_extra_vars,num_constr)
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(num_constr,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(num_constr,net.num_vars+num_constr))
self.assertEqual(J.nnz,num_Jnnz)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,net.num_vars+num_constr))
self.assertEqual(A.nnz,0)
self.assertTrue(type(G) is coo_matrix)
self.assertTupleEqual(G.shape,(num_constr,net.num_vars+num_constr))
self.assertEqual(G.nnz,num_constr)
self.assertTrue(np.all(G.row == np.array(range(num_constr))))
self.assertTrue(np.all(G.col == np.array(range(net.num_vars,net.num_vars+num_constr))))
self.assertTrue(np.all(G.row == G.col-net.num_vars))
self.assertTrue(np.all(G.data == 1.))
self.assertTrue(type(u) is np.ndarray)
self.assertTupleEqual(u.shape,(num_constr,))
self.assertTrue(type(l) is np.ndarray)
self.assertTupleEqual(l.shape,(num_constr,))
J_row = 0
for t in range(net.num_periods):
for bus in net.buses:
for branch in bus.branches_k:
#i = t*net.num_branches*2+2*branch.index
self.assertEqual(u[J_row],branch.ratingA)
self.assertEqual(u[J_row+1],branch.ratingA)
self.assertEqual(l[J_row],-branch.ratingA)
self.assertEqual(l[J_row+1],-branch.ratingA)
J_row += 2
# Row info
index = 0
for t in range(net.num_periods):
for bus in net.buses:
for branch in bus.branches_k:
if branch.ratingA != 0:
skmJ = constr.get_J_row_info_string(index)
smkJ = constr.get_J_row_info_string(index+1)
self.assertEqual(skmJ,"AC branch flow limits:branch:%d:%s:%d" %(branch.index,"km",t))
self.assertEqual(smkJ,"AC branch flow limits:branch:%d:%s:%d" %(branch.index,"mk",t))
skmG = constr.get_G_row_info_string(index)
smkG = constr.get_G_row_info_string(index+1)
self.assertEqual(skmG,"AC branch flow limits:branch:%d:%s:%d" %(branch.index,"km",t))
self.assertEqual(smkG,"AC branch flow limits:branch:%d:%s:%d" %(branch.index,"mk",t))
index += 2
# Hessian structure
for i in range(constr.J.shape[0]):
H = constr.get_H_single(i)
self.assertTupleEqual(H.shape,(net.num_vars+num_constr,net.num_vars+num_constr))
self.assertTrue(np.all(H.row >= H.col))
Hcomb = constr.H_combined
H_comb_nnz = 2*(net.num_branches*10 +
net.get_num_tap_changers()*5+
net.get_num_phase_shifters()*5)*self.T
self.assertTupleEqual(Hcomb.shape,(net.num_vars+num_constr,net.num_vars+num_constr))
self.assertTrue(np.all(Hcomb.row >= Hcomb.col))
self.assertEqual(Hcomb.nnz,H_comb_nnz)
y_init = constr.init_extra_vars
self.assertEqual(y_init.size,constr.num_extra_vars)
self.assertEqual(y_init.size,constr.f.size)
self.assertTrue(np.all(y_init == 0.))
constr.eval(x0)
y0 = np.random.randn(num_constr)
constr.eval(x0,y0)
self.assertEqual(num_constr,constr.J_row)
self.assertEqual(0,constr.G_nnz)
self.assertEqual(num_Jnnz,constr.J_nnz)
f = constr.f
J = constr.J
A = constr.A
b = constr.b
G = constr.G
l = constr.l
u = constr.u
constr.combine_H(np.ones(f.size),False)
Hcomb = constr.H_combined
# After eval
self.assertTrue(not np.any(np.isinf(f)))
self.assertTrue(not np.any(np.isnan(f)))
# Projections
P1 = constr.get_var_projection()
P2 = constr.get_extra_var_projection()
self.assertTrue(isinstance(P1,coo_matrix))
self.assertTrue(isinstance(P2,coo_matrix))
self.assertEqual(P1.shape[0],net.num_vars)
self.assertEqual(P2.shape[0],constr.num_extra_vars)
self.assertEqual(P1.shape[1],net.num_vars+constr.num_extra_vars)
self.assertEqual(P2.shape[1],net.num_vars+constr.num_extra_vars)
self.assertEqual(P1.nnz,net.num_vars)
self.assertEqual(P2.nnz,constr.num_extra_vars)
self.assertLess(np.linalg.norm(x0-P1*np.hstack((x0,y0))),1e-12)
self.assertLess(np.linalg.norm(y0-P2*np.hstack((x0,y0))),1e-12)
# Cross check current magnitudes
J_row = 0
for t in range(net.num_periods):
for bus in net.buses:
for branch in bus.branches_k:
Pkm = branch.get_P_km()[t]
Qkm = branch.get_Q_km()[t]
Pmk = branch.get_P_mk()[t]
Qmk = branch.get_Q_mk()[t]
vk = branch.bus_k.v_mag[t]
vm = branch.bus_m.v_mag[t]
ikmmag = branch.get_i_km_mag(eps=param)[t]
imkmag = branch.get_i_mk_mag(eps=param)[t]
error_km = 100.*np.abs(ikmmag-f[J_row]-y0[J_row])/max([ikmmag,tol])
error_mk = 100.*np.abs(imkmag-f[J_row+1]-y0[J_row+1])/max([imkmag,tol])
self.assertLess(error_km,eps)
self.assertLess(error_mk,eps)
J_row += 2
# Jacobian check
pf.tests.utils.check_constraint_Jacobian(self,
constr,
x0,
y0,
NUM_TRIALS,
TOL,
EPS,
h)
# Sigle Hessian check
pf.tests.utils.check_constraint_single_Hessian(self,
constr,
x0,
y0,
NUM_TRIALS,
TOL,
EPS,
h)
# Combined Hessian check 1
h = 1e-12
pf.tests.utils.check_constraint_combined_Hessian(self,
constr,
x0,
y0,
NUM_TRIALS,
TOL,
EPS,
h)
# Combined Hessian check 2
coeff = np.random.randn(constr.f.shape[0])
constr.eval(x0,y0)
constr.combine_H(coeff,False)
H = constr.H_combined.copy()
H_manual = 0
for i in range(constr.f.size):
Hi = constr.get_H_single(i)
H_manual = H_manual + coeff[i]*Hi
diff = coo_matrix(H_manual-H)
self.assertLess(norm(diff.data)/norm(H.data),1e-12)
# Sensitivities
net.clear_sensitivities()
for t in range(net.num_periods):
for branch in net.branches:
self.assertEqual(branch.sens_i_mag_u_bound[t], 0.)
mu = np.random.randn(constr.J.shape[0])
self.assertEqual(mu.size, constr.G.shape[0])
constr.store_sensitivities(None, np.zeros(mu.size), mu, np.zeros(mu.size))
G_row = 0
for t in range(net.num_periods):
for bus in net.buses:
for branch in bus.branches_k:
if np.abs(mu[G_row]) > np.abs(mu[G_row+1]):
self.assertEqual(branch.sens_i_mag_u_bound[t], mu[G_row])
else:
self.assertEqual(branch.sens_i_mag_u_bound[t], mu[G_row+1])
G_row += 2
# Single period
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,1)
self.assertEqual(net.num_periods,1)
net.set_flags('bus',['variable','bounded'],'any','voltage magnitude')
net.set_flags('bus','variable','not slack','voltage angle')
self.assertEqual(net.num_vars,2*net.num_buses-net.get_num_slack_buses())
if len([b for b in net.branches if b.ratingA != 0.]) == 0:
continue
constr = pf.Constraint('AC branch flow limits',net)
constr.analyze()
self.assertGreater(constr.num_extra_vars,0)
# Single Hessian check
x0 = net.get_var_values()
y0 = np.zeros(constr.num_extra_vars)
constr.eval(x0,y0)
for i in range(10):
j = np.random.randint(0,constr.f.size)
constr.eval(x0,y0)
g0 = constr.J.tocsr()[j,:].toarray().flatten()
H0lt = constr.get_H_single(j).copy()
self.assertTrue(np.all(H0lt.row >= H0lt.col)) # lower triangular
H0 = (H0lt + H0lt.T) - triu(H0lt)
d = np.random.randn(net.num_vars+constr.num_extra_vars)
x = x0 + h*d[:net.num_vars]
y = y0 + h*d[net.num_vars:]
constr.eval(x,y)
g1 = constr.J.tocsr()[j,:].toarray().flatten()
Hd_exact = H0*d
Hd_approx = (g1-g0)/h
error = 100.*norm(Hd_exact-Hd_approx)/np.maximum(norm(Hd_exact),tol)
self.assertLessEqual(error,EPS)
# Combined Hessian check
x0 = net.get_var_values()
y0 = np.zeros(constr.num_extra_vars)
lam = np.random.randn(constr.f.size)
constr.eval(x0,y0)
constr.combine_H(lam)
h = 1e-11
F0 = np.dot(constr.f,lam)
GradF0 = constr.J.T*lam
HessF0lt = constr.H_combined.copy()
self.assertTrue(np.all(HessF0lt.row >= HessF0lt.col)) # lower triangular
HessF0 = (HessF0lt + HessF0lt.T - triu(HessF0lt))
for i in range(10):
d = np.random.randn(x0.size+y0.size)
x = x0 + h*d[:x0.size]
y = y0 + h*d[x0.size:]
constr.eval(x,y)
F1 = np.dot(constr.f,lam)
GradF1 = constr.J.T*lam
Jd_exact = np.dot(GradF0,d)
Jd_approx = (F1-F0)/h
Hd_exact = HessF0*d
Hd_approx = (GradF1-GradF0)/h
errorJ = 100.*norm(Jd_exact-Jd_approx)/norm(Jd_exact)
errorH = 100.*norm(Hd_exact-Hd_approx)/norm(Hd_exact)
self.assertLess(errorJ,EPS)
self.assertLess(errorH,EPS)
def test_constr_AC_FLOW_LIM_with_outages(self):
# Constants
h = 1e-11
tol = 1e-2
eps = 1.1 # %
# Multiperiod
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
# Vars
net.set_flags('bus',
'variable',
'any',
['voltage magnitude','voltage angle'])
net.set_flags('branch',
'variable',
'tap changer',
'tap ratio')
net.set_flags('branch',
'variable',
'phase shifter',
'phase shift')
self.assertEqual(net.num_vars,
(2*net.get_num_buses() +
net.get_num_tap_changers() +
net.get_num_phase_shifters())*self.T)
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
for branch in net.branches:
branch.outage = True
# Constr
constr = pf.Constraint('AC branch flow limits',net)
constr.analyze()
constr.eval(x0)
self.assertEqual(constr.f.size, 0)
self.assertTupleEqual(constr.J.shape, (0, net.num_vars))
self.assertEqual(constr.l.size, 0)
self.assertEqual(constr.u.size, 0)
self.assertTupleEqual(constr.G.shape, (0, net.num_vars))
# Jacobian check
pf.tests.utils.check_constraint_Jacobian(self,
constr,
x0,
np.zeros(0),
NUM_TRIALS,
TOL,
EPS,
h)
# Sigle Hessian check
pf.tests.utils.check_constraint_single_Hessian(self,
constr,
x0,
np.zeros(0),
NUM_TRIALS,
TOL,
EPS,
h)
# Combined Hessian check 1
h = 1e-12
pf.tests.utils.check_constraint_combined_Hessian(self,
constr,
x0,
np.zeros(0),
NUM_TRIALS,
TOL,
EPS,
h)
def test_constr_DUMMY(self):
# Multiperiod
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
# Too big
if net.num_buses > 1000:
continue
# Add vargens
load_buses = net.get_load_buses()
net.add_var_generators_from_parameters(load_buses,80.,50.,30.,5,0.05)
self.assertGreater(net.num_var_generators,0)
self.assertEqual(net.num_var_generators,len([b for b in net.buses if b.loads]))
for b in net.buses:
if b.loads:
self.assertGreater(len(b.var_generators),0)
for vargen in b.var_generators:
self.assertEqual(vargen.bus,b)
# batteries
for bat in net.batteries:
if bat.index % 2 == 0:
bat.P *= -1.
# Variables
net.set_flags('bus',
'variable',
'not slack',
'voltage angle')
net.set_flags('generator',
'variable',
'any',
'active power')
net.set_flags('load',
'variable',
'any',
'active power')
net.set_flags('variable generator',
'variable',
'any',
'active power')
net.set_flags('branch',
'variable',
'phase shifter',
'phase shift')
net.set_flags('battery',
'variable',
'any',
'charging power')
self.assertEqual(net.num_vars,
(net.num_buses-net.get_num_slack_buses() +
net.num_generators +
net.num_loads +
net.num_var_generators +
net.get_num_phase_shifters()+
2*net.num_batteries)*net.num_periods)
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
# Ref constraint
constrREF = pf.Constraint('DC power balance',net)
self.assertEqual(constrREF.name,'DC power balance')
# Dummy constraint
constr = pf.constraints.DummyDCPF(net)
self.assertEqual(constr.name,'dummy DC power balance')
self.assertEqual(constr.A_row,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.A_row,constrREF.A_row)
self.assertEqual(constr.A_nnz,constrREF.A_nnz)
self.assertEqual(constr.b.size,0)
self.assertEqual(constr.A.shape[0],0)
self.assertEqual(constr.A.shape[1],0)
self.assertEqual(constr.A.nnz,0)
constrREF.analyze()
constr.analyze()
self.assertEqual(constr.A_row,net.num_buses*self.T)
self.assertGreater(constr.A_nnz,0)
self.assertEqual(constr.A_row,constrREF.A_row)
self.assertEqual(constr.A_nnz,constrREF.A_nnz)
self.assertTrue(np.all(constr.b == constrREF.b))
self.assertTrue(np.all(constr.A.row == constrREF.A.row))
self.assertTrue(np.all(constr.A.col == constrREF.A.col))
self.assertTrue(np.all(constr.A.data == constrREF.A.data))
self.assertTupleEqual(constr.l.shape,(0,))
self.assertTupleEqual(constr.u.shape,(0,))
self.assertTupleEqual(constr.f.shape,(0,))
self.assertTupleEqual(constr.G.shape,(0,net.num_vars))
self.assertTupleEqual(constr.J.shape,(0,net.num_vars))
constrREF.eval(net.get_var_values())
constr.eval(net.get_var_values())
self.assertTrue(np.all(constr.b == constrREF.b))
self.assertTrue(np.all(constr.A.row == constrREF.A.row))
self.assertTrue(np.all(constr.A.col == constrREF.A.col))
self.assertTrue(np.all(constr.A.data == constrREF.A.data))
def test_constr_BAT_DYN(self):
# Multi period
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,5)
self.assertEqual(net.num_periods,5)
self.assertEqual(net.num_vars,0)
# Add battries
gen_buses = net.get_generator_buses()
net.add_batteries_from_parameters(gen_buses,20.,40.,0.8,0.7)
self.assertEqual(net.num_batteries,len(gen_buses))
self.assertGreater(net.num_batteries,0)
# Vars
net.set_flags('battery',
'variable',
'any',
['charging power','energy level'])
self.assertEqual(net.num_vars,5*3*net.num_batteries)
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
# Constraint
constr = pf.Constraint('battery dynamics',net)
self.assertEqual(constr.name,'battery dynamics')
f = constr.f
J = constr.J
A = constr.A
b = constr.b
l = constr.l
G = constr.G
u = constr.u
# Before
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,0))
self.assertEqual(A.nnz,0)
self.assertTrue(type(l) is np.ndarray)
self.assertTupleEqual(l.shape,(0,))
self.assertTrue(type(u) is np.ndarray)
self.assertTupleEqual(u.shape,(0,))
self.assertTrue(type(G) is coo_matrix)
self.assertTupleEqual(G.shape,(0,0))
self.assertEqual(G.nnz,0)
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.G_nnz,0)
constr.analyze()
self.assertEqual(constr.A_row,(5+1)*net.num_batteries)
self.assertEqual(constr.A_nnz,5*4*net.num_batteries)
self.assertEqual(constr.G_nnz,0)
constr.eval(x0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.G_nnz,0)
f = constr.f
J = constr.J
A = constr.A
b = constr.b
l = constr.l
G = constr.G
u = constr.u
# After
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(6*net.num_batteries,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,net.num_vars))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(6*net.num_batteries,net.num_vars))
self.assertEqual(A.nnz,5*4*net.num_batteries)
self.assertTrue(type(l) is np.ndarray)
self.assertTupleEqual(l.shape,(0,))
self.assertTrue(type(u) is np.ndarray)
self.assertTupleEqual(u.shape,(0,))
self.assertTrue(type(G) is coo_matrix)
self.assertTupleEqual(G.shape,(0,net.num_vars))
self.assertEqual(G.nnz,0)
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.G_nnz,0)
for t in range(5):
for bat in net.batteries:
self.assertTrue(bat.has_flags('variable',['charging power','energy level']))
aPc = np.where(A.col == bat.index_Pc[t])[0]
aPd = np.where(A.col == bat.index_Pd[t])[0]
aE = np.where(A.col == bat.index_E[t])[0]
if t < 5-1:
aEE = np.where(A.col == bat.index_E[t+1])[0]
self.assertEqual(aPc.size,1)
self.assertEqual(aPd.size,1)
eq_row = A.row[aPc[0]]
self.assertEqual(eq_row,A.row[aPd[0]])
self.assertEqual(A.data[aPc[0]],-bat.eta_c)
self.assertEqual(A.data[aPd[0]],1./bat.eta_d)
if t == 0:
self.assertEqual(aE.size,2)
# init eq
j = aE[0]
self.assertEqual(A.data[j],1.)
self.assertEqual(b[A.row[j]],bat.E_init)
self.assertEqual(np.where(A.row == A.row[j])[0].size,1)
# update eq E_{t+1} - E_t - eta_c Pc_t + (1/eta_d) Pd_t = 0
j = aE[1]
self.assertEqual(A.data[j],-1.)
self.assertEqual(b[A.row[j]],0.)
self.assertEqual(np.where(A.row == A.row[j])[0].size,4)
self.assertEqual(A.row[j],eq_row)
self.assertEqual(A.row[j],A.row[aEE[0]])
elif t < 5-1:
self.assertEqual(aE.size,2)
# update eq E_t - E_{t-1} - eta_c Pc_{t-1} + (1/eta_d) Pd_{t-1} = 0
j = aE[0]
self.assertEqual(A.data[j],1.)
self.assertEqual(b[A.row[j]],0.)
self.assertEqual(np.where(A.row == A.row[j])[0].size,4)
self.assertNotEqual(A.row[j],eq_row)
self.assertNotEqual(A.row[j],A.row[aEE[0]])
# update eq E_{t+1} - E_t - eta_c Pc_t + (1/eta_d) Pd_t = 0
j = aE[1]
self.assertEqual(A.data[j],-1.)
self.assertEqual(b[A.row[j]],0.)
self.assertEqual(np.where(A.row == A.row[j])[0].size,4)
self.assertEqual(A.row[j],eq_row)
self.assertEqual(A.row[j],A.row[aEE[0]])
else:
self.assertEqual(aE.size,2)
# update eq E_t - E_{t-1} - eta_c Pc_{t-1} + (1/eta_d) Pd_{t-1} = 0
j = aE[0]
self.assertEqual(A.data[j],1.)
self.assertEqual(b[A.row[j]],0.)
self.assertEqual(np.where(A.row == A.row[j])[0].size,4)
self.assertNotEqual(A.row[j],eq_row)
# update eq - E_t - eta_c Pc_t + (1/eta_d) Pd_t = -E_final
j = aE[1]
self.assertEqual(A.data[j],-1.)
self.assertEqual(b[A.row[j]],-bat.E_final)
self.assertEqual(np.where(A.row == A.row[j])[0].size,3)
self.assertEqual(A.row[j],eq_row)
def test_constr_BAT_DYN_with_outages(self):
# Multi period
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,5)
# Add battries
gen_buses = net.get_generator_buses()
net.add_batteries_from_parameters(gen_buses,20.,40.,0.8,0.7)
self.assertEqual(net.num_batteries,len(gen_buses))
self.assertGreater(net.num_batteries,0)
# Vars
net.set_flags('battery',
'variable',
'any',
['charging power','energy level'])
self.assertEqual(net.num_vars,5*3*net.num_batteries)
x0 = net.get_var_values()
# Constraint
constr0 = pf.Constraint('battery dynamics',net)
constr0.analyze()
for branch in net.branches:
branch.outage = True
for gen in net.generators:
gen.outage = True
constr1 = pf.Constraint('battery dynamics',net)
constr1.analyze()
self.assertEqual((constr1.A-constr0.A).tocoo().nnz, 0)
self.assertEqual((constr1.G-constr0.G).tocoo().nnz, 0)
self.assertLess(norm(constr1.b-constr0.b), 1e-8)
self.assertLess(norm(constr1.l-constr0.u), 1e-8)
self.assertLess(norm(constr1.l-constr0.u), 1e-8)
def test_constr_LOAD_PF(self):
# Multi period
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
self.assertEqual(net.num_vars,0)
# Powers
for load in net.loads:
load.P = np.random.rand(net.num_periods)
self.assertTrue(np.all(load.P > 0))
# Target power factors
for load in net.loads:
load.target_power_factor = np.random.rand()
self.assertTrue(0 < load.target_power_factor < 1.)
# Vars
net.set_flags('load',
'variable',
'any',
['active power','reactive power'])
self.assertEqual(net.num_vars,2*net.num_loads*self.T)
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
# Constraint
constr = pf.Constraint('load constant power factor',net)
self.assertEqual(constr.name,'load constant power factor')
f = constr.f
J = constr.J
A = constr.A
b = constr.b
l = constr.l
G = constr.G
u = constr.u
# Before
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,0))
self.assertEqual(A.nnz,0)
self.assertTrue(type(l) is np.ndarray)
self.assertTupleEqual(l.shape,(0,))
self.assertTrue(type(u) is np.ndarray)
self.assertTupleEqual(u.shape,(0,))
self.assertTrue(type(G) is coo_matrix)
self.assertTupleEqual(G.shape,(0,0))
self.assertEqual(G.nnz,0)
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.G_nnz,0)
constr.analyze()
self.assertEqual(constr.A_nnz,2*net.num_loads*self.T)
self.assertEqual(constr.A_row,net.num_loads*self.T)
constr.eval(x0)
self.assertEqual(constr.A_nnz,0)
f = constr.f
J = constr.J
A = constr.A
b = constr.b
l = constr.l
G = constr.G
u = constr.u
# After
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(net.num_loads*self.T,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,net.num_vars))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(net.num_loads*self.T,net.num_vars))
self.assertEqual(A.nnz,2*net.num_loads*self.T)
self.assertTrue(type(l) is np.ndarray)
self.assertTupleEqual(l.shape,(0,))
self.assertTrue(type(u) is np.ndarray)
self.assertTupleEqual(u.shape,(0,))
self.assertTrue(type(G) is coo_matrix)
self.assertTupleEqual(G.shape,(0,net.num_vars))
self.assertEqual(G.nnz,0)
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.G_nnz,0)
for load in net.loads:
for t in range(net.num_periods):
indices = np.where(A.col == load.index_P[t])[0]
self.assertEqual(indices.size,1)
row = A.row[indices[0]]
indices = np.where(A.row == row)[0]
self.assertEqual(indices.size,2)
for i in indices:
if A.col[i] == load.index_P[t]:
gamma = load.target_power_factor
factor = np.sqrt((1.-gamma**2.)/(gamma**2.))
load.Q[t] = np.abs(load.P[t])*factor*(1. if load.Q[t] >= 0 else -1.)
self.assertLess(np.abs(gamma-load.power_factor[t]),1e-12)
if load.P[t]*load.Q[t] >= 0:
self.assertAlmostEqual(A.data[i],-factor)
self.assertLess(np.abs(-factor*load.P[t]+load.Q[t]),1e-12)
else:
self.assertAlmostEqual(A.data[i],factor)
self.assertLess(np.abs(factor*load.P[t]+load.Q[t]),1e-12)
else:
self.assertEqual(A.col[i],load.index_Q[t])
self.assertEqual(A.data[i],1.)
x = net.get_var_values()
self.assertLess(np.linalg.norm(constr.A*x-constr.b),1e-10)
for load in net.loads:
for t in range(net.num_periods):
self.assertAlmostEqual(load.power_factor[t],load.target_power_factor)
def test_constr_LOAD_PF_with_outages(self):
# Multi period
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
# Vars
net.set_flags('load',
'variable',
'any',
['active power','reactive power'])
self.assertEqual(net.num_vars,2*net.num_loads*self.T)
# Constraint
constr0 = pf.Constraint('load constant power factor',net)
constr0.analyze()
for branch in net.branches:
branch.outage = True
for gen in net.generators:
gen.outage = True
constr1 = pf.Constraint('load constant power factor',net)
constr1.analyze()
self.assertEqual((constr1.A-constr0.A).tocoo().nnz, 0)
self.assertEqual((constr1.G-constr0.G).tocoo().nnz, 0)
self.assertLess(norm(constr1.b-constr0.b), 1e-8)
self.assertLess(norm(constr1.l-constr0.u), 1e-8)
self.assertLess(norm(constr1.l-constr0.u), 1e-8)
def test_constr_AC_LIN_FLOW_LIM(self):
# Multiperiod
for case in test_cases.CASES:
net = pf.Parser(case).parse(case,self.T)
self.assertEqual(net.num_periods,self.T)
# Vars
net.set_flags('bus',
'variable',
'any',
'voltage magnitude')
net.set_flags('bus',
'variable',
'not slack',
'voltage angle')
net.set_flags('branch',
'variable',
'tap changer',
'tap ratio')
net.set_flags('branch',
'variable',
'phase shifter',
'phase shift')
self.assertEqual(net.num_vars,
(2*net.get_num_buses()-net.get_num_slack_buses() +
net.get_num_tap_changers() +
net.get_num_phase_shifters())*self.T)
# Zero ratings
for br in net.branches:
if br.ratingA == 0.:
br.ratingA = 100.
x0 = net.get_var_values()
self.assertTrue(type(x0) is np.ndarray)
self.assertTupleEqual(x0.shape,(net.num_vars,))
# Constraint
constr = pf.Constraint('linearized AC branch flow limits',net)
self.assertEqual(constr.name,'linearized AC branch flow limits')
f = constr.f
J = constr.J
A = constr.A
b = constr.b
G = constr.G
l = constr.l
u = constr.u
# Before
self.assertEqual(constr.num_extra_vars,0)
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,0))
self.assertEqual(A.nnz,0)
self.assertTrue(type(G) is coo_matrix)
self.assertTupleEqual(G.shape,(0,0))
self.assertEqual(G.nnz,0)
self.assertTrue(type(u) is np.ndarray)
self.assertTupleEqual(u.shape,(0,))
self.assertTrue(type(l) is np.ndarray)
self.assertTupleEqual(l.shape,(0,))
self.assertEqual(constr.J_row,0)
self.assertEqual(constr.A_row,0)
self.assertEqual(constr.G_row,0)
self.assertEqual(constr.J_nnz,0)
self.assertEqual(constr.A_nnz,0)
self.assertEqual(constr.G_nnz,0)
self.assertEqual(constr.num_extra_vars,0)
# Tap ratios and phase shifts
if net.get_num_tap_changers()+net.get_num_phase_shifters() > 0:
self.assertRaises(pf.ConstraintError,constr.analyze)
constr.clear_error()
continue
# No voltage magnitude bounds
self.assertRaises(pf.ConstraintError,constr.analyze)
self.assertRaisesRegexp(pf.ConstraintError,
"AC_LIN_FLOW_LIM constraint requires variable voltage magnitudes to be bounded",
constr.analyze)
constr.clear_error()
net.set_flags('bus',
'bounded',
'any',
'voltage magnitude')
self.assertEqual(net.num_bounded,net.num_buses*self.T)
constr.analyze()
self.assertGreaterEqual(constr.G_nnz,constr.G_row)
f = constr.f
J = constr.J
A = constr.A
b = constr.b
G = constr.G
l = constr.l
u = constr.u
# After analyze
self.assertEqual(constr.num_extra_vars,0)
self.assertTrue(type(f) is np.ndarray)
self.assertTupleEqual(f.shape,(0,))
self.assertTrue(type(b) is np.ndarray)
self.assertTupleEqual(b.shape,(0,))
self.assertTrue(type(J) is coo_matrix)
self.assertTupleEqual(J.shape,(0,net.num_vars))
self.assertEqual(J.nnz,0)
self.assertTrue(type(A) is coo_matrix)
self.assertTupleEqual(A.shape,(0,net.num_vars))
self.assertEqual(A.nnz,0)
self.assertTrue(type(G) is coo_matrix)
self.assertTupleEqual(G.shape,(constr.G_row,net.num_vars))
self.assertFalse(np.any(np.isnan(G.data)))
self.assertTrue(type(u) is np.ndarray)
self.assertTupleEqual(u.shape,(constr.G_row,))
self.assertFalse(np.any(np.isnan(u)))
self.assertTrue(type(l) is np.ndarray)
self.assertTupleEqual(l.shape,(constr.G_row,))
self.assertTrue(np.all(l == -1e8))
def test_constr_AC_LIN_FLOW_LIM_with_outages(self):
pass
def test_nonlinear_constr_creation(self):
# Single period
for case in test_cases.CASES:
net = pf.Parser(case).parse(case)
self.assertEqual(net.num_periods,1)
constr = pf.Constraint("variable fixing",net)
# J row
self.assertEqual(constr.J_row,0)
constr.J_row = 19
self.assertEqual(constr.J_row,19)
# J_nnz
self.assertEqual(constr.J_nnz,0)
constr.J_nnz = 17
self.assertEqual(constr.J_nnz,17)
# f
f = constr.f
self.assertEqual(f.size,0)
a = np.random.randn(15)
constr.set_f(a)
self.assertEqual(constr.f.size,15)
self.assertTrue(np.all(constr.f == a))
# J
J = constr.J
self.assertTupleEqual(J.shape,(0,0))
self.assertEqual(J.nnz,0)
Jm = coo_matrix(np.random.randn(4,3))
constr.set_J(Jm)
self.assertTrue(isinstance(constr.J,coo_matrix))
self.assertTupleEqual(constr.J.shape,Jm.shape)
self.assertTrue(np.all(constr.J.row == Jm.row))
self.assertEqual(constr.J.nnz,Jm.nnz)
self.assertTrue(np.all(constr.J.col == Jm.col))
self.assertTrue(np.all(constr.J.data == Jm.data))
# H array
self.assertEqual(constr.H_array_size,0)
constr.allocate_H_array(100)
self.assertEqual(constr.H_array_size,100)
# H single
H = constr.get_H_single(5)
self.assertTrue(isinstance(H,coo_matrix))
self.assertEqual(H.nnz,0)
self.assertTupleEqual(H.shape,(0,0))
A = coo_matrix(np.random.randn(5,4))
constr.set_H_single(5,A)
H = constr.get_H_single(5)
self.assertTrue(isinstance(H,coo_matrix))
self.assertTupleEqual(A.shape,H.shape)
self.assertTrue(np.all(A.row == H.row))
self.assertEqual(A.nnz,H.nnz)
self.assertTrue(np.all(A.col == H.col))
self.assertTrue(np.all(A.data == H.data))
# H_nnz
constr.set_H_nnz(np.zeros(50,dtype='int32'))
H_nnz = constr.H_nnz
self.assertTrue(isinstance(H_nnz,np.ndarray))
self.assertEqual(H_nnz.dtype,np.dtype('int32'))
self.assertEqual(H_nnz.size,50)
for i in range(50):
self.assertEqual(H_nnz[i],0)
constr.H_nnz[10] = 2
self.assertEqual(H_nnz[10],2)
def test_robustness_with_outages(self):
for case in test_cases.CASES:
net = pf.Parser(case).parse(case, self.T)
constraints = [pf.Constraint('variable bounds', net),
pf.Constraint('variable fixing', net),
pf.Constraint('battery dynamics', net),
pf.Constraint('generator active power participation', net),
pf.Constraint('PVPQ switching', net),
pf.Constraint('AC power balance', net), # nonlinear
pf.Constraint('DC power balance', net),
pf.Constraint('linearized AC power balance', net),
pf.Constraint('voltage set point regulation', net), # nonlinear
pf.Constraint('voltage regulation by transformers', net), # nonlinear
pf.Constraint('voltage regulation by shunts', net), # nonlinear
pf.Constraint('AC branch flow limits', net), # nolinear
pf.Constraint('DC branch flow limits', net),
pf.Constraint('generator ramp limits', net),
pf.Constraint('load constant power factor', net)]
# Add variables
net.set_flags('bus',
'variable',
'any',
['voltage magnitude','voltage angle'])
net.set_flags('generator',
'variable',
'any',
['active power','reactive power'])
net.set_flags('branch',
'variable',
'tap changer',
'tap ratio')
net.set_flags('branch',
'variable',
'phase shifter',
'phase shift')
net.set_flags('shunt',
'variable',
'switching - v',
'susceptance')
net.set_flags('battery',
'variable',
'any',
['charging power','energy level'])
self.assertEqual(net.num_vars,
(2*net.num_buses +
2*net.num_generators +
net.get_num_tap_changers()+
net.get_num_phase_shifters()+
net.get_num_switched_v_shunts()+
3*net.num_batteries)*self.T)
x0 = net.get_var_values()
net.clear_outages()
# Analyze without outages
for c in constraints:
c.analyze()
# Eval without outages
for c in constraints:
self.assertEqual(c.state_tag, net.state_tag)
c.eval(x0)
for gen in net.generators:
gen.outage = True
for branch in net.branches:
branch.outage = True
# Eval with outages
for c in constraints:
self.assertNotEqual(c.state_tag, net.state_tag)
self.assertRaises(pf.ConstraintError,
c.eval,
x0)
# Analyze with outages
for c in constraints:
c.analyze()
# Eval with outages
for c in constraints:
self.assertEqual(c.state_tag, net.state_tag)
c.eval(x0)
net.clear_outages()
# Eval without outages
for c in constraints:
self.assertNotEqual(c.state_tag, net.state_tag)
self.assertRaises(pf.ConstraintError,
c.eval,
x0)
def tearDown(self):
pass
| 43.934318 | 141 | 0.476264 | 38,537 | 339,129 | 4.036692 | 0.016322 | 0.146855 | 0.058858 | 0.025456 | 0.926949 | 0.896453 | 0.866201 | 0.823672 | 0.789293 | 0.755577 | 0 | 0.015161 | 0.414005 | 339,129 | 7,718 | 142 | 43.94001 | 0.767627 | 0.017132 | 0 | 0.767435 | 0 | 0 | 0.04801 | 0.001869 | 0 | 0 | 0 | 0 | 0.416851 | 1 | 0.007731 | false | 0.000789 | 0.001104 | 0 | 0.008993 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
eae05df009810f48d8b5d9c87850117dae588486 | 4,538 | py | Python | waterpump/pumps/migrations/0013_auto_20190708_1126.py | hwan27/WATERPUMP_django | dcc6bf27ae3c0e6ca851e854e51fc512f96eab56 | [
"MIT"
] | null | null | null | waterpump/pumps/migrations/0013_auto_20190708_1126.py | hwan27/WATERPUMP_django | dcc6bf27ae3c0e6ca851e854e51fc512f96eab56 | [
"MIT"
] | null | null | null | waterpump/pumps/migrations/0013_auto_20190708_1126.py | hwan27/WATERPUMP_django | dcc6bf27ae3c0e6ca851e854e51fc512f96eab56 | [
"MIT"
] | null | null | null | # Generated by Django 2.0.10 on 2019-07-08 02:26
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('pumps', '0012_auto_20190601_0036'),
]
operations = [
migrations.AlterModelOptions(
name='pump',
options={'ordering': ['id']},
),
migrations.AlterModelOptions(
name='sector',
options={'ordering': ['id']},
),
migrations.AlterModelOptions(
name='town',
options={'ordering': ['id']},
),
migrations.AddField(
model_name='sector',
name='pump_1_auto',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='sector',
name='pump_1_current',
field=models.CharField(default=1, max_length=120),
preserve_default=False,
),
migrations.AddField(
model_name='sector',
name='pump_1_freq',
field=models.CharField(default=1, max_length=120),
preserve_default=False,
),
migrations.AddField(
model_name='sector',
name='pump_1_operating_rate',
field=models.CharField(default=1, max_length=120),
preserve_default=False,
),
migrations.AddField(
model_name='sector',
name='pump_1_power',
field=models.CharField(default=1, max_length=120),
preserve_default=False,
),
migrations.AddField(
model_name='sector',
name='pump_2_auto',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='sector',
name='pump_2_current',
field=models.CharField(default=1, max_length=120),
preserve_default=False,
),
migrations.AddField(
model_name='sector',
name='pump_2_freq',
field=models.CharField(default=1, max_length=120),
preserve_default=False,
),
migrations.AddField(
model_name='sector',
name='pump_2_operating_rate',
field=models.CharField(default=1, max_length=120),
preserve_default=False,
),
migrations.AddField(
model_name='sector',
name='pump_2_power',
field=models.CharField(default=1, max_length=120),
preserve_default=False,
),
migrations.AddField(
model_name='sector',
name='pump_3_auto',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='sector',
name='pump_3_current',
field=models.CharField(default=1, max_length=120),
preserve_default=False,
),
migrations.AddField(
model_name='sector',
name='pump_3_freq',
field=models.CharField(default=1, max_length=120),
preserve_default=False,
),
migrations.AddField(
model_name='sector',
name='pump_3_operating_rate',
field=models.CharField(default=1, max_length=120),
preserve_default=False,
),
migrations.AddField(
model_name='sector',
name='pump_3_power',
field=models.CharField(default=1, max_length=120),
preserve_default=False,
),
migrations.AddField(
model_name='sector',
name='pump_4_auto',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='sector',
name='pump_4_current',
field=models.CharField(default=1, max_length=120),
preserve_default=False,
),
migrations.AddField(
model_name='sector',
name='pump_4_freq',
field=models.CharField(default=1, max_length=120),
preserve_default=False,
),
migrations.AddField(
model_name='sector',
name='pump_4_operating_rate',
field=models.CharField(default=1, max_length=120),
preserve_default=False,
),
migrations.AddField(
model_name='sector',
name='pump_4_power',
field=models.CharField(default=1, max_length=120),
preserve_default=False,
),
]
| 31.957746 | 62 | 0.546056 | 430 | 4,538 | 5.532558 | 0.123256 | 0.070618 | 0.193359 | 0.226986 | 0.911307 | 0.911307 | 0.870954 | 0.870954 | 0.8533 | 0.8533 | 0 | 0.038861 | 0.342221 | 4,538 | 141 | 63 | 32.184397 | 0.758124 | 0.010137 | 0 | 0.777778 | 1 | 0 | 0.104232 | 0.023831 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.007407 | 0 | 0.02963 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
d83b4f0ce30cddfcd6b073d5a3c33562e1f4fa4a | 30,117 | py | Python | jamf/api/patches_api.py | jensenbox/python-jamf | 85213085b1064a00375a7aa7df5e33c19f5178eb | [
"RSA-MD"
] | 1 | 2021-04-20T15:28:57.000Z | 2021-04-20T15:28:57.000Z | jamf/api/patches_api.py | jensenbox/python-jamf | 85213085b1064a00375a7aa7df5e33c19f5178eb | [
"RSA-MD"
] | null | null | null | jamf/api/patches_api.py | jensenbox/python-jamf | 85213085b1064a00375a7aa7df5e33c19f5178eb | [
"RSA-MD"
] | null | null | null | # coding: utf-8
"""
Jamf Pro API
## Overview This is a sample Jamf Pro server which allows for usage without any authentication. The Jamf Pro environment which supports the Try it Out functionality does not run the current beta version of Jamf Pro, thus any newly added endpoints will result in an error and should be used soley for documentation purposes. # noqa: E501
The version of the OpenAPI document: 10.25.0
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from jamf.api_client import ApiClient
from jamf.exceptions import ( # noqa: F401
ApiTypeError,
ApiValueError
)
class PatchesApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def patch_id_get(self, id, **kwargs): # noqa: E501
"""Return Active Patch Summary # noqa: E501
Returns active patch summary. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_id_get(id, async_req=True)
>>> result = thread.get()
:param id: patch id (required)
:type id: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: ActivePatchSummary
"""
kwargs['_return_http_data_only'] = True
return self.patch_id_get_with_http_info(id, **kwargs) # noqa: E501
def patch_id_get_with_http_info(self, id, **kwargs): # noqa: E501
"""Return Active Patch Summary # noqa: E501
Returns active patch summary. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_id_get_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: patch id (required)
:type id: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(ActivePatchSummary, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_id_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `patch_id_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "ActivePatchSummary",
}
return self.api_client.call_api(
'/patch/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def patch_id_put(self, id, active_patch_summary, **kwargs): # noqa: E501
"""Update patch report # noqa: E501
Updates patch report. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_id_put(id, active_patch_summary, async_req=True)
>>> result = thread.get()
:param id: patch id (required)
:type id: int
:param active_patch_summary: Active patch summary. (required)
:type active_patch_summary: ActivePatchSummary
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: ActivePatchSummary
"""
kwargs['_return_http_data_only'] = True
return self.patch_id_put_with_http_info(id, active_patch_summary, **kwargs) # noqa: E501
def patch_id_put_with_http_info(self, id, active_patch_summary, **kwargs): # noqa: E501
"""Update patch report # noqa: E501
Updates patch report. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_id_put_with_http_info(id, active_patch_summary, async_req=True)
>>> result = thread.get()
:param id: patch id (required)
:type id: int
:param active_patch_summary: Active patch summary. (required)
:type active_patch_summary: ActivePatchSummary
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(ActivePatchSummary, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'active_patch_summary'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_id_put" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `patch_id_put`") # noqa: E501
# verify the required parameter 'active_patch_summary' is set
if self.api_client.client_side_validation and ('active_patch_summary' not in local_var_params or # noqa: E501
local_var_params['active_patch_summary'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `active_patch_summary` when calling `patch_id_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'active_patch_summary' in local_var_params:
body_params = local_var_params['active_patch_summary']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "ActivePatchSummary",
}
return self.api_client.call_api(
'/patch/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def patch_id_versions_get(self, id, **kwargs): # noqa: E501
"""Return patch versions # noqa: E501
Returns patch versions. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_id_versions_get(id, async_req=True)
>>> result = thread.get()
:param id: patch id (required)
:type id: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[PatchVersion]
"""
kwargs['_return_http_data_only'] = True
return self.patch_id_versions_get_with_http_info(id, **kwargs) # noqa: E501
def patch_id_versions_get_with_http_info(self, id, **kwargs): # noqa: E501
"""Return patch versions # noqa: E501
Returns patch versions. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_id_versions_get_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: patch id (required)
:type id: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[PatchVersion], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_id_versions_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `patch_id_versions_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[PatchVersion]",
}
return self.api_client.call_api(
'/patch/{id}/versions', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def patch_obj_policy_id_get(self, id, **kwargs): # noqa: E501
"""Return Patch Policy Summary # noqa: E501
Returns patch policy summary. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_obj_policy_id_get(id, async_req=True)
>>> result = thread.get()
:param id: patch policy id (required)
:type id: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: PatchPolicySummary
"""
kwargs['_return_http_data_only'] = True
return self.patch_obj_policy_id_get_with_http_info(id, **kwargs) # noqa: E501
def patch_obj_policy_id_get_with_http_info(self, id, **kwargs): # noqa: E501
"""Return Patch Policy Summary # noqa: E501
Returns patch policy summary. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_obj_policy_id_get_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: patch policy id (required)
:type id: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(PatchPolicySummary, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_obj_policy_id_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `patch_obj_policy_id_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "PatchPolicySummary",
}
return self.api_client.call_api(
'/patch/obj/policy/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def patch_svc_disclaimer_agree_post(self, **kwargs): # noqa: E501
"""Accept Patch reporting disclaimer # noqa: E501
Accept Patch reporting disclaimer # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_svc_disclaimer_agree_post(async_req=True)
>>> result = thread.get()
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.patch_svc_disclaimer_agree_post_with_http_info(**kwargs) # noqa: E501
def patch_svc_disclaimer_agree_post_with_http_info(self, **kwargs): # noqa: E501
"""Accept Patch reporting disclaimer # noqa: E501
Accept Patch reporting disclaimer # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_svc_disclaimer_agree_post_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_svc_disclaimer_agree_post" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/patch/svc/disclaimerAgree', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
| 42.35865 | 342 | 0.587808 | 3,343 | 30,117 | 5.032306 | 0.066707 | 0.035665 | 0.054093 | 0.032099 | 0.936397 | 0.93075 | 0.918088 | 0.911193 | 0.902396 | 0.89336 | 0 | 0.012588 | 0.343195 | 30,117 | 710 | 343 | 42.41831 | 0.837875 | 0.489259 | 0 | 0.732026 | 0 | 0 | 0.157455 | 0.039705 | 0 | 0 | 0 | 0 | 0 | 1 | 0.035948 | false | 0 | 0.01634 | 0 | 0.088235 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
dc542e022c7024f4343d59ed42381ac89de77e12 | 61,451 | py | Python | Python/windwardrestapi/Model/Variable.py | windward-studios/Windward-REST-version-2-Clients | 8fd467e6f4ece6fcc435609ffb23448d07af3131 | [
"MIT"
] | null | null | null | Python/windwardrestapi/Model/Variable.py | windward-studios/Windward-REST-version-2-Clients | 8fd467e6f4ece6fcc435609ffb23448d07af3131 | [
"MIT"
] | 1 | 2020-10-12T20:32:05.000Z | 2020-10-12T20:38:04.000Z | Python/windwardrestapi/Model/Variable.py | windward-studios/Windward-REST-version-2-Clients | 8fd467e6f4ece6fcc435609ffb23448d07af3131 | [
"MIT"
] | null | null | null | __pyarmor__(__name__, __file__, b'\x50\x59\x41\x52\x4d\x4f\x52\x00\x00\x03\x08\x00\x55\x0d\x0d\x0a\x04\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x40\x00\x00\x00\xb9\x3b\x00\x00\x00\x00\x00\x10\x3e\xaa\xa7\x5c\xa1\xd9\x46\xd5\xc8\xe3\xb0\x8d\xe2\x2c\x50\x6f\x00\x00\x00\x00\x00\x00\x00\x00\xb7\x1c\x21\xc6\x26\xbf\xec\x3f\x83\xcc\x82\xea\xc3\xc3\xd1\x9a\x8e\xaa\x39\x12\xce\xc5\x60\xb9\xe0\xbb\xb5\xb2\xde\xd9\xd6\xa8\xcc\xe1\x6e\xa5\x70\x5e\xb9\x3a\x4f\xad\xfb\x53\xcb\x3c\x2d\x4a\xcd\x52\xe6\x0c\x66\x9d\x45\x4c\xb6\xd9\x02\xc7\xca\xa1\x50\x96\x5e\xa0\x06\x29\x93\x82\x04\x4c\x1e\x13\xd1\x31\x0e\xc4\x0b\x3c\xf5\x8a\x69\xcc\xa7\x02\x0a\x09\xbe\xb4\x3a\x17\xc3\x3c\x99\xd5\xec\xa4\xd5\x9f\x7d\xdc\x73\x79\xff\x8b\x51\xc4\xf3\x7b\x5f\x93\xc5\xb3\x74\x6d\xc7\x08\xb9\xc7\xe3\x00\x6c\x43\x04\xf9\x8b\x66\xb6\xcf\x42\xdf\x9b\x45\xd9\x64\x96\x52\x0a\xce\x50\x22\x65\x0f\xe7\x56\x70\x41\xe0\xde\xa5\xd6\x1f\x23\xc0\xa5\x0b\x90\x28\xb2\xcf\x1e\x21\xd3\x52\xb3\x16\x61\x39\xa5\x01\xde\xce\x4b\x09\x27\xd8\x1f\xd3\x15\x6a\x96\xd3\x06\xee\x84\xdb\xbd\x1e\x11\xdb\x68\xda\xfc\x85\x13\xc2\x26\x73\xcc\xfd\x1a\xa7\xce\x17\xdc\xca\x14\x4d\xfd\x98\xbe\xac\x66\x1b\x7e\xb2\xe5\x2e\xf6\xdd\x99\xe5\x36\x9e\xa8\xbb\xdf\x19\x6a\x7c\xc1\x11\x52\xaa\xed\x47\x6a\x7d\x7c\x6c\x40\x70\xdc\xea\xe2\xe3\x91\xc5\x87\xb1\x19\x01\xcf\xbe\x92\xe7\x23\x9f\x28\xd3\x54\xf6\x45\xd1\x15\x6b\x48\x64\xed\xc7\xc4\xb8\xed\xd7\xcf\xc8\xb0\x22\xb9\x4a\x75\x68\x87\xca\xd9\x9e\xfa\xe0\x8f\x0c\x74\x61\xf5\x89\x60\x56\x94\xd7\x40\xbd\xb2\xc4\x59\xb8\x15\xea\xdd\x28\x32\xe6\x1e\x15\x0f\x5a\xf1\xfa\xaa\x1c\xc3\xec\xa9\x26\x0f\x5e\xad\x4c\x90\xf2\x3a\x52\x36\xd3\xd5\xe8\x00\x3c\x2b\x76\x9a\xfd\x15\x5b\xf4\x78\xa2\x7f\xd8\x23\x94\xfd\xd2\x28\xcd\x59\x95\x49\xa0\x14\xdc\x36\xcf\x58\x9e\x84\x11\x04\x09\xa7\xfb\xc0\x89\x3f\x1e\x7c\xd3\xf7\xef\x62\xb7\x75\x71\x10\xea\x6a\x13\x41\x09\xb1\xe6\x51\x5f\x47\xb9\x7b\xfc\x34\x9c\xaa\xbd\xc7\x3d\xa7\xcc\x2d\x5a\x7f\x18\xa3\x09\xb4\xc3\x84\x30\xc6\xf7\x2e\x92\xbb\xe2\xa5\x1d\x07\x6e\x38\x44\x97\xa3\x10\xc2\x0e\x4d\x64\x01\x87\x21\x54\xc7\xe3\x7f\xa7\x01\x88\xf9\xc6\xb6\xc4\xca\x28\xe9\xdb\x33\x32\xee\xef\x9b\x16\x30\x89\x2f\x24\x2e\x61\xcf\x4e\x0d\x2e\xdf\xe7\x7a\xcb\x05\xbe\xa8\x85\x3d\x9c\xe7\x2b\xf6\xa5\x4c\xc9\x11\x72\x84\x3c\x04\x72\x22\x86\xd5\x2b\xf0\x19\x25\xd5\x84\xf2\xd2\x7c\xd9\x5f\xfe\x16\x6f\x2d\x6e\xe4\x2b\x99\x5b\x7f\xed\x3a\x2c\x03\xd9\x4d\xbc\x73\x1c\x1b\x8d\x64\x87\x40\xf0\x64\xf3\xd5\xcc\x13\x0f\x49\xdb\xe7\x61\x44\xe7\x8f\x83\x1e\x54\xaa\x48\x1d\x15\x64\xf4\x6d\xf1\x16\xf8\x3b\x09\x2d\x97\xf2\x40\x99\xe0\xb2\x74\x98\x42\xa0\xa6\x18\xb2\xc0\xfa\xb6\xcf\x39\x60\x7f\xb6\x03\x26\x7e\xaa\xf0\x7f\x14\xb6\x13\xe0\xdd\xda\xae\xd7\x8f\x1c\xe5\x4f\x9a\x38\xa3\x5d\x39\x50\x20\xf9\xc6\x62\x3d\x29\xa1\xa6\xc1\x20\x79\x87\xb8\xd0\x1c\x83\x61\x05\xb1\xdf\x1e\x48\x60\xa4\xcb\x36\x87\x7f\xc4\xdc\xc8\x3c\xce\xe2\x27\x2a\xaf\xe3\xd9\x0a\xa3\x4f\x6d\xfb\x9a\xce\x7d\x13\x6c\x1b\x29\x72\x8b\x8b\x65\x88\xf6\x17\xeb\x35\x19\x01\x20\xf8\x5a\x16\xf0\x32\xfe\x13\xbc\x96\x62\x1c\xeb\x7f\xc7\xf9\x07\xfc\x46\x8e\x9d\x90\x7d\x52\x9c\xd7\x57\x82\x92\x3c\x47\xc3\x5e\xa2\x51\x8a\xcd\x08\x72\xab\xc2\x18\xc9\x4b\xe3\xdd\x00\xc0\xb8\x90\x5c\xbc\xbd\xfe\x9b\x58\xcb\xfb\x5c\x85\x03\x8c\x7a\xc9\xa5\x4c\x19\xd2\x59\x8c\x5c\xb0\x5b\xc2\xc9\xf2\x8e\xd4\x54\xb2\x93\xff\x58\xe1\xbb\x8e\x29\x20\x43\xbf\xde\xcb\xf1\x48\xb3\xf2\xf2\xd8\x19\x26\xdd\x64\x2c\x66\x42\x47\x8f\x18\x3b\x79\xd8\x65\xc0\xaa\xfe\x70\x40\x04\xb2\x57\xe8\x7d\x59\x1d\x5b\x04\x4a\x90\xdd\x42\x58\x74\x2f\x5c\x34\x58\xaa\x94\xd6\x4a\x1b\x24\x16\x54\xc6\x8b\xb2\xc0\x0b\xff\xf1\xd8\x4f\xda\x97\xdc\x6d\xc5\x09\x24\xdb\xdb\x58\x91\xbf\xad\x29\x10\x24\x4a\x88\xfa\x80\x49\x97\x4e\xf5\x90\x68\xb0\xc1\x34\x86\x01\xfd\x32\xfd\x78\x15\x99\x01\xc7\x96\x35\x74\x88\x54\x21\xc6\xb2\x5a\xcb\xc6\x23\x4d\x19\x99\x5b\x23\x31\x47\x11\x11\x65\xdd\x9e\x5d\x3c\x3d\x35\x6c\xed\x56\x7b\x59\xfe\x57\xab\xd4\x9c\x5f\x66\x62\x24\x35\x91\x3a\x5d\xb9\xe7\x78\xcc\x95\x11\x74\x97\x80\xa2\x5f\x3c\x01\x34\x62\xaa\x48\xaa\xa3\xb5\xfd\x1d\x98\xce\x6a\x49\xd4\x78\x52\x47\xb1\xc8\x86\x13\x20\x3f\x9f\xb0\xa3\xd1\x3a\xb8\xcf\x1d\x78\xd1\x68\xf6\xf4\xda\xdc\xcd\x52\x89\x0e\x43\x63\x8f\xe1\xa3\x88\x6b\x25\x2f\xc6\x07\x8a\x6a\xc7\xb5\xe5\x91\xf5\x53\x51\x39\xa3\xd2\x0d\x37\x89\x9f\x38\x44\x93\xcc\x31\xf7\x66\x20\x7f\x37\xdd\xe1\xd6\x80\x3d\x18\x5b\xf6\x7a\x36\x85\x4c\xd7\xef\x77\xe1\xec\x3b\xc4\xbc\xe1\xe2\xd7\x3c\xc5\xf3\x65\xf9\x0c\xdc\xc2\xb7\x0c\x4d\x55\x62\x57\xb8\x25\x60\x50\xe2\xd3\x60\x89\x65\xaf\x0f\x41\x0e\x00\x28\x96\x8c\x76\x11\xa7\xff\xbc\x53\x40\x1c\x27\x4e\x34\xcb\x56\x02\x0e\xee\xb5\x88\x70\x73\x03\xc6\x24\xe1\xbf\x71\x65\x9a\x00\xc9\x81\x8d\xa3\x01\xd2\x43\x70\x63\x1a\xde\x2a\x7f\xeb\x70\x03\x63\x89\xb8\x56\x84\x3e\x78\xaf\xa2\x69\xa0\x3a\x6f\x3d\x42\x2d\x39\xa6\xa3\xc3\x76\xba\x19\xf9\x01\x28\x40\x49\x22\x77\xa3\x3c\x5a\xaa\x2d\x24\x56\xfd\x87\x3c\x7e\x92\x91\x00\xdc\xb5\xd4\xa5\x63\x9c\xe5\x34\x36\xab\x82\xd7\x1d\x35\x3d\xba\xe9\xff\x9a\xfe\x00\x74\xb3\xac\x1d\x8f\xb7\x22\x8a\xb9\xd3\x20\xa8\xa4\xbe\x7b\x34\x4a\x5f\xb7\x9a\x82\x10\x32\xb2\xb7\xc5\x47\xa3\x95\xe2\xfd\x3a\x07\x35\x38\x9d\x98\xf0\x0e\xfe\x0c\x64\x6e\xfa\x4b\xd5\xa6\xda\xc9\xee\x76\x46\x4e\xee\xbb\x8c\x83\xa7\xf6\x0b\x71\x4e\x86\x0e\xcf\x64\x27\x47\x77\xd8\xf1\x03\x8d\xea\xfc\xf2\x67\x00\xea\x77\x1d\x52\x01\xff\xc6\x3d\x30\x63\xe6\x65\xaf\x04\x05\x95\xff\xfb\x41\x9e\x15\xcd\x84\x75\x72\x6a\xdb\x46\x1f\xec\x06\xad\xc1\x63\xc3\x04\xd4\x63\x4d\xe7\xcf\x76\x15\x4e\x73\x3a\x65\xac\xb8\x49\xa2\x41\xc7\x75\x20\x4f\x34\x4b\xe4\x5d\xde\x95\x84\xfe\xf0\x0a\xcb\x33\xae\x4f\x94\xba\xbd\x57\xff\xee\x4f\x40\x80\x9e\xad\x44\x29\xaa\x54\x3b\xb4\x79\xf8\x7a\x65\xe1\x12\xf6\x27\x9d\xe9\x60\x3b\x8d\xd3\xcf\x6e\x51\x33\x00\x45\xa5\x2a\x5c\xa0\xdc\x60\x86\x76\x8c\x8e\x35\x69\x4b\x08\x20\x82\x91\xcf\x93\x87\x63\x77\x3b\x5c\x90\xbb\xe2\xce\x73\x20\xe4\x44\x99\x49\xbc\xb2\xdf\x89\x17\xe1\x15\xc3\xc5\x60\xda\x8c\x92\xbd\x17\xf9\x90\x91\x0a\xb8\x21\xc0\xc9\x5c\xe5\x40\x2f\x2a\x11\x94\xe4\x83\x30\x35\x1d\x07\x98\x2c\xe3\xcf\xd6\xc5\xd0\x89\x31\xb5\xac\x94\x00\xa5\x84\xdf\xe3\x93\x83\x00\x01\x9f\x11\x19\x5a\x34\xca\x61\x78\xbd\xae\x12\x06\x24\xa8\xb4\x3e\x32\x42\x44\x5e\x3f\x01\x4f\x56\xdc\x2e\x91\xd0\xc8\x45\xf4\x1a\x6f\x4e\x13\xfc\x17\xf6\xef\xf3\xe6\xf6\x14\x82\x41\x57\x4b\x6e\xd7\xdb\xc0\x1f\xd5\xed\x86\xb2\x5b\xbb\xe2\xea\x69\xf6\x62\xb3\xdd\x10\x60\xfa\xb8\xc5\x6f\x7b\xa6\x2f\x54\x8e\x01\x42\x76\x4d\xef\x71\xde\xce\xa2\x94\x8e\xb7\xb8\x0f\xa9\x9e\x37\xae\xb6\x9f\xc6\xf7\xf7\x4b\xfc\x91\x36\xf6\x49\xbf\x43\xbf\x8e\x4a\xd0\xba\xf8\x8b\xdb\xcc\xdd\xaf\x10\x3c\x20\x7b\x1d\x02\x3e\x8e\x32\x71\xdc\xe2\x19\x6e\x72\x32\x99\x4b\x0e\xbe\xf4\x93\x9a\x3b\x27\x00\x97\x9e\xcc\x96\x29\x5f\xb6\x62\x68\x10\x44\x3d\x17\xcb\x7b\xc1\x95\x23\x00\x2a\xfc\xe5\x66\x18\x27\x1b\x11\x89\x35\xa9\x31\xe7\x06\xb4\xd5\x2d\x67\x4b\x1e\xc0\xb4\xc7\x8b\x40\xb8\x73\x9e\xe3\xbe\x08\xb5\x75\x18\x30\x2c\xe6\xb6\x66\x5e\xfd\xaa\xa2\x6c\x32\xb1\x3d\xb4\xfc\xf9\xb3\x2c\x4f\x18\x95\xda\xff\x98\xa3\x73\x05\xe0\xab\x78\xd3\x25\x11\xed\xc2\x83\xe2\x60\x25\x65\x02\xd6\x21\x64\x7a\x5c\xa8\x1c\xb5\x85\x4c\x0e\x63\xd8\x1f\x02\xbb\xe7\xd6\xad\x41\xb2\x83\x1a\xa2\xcd\x9a\xdc\x3b\xc4\x27\x22\xe7\x2f\x92\x2f\x3d\x06\xfc\x65\xd9\xcb\xd9\xd7\xb0\x5c\x7b\x16\x2a\x0b\xaf\xb2\x2c\x8d\xfb\x32\xda\xd1\x3a\xcb\xaf\xf0\xcb\x66\xc2\xee\x29\xfd\x17\xf2\xff\xd7\x27\x97\xe2\xfb\xc3\xb6\xce\x89\x5a\x3e\xd5\x7f\x94\xf9\x65\x77\xc7\xd5\xe4\x08\xa0\x05\x30\x27\x31\x6a\xb5\x30\xcd\xb4\x98\xfe\xbf\x3a\x03\x7a\x7b\xd5\x27\xa3\x44\xed\xf6\xf9\xad\xef\x86\x38\x3c\x77\x56\x99\xc5\x71\x69\x70\x29\x52\x76\xd7\xf8\x60\x09\xfa\x10\xb5\xbe\xd0\xb0\x72\x14\xbd\x4e\x02\x81\x1f\x10\x56\xcc\x7c\x91\xe9\x47\x75\xa5\x6f\x05\xd2\xfb\x69\xc9\xa3\x43\x23\x71\xc6\xc4\x00\x5b\x1f\x47\x18\x8d\x57\xd4\xa3\xcc\x1e\xbe\xb5\xa3\x90\x52\x08\x4e\xd5\xd7\xda\xdf\xd9\xf1\x9b\xb9\x25\xad\x8d\x52\x32\x85\x46\x6c\x10\xeb\x48\xd9\x1b\x28\xa7\x31\x63\xf8\xf0\x3f\xbc\x2c\xfe\xb2\x88\x52\x73\x2b\x89\xca\x2d\xfc\xe8\x3a\x73\x53\xe2\x41\x69\x3d\x1a\x83\xc6\x0e\x8a\x08\xab\x78\x17\xd4\xaf\x9e\xf8\x0e\x4e\x5f\xed\x1d\x2f\xa1\x50\xa5\x5d\x42\x34\xb1\x66\xe1\xf0\x27\xb0\xb7\xb1\xeb\x12\x63\xc9\xc5\x5d\xfa\xe8\x02\x0d\xdb\x1d\x89\x8a\xdb\xe8\x1e\x88\x27\x3e\xaf\x16\xf1\xe3\x3c\xe9\x26\x4a\xf2\xa1\x1d\x99\xff\x3b\x07\xb6\x8e\xad\xe7\x24\x01\xa5\xb7\x91\xb2\x5b\xb8\xfe\x41\x37\x7c\xac\x9e\x31\x84\x75\x47\x6d\xb8\x6f\x5c\xa6\x92\x0c\xfe\x0f\x43\xa0\x3c\xd8\xb6\x53\x4c\x25\xba\xe7\x63\x5a\x9a\xaf\x7b\x8c\x93\x12\x27\x8d\x65\xea\x4f\xfd\xfc\x6f\x24\xf7\x26\xf9\xd5\xa0\x51\xdd\xec\x70\x65\x3d\xdc\xb6\x4d\x77\x12\xf1\x88\x63\x2d\x79\x8e\xdd\xc8\x92\x21\x23\xa6\xc4\x5f\x4e\xe3\xc2\x74\x95\xf7\x8e\x4e\xe9\xad\xb6\x8a\x79\x71\x11\x63\xc7\x14\x86\x56\xc0\x02\x0a\x7c\x0a\xdf\x63\xfb\xa5\x8c\x5e\x37\xda\xfc\x84\x16\x9a\xba\x4b\x35\x53\xce\x3d\x91\x74\xc7\x88\xe7\xb1\x5e\x71\x92\xd9\x3e\xef\x18\xd4\xd9\x14\x69\x21\x1a\xa3\xaa\xca\x75\x30\xa6\xa7\x77\x8b\xf0\x0c\x28\xbd\xe0\x9b\xf8\x1a\x7d\xfb\x8b\xa3\xc8\xb8\xda\x1a\xb2\xd5\x9f\x58\xd9\xf0\x32\xf7\x09\x15\x07\x41\x34\xa8\xb5\x3a\x2c\x69\x28\x27\x55\x90\x50\xfe\x00\xe8\x89\xba\x81\xa5\x8c\xe7\x93\x55\x3a\x4c\x61\x68\x35\xcb\x30\xc5\x7a\x6b\xf5\x7f\xf4\x8e\x8f\xe5\xcb\xb9\x0f\x23\xff\x7c\x37\x2a\xef\x61\x96\x17\x69\x36\x8b\x68\xa0\xec\xea\x47\xa6\x90\x50\x97\xeb\xec\xd6\x4c\x3c\x7c\x15\xa1\x93\x42\xea\x0a\x84\x32\x5d\xd2\x4e\xbf\x13\xe1\xb7\xe4\x80\x5f\xc6\x48\xee\xfa\x94\x7c\xdd\x77\xe6\x72\xb4\xa6\x5e\x3d\x10\x50\x55\xb4\x18\x01\x31\xd4\x3e\x80\xaa\x84\xf8\xc5\xb0\x56\x43\x1f\x77\xbc\x6e\x85\xca\xaf\x61\xd5\x1f\xe6\x8d\xdc\x10\x20\x71\xde\xee\xe4\x3b\xf6\xd4\x70\xc9\xdd\xa2\x44\x30\xb3\x8d\x52\xe1\x6a\xdd\xb2\x65\xda\x13\x0d\xb4\xaa\xed\xef\x82\x75\xdd\xc7\x42\x75\xb8\x05\xc7\xfa\x56\x8c\xe6\xf2\x62\x1b\xf6\x29\xe2\x2c\x85\x32\x02\x71\x97\x2f\x5a\x7a\x06\x56\x52\x23\x1c\x1e\x7b\xa5\x17\xbc\x28\xfe\xec\xf2\x69\xb5\xac\xb1\xe6\x9b\x86\x65\x13\x10\x07\xb9\x79\x64\x19\x55\x92\x10\x04\x40\x83\x86\x47\xa1\x6e\x7b\x47\x71\x27\x09\x06\xd9\x83\x47\x84\x19\xfb\xe9\x12\x1f\xad\x79\x91\xa9\xac\x26\xb9\xc3\x69\x39\x9b\x85\x80\x54\x61\xd9\xa0\xf1\xed\x07\x98\x0d\x42\x38\xfd\x04\x7c\xc8\x03\xce\x14\x68\xae\xa1\x1d\x8b\x10\xbf\x6f\x4c\xcb\xd0\xe8\x9e\x75\x04\x42\xfd\x61\x36\x9d\xbd\x8a\x02\xf8\x58\xc4\x0d\x03\x7e\x49\x00\x05\x3f\xb0\x8b\x53\x4a\xdd\x68\xb1\xa0\x0c\xc7\xd6\xed\x43\xec\xcb\x8b\x9c\x85\x9a\x6e\x03\xa1\xc3\xe7\x76\x2b\x0d\x98\x12\x6c\x8e\xf5\x68\x86\x46\xb7\xcd\x61\xe0\xeb\x40\x52\x32\x4b\xe5\x4c\x5e\x59\x7d\x1b\x20\xb3\x94\x0a\xef\x4a\x52\xbb\x55\x17\x20\xf9\xbb\xa0\x41\xdd\x04\x18\x2e\xc3\xe0\x13\xf0\x67\xb4\xc2\x70\x4e\x2a\xb4\x47\x54\x25\xec\xc7\x91\x7b\x25\x42\xca\xfa\xe1\xd2\xcd\x71\x99\x53\x91\x20\xf5\x3b\xea\x88\xf4\x87\x35\x5a\xc2\x0b\x60\x2a\x93\x74\x3a\x51\x00\xe0\xc1\xd1\xc5\x81\x1d\xf0\xb0\x2b\x30\x5f\x7e\x45\x28\x03\x41\x70\xea\x9c\x06\xe0\x12\xf6\xf0\x2b\x35\xa7\xfe\xc6\xac\x64\x67\xf8\x16\xd3\x33\xca\x68\x54\xda\x22\x86\xad\x65\xac\xd0\x45\xaa\xad\xf7\xee\x1b\xea\xe7\x6f\x81\xce\x6c\x2d\xae\xe4\x1d\x37\x10\x0a\x77\x90\x80\xb8\x6a\xc8\xcd\xdd\xce\x3a\x90\x54\x34\x02\x5e\x2e\x65\x31\x9f\x8c\x26\x8e\x85\xd4\xac\x8d\x53\xb2\xd1\x95\xce\x40\x8b\x05\x9b\x91\x5f\x18\x61\x57\x57\x69\x15\xe0\x18\xdd\xdb\xa1\xfa\x56\x40\xa1\x07\x6c\x3e\x52\x4c\x7e\x45\x17\xa5\xb9\xde\x0d\x3f\x8b\x2c\x89\x60\xd6\x15\x2c\x45\xa1\xbf\x69\xea\x0e\x28\x8f\xb2\xea\xb8\xe5\x29\xdd\x0d\x13\xf4\x4a\x13\xb9\xc2\xae\x57\x6e\xd8\x61\xbd\xe5\x88\x24\xb6\xf5\x91\x06\x7d\x11\xc4\x47\x5b\x6c\x58\xc5\x8f\x15\xba\x57\x4c\x68\x9d\xec\x45\x5e\x1d\x34\xd5\xc7\x08\x75\x80\xf6\xc4\xff\xe0\xb2\xb6\x3d\xf8\x1b\x5c\x8c\xf7\x85\xcc\xa6\x42\x4f\x4c\xdf\x53\xa2\xe8\x81\xb6\xce\xd6\x75\x90\xf5\x2d\xa3\xbc\xbc\xd5\x2e\x61\x1e\x9d\xae\x0a\x9c\x6d\xd7\xbe\x8d\xa0\x13\xfa\x11\xde\xae\x2e\xd5\x78\x28\x30\xcc\x71\x34\x4d\x1c\x0b\x31\xb1\x3d\x70\xf1\x68\xf2\x34\xdb\x12\xf8\x85\xd7\xfd\xcc\xe9\x60\x17\xb5\x4b\x82\xef\x14\x5d\x24\x84\xad\x4c\x4e\x31\xd7\x9e\xc3\x66\xaa\x76\x27\xf5\xc8\x15\x95\xc6\x1e\xbb\x98\x9a\x57\x27\xb7\x9b\x6d\x70\x59\xdb\x87\x74\xce\xbc\xb7\x5e\x47\xf9\x2f\xa7\xf7\xb1\xe4\x90\xc3\x7f\xd0\x5d\x53\x16\x0c\x45\x97\xf5\x7b\x07\xf3\x94\x73\xcb\x72\x07\xbe\x93\x96\x7c\x26\xcd\x7e\xe0\x91\xf6\x68\xce\x81\xb5\x5e\xd5\x00\xa0\xdf\xbd\x6a\xa9\x7f\xf5\x85\x55\xa2\x87\xca\x0b\x6b\x46\xe8\x51\x40\x08\xd0\xff\x7c\xa2\xf3\x71\xd0\x0b\x99\x9b\x09\xbc\x36\x44\xdc\xd0\xca\x21\x69\x6a\xaf\x03\xe6\x83\x32\xa3\x79\x46\x0d\x85\xb8\x7f\xa0\xf3\xe9\x54\x0e\x29\x6d\x54\x17\x2d\xcb\x8d\xfb\x5f\xa6\xd1\x1f\xac\x52\x0c\x98\x1e\x91\x33\xe8\x83\xb9\xdf\x61\x4f\xb6\xf5\xe0\xa0\xb2\x4c\x41\x56\x71\x75\x4b\x3a\xd2\x23\x3a\xa5\x67\x71\x14\x99\xd3\x78\x0f\x23\x3c\x84\x86\xf5\xb6\x02\x2f\x64\xee\x7a\xa1\x64\xa4\x59\x5d\x06\xaa\xcd\xba\x90\x81\x5a\xfc\x89\xe2\xb5\xbb\x7c\xd6\x10\xab\x45\x21\x76\x51\xce\x4a\xc1\x65\xf3\x03\x89\xf8\x65\x53\x2e\xc7\x31\x77\x2c\x4e\x50\x17\x89\x44\x0b\xfb\xb2\x12\x73\x7e\xbd\x85\xec\xf5\xc7\x19\x37\x32\xad\x52\x94\xb3\x70\x2f\x41\xf0\xf8\xc6\x5c\xf6\x0f\xf3\xd6\x88\x9d\x4c\x5c\x2e\xea\xa0\xfb\xfd\x7a\xac\x47\x62\xf4\x25\x01\x90\x0c\xca\xe2\x7c\xcf\x73\xad\x60\x94\x89\xaa\xf3\x68\x8b\xfd\xe6\x04\x3e\x28\x7f\x9c\xb3\x0c\x43\xb6\xdf\x3d\x5f\x04\x79\x93\x5a\xd7\x96\x0a\xc1\xcb\x19\xbb\x69\x87\xdd\xa4\x8e\x06\x5b\xf1\xb5\x4a\x89\x38\xbd\xe9\xa2\x8a\x7a\xde\x58\xce\xb9\x88\x9c\x7d\x14\xfe\x52\x24\x32\x68\x64\xff\x20\xeb\xa6\x38\xcd\xb8\x4b\xb2\x91\xec\x66\x2b\xcd\x48\x72\xae\x28\xba\xe1\x9c\xe4\x16\x34\x29\x29\xe7\x3e\xa9\x25\x57\x4e\x5f\xf2\xc7\xc6\x40\xc3\xdf\x99\x24\x1e\x59\x70\xcc\xc3\x4c\x09\xa2\x56\xa4\x61\x5f\x7b\xc5\x34\x57\x64\x8b\xfb\x26\x95\xb7\x39\xc1\x1f\x33\x3f\xd6\x3a\xcd\x12\x24\xae\x79\xd8\x35\x0c\xa9\xdd\x7d\xcf\x8b\x5b\xd5\xb0\x41\x4b\xd9\xf5\xf2\xfc\x31\xf0\x4d\x5f\xbd\x2e\xdc\x62\x49\x48\xab\xed\x64\xfe\x81\x39\xef\x0d\x39\xa0\xc1\x4f\xc5\x39\xe6\x78\x68\xa6\x28\xe9\xec\xe2\xe7\xbc\x90\x81\x6a\x48\x9b\xa3\x03\x37\x72\xbc\x6f\x72\x9e\x95\x7b\xaf\x47\xed\xa1\x3a\x26\xe3\x6d\x2f\x09\xb7\x8c\xc7\xab\x63\x9f\x9a\x20\x36\x82\x92\xeb\x13\x50\x08\x0b\x5f\xf5\xe0\xd7\x5c\x64\x2f\xb6\x01\x0d\xbf\x12\xc9\xcc\xab\x7d\x6f\x76\xcd\xc5\x62\x60\xd7\x0f\x02\xf2\x7d\xde\x8b\x98\x18\x29\xda\x69\x31\x5f\x57\xf0\x66\xea\x7c\x04\x3a\xb2\x8c\x9f\xea\xb5\x3b\xb7\x2f\x20\xd8\x55\x2d\x01\xaf\xc6\x2f\x3c\x60\x9a\xc2\x51\x3b\xb0\x12\x5f\x2a\x9d\x36\xfd\xc4\xab\xd5\x75\x28\xc4\x51\xd9\xad\xa8\xcc\x03\x85\xbe\xb7\x22\x01\x35\x25\x54\x0b\x23\x6c\xe8\xc9\x7e\x67\x09\xd0\x63\xaa\x33\x43\x5d\x83\xda\x4f\x8a\xf9\x74\x71\xfe\x7d\x5f\x88\xf2\xc2\x6c\x44\xb7\xfb\xb9\x3d\xb2\xac\xfb\xac\xe1\x40\x43\x3d\xa2\xc2\x57\x3e\x82\x43\x59\xa1\x78\x5e\xf4\xe0\x78\x49\x1a\xef\xc0\x15\x74\x9c\xa5\xe4\xd6\xc2\xa3\x5b\x38\x8c\xd1\x62\xd9\x82\x54\x46\x8e\x2b\xbb\x5d\xef\x1c\x63\xb9\x94\xc3\x38\x09\xb9\x28\xc4\xcd\xd4\xe1\x2a\x78\x6b\x62\x70\x4f\x2f\x50\xd4\xf7\x07\x56\x9d\xc3\x29\x60\x39\x9d\x0a\x71\x87\x2c\xc4\x0c\x5b\xfe\x22\xaf\x06\x98\xbb\x8b\x9a\xf2\x48\x62\x61\x67\xc8\x67\x00\x82\x0f\xa8\x89\x8b\xc8\x22\xd2\x14\x6b\xcf\x32\x78\x2c\xba\x0b\xdd\x8b\xd8\xf4\xe5\x9d\xa4\x47\x2a\xa5\x7a\x47\xf7\x8e\x63\x4f\x1c\xa1\x87\x33\xaa\x7c\x34\xd2\xfb\xc5\x21\xb5\xff\x61\x86\x78\x39\x89\xdf\x3a\x01\xb7\xa6\x23\xc2\x46\x64\x4f\xc1\x4f\x0a\x98\xa1\x59\xb8\x3b\x07\x4f\x9a\x96\x9a\x90\x77\x48\x98\x91\x6c\x72\x55\x59\xd3\x6c\x24\x82\x90\xb4\xc3\xce\xa6\x85\x4e\xbe\x4c\x43\xca\x46\x70\x53\x3e\xd7\xe1\xd7\x13\xc5\x79\x70\xd5\x38\xd5\x61\x7d\x98\x1f\x3d\x50\x7d\x86\x25\xaf\x94\xb0\x53\xf0\x51\x4c\xfb\x65\x6f\x14\x5f\x5e\xd8\x18\x79\xd8\xfb\xd3\x58\xea\x21\xb2\x07\x99\x63\x74\xbc\xe6\xc6\x09\xdf\x8d\x28\x84\xcf\x37\x62\x8c\x82\x00\x09\x0f\xb4\xd6\xd5\xed\x06\x74\x02\x7f\xaf\x9b\x4f\x2d\xfd\xa7\x6f\x9b\xd4\xff\x9f\x43\xf6\x81\x0d\x54\x14\x01\xb4\xc7\xcf\x7f\x77\x9e\xd9\x37\x25\x9c\xd5\x0b\x0d\xdd\x65\xc5\xeb\xc4\x5d\x2f\xb2\xe2\x2c\x5f\xb6\xf9\x4c\xb3\xa8\xf3\xb9\x72\x46\xe1\x82\x65\xad\x4a\x68\x3f\x20\x4d\xda\x87\x19\x9d\x02\x88\x68\x87\x33\xdc\x09\x3b\x10\x64\xae\x43\x01\xe8\x4d\x01\x05\x9e\xbd\xcf\x8c\xff\x6f\x34\x96\x63\x0f\x56\xc1\x6f\xb2\x63\xbf\x23\x6f\x2e\x88\x39\xcb\x3a\xb2\xf9\x8a\x74\x2f\x88\x73\x74\x0a\x46\xa4\x85\x2a\xbd\x8a\xd4\xd1\x2a\x7c\x8a\x14\xfa\xf6\x72\x69\x0d\xd7\xdf\x50\xa9\xfb\x95\xbc\x56\x3d\x3c\x61\x6f\x3a\xaa\x9d\xd8\xfb\x71\x16\x09\xe7\xeb\x66\xdc\xfa\x7b\x23\x14\xe0\xd3\xeb\x00\x14\x8c\x0f\x5b\x36\xcb\xa8\xd3\xec\x9e\xbe\x5d\xc6\xaf\x01\x63\xbc\x0a\x24\xde\xe0\x32\x1e\x5f\xc5\xcf\xa8\x9e\x82\x74\x6c\xfd\x62\x9f\xa7\x9b\xc9\x77\x53\x18\x19\x69\x8e\x67\x7d\x38\xd7\x1d\xc9\x85\xb0\x83\x8f\x2d\x60\x61\xf9\x7b\xaa\xa5\x27\x7b\xed\xbc\x05\x5d\x85\xb8\xbf\xc0\x9b\xaa\x9b\x9b\xb6\xde\x8b\xa5\x82\xa3\xb8\x33\xe5\x45\xf0\x1c\x40\x53\xc2\xda\xe6\xaf\x63\x78\x3c\xbc\x82\x6f\x4f\xf1\xc0\x50\x71\x2d\xe8\xb3\x1f\x3a\xb7\x3d\x97\xd5\xc3\xa7\xba\x0c\xd8\x4c\xf6\xb3\x46\xbc\xbe\x86\x73\xec\xa2\xcc\xf4\x44\xf0\x0a\x67\x47\x2e\xaf\x70\x2b\xf6\x51\x6e\x1e\x36\xce\x7e\x9f\xcd\xb8\xf7\x87\x79\xf6\x98\xe8\xaa\xe2\x2c\xa5\x72\xce\xc1\xf7\xe1\x4b\xab\xb7\xdf\x87\x77\x5c\x40\x5d\x29\x44\xd4\xa0\x05\xdc\x86\x70\x8b\xf4\x9f\x58\xde\x35\x03\xa1\x23\xda\x96\xfd\xf4\xe0\x89\x13\x68\x40\x76\x7c\x68\xc0\x53\x5a\xb0\xbb\x46\xf3\xf8\x03\x2d\xa6\x79\x34\x57\x7c\x25\x78\xcb\x72\x00\xf0\x76\x0d\xcc\xba\xbf\xde\x92\xbc\xde\xff\x39\x58\x73\x8a\x9a\x19\x96\x55\x17\x15\xc5\x69\xf0\x76\xd5\x07\x67\x47\x9e\x36\xf2\x66\x84\xc4\x97\xda\x3d\xaf\x2e\x2f\x12\x48\xfe\x1a\x84\x66\x1c\x7d\x4b\xb4\x75\xd1\xa2\x50\xc9\x2f\xf5\xcb\xca\x84\x8b\x43\x0c\x67\x81\x41\x9a\xbb\x9e\xfe\xb0\x1b\xcd\x1c\x17\x51\x94\x6f\xd2\x97\x4c\x8b\xfc\x63\x89\x9b\x50\x69\xcd\x2f\x19\xf4\xfa\x63\xe0\x87\x5a\x53\x3a\x40\x22\xbf\x57\xf2\x8c\x19\xc3\x94\xef\x28\xa2\xb1\x4b\x7b\xda\xf0\x23\xd8\x7f\xfb\x20\x65\x00\x39\xb7\x82\x65\xd8\x86\x74\xe6\x44\x8e\xf8\x24\x56\xac\xcc\x2c\xc7\xe7\xe5\x0f\x6d\xdd\x36\x38\x51\x5c\x53\xbb\xaa\x95\x8c\x1e\x96\xcc\x13\x60\x77\x5e\xe0\x3e\x83\xe2\x3a\x53\xf9\xdd\xc1\x8a\x0a\x86\xbb\x3a\x9b\x17\xdf\x3a\x9d\x8c\xa6\x20\x12\xd2\x54\xb0\x58\x99\x10\xcb\x4d\xb9\xdf\xba\x63\xf2\xac\xc7\x75\x78\x11\xb6\xb7\xfd\x59\xaf\x09\xc8\x87\x77\xf0\x61\x90\x62\x87\xae\x4a\xff\x4f\x90\xa7\x08\xb6\x05\x8b\x4d\x09\xe9\xca\x90\xe1\x36\x4b\xe8\xe5\x95\x8f\x5f\xfd\xf9\x97\x3b\xa8\xd8\xf4\x01\xb5\x29\x9e\xdc\xf5\xfa\xe6\x13\x1e\xda\x27\xa8\x4f\x5c\x20\x05\xf5\x60\x8e\xad\xc8\xf2\x63\x34\x6f\x23\x35\x2c\xf9\x8c\x10\xd8\x9c\x4d\x30\xb3\x54\xe2\x05\x61\x14\xb0\x8e\xab\x41\x07\x84\x82\x98\x11\x8d\x5f\xf4\x20\x4f\x4d\x92\x63\x06\x04\xbe\xf5\xfc\xec\xb4\x91\x60\x93\x43\x49\x0a\x00\x83\xd9\xff\xf3\x2e\x45\x88\x08\x06\x20\x7e\xc0\x79\xcb\x9a\xaf\x16\x7b\x92\xe9\x83\x3e\xf8\xda\x7c\xcf\xb4\x57\xb1\xf2\x8a\x74\x80\xfb\x2f\xea\x91\xbd\xc0\x10\x2e\x09\x7b\xad\xcf\x01\xa2\x12\x7a\x0a\x14\x18\x25\xf9\xfa\x5e\xe8\x3f\x92\xd5\xef\xcf\x2c\x2d\x51\xe4\x12\x24\x7c\x49\xa7\x09\x2a\xbc\xd6\x6e\x51\xdf\x69\x5b\x73\xbe\xd8\xd4\xee\xcf\x5e\x84\xef\xb4\x5b\x04\xfc\x36\x1f\xa2\xec\x2e\x0b\xc5\xbc\x15\x40\xf1\x1d\xa6\xed\xe6\x55\x17\x18\x26\x74\xc6\x50\x3c\x67\x16\xc4\x8a\x06\xc8\x3c\xd6\x17\x86\xd4\x9f\x07\xc7\xc0\x3b\xf5\x8e\x9e\xb1\xdf\x7b\x7e\x10\xaf\x84\x85\x63\x6b\x8f\x1c\x33\x6e\x3d\xf0\x3c\x6c\xb1\x4a\xd6\xcf\x1c\x70\x5d\x67\xc9\x77\x14\xef\xfc\xdb\x63\x93\x2b\x04\xbd\x3c\x5a\x7e\x9e\x20\x08\x3c\x51\xc0\x59\x8a\x9d\xe5\xc5\x7b\x65\xc5\x44\x17\x77\x56\xc3\x8e\x89\x89\x7b\x76\x79\x41\x4d\x61\xff\x47\xfd\x8b\x72\x3f\xa5\xe2\xc2\x8a\xd9\xcb\xc6\x03\x2a\xad\xc8\x27\xa9\x56\xca\x07\x0f\x33\xf8\xf1\xe9\x11\xf9\xd2\x35\x11\x76\xc2\xea\x46\xb1\x3f\x87\x15\x95\x98\x0f\x0b\x19\x70\xf6\xe2\x92\x09\x3b\xcb\x9f\x63\x5f\xa2\xa6\xab\x90\x9a\x63\x58\x26\xa9\xce\x81\x02\x32\x9b\x83\xa3\x35\x1b\xb9\xe4\x87\x84\xfc\x69\xaf\xd7\x4c\x21\xff\x8c\xda\x42\x03\x0e\x89\x20\x81\x78\xa2\x32\xb2\x17\x08\xff\xc5\x12\xa2\xeb\xa3\x28\x2e\x61\xc2\xc2\x4a\x58\x1c\xe7\x14\xd8\xf5\x96\x9d\xf4\x27\xf0\xff\x9a\xc0\x8a\x34\xbd\x01\xe4\xdd\xed\xac\x9a\x4c\x73\x44\x56\xc0\x64\xd2\x1a\xdd\xda\x99\x13\xc8\x14\x7f\xa4\xa4\xd8\xe3\x16\x64\x76\xf1\x61\x94\x22\x29\x9d\x7f\xbd\x6e\xa3\x77\xda\xfd\x1e\xa5\x1e\xef\x49\xb4\xeb\x58\xe0\x74\x2f\x08\x25\x5a\x4f\x10\xc5\xe1\xf6\xf3\x92\x13\xe1\x10\x32\xd9\x6e\xc6\x57\xd6\xe0\xf5\x02\xd9\x9c\xaa\xf8\xfa\x99\xeb\x35\x15\x78\x81\x40\x0b\xe7\x50\xef\x9f\x3d\x41\x4f\xc6\x27\xc0\xb4\x6f\x43\x60\x41\xc7\xe3\xc8\x1d\xd9\xa0\x5c\x09\x06\xc3\xfa\x7e\xd3\xa2\x6d\x32\xce\x82\x27\x3f\xe2\xa6\x8c\x71\x6b\x39\x59\x92\xed\x12\xf6\x06\x93\xa4\x7f\x42\xbf\xdd\xd8\xe8\xca\x06\xc1\x19\x03\x1a\xd9\xca\xbc\xb1\xf7\xec\x48\xd0\x24\x1c\x90\xcf\x2e\x9a\x2a\xae\x20\xd0\x25\x3e\x28\x98\xf9\xb6\x65\xb7\x40\x2d\xf4\x0c\x52\xee\x35\x2f\x1f\xb3\xd4\x6e\x78\xdb\x62\x74\x7f\xaa\x28\xfa\x5e\x75\x8b\xe2\xb4\xd8\x9d\x43\xa6\x9a\x37\x9a\x75\xf7\x33\x88\x1d\x56\xd4\xe1\x01\x09\x96\x2e\xff\x62\xee\xb5\x22\xe2\x28\xc7\xc1\x8f\xde\xcc\xc6\x2d\x5f\x86\x31\x77\xcc\x72\x3b\x5d\x14\xad\x94\xca\xc7\x0e\x44\x70\x9f\xf2\x52\xc0\x27\x99\x30\xa7\xd0\xbe\x92\x57\x5f\x57\xd5\xcc\x1e\xfa\xe6\x36\x2d\xfc\x3a\x17\xc5\x55\x8d\x49\x0e\x83\x89\x11\x30\xc9\xe1\xcc\xc6\x30\xe8\x84\x8d\x0a\x97\xcb\xcd\xb8\x6f\xcd\x18\x4f\x78\xfe\x06\x51\x20\x84\x09\x76\x4a\xe7\x0d\x29\x4b\x90\x8f\x5d\x3a\x2a\x6e\xf5\x02\xfa\xe7\xf7\xb2\xcb\x3d\x2c\xff\x7b\xea\x84\x8e\x01\x7f\xb8\x61\x29\xc4\xf3\x85\x39\x0d\xa3\x2c\x11\x8c\xdf\x5e\xd9\xe9\xeb\x49\x6d\x75\x43\x48\x32\xd2\xa7\xe2\x7c\xa1\xfc\x14\x69\x16\x0d\x5d\x2e\x4c\xfc\xf6\xce\xb0\x7e\x7c\xb7\x1c\xdd\x9e\xe1\x3e\x15\x74\xbf\x2e\x95\x58\x36\x09\x8c\x0d\x2a\x0c\xeb\xfc\x5a\x87\x2c\xa3\x30\x8c\x7e\xe7\x1f\x12\xbe\xba\x9f\xe0\x4b\xc9\x61\xad\x69\xa7\x4c\x34\xea\x06\x77\xf4\x5d\x7d\xe9\x18\xd0\xbd\xe0\xdf\x7c\x92\xbf\xfe\xa9\xc1\x7c\x57\x7d\x18\x64\x3d\xdb\x10\xac\xb3\x1f\x31\x12\x7d\x74\xb1\x70\xb2\x49\xde\x4b\x7b\x64\x30\xaf\xb0\xfb\x7f\xe8\x48\xd3\xd1\x8f\xe2\x08\x80\xa5\x1b\xef\x97\xfc\x6d\xd1\x94\x7e\x5e\x9b\xfe\x65\x0e\x44\x02\x5c\x03\xe6\xc6\x0b\xcc\xdd\x4e\x38\x8a\x10\x76\xd8\xd5\x24\x54\x6e\x84\xe5\xe1\x14\xdb\xe0\x63\x45\x30\x32\xa4\x35\xa4\xd6\x28\xfb\x94\xbf\xbd\x16\x41\x26\x0c\xb0\x4d\x27\x82\xce\xb2\x11\x3e\x00\x79\x9b\xc5\xa0\x8a\xdb\xab\xff\x9d\x92\xc3\x4d\x77\xbe\xd6\x36\x16\x9c\x30\x5e\x67\x65\x1b\x84\x00\x62\x2a\xe1\x37\x5b\xac\xa7\xaa\x05\x33\x1d\xfe\x4d\x95\x99\xbf\x75\xc1\x38\xc1\x03\x83\x4c\xb3\xf2\xff\x58\xf6\xf2\x6d\xb4\xd8\xb9\x94\x2e\x7c\xba\xa8\x3e\x69\x05\xb9\x87\x79\x44\x03\x54\xe4\xa5\xc2\x60\xc9\xce\x9c\xde\x87\x54\x2f\xa9\x47\xf6\x2c\x07\x18\x34\xc8\x35\x74\x46\x98\x50\x20\x9b\xcb\x31\xcd\xd2\xa7\x79\xe9\x54\xbd\x7c\x2f\xa9\x0d\xe7\x97\xa2\x98\x31\xa1\xba\xba\x5b\xac\xf2\xb3\x87\xa5\x51\x4a\x41\xdb\x8b\x11\x60\x0e\xd0\x3c\x5a\x52\x4b\xc0\xfe\x02\xb5\xa7\x00\x83\xb0\xcb\x3f\x14\x1a\x1e\x5f\xe3\x64\x22\xde\x76\xcb\xa6\x80\x11\xdb\x9d\x68\x39\x05\x81\xc2\x9f\x57\xb4\x00\xe2\xee\xc0\xb7\x86\x10\x38\x6d\xa9\xd3\x80\xdd\x02\x6d\x32\x4e\x63\x9e\x0a\x97\x3b\x36\x57\x3c\x60\x82\xc9\xa7\x62\x4e\xa7\xc5\xc1\x9e\x4e\x0f\xac\x3e\x7b\x4a\xe1\x72\x17\x4d\x7c\xf9\xe4\xf4\x6e\xce\x7b\xa8\x75\xe5\x99\x6f\x43\xa8\x25\x82\x69\xf1\x08\xe5\xd9\x15\x5f\x5c\x4d\x21\x8c\x98\x5f\x2e\x0c\x6d\xa5\x2d\x5e\xdf\x2d\xaf\x10\x72\xbe\xb2\xde\x7a\x92\x63\x4c\x39\x82\x02\x05\x8d\x34\xef\xd6\xc2\xcc\x20\x30\x22\x76\x46\xbd\x6e\x44\x81\x64\xcd\x57\xd6\x84\xb1\xe7\x09\x84\x0b\x66\xae\x7a\x74\x17\x6c\xff\x29\x09\xc1\x9d\xc9\x5a\xbd\x36\x0e\x99\xe3\x64\x6f\x97\x78\xfc\xd8\x46\x50\x68\xce\xb8\xba\xa4\xe3\x90\x81\x83\xf3\x6d\x98\x7e\x6e\xff\x8e\x7a\x9b\x51\x1e\x97\xe6\x4a\x29\xc0\x6f\xdb\x8e\x5e\x22\x5e\x81\xfd\x74\x50\x56\x30\xc5\x5f\x34\xa6\x28\x0b\x3a\xe9\xce\xf8\x8a\x05\x22\x7d\x0b\x85\xe3\xc9\x15\x63\x16\xe6\x61\x54\xa4\xaa\x09\x9e\x22\xe8\xe5\x3f\x1a\x72\xcb\x37\x39\xab\x76\xa4\xcb\x12\xa7\x9d\x7e\x4f\x77\xb3\xa7\xaf\xc4\xb1\x86\x3c\xa9\x54\x26\xbf\x0e\xdb\x81\xe5\x4a\xfe\xb7\xa9\xae\x54\x5f\x2f\x98\x78\x03\x29\x91\x16\xf5\x32\x65\x28\xdf\x27\x2a\x18\xa6\xb8\x08\x07\x2c\x07\x43\x5f\x54\xfb\xa0\xc3\x78\x13\xe0\xa7\xde\x6e\xca\x8d\x8e\xad\x92\x07\x30\xe5\x3e\x1e\xf8\x76\xe1\x05\x01\xfd\x15\xd9\xce\x54\x73\x2a\x2c\x64\x1c\xeb\xb3\x6c\x84\xf2\xe6\x74\x6b\xcc\x91\xe8\x86\xae\xfe\x3a\x94\x7f\xf4\x06\x0c\x9a\x34\x26\xd5\xdf\x8a\x8f\x17\xd2\xf7\x9e\xc9\x71\x78\xd4\xbe\x47\x62\xd7\x89\xc7\x3a\x76\x54\xae\x3b\x57\xf6\x42\xd7\xa2\x55\x32\x00\x92\xc5\x91\x6e\x0b\xa3\x04\x2e\x0a\x9b\xa3\x45\x0f\xd5\x18\x0f\x6c\xb5\x22\xe8\x2a\xa6\xe3\xc3\xc5\x45\x54\x70\x8e\x03\x2a\x62\x47\xcb\xc5\xa9\x42\xeb\x4c\x37\x14\xad\x4c\xa9\xae\xfd\xa3\x35\x7c\xff\x8f\x8d\xac\xe5\xdc\x29\x3d\xb1\xa6\x78\xe5\x01\x1a\xb7\xa2\xd6\xa8\x39\x59\x81\x6e\xba\x0b\x45\x0d\x01\xd5\x73\xf1\x13\x58\x88\xd4\x3e\xd4\xdc\x4a\x51\xab\x29\x70\x6f\x78\x79\x0a\x6f\xa1\xcd\x36\xd4\x5b\x2e\x71\x53\x8f\x07\x1a\x37\x86\xd6\x31\xc3\x08\x0b\xd6\x8b\x0a\xb1\xcc\xb7\xec\x59\x3a\xa9\xae\xc6\x71\x77\x1a\xee\x15\x85\x40\xa5\xc5\xfb\x3a\x83\xab\xa0\xc1\x95\x23\xea\x2c\xbc\xcc\x35\x1d\xe4\x13\x45\x80\x63\xf5\x5e\xa8\x45\x5f\x44\xcc\x13\x6d\x85\xd4\xaa\x4f\x15\x77\x3a\x5f\xdc\xbf\xec\x95\x31\x1e\xe2\x37\x3c\xbb\x90\x5e\x19\x9a\x02\xa7\x97\xe8\xa2\x1d\x40\xe7\x23\x80\x1d\xad\xca\x99\x9a\x06\x81\x60\xb4\x3a\xc6\x70\xc4\xe5\x40\x58\x84\x85\xf7\x59\xd7\x3a\x11\xc0\xda\x44\x24\x8e\x39\x05\x9c\x0d\x69\x0a\xf0\xf9\x3c\x80\xb2\xe7\x8e\x83\xf4\x28\xc7\xb0\xec\xa1\x1a\xd5\x92\xe9\x55\xc4\x5a\x0f\xf2\x6c\x2b\xb4\x87\xb9\xe8\x1f\x1d\xaa\x6e\x77\xe1\xb7\x59\xab\x3e\x75\x24\xf6\x4a\x39\xe8\x87\x05\x4a\x5e\xf8\x3a\x5d\x8f\xd0\x6a\xee\x08\xac\x41\xe5\x3b\x64\x73\xae\x25\x14\x77\xe3\x0d\x55\x9b\xd3\x25\xc8\xdf\x80\x08\xb0\xb1\x2e\xca\xc3\xa1\xd7\x7a\x72\xf4\x0c\x39\xef\x76\x80\xcf\xbe\x16\x75\x7f\x10\xf6\x66\xc1\x19\xb0\xa9\x95\xe5\x5f\x74\xd0\xb5\xf8\x26\x86\x86\xfe\xae\x7a\xa9\xc5\xf2\x7b\x59\xd7\xd7\xae\x1a\xc4\xfb\xd9\xc5\x2e\x29\xe4\xf7\x75\xef\xc7\x08\xa7\xa9\x03\xfc\x48\xb4\xca\xc8\xd9\x7e\x53\x61\x7a\x4d\xd6\x0f\xcd\x0c\x6a\x89\x3c\x10\x9f\x8b\x3d\x76\xf8\xad\x15\x82\xbb\x3b\xfd\x6b\x64\xab\xc6\xbc\x48\x0a\x05\xae\xd8\xdc\x77\xb1\x79\xee\x6d\xc1\xc2\x9d\x4a\xbe\xca\x82\xe1\xc9\x48\x79\xbf\xeb\xc6\x57\x4d\xde\xb5\x78\x14\xba\x24\x0d\x3c\xe7\x30\xdc\xaa\x45\x4c\x24\xa2\xe2\x63\x87\x54\x77\xbf\xb4\x34\x62\x5d\x41\x57\xf8\xbb\xcc\x52\x53\x34\x1c\x05\x87\x26\x10\x59\xde\x98\x8e\x50\x3b\x05\xcd\x51\xc2\x8b\xd6\xc5\x89\x45\xea\x93\x5d\x55\x5d\x6c\x00\x84\xeb\x46\x4a\xa7\x07\x12\x58\xed\x4a\xc6\x67\xbb\x83\x27\x73\x22\xc0\x0c\xc3\xd5\x70\x59\x5a\xd1\x2b\x02\x17\xb4\x94\x52\x7d\x4d\xe5\x43\xcd\xbf\x38\x94\xdd\xc7\xfd\x32\x97\x84\x3e\x16\x89\x80\xb9\x4b\x81\x46\xe7\xb4\xce\x71\x26\xb1\xe8\x49\x0c\xc7\x8b\x8f\xca\x41\xc9\x00\x8b\x06\x3f\x58\x86\xaa\xa9\xbb\x81\xb0\xc7\xb1\x07\x2f\xd6\x32\xbb\x63\x57\x7a\x87\xf2\x1b\x26\xe7\xf7\xe6\xdd\xa4\xb1\x13\xc2\xc9\xdb\x6b\x38\x52\x0f\x25\xbb\x8d\x94\x96\xed\xd6\x9f\x81\x3d\x3f\xdb\x10\x44\x4b\x9f\x97\xa7\x61\x9f\xa5\x9f\xc0\x81\x66\x8f\xc1\x51\x92\x00\x7f\x1e\xc1\xfd\x80\x1d\x51\x81\xe2\x43\xc8\xdb\x02\x49\x04\x73\x90\xf1\x52\x2e\xb6\xf9\xb0\xec\xe0\x58\xbb\x48\xb1\x78\x21\x66\xbb\x95\xf6\xe6\x0e\xff\xb5\x26\x4c\xc7\x00\xed\x72\x3b\xf2\x35\x5c\x92\xda\x21\x86\x5c\x67\xb4\x48\x76\x8b\xd8\x37\xeb\x9a\xfa\x4c\xbc\x50\x6d\xdb\x7a\x5f\xfd\x0e\xbc\xfa\x47\x02\xf3\x1c\xf2\x05\x55\x1d\x93\xdb\x9b\x93\xc3\xcd\x3d\x38\x1a\x51\x3d\x12\x00\xef\x02\x7e\x48\x44\x69\xb9\x18\x7c\x99\x4e\x4f\xc2\x8b\xf7\x3f\xad\x52\x76\xaf\x6c\x16\x71\xa4\x34\x99\xc5\x99\x2e\xc6\x70\x10\x6e\x51\x55\xe7\xbf\xc7\xb4\x68\xab\x61\xba\x48\xa8\xe3\x18\x6e\x5f\x2f\xa9\x67\x55\x1d\x04\x17\x7c\x78\xd5\x61\x8c\x45\xfd\xec\x7f\x33\xd3\xa0\x57\x04\x48\x53\x8d\x36\xd1\x6e\xe6\x18\xff\x3d\x3d\x63\x7a\xea\x9d\xfe\x6b\x2b\xfc\x72\x67\xb7\x8c\x5c\xcd\xbb\xbf\x75\x9f\x4d\xb3\xaf\x97\xec\xf9\x18\x62\x63\x76\xf7\xf8\xab\x2c\xa5\x8d\xc7\xd6\x88\xb5\x0d\xfe\xdd\x59\x93\x27\x84\xbd\x6a\x40\x5d\x5f\x24\x84\xbd\x4f\xd9\xc8\x19\xd1\xcc\xbf\xd1\xbc\xbd\x1b\xe5\x37\xdf\x54\x32\x59\xad\xbf\x3c\x77\x37\x52\xde\x79\xa1\x02\x38\x82\x46\x2b\x1c\xc4\xf0\xd0\xdb\x59\xda\xb3\x06\xf3\x22\xff\x98\x1c\xdf\xee\x76\x36\x10\xd2\xa8\x5b\x70\x47\x1b\xfa\xc9\xd5\xf0\x85\x42\x1e\xdc\x25\x41\x1b\x43\x43\x9f\x44\xfb\x1a\x99\xeb\x1d\x51\x3f\xcf\xb7\xf4\x55\x9f\x33\xfb\x60\x76\xe7\xe0\x3b\x9c\x7b\x51\xc4\xef\x8a\xd6\x10\xc6\x7f\x34\xde\xd7\xc3\xda\x74\x10\xb3\xda\x10\x72\x12\x38\xc4\x95\xd7\xe4\x83\x78\x19\x98\x55\x1d\x70\xf0\x27\xdf\xc3\x34\x3d\xe8\xbf\xb1\xba\xc9\x76\xaf\xab\xb1\x90\x88\x90\x1d\x93\xa6\x23\xe1\xec\xda\x91\x16\x99\x63\x65\x30\xf5\x65\xe3\xef\x1c\x64\xbc\xed\xb4\x74\xcb\xdb\x6f\x50\x23\x6a\xcd\x6f\xcc\x1d\xa9\xce\xf3\xc0\x0f\x45\x70\x69\xf6\x5f\xd5\x5e\x21\x44\x38\xec\xc1\x2f\x7f\xe9\xff\x16\xe7\x3d\x32\xd1\x48\x57\x32\x2f\x13\x31\x55\x51\x7e\xf7\xa8\x52\xe7\x99\xc8\x3b\x83\x21\x29\x76\xd7\x82\x8e\x3d\x24\x2a\x7e\x08\x6b\x13\xd8\x8c\xe4\xfd\x72\xe5\xf0\xc0\xd8\xa0\xde\x10\x08\x8a\xd7\xa3\x29\x42\x3e\xb8\x2d\x24\xaa\x51\x16\xc8\x9b\x0e\xfe\xd7\xec\x70\x1d\x6e\x22\x18\xf7\x6f\xbb\x34\x9f\x43\x01\x00\xb1\xc8\x81\x12\xbb\xe7\x57\xf0\xeb\x87\x62\xa1\x02\xa7\xb8\x1d\x8e\x30\xb4\x88\x5d\x71\xee\x8d\x66\x46\xf1\xcc\x05\x56\x45\x38\xab\xed\x7d\xbc\x48\x0e\x78\x39\x0b\x27\xa4\x99\xc2\x49\x0e\x48\xcb\x7b\xda\x16\xbc\xde\x3f\xcf\xdd\x1d\x06\xb8\x07\x58\x1a\x35\xaf\x44\x27\x8d\x60\xb0\x18\xb6\x44\x30\x23\xa3\x9f\x20\x47\x0d\xd3\xae\x57\x92\x4e\x87\xc1\xf2\xc5\xd1\xac\xd0\x5b\x53\x83\x54\xc3\x61\xee\x0f\x1b\x20\x96\x3b\x2b\xe5\x17\x81\x1c\xf2\xce\xdf\x90\x13\xb5\xc5\xf4\xbc\x1b\x97\xee\x4b\xa1\x15\x0d\x44\xfa\x0f\xe0\x9c\x38\x99\xa6\xc5\xa5\x08\x16\x08\xbd\x41\x8e\x6a\x9d\xed\x38\xfa\xbe\xd6\xec\xf1\x75\x73\xf6\xeb\x5a\x46\x3e\xf5\x58\x82\x47\xc7\xff\x65\xd3\x3c\x3c\x1b\x7f\xef\x8f\x14\xc5\x96\xfa\x9d\xc0\xa3\xbd\xa8\x4d\xeb\xd7\x49\x34\x46\xc6\x8b\xd6\xd3\x20\x8c\xa7\x8f\x4f\xd8\xb4\xd0\xba\xb5\xb1\x54\x79\x55\x7e\xc9\x22\x16\x9e\xb2\x21\xeb\x42\x06\x5b\x57\xd1\x3c\xea\xe6\x32\x00\x91\x93\x8e\x7f\x1e\x7b\x46\x53\x7a\xcb\x9f\xbf\x95\x7b\x23\x60\xc5\x3b\xda\x77\xd8\xf4\x9f\x3c\x79\xd6\x2f\xe2\x12\xd8\x7c\x59\xb4\x7c\x5b\x24\x38\x4e\x51\xf0\x45\x65\xb0\xb0\xa6\x7d\x64\x48\xcb\xe0\x80\x74\xf3\x82\xc3\x08\x28\xae\xf3\xd6\xe6\x41\x4e\x6e\x64\xe8\x67\xf6\x3d\xb9\x1d\xea\x94\x10\x1f\xcf\x1e\xb4\x2c\xe0\xa4\x1a\x69\x72\x11\xbb\xbd\x7b\x5c\x8a\xff\x0c\x42\x65\x88\xde\x77\xad\xe9\xed\x98\xe5\x07\x3d\x6e\x91\x27\x05\x74\xa5\xd0\xad\x77\xd9\x2a\x13\x3d\x0c\xa0\x9c\x0f\x92\xae\xdc\x30\x02\x90\x1f\xf5\x65\xe6\x5f\xc5\x30\x5d\x67\xc3\xf1\xc8\x9c\x14\x1e\x08\xf7\x8b\xd9\x24\x74\x32\x87\x18\x80\x88\x48\xb7\xee\x21\xb8\xc5\x54\xe6\x91\x8e\x10\x53\x22\x7a\x20\x6d\xc8\xe4\xcc\xaf\xcf\xfb\xbf\x5c\x06\x4b\x63\x1a\xc4\x2e\xb6\x6d\x53\x5d\xca\x4c\xd9\xbc\x1a\x4c\x35\x80\x3b\x1c\x07\x04\x14\xfd\x17\xcc\xd3\x0f\x17\x39\xfd\xa8\x51\x2c\x05\xac\x9c\xd9\x54\x9d\x51\xd9\xdc\x6b\xb8\x66\x09\xa1\x25\x0b\xe7\xe3\x5b\xc1\x6b\xdf\x24\xbd\x10\x33\xe2\x56\x38\x20\xa2\x32\x4d\x28\x8f\x5c\x89\x1a\xcb\x87\xaa\x4b\xd6\x19\x4f\x01\xa1\xc3\x09\x44\x89\x2b\x11\xe8\x2b\xcc\xc3\xba\x7b\x45\x1f\xfe\x87\x7e\xe6\x6b\xf5\x2b\x7a\x69\x01\x3e\xe5\xa7\x35\x39\x5a\x54\xf4\x45\x6d\x8e\x45\x4d\xdf\xc5\x8a\x13\x20\x38\xf7\x83\x5a\x08\x86\xb3\xb4\x45\x87\x62\x0b\xd1\x91\x55\xe6\xb1\x46\xcd\xea\x59\xf1\x55\x99\xe1\x0a\x88\x0a\xb8\x4f\x32\x20\xc9\xd7\xa0\x75\xf0\x71\xe2\xef\xfd\x25\x1e\x6a\x98\x46\x9e\xd9\x9a\xa8\x45\xcb\x50\xf6\xad\xea\xa8\xf5\x38\xe1\x44\xdb\x51\x07\xfe\x96\x35\xda\x98\x45\xaf\x61\x6a\x02\xd8\x97\xc8\x28\x5f\xfe\x80\x08\x31\xfa\xe8\x19\x1a\x82\xd0\xba\x49\x96\x1d\x13\x3a\xdd\xb2\x16\x6a\xc9\xb8\xbf\x51\x5c\x59\xe7\x49\xb5\xd7\x14\x8a\x69\x01\x88\x45\x3d\xea\x40\x14\xf7\x86\x61\x07\xba\x89\xe6\x47\x15\xf6\xaf\x11\xc8\x4a\x79\xd0\xba\x34\x72\x8e\x6a\x2b\x0e\xbf\x0a\xc2\xb3\xd5\xc3\x03\xb2\xa8\xa2\xca\xdf\x56\x5a\xb2\x7c\xc8\x94\x36\x63\x39\x10\xdd\xad\x8b\xad\x52\xb4\x47\x64\xf3\x0f\xfc\xf7\x06\x15\x0f\x0f\x12\xa7\x86\x03\xfc\x94\xcc\x19\x0a\x74\x80\x74\xe4\x48\xc2\x1c\xfb\xba\xbd\x4b\x25\x7d\x5f\x9b\x69\x51\x33\xa1\x12\x88\xa7\x84\xf0\x24\x4d\x74\x9a\x6c\x2c\x34\x3e\x8b\x80\xb2\xb9\x10\x38\x65\x63\x78\x42\xb6\x7e\xa3\xdb\x3b\x1a\x40\xc7\x77\xca\x4f\xad\x65\xde\x2b\x91\x95\xbd\xfe\xe2\x84\x7e\x1b\x52\x7a\x3c\x52\xb7\x5e\xd7\x44\xcd\xb6\xee\x2c\xb3\x86\xdb\x1c\x45\x33\xc3\x41\xbb\xb4\x90\x6d\x9c\x4f\x3d\x94\x50\xf9\x81\x98\x5f\xeb\xf6\x87\x57\xfd\x03\x57\x73\xe8\x65\x0c\x9b\xcd\xba\x52\xa9\x40\x9d\xb0\xa3\xd3\x42\x7e\x1f\x7a\xc1\xff\x97\x3a\x2c\xb1\x3c\x99\xd0\x73\x3c\x9c\x8c\x16\x5c\x6b\xc0\x51\xf9\x82\x91\x6d\x63\xcf\xcc\x0f\x12\x11\x38\x44\x9a\xa9\xfa\x67\x48\xf7\xf0\xfe\x25\x06\x11\x0d\x98\x99\x11\x9e\xa5\xe4\x0c\x15\x4d\x5c\x07\x72\x6a\xde\xb7\xf9\xd5\x7a\x65\x9d\x76\x18\xdc\x90\x93\x26\x93\xfb\x7c\x6b\x16\x64\xe9\x31\x1d\xd7\x39\xc9\x87\x3e\xf6\xf0\xf1\x1c\x61\xf9\xc3\xa0\x7f\x3e\x60\xa2\x68\x97\x53\xf6\x79\x3b\x78\x82\xea\xb7\xf8\x6c\x81\xc9\x82\x7f\xbb\x08\x55\x10\x57\x84\xe0\x8f\x79\xc6\xb5\xb2\x14\x31\xe6\x20\x6a\x8b\x5b\xb2\x3d\xcf\x17\xa7\xae\xf5\x62\x27\x94\x88\x77\x97\x4d\xae\x36\x46\x13\xce\x6c\x84\xd3\x13\x9d\x8c\xd8\xcb\xe1\xba\x3f\x37\x6c\xbc\x34\xcf\xaf\xf8\x01\xb3\x47\xda\x7d\x82\xd8\xf4\x65\x65\x1a\xc9\x73\x19\xab\xc9\x89\xe4\xe7\x0d\x09\x5a\x8e\xe2\x20\x4f\x22\xcb\xc0\x02\x7f\x45\x77\xca\xbe\x8f\xfb\x5f\xbf\x57\x5d\x0d\x53\x3a\x05\x0e\x7d\xcc\x28\xcd\xae\x3f\x52\x4e\xad\xe6\x6b\x67\x25\x18\x7f\xed\xf1\xde\x53\x3b\x83\xad\xe9\x63\xd1\xa1\x38\xc0\x8d\x7d\x63\x3b\xd2\x54\xd2\x2c\x62\x37\xff\x27\xbe\x74\xdd\x33\x9c\x59\xd5\xea\x8c\x52\x21\xa2\x67\x4c\x95\x47\x7c\x90\x37\xb7\x2a\x23\x05\x6b\x99\x2c\x87\x0a\x9a\xf2\x08\x6c\xcc\xac\x92\xa6\xcd\x6e\xcb\xc6\xd3\xac\xea\x09\xdb\x9f\xc6\xaf\x21\x65\xf1\x9c\x2e\xa1\xed\x64\x3a\x23\x5c\x62\x6f\xe1\xea\xfa\x1f\x69\x89\xf5\x4e\x4a\x07\x80\xb4\x22\x02\xbb\x34\x5d\x71\xcb\x6f\x6e\xcd\x93\x52\xd2\xb2\x12\x02\x69\xa2\xce\xcc\x5c\x12\x16\xad\xd1\xb1\x04\x15\x6b\x63\x2d\x52\x05\x51\xbd\x7e\x40\xd6\x29\x8c\x86\x2e\x10\x4f\x85\xde\xab\x4b\x01\x51\x3f\xdc\xab\x6e\x79\xe7\xff\x9b\x85\x87\xa9\x9e\xd1\x20\x2d\xa1\x28\x67\x86\x3d\x31\x98\x57\x29\xf0\x5a\x17\xf5\x3f\xb2\xca\x3c\xb1\xbb\x24\x34\xc0\x31\x77\x42\x12\x4c\xb8\x74\x46\x41\xf7\xd3\xea\xbd\x68\xb7\xdf\x9c\xac\x2f\x5d\x31\xa2\xee\xdd\xe6\xde\x27\xd8\xc9\x5c\xc5\xb7\xb8\xda\x69\x35\xde\x7c\x37\x81\x1b\x02\xd7\x54\x87\xdf\xad\x14\x18\x89\xfe\xcd\x95\x52\x0e\x5b\xcd\x5c\xad\x21\x2a\x89\x13\x0d\xc5\x91\xe7\x42\x65\xec\x19\xd7\xfa\x03\x86\xee\xeb\xa9\x51\xe7\xdf\x66\xf7\x69\x2e\x4c\xb1\x51\xec\xdf\x24\xdc\xa2\xf9\x3f\x6f\x74\x6e\xce\x0f\xfe\x64\xff\x5d\xf7\x3c\x6e\x49\xc4\x30\x12\xd9\x4a\xfe\x65\xce\x13\xba\x67\x69\x33\x2b\xc6\xc6\x42\xfa\xec\xfe\x3f\x1e\x6f\x05\x3e\x7a\xfa\xfc\xb5\x7c\xf9\xd8\x28\x68\x40\x8c\xb5\x6d\xee\xf6\x7b\x25\xe8\xd6\xd8\x1d\x2d\x0e\xb7\x53\x8d\xe1\xb9\x53\xe3\xd3\x77\x32\x57\x7c\x8c\x57\x63\x2f\x06\x4f\x96\x72\x1e\x5a\x1b\x0c\xd1\xd8\xa7\xdc\xf0\xa5\x71\x86\x8a\x77\x06\x97\x0d\x23\x88\x7e\x2c\x01\x3f\x5e\xbf\xb0\x59\x74\x56\x02\x61\x22\x2f\xa1\x6d\x2b\xd4\x3f\xe6\x5d\x59\x37\x99\x3a\xb0\x98\x33\x0f\x20\x51\xf0\xdc\x52\x2f\x95\x67\x02\x81\xbd\xe9\xd9\x91\x12\x6c\x94\x09\x36\x56\xec\x20\x09\xd6\x1e\x50\x5c\x02\xac\x9b\xc2\x09\xee\x33\x47\x5a\xeb\x36\x97\x01\xe7\xa6\xf0\x88\x1c\xf4\x29\x2a\xb3\xd1\xcc\x4e\x81\xde\xdc\x9c\x82\x34\x73\x19\x35\x31\xca\x40\x6d\xd2\x02\x94\xd7\x34\x0d\xc9\xe6\x8b\xbf\xfd\xa4\x95\x0a\x4c\x9f\x5b\xf4\x8d\x27\xa2\x50\xa6\x6f\x6e\x39\x74\xdd\x7a\x08\xd7\x69\x21\x63\xc8\x47\x49\x04\xdd\x56\x47\x6a\x64\x68\x73\xad\xfa\x1d\x9e\xa1\x22\xe4\x83\x19\xa0\xfc\xd3\x1f\xea\xf7\x16\xe3\x8b\x20\xce\xeb\xb0\xbe\x31\x9d\x32\x62\x3f\x91\xfc\xfd\xde\x86\x96\xbb\x44\x11\xf4\x4e\x9a\xa5\x2c\x00\x30\x9d\x0b\xa5\x03\x12\x7c\x40\x71\xa3\x6f\xf7\x43\x7c\xdb\x12\xf2\x96\x44\x58\x2e\x28\x2f\x5e\x8f\x7e\x1a\xe1\x9c\x5e\xe3\xe8\x91\x7d\x64\xbf\xb4\xf8\xa4\x10\xd1\x81\x5e\x27\x43\xfa\x74\xe2\xbe\xf5\x90\x0a\x92\x26\x3e\x46\x7a\x1b\x8b\x1e\x73\x1b\x7a\x0b\x10\x20\xac\xce\xce\xf6\xbb\xb3\x3a\xc7\x9c\x26\x23\x74\xf1\x93\x8f\xef\x97\x72\x41\x59\xfc\xe6\xd8\x1b\x67\xd5\x07\x68\xcd\x8a\x81\xb9\xec\xcb\xfd\xfc\x7b\xa7\x53\x5a\xf1\x09\x04\x8b\x07\xfa\x05\xe8\x6f\xaa\x70\xec\x5a\x61\x7a\x0a\x8c\x0a\x6c\xd2\xf5\x15\xd9\x65\xb6\x24\xa7\x7c\x87\x4c\x54\x94\xa4\xb8\x5b\x8f\x67\xc6\x3c\xc4\xb1\x34\x07\x56\x3e\x76\x65\x2f\x5e\x97\xb9\x8b\xd3\x31\xd0\xfa\x16\x39\x95\x2b\x2e\x9b\x84\xc4\x9a\x89\x60\xea\xda\x80\xa1\x8f\x2d\x05\xa2\x62\x61\xb6\xb0\x09\xac\x1f\xdc\x53\x2e\x15\xcc\xd3\x52\x6f\xb8\x3b\xa2\xfc\x64\x14\xce\x3a\x02\xab\x64\x21\x2d\xa3\xc8\x44\x0d\x4f\xcc\x37\xc0\xdc\xaa\x45\x7e\x8b\x06\xe9\x28\x8a\x05\x73\x6a\x80\xdd\x8e\x4d\xf4\x5a\xd2\x3d\x7c\x6b\xeb\xf9\xfd\x47\xe0\xdb\xb3\xbb\x60\xbe\x1f\x38\x13\x13\x0b\xbf\x8d\x2c\xda\xa2\xc1\x77\x35\xfd\xbb\x6a\xc7\x05\x8e\x8a\xa2\x6b\x3f\x08\x41\x1c\x7c\xa3\xa4\xff\x11\x09\x8d\x1e\x5e\xfc\x60\x2b\xd6\xc3\xc1\x4e\x73\x6c\x34\xfd\x36\x97\x51\x49\xc9\xfa\xa6\xd7\x1d\x35\x13\x5b\xfe\xa1\x12\x92\x95\x6e\x57\x5b\x46\x9c\xda\xa7\xce\x0d\x09\xe8\xb0\xa1\x42\xb6\xdb\x55\x01\x42\x47\x24\x38\xda\x39\x81\xae\x85\xbd\x91\xcf\x5c\x96\x23\x54\x1b\xe6\xe0\xf5\x26\xdd\xf5\x97\x6b\xa7\xce\x09\xeb\xe4\xb1\xe1\x02\xff\x19\xf2\xe3\x59\x1a\x17\xa2\xb3\x3a\x9e\x54\x62\x67\xad\x41\xa0\x91\x32\xc4\xce\xf0\x4c\x91\xe6\xd1\x89\x8e\x34\x2e\xc6\x9a\xa9\x96\xb4\x7a\xbf\xcc\x1e\xf3\x02\xad\xed\x23\x1b\x4c\xe1\x8d\x52\x04\x57\xde\x63\xdf\x9b\xb5\x07\x67\xce\x7b\xb7\x13\x21\xc8\xf0\x31\xeb\xf5\x9a\x5e\xc8\xe2\x2c\x2f\x44\x19\x19\x64\x99\x5a\x64\x39\xee\x5e\xe9\x99\x9e\x12\x47\x4f\xb9\xcc\x4f\xdc\x1e\xd2\xa9\x46\x6b\x68\xdb\x10\x64\x3e\x14\xf4\x2c\xb0\x36\x6d\xae\x43\xdd\x0d\x37\x04\x05\xe2\x70\x19\xca\x2c\x83\x99\xfc\xe5\x40\xe5\x94\xdf\x5a\xe6\x3b\x7e\xab\x57\xd3\x2f\x23\x1f\x4c\x5a\x6b\x6e\xd4\x6b\x5c\xe0\x77\xd0\x9b\x49\x28\x17\xec\xc0\x4c\x9f\x37\xa5\x0b\x08\x18\xbd\x25\xae\xab\x79\x01\x84\xbd\x2a\x37\xd0\x98\x6b\xc6\x3f\xa3\x54\x94\xe4\x5a\x6b\x51\x9e\xf1\x6f\x7c\x9e\x63\xe1\xc0\xcf\xf9\xfd\x3e\x13\x62\x99\x17\xa0\x06\x23\x60\xd7\x25\x64\xf8\x23\xcb\xde\xd7\x9e\x84\x92\x35\x9a\x6e\x9a\x69\x35\x65\x6e\xe3\x87\xef\x58\x47\xed\xc2\x60\xce\x6b\xd5\xe8\xbd\xec\xa2\xde\x93\x58\xe8\x11\x05\xfe\x79\x6c\x02\x66\xe5\x4b\xa3\x28\xd8\x73\xc1\xf6\x83\x1a\x48\x85\x54\x33\xb9\xff\x4f\x0b\x8a\xee\x02\xa9\x4a\x95\x85\x4c\x9f\x1f\x7f\xb4\xa0\xec\xd5\x2a\x51\x7b\xf8\x57\x22\xa5\x3b\x51\x75\xbc\xbb\xf7\x47\x0c\x96\x51\xed\x21\x0d\x1c\x48\x9e\xa4\x86\x1f\xb4\xf6\x8a\xdf\xe6\xbe\x3c\xec\x58\x1f\x49\x95\xe5\x1c\xc2\x1b\x7a\x90\xd0\x19\x84\x51\x26\xb5\x44\xde\x83\x83\x5d\xe0\x1b\xff\xaf\xe5\x62\xe5\x03\xde\x3e\x7f\x16\x6f\x69\x76\xa2\xdc\xfd\x5e\xea\xf0\xc6\x7d\xf1\xa3\xd9\x92\x6b\x99\x00\xcb\xe5\x85\x3a\x93\xd4\x1c\x46\x8a\x60\x82\x5e\x75\xdd\xf1\xf5\x73\x5a\x29\x23\x86\xa7\x41\x5e\x21\xd9\x6a\x88\x9d\x26\x13\xec\x44\x79\xc5\x7a\x76\x13\x94\xe4\x88\x02\x62\xa5\x08\x1d\x25\xae\xf0\x3a\x0b\x01\x16\x8f\xbe\x7f\xcb\x71\x22\x26\x46\x03\x07\x67\x51\x5e\x49\x88\xb6\x2a\xa4\x5f\x6a\x88\x4f\x79\x19\x2d\xc2\x1b\x99\xe1\xbe\xf6\xa7\x36\xf4\xcc\xca\x8c\x09\x6d\x6a\x7a\x3e\x3a\x16\xfe\x3a\x6f\x91\x9a\xe9\x99\xd7\xef\xae\x49\x0e\x5f\x68\x4e\x52\x92\xea\x9d\xe9\xf0\xf8\xd5\x93\x0d\x5a\xa1\x01\x6d\x07\x97\x2d\xe1\x29\x07\x93\x7a\xeb\x13\x34\x06\xbf\xc3\xe4\xee\xd6\xb1\xd7\x05\xd0\x9e\x9a\x94\x0f\xf5\x0f\xa0\xa0\x19\x1a\xe6\xe7\x50\x29\xfe\x0d\xfc\x76\xd4\x61\xb7\x30\x24\x47\x7d\x7d\x8a\x78\xff\x90\x3c\x64\x60\xef\x8f\xa2\xa2\x09\xd2\x85\xdb\x04\x97\xe1\x95\xbd\xcf\xa9\xb0\x9b\x3a\x8f\x29\x8b\x8d\x93\x75\x15\x65\xeb\x6b\xae\x0e\x47\x6a\xff\xbe\x44\x20\xa4\x8f\x8f\xa1\x7c\xa7\x20\x02\x66\xee\x8e\x5d\xb5\xdb\x82\xbe\xbb\xe4\xbe\xe5\x5d\x10\xe3\x81\x0b\x62\x4c\xa0\xdb\xab\x7c\xab\x92\x67\x3f\x5c\x30\x6a\xe7\xee\x5b\x5e\x36\x0d\xa7\x82\xd2\x0f\xa7\x6e\x31\x19\x81\x53\xe8\xff\xe5\x7e\x0a\x55\x90\xbe\x2e\x77\xaf\x10\xa7\xc2\x2a\x12\x86\xd1\x9f\xcc\x03\x5c\xc7\x9b\x9b\x12\xef\x98\xc3\xfe\xa8\xbe\xc5\xeb\x8c\x2d\x36\xeb\xf9\x51\x9b\xda\xe5\xa3\x1a\x41\xd7\x6b\xd8\xb4\x0b\xa5\xa7\x16\xf3\x6d\x87\x10\xa6\x56\x9e\xcb\x89\x7d\x6b\xae\xfc\x29\x51\x77\x12\x9a\x64\x99\x13\x1d\xee\xfa\x76\x6f\xf3\x67\xf8\x9b\x38\x82\x57\xa6\xa2\x34\xa7\x16\xee\x80\x42\x5b\xe6\x81\xf6\x3f\x88\xf8\x5a\xe1\xad\xd1\x47\xe4\xf5\x0b\xf2\x24\x96\x1d\x77\x00\xc4\x87\xb7\x51\xfb\xd4\xc7\x1c\xa4\x99\x88\x72\x6c\x80\x6f\xd5\x59\x4f\x84\x98\xdb\x38\xc6\xaa\x6c\xb6\xcd\x2e\x7d\xb2\x4f\x25\x04\xab\x6c\x30\x5e\x5f\x2d\x9e\x60\x96\xca\x5b\xe4\xc7\x76\xc6\x91\x82\xc0\x67\x99\xc5\xeb\x53\x94\x6b\x2b\x06\xe8\x9e\x0a\x5b\xbb\x2f\xb5\xbb\x95\xb9\xae\x78\xbd\xd2\xc7\xf6\x4d\x18\xac\xc6\x6a\x96\x94\xde\x13\x9f\x71\xc4\xa4\xc7\x36\xdb\x67\xec\x61\xf0\xb2\x36\xc4\x26\xc4\xe2\xc5\xc0\x3c\x5f\x01\x01\xdd\x8f\x0b\xa3\x94\x62\x88\x5e\xe0\x11\xe1\x3a\x00\xaf\xcc\x35\xd6\xcc\x8d\xa4\x73\xa2\xa2\x62\xd8\x1f\x4f\xc2\xb7\xb2\x45\x59\xac\x0d\x9d\x53\x5e\xa4\x03\xb5\x29\xda\x3a\xaf\xe6\x04\xab\x5f\x39\xb9\xf2\xc6\x26\x38\xf2\x54\x8e\x1b\xb3\x79\xcb\x6a\x8d\xff\x7f\x48\x57\x6a\x6d\x70\x20\xea\x80\x45\xf8\x4d\x97\xe5\xfd\x6b\x57\xc1\x5f\xd7\x25\x6f\x4d\xb9\x13\x57\x13\x2d\x91\xe3\x3c\xe7\xfb\x47\x75\x29\x67\xd3\x13\x7d\xe5\xb7\x99\xe1\x69\x9d\xbd\x04\xee\xf9\xed\x05\xa3\xf5\x2f\xd3\x88\x92\x7c\x50\xf3\x43\x5d\xbe\x66\xb6\x30\xb7\x4d\x52\xf0\x8e\xa8\x31\xa5\x31\x21\x0e\x05\x26\x22\xa3\x40\x04\xe9\x59\xdb\xa1\x6a\x57\x71\x25\x01\xa2\xbe\x55\xc2\xaa\x33\xc7\x96\xea\x97\x43\x72\x9d\x6e\xff\x09\x9d\x35\x29\x69\xf4\x0f\xaa\xff\x71\x3f\xa0\x50\x1e\xb3\x76\x53\x25\x62\x24\xc8\x6a\xea\xd2\x8d\xf2\x53\xbd\x40\xdc\xad\xc0\xcb\x57\x91\x89\x73\x4d\x56\x2b\xaa\x71\x7b\x69\xd0\xf4\x9b\xdb\x93\x7b\x9e\x48\xf1\x75\x1c\x36\x7c\xee\xeb\x2d\x1c\xa6\xd4\x35\x7b\x9d\xe9\x73\x9a\x41\x07\x3a\x97\x41\x2b\xb6\x66\x1d\x0c\xc1\x96\x71\xaa\xd5\x75\x45\x91\x8b\x16\x6e\x85\xc6\xc3\x6b\xac\x7e\x0b\x61\x90\xfe\x05\x56\x40\x43\x31\xf8\x09\x84\xe5\x89\xc9\x52\x20\xfc\x2f\x7d\x32\x8a\x22\x18\xca\xcc\x5c\x72\x5f\x93\x1a\x61\x3c\xf1\x44\x78\x77\xb7\xb1\xdb\x34\x87\x18\x2a\x6d\x26\x44\x94\x23\xd4\x5b\x36\xfc\x46\x9a\x45\x51\xa4\xc4\x4d\x72\x99\x08\x33\x09\x55\x4f\xde\x19\xb8\xcd\x08\x54\x77\x05\x41\x44\x4c\x63\xb6\xd3\xc8\x6b\xd9\xbc\x13\x20\xb8\x76\x1b\x2b\xbb\x88\xeb\xf8\x60\xb6\xf2\x72\x0b\x88\x01\xc6\x40\x07\x72\xc0\xd7\xd9\xb1\x64\x4e\xc1\xdd\xdb\x8e\xca\x36\x33\xef\x40\xb1\xb2\x70\xb3\x1f\x48\xa3\x8c\x48\xec\x0c\xc3\x5d\xdc\x27\xe7\xd9\x94\x75\x5f\x24\x16\x16\x07\xae\x63\x00\x6a\x59\x00\x9d\xf1\x02\xfc\x79\x78\x3e\xfb\x6c\xd3\x9c\x88\x0d\x28\x73\xc0\xc7\x2d\x92\xd0\xf1\xf4\xa6\x60\x69\x9f\xf3\x8e\x2c\x38\xd9\x3c\x9e\x8a\x11\xdb\x97\xbc\x3d\xb3\xa4\xe6\x06\xf4\x43\xf9\xba\x9f\x0f\xa1\x4f\x1c\x4d\xfe\x76\x25\xc2\xc3\x96\x65\xb4\x84\x55\x4f\xac\x92\x57\x13\x14\xac\xcf\x43\x3e\x7e\x76\xe0\xfa\xa6\x5c\xe4\x34\x02\x0e\x0b\xe5\x93\x7a\xac\xac\xec\x9b\xd5\x5d\x0d\x07\x76\x6d\x5f\x00\x26\x7e\xd7\x2f\x2d\xb6\x3e\x59\xed\xa4\xcb\x3e\x84\xb4\x2b\x28\x71\xdb\xe5\xa5\xdb\xae\x74\x40\xa1\xb4\xe4\xac\xd4\xd5\x59\x2a\xf1\xf8\x72\x99\x34\x74\xff\xdf\x6e\x8e\xdc\x28\xb8\xc1\x43\x8b\x21\xc7\xfe\xea\xd5\x48\x15\x60\xf7\xcf\x29\x7a\xac\x98\xf1\xfd\x31\x35\xa7\xa6\xf8\x7a\xee\x35\x08\x45\xe6\xe7\x53\xf4\xb1\xc5\xa6\x9d\x36\xee\x68\x2e\xa0\x49\xc1\x9d\x98\x40\xe8\x4f\x3b\xdf\x72\xbb\x21\x28\xf4\x7a\xc1\x3d\x38\x58\x79\x55\x09\x60\xbd\x21\x19\xd5\x66\xb7\xbf\xd1\x0a\x23\xed\xdf\x5f\xb7\x36\x12\x1f\xd4\x63\x3e\xf2\xf1\x5f\x7b\x9e\xb9\x44\x4f\xa4\x66\xfd\x9c\x57\x06\x3f\x3f\xe4\x5f\x5e\x6f\xe3\x45\x94\x77\x82\x9c\xec\xc9\x7c\xd9\x12\x75\xcc\x4f\xec\x44\x85\x5c\x72\x05\xa2\x34\x30\x8d\x42\x33\xcb\x74\x08\xc3\xf6\xb8\xc2\x8e\xb3\x11\x7f\x5b\x3d\x86\x43\xd2\xc1\x00\x07\x19\x0c\x52\x53\x1a\x7d\xca\x54\x48\x18\x1d\x87\x55\x0a\x11\xcf\xae\x10\xd2\xb6\xef\xd1\x74\x2e\xcc\xc7\x9e\x85\xf4\x68\x16\xc0\x4a\x8d\x6e\x51\x4d\x9c\x7e\x2d\xe3\x00\x0e\x35\x02\x09\x7b\xa1\x66\xf1\x08\x07\x29\x73\xd1\x0e\xf0\xc7\x29\xa0\xaa\x5f\x17\xb0\x3c\xfc\x69\x06\x8c\x0a\x28\xbb\x16\xc0\xe5\x5d\xc2\x27\xad\x67\x3b\xef\xb2\x47\xfa\xe6\xc6\xe8\x49\x33\x5b\x37\xb5\x29\x62\x6a\x30\xe1\xd7\xc2\x65\x77\x86\xf9\xeb\xab\x20\x67\x47\xb7\xb3\x35\x2d\x1b\x2b\x4c\x80\x63\x11\x17\x14\xba\x97\x24\xa7\x1e\x15\xa0\x73\xe3\x8a\x30\x94\x16\x4b\xd0\x1c\x90\xad\xa4\xc3\x0d\xd3\x28\xfb\x62\x0a\x61\xc1\x98\xe0\x21\xde\xb8\x92\x72\x57\x1e\xc6\x4d\x02\x37\x7b\xef\x9e\x95\xf1\x10\x1e\x67\xe5\xc5\xa0\x4a\x88\x13\x5c\x51\x94\x0b\xcd\xaf\xcf\x15\x59\x3c\x5e\x49\x03\xfd\x8b\x14\x7a\xbb\x8e\x0d\xa3\xa7\xbf\x4b\xe0\xee\x4e\x6a\x37\x8b\x1c\x7b\x7b\x47\x78\xe1\xb3\x95\xa4\xf2\xcd\xc6\xcd\xd5\x5c\xff\xe5\xec\x43\x32\x9b\x91\xfe\xb7\xf1\x34\x3a\x3e\x47\x5a\xd6\x5f\xa9\xca\xa7\x2a\xd2\xc3\x02\x29\x16\x3f\x68\x0a\x70\x20\xdf\x71\x7c\x9f\x23\x2d\x0b\xac\x9f\xbf\x27\x89\x9c\x62\x3c\x0a\xe0\x65\x07\x6d\xba\x35\xae\x7f\xef\x2c\xcb\xfb\x7e\x5d\x14\x2a\xbe\x92\x74\x25\xa1\xfe\x5b\xff\x31\x7d\xf5\xcd\x62\x06\xc0\xc8\xaf\xf9\x43\xeb\x3e\xdd\xc7\x9b\x10\xb3\x24\x9a\x5d\x2a\x39\x03\xf7\x89\x40\xf6\x26\xe6\x97\xe5\x58\x14\x9e\xd3\x7c\x97\x4b\x24\x12\x9f\xdf\x75\x3d\x51\xdf\x95\x0b\x25\x79\x2c\xc6\x1b\x48\xb1\x29\xf9\x17\x99\x0a\x0c\x50\xb8\x4b\xab\xc8\xf2\xb4\x62\xa6\x8a\x3e\x49\x89\x7d\xf9\x1e\xf7\x9e\x80\xaf\x8d\xc2\x5e\x2c\x17\xae\x06\x0d\xc8\xf8\xdf\xf7\xbc\xea\x58\xc7\xcb\xd3\xc5\x08\xe9\x13\xcf\xf5\xb2\xf4\x59\x03\x78\x2d\x2c\x5c\x8d\x72\xed\x32\x4f\x53\xdd\x18\x9a\xc5\xa0\xfa\x94\x58\x0e\xcd\xae\xdb\x9d\x02\x21\x1d\x33\xcd\x74\x0f\x22\x3a\xf3\x1f\x96\x4f\xcc\x5a\x19\x94\xf3\x48\x28\x6c\xd1\xbf\x05\xc9\x36\x9e\x95\x9f\x61\xe4\xac\xa6\x36\x66\x0d\xc6\xaf\x1c\xd5\x05\xbb\x72\xc6\xce\xff\xb8\xc4\x2a\x7d\xbe\xfa\x96\xae\x10\xda\x6d\xea\x61\x53\xf3\x01\x45\x48\x80\xee\xf7\x5f\x0d\xa8\x92\xe7\xac\xee\xd6\x0f\x3a\x03\xb9\xef\xcc\x94\x9c\x47\xb2\x2f\xa9\x76\xd0\x6e\x8c\xf0\xd2\xf7\x77\x0e\x7d\x57\xf2\x15\xc2\x8c\xdc\xc0\x88\x67\x54\x0f\x0c\x2d\x2b\x79\xa1\x71\xcc\xb5\x60\xcd\x28\x63\xba\xdc\x3a\x46\x33\xac\xac\x96\x6f\xa2\x1b\x3a\x6a\xd6\xed\x10\xac\x81\x8c\x72\xd2\x5d\x73\xf0\xfb\xbc\x70\x4c\x8d\x8a\x48\xd7\xc8\x71\x1c\xe3\xcd\x63\xdb\x93\xf1\xe8\x31\xeb\xb0\x86\xf4\x76\x15\xe8\xce\x91\xb5\xfd\xc2\x65\xe4\xa5\xe0\xe5\x8d\xb2\xb0\x28\xb2\xae\x2a\x1a\x58\xca\x34\xdd\x58\xe0\x5a\x88\x63\x87\xfc\xe3\xf3\x2e\x59\x28\x45\x0e\x77\xac\xb5\xe8\xb2\x6a\x90\xf7\x97\x8e\x71\x6e\xe9\x22\x59\x29\xf0\xd2\xf3\x1c\x7b\x14\xa8\x72\xab\x59\xf8\x78\xf3\xf9\x42\xb2\x3d\x50\xd0\x3a\xf6\x24\x7e\xeb\xc9\x4e\xb7\x61\x82\x3b\x2a\x49\x8f\x8f\xcc\x05\x51\xa4\xb4\x9e\x53\x4f\x62\xef\x88\xe0\x20\x76\xbc\x1d\x82\xc1\xe3\x8d\x0d\xb9\x05\x18\x0d\x90\x8b\x7a\x85\x3d\xb9\xe4\x92\xfa\xc6\x3e\xa8\x41\x35\x84\x4c\xa6\x1f\x35\x71\xaf\x54\xd1\x67\x37\x60\x7f\xb2\xb8\x34\xde\xab\x9d\xcf\x4f\x02\x51\x9a\x84\xc8\xf0\xc9\x84\x1f\x62\x19\xc0\xd1\x2c\xb3\xa1\x05\xc6\xfa\x30\xf0\x53\xee\x3e\xa7\x34\x59\x38\xec\x6a\xbc\xf3\x97\x22\x98\xa4\x9a\x39\x64\x91\xa6\xb7\x67\xe8\xbd\xd6\xf7\x35\xd0\x42\x1a\x64\x7b\xae\x66\x6f\x61\x99\xc8\x5b\xf1\xe8\x89\xa3\x31\x01\xfd\x53\x6d\x78\xed\x29\x97\x43\x77\x25\x7d\x56\x19\xa8\x4e\xfe\xd9\x46\x61\xfc\xbb\x9b\x25\x8a\xb9\x52\x06\x9c\x78\x03\xdf\x0b\xe7\xb4\xc7\xed\x76\x35\x56\xbd\x10\xe2\xf6\x1a\xe3\x7c\x41\x05\xb4\xbb\x0a\x9a\x43\x75\xdd\xc8\x9a\xb3\x25\xa9\xc5\x85\xdb\x3c\xc7\x47\xa5\xeb\x07\x15\x32\x0e\x1d\x0c\x68\x06\xca\x28\x12\x78\xb3\x34\xaa\x9d\x48\x1b\x9c\x30\xed\x52\xc3\x4d\x0d\x4d\xf1\xde\x1d\x43\x2b\xcd\x43\x8c\x37\xbb\xa4\x88\x57\x6f\x11\xcb\x71\xea\xbf\x68\x1d\x1d\xa0\x7f\x4d\x6b\xa1\x4d\x02\xf1\xb6\x6d\x18\x9d\xe1\x68\x9c\xe1\xac\x8d\x25\x64\x1e\x7c\xd9\x5d\x01\x44\xa1\xf4\x24\x31\x25\x74\x4e\xa3\x86\x26\xdd\xcd\x36\x27\xe7\xb5\x97\xd5\xbd\xbb\x8f\x52\x3e\x29\xed\xa3\x3f\x88\xb4\x48\xc9\x5b\xd3\x1b\xdf\x05\x41\x0a\xe2\x97\xa9\x87\x9e\xb8\x85\xdf\xef\x63\x1e\x75\xff\xf3\xf7\xe4\xbe\xc6\x7b\x6d\x68\x22\xa0\xef\x57\x6e\xb0\x83\x77\xad\x58\x82\xca\x87\xa5\x22\xc0\xe9\x65\x12\x78\x42\xa2\x44\xc3\x8a\x3c\x59\xbd\x9b\x35\x83\x52\x94\x0d\x45\xe2\x3c\x9f\x50\x07\xcf\x5c\x3b\x36\x11\xac\x22\xcc\x13\x2e\x3d\x5a\x21\x6c\xc5\x2e\xea\x99\x70\x5e\x4d\x07\x58\xf4\xdb\x5a\xc5\xbe\x06\x0f\x5d\x30\x8b\xd0\x15\xb0\x3b\x84\xb2\x85\x4f\xe5\xe9\x64\xf5\xa1\x33\x12\xb5\xed\x1b\xc8\xf8\x89\xf8\x34\xf8\x10\x43\x9d\xf5\x76\x0b\x52\xf0\x92\x05\xdc\xa5\xee\x57\x70\x86\x54\x2f\x7a\x42\xaa\xaf\xba\xb9\x49\x0b\x65\xc4\x56\x6c\x55\xfd\x7d\x09\x5a\x59\x1c\xe8\xc7\xea\x5c\x47\x5f\xa9\x49\xb1\xa0\x55\x39\xa6\xe3\x66\xd3\xc7\xa3\xef\x0e\xd4\xb5\x3e\x81\x99\x36\x79\x77\xcc\x28\xaa\xd9\xdd\xd1\xbf\x68\xca\x3b\x73\xea\xb7\x4b\x76\xcb\x5d\xe9\x25\xe7\xde\x4b\x6e\x11\x2f\x49\x05\x6a\x11\x4f\x0e\xe7\x1a\x98\xc0\x4c\x24\x9f\xb8\x7b\xe9\x84\x82\x77\x6c\x9f\x19\x05\xc1\x28\x21\x8c\x67\x9d\x2f\x61\xbb\xc5\x93\x55\x1c\xaa\x6f\x5d\x89\xba\xd7\x63\x19\x10\xe6\x56\xe1\xa6\x55\xf9\xbb\xff\x9c\x36\xac\x55\x51\x95\xb3\x30\x65\xb6\x50\xf8\x1a\x92\x5d\xe0\x67\xa7\xcf\xb0\x5b\xb3\xdd\x52\x1a\xad\xa6\xdc\xcd\x8b\xb4\x8d\xfa\x78\x02\x5e\x0e\x97\x12\x25\xf3\x11\xd7\x76\x3b\xb0\x60\xd9\x04\x79\x5f\xaf\x1f\x46\x85\x6f\x6a\x76\xfa\xcf\xeb\xb0\x52\xed\x68\xcd\x14\xb0\x4f\x82\x19\x6d\x5b\x95\xc8\xf9\xd1\xb9\xf5\xf9\x69\x93\x44\xd5\xa8\x1f\xb9\x19\x16\x5b\x0d\xa0\xf3\x30\x6e\xa7\x54\x1a\x2d\xee\x97\xe7\x2e\xcb\x4c\x7c\x02\x22\xb0\xf5\x91\xcf\xb4\xe4\x6b\x55\x24\x4b\x59\x61\x8f\xae\x69\x33\x5a\x98\x95\x01\xdd\x8c\xe0\xbe\x30\x9c\x35\x1a\x10\x61\x69\x0d\xbf\x39\xcc\x78\xdc\x94\xd7\xa3\xed\x04\x52\x24\x88\x0b\x65\xed\x6b\x44\xba\x03\x6e\xc8\x38\x1f\x25\x9f\xbe\xc4\xad\x73\x2a\x47\x6d\x19\x15\x59\x47\x13\x5a\x69\xf5\x44\xaa\x6e\xc9\x27\x51\xb7\x5d\xee\xc5\x5b\xe5\xba\xb5\x8c\xc5\xd4\x72\xa3\x63\x61\xd7\xf2\xbe\x0f\x95\x58\x73\xf3\xfc\x09\xad\x7d\xad\x7e\xcf\x6c\xe4\xef\x65\x4e\xd2\x87\xb4\x71\x36\xe0\x29\xb2\x2a\x0a\x85\xfc\x08\xd5\x63\xad\xe7\x5b\xc1\x7f\xec\xf2\x4f\xaf\x58\x74\x24\x24\xf5\x52\x84\x4e\xc4\xa0\x95\xa1\x69\xb3\x36\xe2\x75\x86\x6f\x4e\xc1\xce\x89\x03\x2f\xd3\x10\xa7\xef\x3f\xa9\x2e\xcf\x3d\xc1\x72\x6d\x4a\x9e\x62\x9a\x80\x74\x17\x12\x69\x5c\x0f\x65\x9b\xb7\x3f\xe9\x8e\xc1\x28\x66\xc4\xeb\x78\x1f\x41\xd5\x42\x72\xa8\x43\x89\x19\x56\x06\xb1\xea\x85\x78\xc2\x46\xbd\x89\x35\xf7\x3c\xd4\x17\x2c\x54\x46\x84\xd1\x4e\xbe\x73\xcc\x62\xfa\x2e\x62\x77\x96\x75\x60\x28\xcb\x88\x67\xfe\xa8\x9b\xd2\xee\xd9\xcf\x8f\xce\xfb\x2d\x9d\x33\xef\x07\x57\xe4\xb7\xdb\x36\x95\xe5\x18\x85\x03\xcd\x84\x26\xda\x8b\x3d\x1c\x5a\x1e\x60\x3c\xce\x32\x06\x1f\x6d\x5b\x76\x27\xfa\xf4\x8b\xcb\xb6\xba\xf5\x56\x9c\x5b\x62\xce\x0e\xa6\x25\xb9\xfd\xe0\xf4\x2f\x27\xbc\xdc\xc4\x77\x79\x71\x6e\x0e\x1e\x94\xe3\x37\xdc\x56\x15\x42\x54\xbc\xf5\x8d\x90\xc4\x14\x5a\xac\x28\x5e\x2f\x21\x52\x69\x2d\x71\x92\x0b\x71\x9d\x3e\xce\x47\x05\xfd\xf1\x44\x8e\x1f\x3f\xdc\x25\xd4\xac\xa5\x46\x17\x3f\xfe\xcf\x83\x9a\x9f\xad\x54\x78\x49\x0c\xd4\xdc\x6a\x4f\x3e\x9b\xd5\x44\xca\xd0\x31\x61\xf5\x3e\x18\x9f\xca\x7e\xb1\x3a\x25\x99\x57\x75\xeb\x0f\xc2\x1c\xab\x01\x11\xe9\x16\x82\xd7\x9d\x54\x0a\xaa\xe0\x3c\x8c\xf9\xd8\xdf\xfa\x29\x8c\x59\x3b\x04\xbb\x60\x15\x93\x63\x59\x37\x8c\xbd\x04\xde\x6c\xcc\x85\x61\xec\x5e\x85\x21\x10\x50\xb5\x89\xb6\x4e\xbb\x56\xbc\xfe\x9b\x97\xd5\xaf\xa6\x71\xde\x02\x12\x40\xf4\x97\x53\x54\xf8\xa0\xbc\x8a\xa2\x11\xe0\x0f\xa2\x01\x85\x6f\x2f\xdb\xd5\x4c\x5a\x81\x1e\xb6\x68\x27\xec\x54\xd0\x6c\xad\xd1\x62\x1c\x38\x73\x66\xde\xcb\xe6\xb1\x0d\x64\xac\x26\xab\xd0\x74\x31\xf5\xe8\xa2\xf3\x8e\x0f\xf5\x45\x9a\xcf\x8d\x99\x97\xa3\xd8\x42\x5b\x23\xd1\xb2\xf8\x63\x7f\xed\xab\x05\x9e\x07\x95\x1d\x0e\xe0\xcf\x17\x92\xf4\xf5\x80\x8e\x5a\x16\x45\x8e\x28\xa0\x60\x09\x20\x2c\xc3\x32\x11\x7e\xab\x57\x8b\x12\x32\x73\xeb\x38\x9f\x8b\x76\x3c\x1c\x3c\x36\xd2\x07\x2d\xf6\xb3\x3c\x75\x72\xb3\x33\x07\xf2\x19\xbb\xa1\x5e\x82\x64\x10\x3b\x62\x47\x28\x30\xea\xb5\xf9\x01\x49\x1f\xae\x7c\x49\xa4\x77\xb4\x5b\xe2\x38\x2e\x7f\x60\xa5\x6c\x6c\x9d\xaf\xa0\xb7\xab\xe1\x93\x21\xac\x5b\x3f\x73\x86\x8b\x12\x14\x45\x46\x5c\x4f\x26\xab\xc0\x18\xf9\x76\xf8\xdf\xa1\x4e\xfc\xd6\x09\x27\x45\x85\xd5\xdf\xa4\x87\xe1\x9a\xab\x46\x93\x07\x04\xe2\x70\x7d\x50\x68\xfd\x69\x54\x31\x1b\xcc\x91\xbf\x08\xbd\x90\xc8\xee\xd1\xdb\x83\xe7\xf4\x52\x95\xd3\xa6\xb3\x06\x10\xdb\xc6\xba\x24\x97\x95\x56\x4f\x8c\x81\x71\x8d\x3d\x70\x14\xb6\xbb\x9b\x08\xfc\xea\xf5\xb9\x26\xe0\xad\x6d\xc0\x42\x44\xd7\x69\x92\x80\x63\xd5\x8c\x8c\x4f\x11\x27\x66\x27\x8e\xe8\xc9\xe7\x0f\xb2\x70\xa5\x10\xa9\x8a\xbc\x59\x07\xad\x87\x48\x00\x23\x17\xc2\x3f\xd4\x7a\x7b\xca\x8f\xa3\x6e\x14\x83\x19\xe4\xd4\xaa\xad\xb8\x57\xfb\x8f\x09\x43\x80\xe4\xa5\x8e\x34\x26\xbb\x80\x0b\xc4\xca\x99\xfc\x72\x29\x68\xe9\x33\x4b\x1b\xee\x29\x85\x23\x61\x5d\x9a\x1a\x9b\x81\xd1\x03\x74\xd9\x01\x05\x21\xab\xd0\x68\x03\xef\x2c\x04\xe9\x06\x0d\x68\xb8\x8d\x73\x4f\xfe\xa9\xa7\x61\x44\xb1\x3e\xd0\x1a\x8a\x87\xfe\x99\x6b\x78\xe6\x16\x2b\x5d\x71\x24\x63\x02\xa1\x6c\x3e\x31\x02\xc1\x54\xa2\xf0\xe2\x52\x37\xe3\x73\x38\x9a\xac\x04\x3c\xcf\x1d\x7f\x05\x1e\x84\x88\x2d\x63\x5a\x1f\x21\xa8\x43\x93\xd1\x84\x38\x37\x6a\x2d\x68\x32\x42\xea\x31\xe9\xaf\x62\x86\xd4\x6f\x15\x55\x60\xc1\xf3\xb1\x45\x71\xf3\x89\x16\x65\x80\x3b\xa8\x4c\x55\xe2\x04\xe5\xbe\xca\x23\x73\x71\xc1\x43\xa2\x82\x18\x1f\x90\x9d\xe1\x04\xfe\xa2\x78\xb6\x50\xac\x10\xd2\xab\x03\x3d\xee\xbb\x66\x8d\xf5\xbc\x1a\xca\x84\xbe\x21\x6a\xaf\xb6\x0e\x71\x29\xf9\xed\x5f\x8e\xa6\x4b\x02\x36\x9b\x0e\x5e\x28\x9e\x9b\x1e\x2e\xdb\xe2\x1d\x5e\xdb\x62\xbd\x59\xbb\xfa\xab\x97\x12\x4e\x2b\x87\xd0\x92\x02\x1f\x32\x64\xde\x07\x4b\xde\x5a\x1e\x98\xff\x84\xff\x3d\xf5\xc2\xf0\x4e\x7e\xd8\x0f\xbd\xfc\x21\x41\x8c\xfc\x40\x1f\x73\xde\x1d\x41\xc9\x57\xb2\x13\xac\xad\x80\xbc\x7b\xc6\xaa\xe8\x7b\xaa\x58\x5d\x9c\xeb\x36\x69\x8a\xf4\x66\xc6\x14\x6f\xa0\xbe\x94\x49\x0e\x5a\xef\x29\xcb\x5a\x8f\x9d\x93\x4f\xc9\xc8\x5d\x32\x81\x5d\x33\x59\xae\x7d\x44\xdf\x32\x5d\xfb\x29\xe0\xc7\x90\x08\x23\x26\xd3\x15\x75\xaf\x5a\x96\x8a\x13\x78\x15\xc9\x4f\x07\x82\xd4\xf0\xbb\x9c\x0e\x78\xd8\xd5\xaa\x61\x74\xed\xc0\xf3\x70\x25\xc0\xf3\x26\x35\x3e\xdd\xe8\x6f\xfb\xd5\x07\x86\x59\xe1\x43\xfb\x12\xec\xcc\x64\xc4\x2c\x33\xdb\xfa\x21\x68\x65\x79\x3e\x4b\x29\x62\x2a\x6b\xc1\x4c\x59\xe6\x45\xca\x45\x1a\x70\x29\x5d\xf7\x49\x30\x92\xb0\x98\x41\x39\x13\x69\x82\x35\x00\x6b\xe8\x1d\x8a\x97\x54\x25\x25\xb8\x7b\x1d\xdd\xbc\xa8\x34\x52\x38\x5b\xc2\xae\x88\xbb\x0b\x8d\xb0\x60\xdd\x92\xa9\x42\x13\x88\xb7\xbb\x01\x34\x86\x98\xb8\x83\x70\x87\xf7\xad\x5b\xf7\x6c\xa2\xdb\x52\x60\x0b\xac\x8e\x1e\x19\x7a\xc7\x8a\xe8\x07\xbb\x52\xcf\x55\xd2\x10\x6a\xc3\xbd\xa3\xf7\x31\x0b\x85\xc4\x35\xf5\x68\x39\xc5\xcc\x6c\x12\x24\xdb\xca\x0a\xe2\x87\xe7\xb4\x02\x41\x8e\xa4\xab\x1e\x1a\x99\xcd\x34\xcb\x62\x2a\xd1\xb1\x0b\x56\xa9\x72\x64\x61\x5e\x4f\xc0\x9c\x49\x5d\xc8\x40\x66\x87\x93\x75\x07\x61\xc4\x4c\x51\x6b\x37\xf3\x54\xa4\xae\x1b\x40\x0f\x74\xae\x92\x27\x4a\x16\xfe\xca\xe3\x37\x75\x83\x67\xcd\x41\xcb\x69\x5a\xb7\x76\x2e\xe3\x16\x9a\xbf\x38\x20\x2a\x7d\xa8\x53\x7e\x84\x2a\xcf\xeb\x87\xb7\xfe\xd5\xa2\x34\xe3\x4d\xdb\x44\x26\x48\xcf\xfa\xbf\xe9\xed\x07\x97\xe7\x7b\x65\x5f\x09\x3e\x46\x9e\xde\x62\x28\xf2\x57\x68\x85\x0c\x40\xc9\xd6\x5e\xd6\x68\x29\xb2\x12\xe4\xb8\x5d\x82\x78\x47\x1a\xb5\x18\x9f\x38\xc3\x32\xce\x84\x53\x83\x9f\x28\x43\x57\xb8\x76\x1d\xa4\xb9\x1c\x87\x40\x73\xda\x97\x14\x44\x12\x1b\xa1\xd2\xd0\xb6\x49\x11\x1f\x96\x0f\x12\x7b\xbd\xb7\x02\x91\x36\xa4\xa0\xfc\xcf\xcb\xbe\x3a\x62\x7f\xf6\xc8\xb4\x60\xdf\xc5\x89\x65\x48\x01\xfe\x92\x71\x21\xd5\x36\xab\x96\xe5\xf5\x57\x72\xca\xf7\xcd\x31\xf5\x97\xaa\x9b\xc6\x87\x9a\x6d\xfb\x27\x86\xec\x98\x74\x46\x86\x01\xf2\x00\xd4\xe2\xad\x42\x93\x79\x4e\x67\x94\x30\x63\x25\x64\xd9\xf5\x2f\x14\x8c\x80\xfc\xcb\x45\xd5\xb6\xa9\xe5\xc7\x82\x96\xab\x46\x9e\x41\x57\x2c\x6a\x18\x0b\xb5\x78\x9c\x70\xf0\xdf\xca\x9b\x0d\xcb\xcb\x87\xbf\x4a\xe5\xb0\xa2\x28\xda\xf4\x8b\x18\xe7\xdc\xb2\x03\xb2\x5c\x98\x5a\x2f\x27\x24\x61\x17\x20\xd7\x19\x58\x9e\xf5\xf5\xac\x45\xf4\xab\xcc\xc9\xca\xe6\x10\xa9\xb2\x61\xe0\x76\x62\x50\x7d\xac\xc5\x86\xd7\x73\xaf\x0b\xcd\xe3\xc2\xf2\xfb\x21\xde\x2a\xc0\xe3\x8f\x1e\x2e\x7a\xf3\x04\xeb\x92\x79\x83\xfc\xbc\x8e\xa1\xeb\x4e\x9f\x86\x27\xe4\x32\x94\xee\x3e\xad\x11\xda\xf2\xfd\xc3\x97\x27\x7e\x9f\x9a\x12\x4f\xe3\x9a\x68\x8b\x59\xf6\xb3\x59\xe3\xc1\x83\xeb\xf1\x35\x11\x96\xa9\x8b\xf4\x4e\x2e\x9a\x43\x78\xce\x12\x08\xc1\x6b\x03\x31\xf1\x35\x56\xc0\x1c\x27\x88\xbc\x95\x97\x8b\x6a\x32\xbb\xa0\xc8\x8c\x6f\x47\x2d\x86\x4b\x7a\x1d\x71\xe6\x30\xc5\xcd\xd7\x55\x28\x2a\x25\xc8\x07\x7b\xcf\xa2\x6d\x4b\x67\x15\x2c\x0b\x31\x1a\x2f\xe4\x71\xd0\x1a\x7a\x31\x4e\x15\xca\x60\x4e\xe7\x46\x65\x28\xb9\x87\x97\x13\x32\x56\xfd\xee\x2f\xbb\x20\xe2\x6e\x6f\xc6\xe7\xfb\x42\x56\x53\x57\x72\xd2\x1c\x6b\x8a\x4f\x8f\xe2\x4c\x72\x6c\xfe\x89\xd7\xf2\x0c\x62\x77\xbe\x3a\x15\x1a\xdb\x5c\x81\x80\xfc\x51\x9c\xcf\x44\x27\xd3\xd7\xd0\x54\xed\xcb\x39\xd0\xae\x8b\x2c\x6e\x39\x85\x6f\xcc\x0a\x86\xdd\x86\x4c\x80\xd9\x4f\x35\xbb\x60\xcf\x37\xae\xde\xa6\x25\x7f\x6c\x58\x89\x35\x37\x4c\x01\xfb\xaa\x20\xc6\x0e\xe3\xed\x9b\x10\x62\x34\xb9\xc0\x0f\xe9\xcc\x3c\xff\x47\x63\x0c\x25\x29\x7d\xa7\xa8\x3a\xd4\x73\x06\x4c\xa0\x1c\xba\xf2\xfe\x55\xe3\x97\x94\x7d\x94\x18\xf7\xd3\xd0\x75\x41\x70\xf6\x21\xf7\x0b\x73\x4e\xd2\x88\xb4\x5a\xd2\xb9\x43\x12\x62\xb6\x90\x09\x53\x02\xf0\x5e\x0d\x45\x39\xbc\xf1\xee\xe7\xc3\x01\x9b\x1c\x78\xb6\xa2\x7b\xbb\xb2\x15\xec\xf4\xaa\xe3\x47\x8b\x7e\x73\xa2\x13\xa2\x25\xab\x7d\x03\x2f\x71\xcf\x8a\x1d\xda\x2e\x0c\xab\x8d\xe1\x25\x5e\x8d\x7e\x17\xdb\x0d\x92\x81\x2a\x73\xec\x49\x1c\x23\xfd\x5a\x33\x6a\xdb\x36\x29\xbd\x20\xd7\x3a\x1b\x45\x50\xc9\x01\xb2\x49\x85\x1c\x6f\x86\xbb\x44\xef\x0f\x94\xae\x4f\x34\x02\xec\xd3\x80\xc3\xe0\x23\x83\xd4\xe4\xa5\x4c\x7c\xf0\x26\x5f\x6a\xad\xd3\x42\x52\x4f\x7c\x45\x80\x5e\xf8\x55\x60\x09\x0d\xe1\x23\x53\x6d\x56\xcb\x30\x52\x71\xa0\x79\xc8\xdf\x96\xb7\x97\x38\x3a\xc2\xe5\x12\x5d\xae\x4c\x18\xfb\x8b\xe6\xd5\x1a\x1d\x2f\xc7\x47\x6a\x17\x26\x76\x1c\x19\x19\x99\x07\xb8\x8a\x69\xec\xc1\x7a\xe7\x7f\x79\x9f\x0c\x22\xf8\x99\x18\x61\x05\x41\x87\xa3\x1e\x91\x41\xa8\xa5\x4e\xf0\x0e\x7c\xf2\x67\x4d\x53\x4d\x20\xcf\xf3\xc9\x55\x96\xb8\xea\x74\x24\x17\xf7\x5b\xaf\x3d\x33\xa6\x4f\x8b\xe4\x95\x89\x1c\x04\xdd\x57\x06\x99\x09\x1c\x8c\xbd\xc2\x90\x4d\x4d\xe5\xec\x12\x1d\x50\xab\x97\x99\x26\x55\x87\xdb\x57\x4b\x86\x31\xaa\x1f\xd3\x56\x4d\x4b\x0b\x68\xe1\xf1\x8c\x1b\x6d\xa7\xcd\x44\xbb\x2a\x95\x94\xf5\x00\x26\x70\x99\xe6\x4f\xf6\xd0\x83\x64\x05\x2e\x79\x21\x5d\x2d\xf3\x7a\xd2\x42\xde\x1b\x1d\xc4\x40\xcf\xbb\xa0\x3b\x91\x06\xa2\xb1\x72\x5a\xe6\xa2\x94\xff\xbf\x86\xd9\x2a\xe2\x79\xff\xab\x5b\x6c\x3a\x9b\x17\xbb\x52\x8a\x5b\x28\x59\x93\x9e\xe8\x8d\x62\x58\xcb\xd3\x72\xf5\xf7\x05\xd5\x07\x1c\x3d\x92\xc3\x43\x5a\x9d\xfd\xe0\x69\x88\x73\x56\xbf\x8b\x3a\xf9\x13\xd3\x07\x44\x62\x4d\xf5\xf5\x33\x23\x09\x52\xa2\x1d\xf4\x01\x56\xe2\xc9\xe4\x08\x5c\xb7\x9c\xea\xf9\x14\x82\xcb\xd0\x15\xb1\xb2\x3e\x38\x7b\x55\x04\x21\x19\x63\x73\xbb\x15\x96\x7a\x2d\x6d\x38\x35\x3b\xe9\xff\x1c\xc0\xbd\x10\x34\x37\x77\x0e\x8f\x5c\x5f\xd7\x96\xd8\x46\x9f\xc9\xea\xd0\x53\xb6\x91\x78\x8a\x63\xdc\xdb\x01\x7b\x0b\x3a\x31\x12\x5c\xb7\xfe\x0f\xb6\x15\xaf\x3f\xf4\x65\xb9\x3d\x6e\xff\x53\x3f\x79\xfc\x1c\x9d\x95\x2f\x0a\x01\x88\x07\xd3\xa4\x92\xd7\x5a\xac\x92\x52\x5a\xc3\x48\x8b\x8d\x62\xa7\x0e\xf1\xbd\xe2\xb2\x19\x5c\x3e\xbe\xcc\xd5\x7e\xff\x28\xeb\xcf\x9a\x01\xbd\x20\xf0\x74\xae\x3d\x72\x1b\x4e\x2b\xb7\xea\xd3\x9f\x6b\xcc\x63\x76\xb2\x19\xac\xb2\xe7\x49\xf5\x01\x64\x91\x58\x03\xb3\xfe\xf4\x7d\x67\x30\xc3\xa6\x34\xb5\x0e\x0f\x5c\xf5\x41\x8c\xb4\x49\xae\x98\x54\xfc\xc0\x4a\xe5\x0a\x2c\xa7\x67\xc9\xa1\x28\xe0\x3a\x5f\x19\xf8\x03\xd2\x51\x38\xb8\xd6\xb8\x4f\xc6\xcc\x5b\xb8\x50\x7d\x04\x30\xc2\xb1\x47\xf1\xc5\xc6\x61\x6d\x79\xf3\x73\x0e\x24\x4a\xd9\xdc\xcf\x68\x3f\xa9\xcd\xe1\xb3\xa2\xaa\xcb\x3a\x27\x6f\x8a\xdc\x62\x75\xa1\xfd\x02\x0b\xf3\x4a\x66\x56\x36\x7d\xd7\x02\xa9\x53\xb2\x30\x82\xd6\x40\x9a\x22\x00\x9d\xff\x03\x5b\x39\xa8\xbf\xcc\x71\x0b\xd4\x48\x7c\x36\xad\x64\x20\x60\x6d\x86\x39\x9d\x2d\xd8\x5f\xb2\x0f\x74\xb0\xa2\x09\x16\x8d\xb3\x1f\x24\xb6\xec\x80\x07\xcc\xac\xab\xbf\x7b\xc3\x25\x3a\x20\x82\x5d\x83\x72\x4b\x1b\x68\x53\xdd\xbd\x09\x56\x81\x74\x2b\x76\xef\xae\x88\xf5\xc8\x45\xdd\x49\xd6\xf8\x98\xbf\x94\x09\x37\x9d\x97\x0a\x6f\x9a\xd5\xb9\x8a\x69\xdc\x64\x00\xff\xff\x18\x1d\x5e\x41\xfc\x2c\x63\x4f\x66\xfc\x08\xe4\xe1\xcb\x24\x63\x54\xb5\x49\x19\x18\x8b\xc4\x08\x0f\x45\x61\x84\x50\x39\xe0\x29\xea\x32\x75\xd6\xf9\xca\xf5\xbd\xf4\x51\x5b\x83\xb7\xe2\xcb\x9e\x4a\x10\x81\x63\x2a\xca\xba\xc8\x91\xdd\x57\x4c\x60\xf8\x14\xaf\xd9\xb2\xe3\xa6\x91\xbb\xf9\xff\xc6\xab\xc3\x76\x3d\x85\x98\xc2\x02\x3d\x36\x9f\x2a\x79\xa6\x2f\x94\x46\xfd\xc7\xa9\xbf\xa8\x90\x24\xf6\xf2\x97\xf3\xfa\x63\x74\x47\x5f\x0d\x77\x74\xa4\x30\x27\x3c\x79\xc6\x6a\xbc\x87\x8b\x60\x64\x06\x4a\xd1\x35\xde\x03\x1c\xe9\x8c\x8e\x93\xbd\xec\x7a\x09\x03\x71\xf1\xef\xac\x31\x5c\x48\x42\x52\xba\x1e\x4a\x96\xd7\xfb\x93\xda\xbf\x85\x1a\xe2\xe0\xd4\xe6\xba\xf2\xf3\x7b\xf6\x5e\xcb\xda\x4b\xd7\xe8\x4f\x1f\x10\xe2\x9e\xb7\x2d\xac\xdc\x6e\x26\x5d\x07\xfc\x61\xf0\x72\x4a\x20\xce\xe8\x38\x63\x58\x25\x91\x5c\xd4\x7a\x77\x26\x43\x51\x7c\x77\x88\x96\x2c\x69\xcd\xbe\x2e\x26\x23\xec\xa2\xac\xc8\x40\xcd\x59\x1c\x2a\xa3\x69\xe8\x23\xb1\xa5\x56\xdc\xe1\xea\xe7\x02\xd4\xec\x91\x7a\x99\x5f\x8f\xf0\x43\x82\x29\xb0\x32\x39\x10\x6b\x33\xd1\xa2\x90\x8c\x88\x3d\xdc\x63\xf6\x4d\xc5\xa0\x6b\x32\x9b\x7a\x82\xd2\x51\x6f\x75\xae\x3a\x95\x34\x06\xdc\x45\x24\x3e\x01\x18\xa4\xae\x83\x38\x2a\xf8\x62\x7f\x08\x88\xe5\x3c\x8f\xc0\x39\xdf\x3c\x5e\xe0\x87\xef\x2d\xdf\x76\x20\x95\x21\xf1\x46\x75\xb8\xc2\x60\xb6\x70\xc5\xe6\x67\xaf\xcc\x7d\xc4\x86\x00\xc4\xf2\xc3\x1e\x0d\x28\xac\x0c\xff\x1b\x8e\x7e\x89\x81\x87\x0c\xcd\x41\x76\xdf\x2a\x73\x48\x65\x67\xd6\xc5\x5a\xb2\x3f\xbf\x65\xfb\x7a\x5d\xc1\x72\x47\xfe\xf5\x84\xd2\x83\x15\x8b\xec\xdb\x82\x08\x93\xd6\xf6\x9f\x06\xfb\x48\x7b\xbd\x35\x3c\xb9\x0c\xdf\xc8\x0a\x15\x52\xfd\x75\x41\xc7\xf3\x2a\xab\x35\xa0\x1b\xdb\x6a\x71\x15\x35\x6c\xe2\xe8\x69\x98\x88\xbd\xd8\x27\x25\x00\x68\xca\x9c\x1b\x0f\x74\x6a\xe2\xbf\xd8\x4b\xd5\x73\x08\xe5\xf2\x99\xbb\x90\x0c\x44\x64\xe0\x91\x80\xa3\x3f\x67\x89\x21\xb1\x56\x94\xd0\x12\x6e\x22\x0c\xad\xd1\xfc\xfb\x40\x82\x94\x5f\xf6\xe1\x7f\xed\xbf\xa3\x9b\x4c\xf1\x7a\x34\xfe\x58\x89\xd1\x2f\xad\x69\x79\x64\x03\x64\x6d\x5f\x93\x23\xf7\xc2\x66\x76\x6a\xa8\xf8\x37\x9a\x0f\xcc\x19\xb8\xf1\x44\x2f\xa2\x24\x5e\xcf\x32\xbe\x20\x3a\x5e\x32\xc4\x05\x25\xb9\x8b\xdb\xcd\xe0\xcd\x5c\x3a\xdb\xe5\xb3\xd3\x3e\x2d\x30\x8b\xe7\x2f\x70\xe7\xa8\xb5\x67\x86\xf0\xa7\xb2\x52\x72\x5d\xc7\xd3\x90\xd7\x9e\x14\x44\xe6\x58\xb7\x82\xab\xaf\xec\xfb\xd8\x05\xcf\x0b\xf8\x32\xfa\xdc\x88\xa9\x65\x3f\xf5\x6f\x52\x5c\x96\x3f\x58\x2a\x09\x80\x4e\xd6\x47\xb0\xd7\x59\x71\xeb\x20\xc9\x91\xaf\xe3\x3e\xb9\x2a\x4e\x33\x1d\xb4\x69\x70\xab\x3e\x6a\xe2\x86\xe7\x6e\x79\xcf\xc8\x90\xe8\x73\x86\xe2\x13\xed\xe6\x4c\xa1\x62\xc5\x92\x1c\xa0\x11\x7e\x10\x71\x8d\x0f\xbf\xb6\x70\x46\x7a\x9b\x2e\x2e\x3e\x83\x88\x1a\x34\x58\xa0\x7f\xc9\xf1\xb4\xa3\x1d\x2d\xdf\x51\x88\x42\xfe\xd3\x8a\x52\x27\x63\xf6\x50\xe8\x06\x3b\xc9', 2) | 61,451 | 61,451 | 0.749996 | 15,358 | 61,451 | 3.00013 | 0.016994 | 0.002865 | 0.003125 | 0.002865 | 0.001302 | 0.000977 | 0.000977 | 0 | 0 | 0 | 0 | 0.310376 | 0.000049 | 61,451 | 1 | 61,451 | 61,451 | 0.439461 | 0 | 0 | 0 | 0 | 1 | 0.999349 | 0.999349 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
f49f5c0b8f0d9ab2b1e99374ccbca61c96083c74 | 22,844 | py | Python | napalm_yang/models/openconfig/routing_policy/defined_sets/prefix_sets/prefix_set/prefixes/prefix/state/__init__.py | lumina-networks-oss/napalm-yang | 5f9ca183f1496f0701cb09d0008fb5fb1f0f3a09 | [
"Apache-2.0"
] | null | null | null | napalm_yang/models/openconfig/routing_policy/defined_sets/prefix_sets/prefix_set/prefixes/prefix/state/__init__.py | lumina-networks-oss/napalm-yang | 5f9ca183f1496f0701cb09d0008fb5fb1f0f3a09 | [
"Apache-2.0"
] | null | null | null | napalm_yang/models/openconfig/routing_policy/defined_sets/prefix_sets/prefix_set/prefixes/prefix/state/__init__.py | lumina-networks-oss/napalm-yang | 5f9ca183f1496f0701cb09d0008fb5fb1f0f3a09 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
from operator import attrgetter
from pyangbind.lib.yangtypes import RestrictedPrecisionDecimalType
from pyangbind.lib.yangtypes import RestrictedClassType
from pyangbind.lib.yangtypes import TypedListType
from pyangbind.lib.yangtypes import YANGBool
from pyangbind.lib.yangtypes import YANGListType
from pyangbind.lib.yangtypes import YANGDynClass
from pyangbind.lib.yangtypes import ReferenceType
from pyangbind.lib.base import PybindBase
from collections import OrderedDict
from decimal import Decimal
from bitarray import bitarray
import six
# PY3 support of some PY2 keywords (needs improved)
if six.PY3:
import builtins as __builtin__
long = int
elif six.PY2:
import __builtin__
class state(PybindBase):
"""
This class was auto-generated by the PythonClass plugin for PYANG
from YANG module openconfig-routing-policy - based on the path /routing-policy/defined-sets/prefix-sets/prefix-set/prefixes/prefix/state. Each member element of
the container is represented as a class variable - with a specific
YANG type.
YANG Description: Operational state data for prefix definition
"""
__slots__ = ('_path_helper', '_extmethods', '__ip_prefix','__masklength_range',)
_yang_name = 'state'
_pybind_generated_by = 'container'
def __init__(self, *args, **kwargs):
self._path_helper = False
self._extmethods = False
self.__ip_prefix = YANGDynClass(base=[RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])/(([0-9])|([1-2][0-9])|(3[0-2]))'}),RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '(([0-9a-fA-F]{1,4}:){7}[0-9a-fA-F]{1,4}|([0-9a-fA-F]{1,4}:){1,7}:|([0-9a-fA-F]{1,4}:){1,6}:[0-9a-fA-F]{1,4}([0-9a-fA-F]{1,4}:){1,5}(:[0-9a-fA-F]{1,4}){1,2}|([0-9a-fA-F]{1,4}:){1,4}(:[0-9a-fA-F]{1,4}){1,3}|([0-9a-fA-F]{1,4}:){1,3}(:[0-9a-fA-F]{1,4}){1,4}|([0-9a-fA-F]{1,4}:){1,2}(:[0-9a-fA-F]{1,4}){1,5}|[0-9a-fA-F]{1,4}:((:[0-9a-fA-F]{1,4}){1,6})|:((:[0-9a-fA-F]{1,4}){1,7}|:))/(12[0-8]|1[0-1][0-9]|[1-9][0-9]|[0-9])'}),], is_leaf=True, yang_name="ip-prefix", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/routing-policy', defining_module='openconfig-routing-policy', yang_type='oc-inet:ip-prefix', is_config=False)
self.__masklength_range = YANGDynClass(base=RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '([0-9]+\\.\\.[0-9]+)|exact'}), is_leaf=True, yang_name="masklength-range", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/routing-policy', defining_module='openconfig-routing-policy', yang_type='string', is_config=False)
load = kwargs.pop("load", None)
if args:
if len(args) > 1:
raise TypeError("cannot create a YANG container with >1 argument")
all_attr = True
for e in self._pyangbind_elements:
if not hasattr(args[0], e):
all_attr = False
break
if not all_attr:
raise ValueError("Supplied object did not have the correct attributes")
for e in self._pyangbind_elements:
nobj = getattr(args[0], e)
if nobj._changed() is False:
continue
setmethod = getattr(self, "_set_%s" % e)
if load is None:
setmethod(getattr(args[0], e))
else:
setmethod(getattr(args[0], e), load=load)
def _path(self):
if hasattr(self, "_parent"):
return self._parent._path()+[self._yang_name]
else:
return ['routing-policy', 'defined-sets', 'prefix-sets', 'prefix-set', 'prefixes', 'prefix', 'state']
def _get_ip_prefix(self):
"""
Getter method for ip_prefix, mapped from YANG variable /routing_policy/defined_sets/prefix_sets/prefix_set/prefixes/prefix/state/ip_prefix (oc-inet:ip-prefix)
YANG Description: The prefix member in CIDR notation -- while the
prefix may be either IPv4 or IPv6, most
implementations require all members of the prefix set
to be the same address family. Mixing address types in
the same prefix set is likely to cause an error.
"""
return self.__ip_prefix
def _set_ip_prefix(self, v, load=False):
"""
Setter method for ip_prefix, mapped from YANG variable /routing_policy/defined_sets/prefix_sets/prefix_set/prefixes/prefix/state/ip_prefix (oc-inet:ip-prefix)
If this variable is read-only (config: false) in the
source YANG file, then _set_ip_prefix is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_ip_prefix() directly.
YANG Description: The prefix member in CIDR notation -- while the
prefix may be either IPv4 or IPv6, most
implementations require all members of the prefix set
to be the same address family. Mixing address types in
the same prefix set is likely to cause an error.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=[RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])/(([0-9])|([1-2][0-9])|(3[0-2]))'}),RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '(([0-9a-fA-F]{1,4}:){7}[0-9a-fA-F]{1,4}|([0-9a-fA-F]{1,4}:){1,7}:|([0-9a-fA-F]{1,4}:){1,6}:[0-9a-fA-F]{1,4}([0-9a-fA-F]{1,4}:){1,5}(:[0-9a-fA-F]{1,4}){1,2}|([0-9a-fA-F]{1,4}:){1,4}(:[0-9a-fA-F]{1,4}){1,3}|([0-9a-fA-F]{1,4}:){1,3}(:[0-9a-fA-F]{1,4}){1,4}|([0-9a-fA-F]{1,4}:){1,2}(:[0-9a-fA-F]{1,4}){1,5}|[0-9a-fA-F]{1,4}:((:[0-9a-fA-F]{1,4}){1,6})|:((:[0-9a-fA-F]{1,4}){1,7}|:))/(12[0-8]|1[0-1][0-9]|[1-9][0-9]|[0-9])'}),], is_leaf=True, yang_name="ip-prefix", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/routing-policy', defining_module='openconfig-routing-policy', yang_type='oc-inet:ip-prefix', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """ip_prefix must be of a type compatible with oc-inet:ip-prefix""",
'defined-type': "oc-inet:ip-prefix",
'generated-type': """YANGDynClass(base=[RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])/(([0-9])|([1-2][0-9])|(3[0-2]))'}),RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '(([0-9a-fA-F]{1,4}:){7}[0-9a-fA-F]{1,4}|([0-9a-fA-F]{1,4}:){1,7}:|([0-9a-fA-F]{1,4}:){1,6}:[0-9a-fA-F]{1,4}([0-9a-fA-F]{1,4}:){1,5}(:[0-9a-fA-F]{1,4}){1,2}|([0-9a-fA-F]{1,4}:){1,4}(:[0-9a-fA-F]{1,4}){1,3}|([0-9a-fA-F]{1,4}:){1,3}(:[0-9a-fA-F]{1,4}){1,4}|([0-9a-fA-F]{1,4}:){1,2}(:[0-9a-fA-F]{1,4}){1,5}|[0-9a-fA-F]{1,4}:((:[0-9a-fA-F]{1,4}){1,6})|:((:[0-9a-fA-F]{1,4}){1,7}|:))/(12[0-8]|1[0-1][0-9]|[1-9][0-9]|[0-9])'}),], is_leaf=True, yang_name="ip-prefix", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/routing-policy', defining_module='openconfig-routing-policy', yang_type='oc-inet:ip-prefix', is_config=False)""",
})
self.__ip_prefix = t
if hasattr(self, '_set'):
self._set()
def _unset_ip_prefix(self):
self.__ip_prefix = YANGDynClass(base=[RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])/(([0-9])|([1-2][0-9])|(3[0-2]))'}),RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '(([0-9a-fA-F]{1,4}:){7}[0-9a-fA-F]{1,4}|([0-9a-fA-F]{1,4}:){1,7}:|([0-9a-fA-F]{1,4}:){1,6}:[0-9a-fA-F]{1,4}([0-9a-fA-F]{1,4}:){1,5}(:[0-9a-fA-F]{1,4}){1,2}|([0-9a-fA-F]{1,4}:){1,4}(:[0-9a-fA-F]{1,4}){1,3}|([0-9a-fA-F]{1,4}:){1,3}(:[0-9a-fA-F]{1,4}){1,4}|([0-9a-fA-F]{1,4}:){1,2}(:[0-9a-fA-F]{1,4}){1,5}|[0-9a-fA-F]{1,4}:((:[0-9a-fA-F]{1,4}){1,6})|:((:[0-9a-fA-F]{1,4}){1,7}|:))/(12[0-8]|1[0-1][0-9]|[1-9][0-9]|[0-9])'}),], is_leaf=True, yang_name="ip-prefix", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/routing-policy', defining_module='openconfig-routing-policy', yang_type='oc-inet:ip-prefix', is_config=False)
def _get_masklength_range(self):
"""
Getter method for masklength_range, mapped from YANG variable /routing_policy/defined_sets/prefix_sets/prefix_set/prefixes/prefix/state/masklength_range (string)
YANG Description: Defines a range for the masklength, or 'exact' if
the prefix has an exact length.
Example: 10.3.192.0/21 through 10.3.192.0/24 would be
expressed as prefix: 10.3.192.0/21,
masklength-range: 21..24.
Example: 10.3.192.0/21 would be expressed as
prefix: 10.3.192.0/21,
masklength-range: exact
"""
return self.__masklength_range
def _set_masklength_range(self, v, load=False):
"""
Setter method for masklength_range, mapped from YANG variable /routing_policy/defined_sets/prefix_sets/prefix_set/prefixes/prefix/state/masklength_range (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_masklength_range is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_masklength_range() directly.
YANG Description: Defines a range for the masklength, or 'exact' if
the prefix has an exact length.
Example: 10.3.192.0/21 through 10.3.192.0/24 would be
expressed as prefix: 10.3.192.0/21,
masklength-range: 21..24.
Example: 10.3.192.0/21 would be expressed as
prefix: 10.3.192.0/21,
masklength-range: exact
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '([0-9]+\\.\\.[0-9]+)|exact'}), is_leaf=True, yang_name="masklength-range", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/routing-policy', defining_module='openconfig-routing-policy', yang_type='string', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """masklength_range must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '([0-9]+\\.\\.[0-9]+)|exact'}), is_leaf=True, yang_name="masklength-range", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/routing-policy', defining_module='openconfig-routing-policy', yang_type='string', is_config=False)""",
})
self.__masklength_range = t
if hasattr(self, '_set'):
self._set()
def _unset_masklength_range(self):
self.__masklength_range = YANGDynClass(base=RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '([0-9]+\\.\\.[0-9]+)|exact'}), is_leaf=True, yang_name="masklength-range", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/routing-policy', defining_module='openconfig-routing-policy', yang_type='string', is_config=False)
ip_prefix = __builtin__.property(_get_ip_prefix)
masklength_range = __builtin__.property(_get_masklength_range)
_pyangbind_elements = OrderedDict([('ip_prefix', ip_prefix), ('masklength_range', masklength_range), ])
class state(PybindBase):
"""
This class was auto-generated by the PythonClass plugin for PYANG
from YANG module openconfig-routing-policy - based on the path /routing-policy/defined-sets/prefix-sets/prefix-set/prefixes/prefix/state. Each member element of
the container is represented as a class variable - with a specific
YANG type.
YANG Description: Operational state data for prefix definition
"""
__slots__ = ('_path_helper', '_extmethods', '__ip_prefix','__masklength_range',)
_yang_name = 'state'
_pybind_generated_by = 'container'
def __init__(self, *args, **kwargs):
self._path_helper = False
self._extmethods = False
self.__ip_prefix = YANGDynClass(base=[RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])/(([0-9])|([1-2][0-9])|(3[0-2]))'}),RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '(([0-9a-fA-F]{1,4}:){7}[0-9a-fA-F]{1,4}|([0-9a-fA-F]{1,4}:){1,7}:|([0-9a-fA-F]{1,4}:){1,6}:[0-9a-fA-F]{1,4}([0-9a-fA-F]{1,4}:){1,5}(:[0-9a-fA-F]{1,4}){1,2}|([0-9a-fA-F]{1,4}:){1,4}(:[0-9a-fA-F]{1,4}){1,3}|([0-9a-fA-F]{1,4}:){1,3}(:[0-9a-fA-F]{1,4}){1,4}|([0-9a-fA-F]{1,4}:){1,2}(:[0-9a-fA-F]{1,4}){1,5}|[0-9a-fA-F]{1,4}:((:[0-9a-fA-F]{1,4}){1,6})|:((:[0-9a-fA-F]{1,4}){1,7}|:))/(12[0-8]|1[0-1][0-9]|[1-9][0-9]|[0-9])'}),], is_leaf=True, yang_name="ip-prefix", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/routing-policy', defining_module='openconfig-routing-policy', yang_type='oc-inet:ip-prefix', is_config=False)
self.__masklength_range = YANGDynClass(base=RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '([0-9]+\\.\\.[0-9]+)|exact'}), is_leaf=True, yang_name="masklength-range", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/routing-policy', defining_module='openconfig-routing-policy', yang_type='string', is_config=False)
load = kwargs.pop("load", None)
if args:
if len(args) > 1:
raise TypeError("cannot create a YANG container with >1 argument")
all_attr = True
for e in self._pyangbind_elements:
if not hasattr(args[0], e):
all_attr = False
break
if not all_attr:
raise ValueError("Supplied object did not have the correct attributes")
for e in self._pyangbind_elements:
nobj = getattr(args[0], e)
if nobj._changed() is False:
continue
setmethod = getattr(self, "_set_%s" % e)
if load is None:
setmethod(getattr(args[0], e))
else:
setmethod(getattr(args[0], e), load=load)
def _path(self):
if hasattr(self, "_parent"):
return self._parent._path()+[self._yang_name]
else:
return ['routing-policy', 'defined-sets', 'prefix-sets', 'prefix-set', 'prefixes', 'prefix', 'state']
def _get_ip_prefix(self):
"""
Getter method for ip_prefix, mapped from YANG variable /routing_policy/defined_sets/prefix_sets/prefix_set/prefixes/prefix/state/ip_prefix (oc-inet:ip-prefix)
YANG Description: The prefix member in CIDR notation -- while the
prefix may be either IPv4 or IPv6, most
implementations require all members of the prefix set
to be the same address family. Mixing address types in
the same prefix set is likely to cause an error.
"""
return self.__ip_prefix
def _set_ip_prefix(self, v, load=False):
"""
Setter method for ip_prefix, mapped from YANG variable /routing_policy/defined_sets/prefix_sets/prefix_set/prefixes/prefix/state/ip_prefix (oc-inet:ip-prefix)
If this variable is read-only (config: false) in the
source YANG file, then _set_ip_prefix is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_ip_prefix() directly.
YANG Description: The prefix member in CIDR notation -- while the
prefix may be either IPv4 or IPv6, most
implementations require all members of the prefix set
to be the same address family. Mixing address types in
the same prefix set is likely to cause an error.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=[RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])/(([0-9])|([1-2][0-9])|(3[0-2]))'}),RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '(([0-9a-fA-F]{1,4}:){7}[0-9a-fA-F]{1,4}|([0-9a-fA-F]{1,4}:){1,7}:|([0-9a-fA-F]{1,4}:){1,6}:[0-9a-fA-F]{1,4}([0-9a-fA-F]{1,4}:){1,5}(:[0-9a-fA-F]{1,4}){1,2}|([0-9a-fA-F]{1,4}:){1,4}(:[0-9a-fA-F]{1,4}){1,3}|([0-9a-fA-F]{1,4}:){1,3}(:[0-9a-fA-F]{1,4}){1,4}|([0-9a-fA-F]{1,4}:){1,2}(:[0-9a-fA-F]{1,4}){1,5}|[0-9a-fA-F]{1,4}:((:[0-9a-fA-F]{1,4}){1,6})|:((:[0-9a-fA-F]{1,4}){1,7}|:))/(12[0-8]|1[0-1][0-9]|[1-9][0-9]|[0-9])'}),], is_leaf=True, yang_name="ip-prefix", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/routing-policy', defining_module='openconfig-routing-policy', yang_type='oc-inet:ip-prefix', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """ip_prefix must be of a type compatible with oc-inet:ip-prefix""",
'defined-type': "oc-inet:ip-prefix",
'generated-type': """YANGDynClass(base=[RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])/(([0-9])|([1-2][0-9])|(3[0-2]))'}),RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '(([0-9a-fA-F]{1,4}:){7}[0-9a-fA-F]{1,4}|([0-9a-fA-F]{1,4}:){1,7}:|([0-9a-fA-F]{1,4}:){1,6}:[0-9a-fA-F]{1,4}([0-9a-fA-F]{1,4}:){1,5}(:[0-9a-fA-F]{1,4}){1,2}|([0-9a-fA-F]{1,4}:){1,4}(:[0-9a-fA-F]{1,4}){1,3}|([0-9a-fA-F]{1,4}:){1,3}(:[0-9a-fA-F]{1,4}){1,4}|([0-9a-fA-F]{1,4}:){1,2}(:[0-9a-fA-F]{1,4}){1,5}|[0-9a-fA-F]{1,4}:((:[0-9a-fA-F]{1,4}){1,6})|:((:[0-9a-fA-F]{1,4}){1,7}|:))/(12[0-8]|1[0-1][0-9]|[1-9][0-9]|[0-9])'}),], is_leaf=True, yang_name="ip-prefix", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/routing-policy', defining_module='openconfig-routing-policy', yang_type='oc-inet:ip-prefix', is_config=False)""",
})
self.__ip_prefix = t
if hasattr(self, '_set'):
self._set()
def _unset_ip_prefix(self):
self.__ip_prefix = YANGDynClass(base=[RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])/(([0-9])|([1-2][0-9])|(3[0-2]))'}),RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '(([0-9a-fA-F]{1,4}:){7}[0-9a-fA-F]{1,4}|([0-9a-fA-F]{1,4}:){1,7}:|([0-9a-fA-F]{1,4}:){1,6}:[0-9a-fA-F]{1,4}([0-9a-fA-F]{1,4}:){1,5}(:[0-9a-fA-F]{1,4}){1,2}|([0-9a-fA-F]{1,4}:){1,4}(:[0-9a-fA-F]{1,4}){1,3}|([0-9a-fA-F]{1,4}:){1,3}(:[0-9a-fA-F]{1,4}){1,4}|([0-9a-fA-F]{1,4}:){1,2}(:[0-9a-fA-F]{1,4}){1,5}|[0-9a-fA-F]{1,4}:((:[0-9a-fA-F]{1,4}){1,6})|:((:[0-9a-fA-F]{1,4}){1,7}|:))/(12[0-8]|1[0-1][0-9]|[1-9][0-9]|[0-9])'}),], is_leaf=True, yang_name="ip-prefix", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/routing-policy', defining_module='openconfig-routing-policy', yang_type='oc-inet:ip-prefix', is_config=False)
def _get_masklength_range(self):
"""
Getter method for masklength_range, mapped from YANG variable /routing_policy/defined_sets/prefix_sets/prefix_set/prefixes/prefix/state/masklength_range (string)
YANG Description: Defines a range for the masklength, or 'exact' if
the prefix has an exact length.
Example: 10.3.192.0/21 through 10.3.192.0/24 would be
expressed as prefix: 10.3.192.0/21,
masklength-range: 21..24.
Example: 10.3.192.0/21 would be expressed as
prefix: 10.3.192.0/21,
masklength-range: exact
"""
return self.__masklength_range
def _set_masklength_range(self, v, load=False):
"""
Setter method for masklength_range, mapped from YANG variable /routing_policy/defined_sets/prefix_sets/prefix_set/prefixes/prefix/state/masklength_range (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_masklength_range is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_masklength_range() directly.
YANG Description: Defines a range for the masklength, or 'exact' if
the prefix has an exact length.
Example: 10.3.192.0/21 through 10.3.192.0/24 would be
expressed as prefix: 10.3.192.0/21,
masklength-range: 21..24.
Example: 10.3.192.0/21 would be expressed as
prefix: 10.3.192.0/21,
masklength-range: exact
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '([0-9]+\\.\\.[0-9]+)|exact'}), is_leaf=True, yang_name="masklength-range", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/routing-policy', defining_module='openconfig-routing-policy', yang_type='string', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """masklength_range must be of a type compatible with string""",
'defined-type': "string",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '([0-9]+\\.\\.[0-9]+)|exact'}), is_leaf=True, yang_name="masklength-range", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/routing-policy', defining_module='openconfig-routing-policy', yang_type='string', is_config=False)""",
})
self.__masklength_range = t
if hasattr(self, '_set'):
self._set()
def _unset_masklength_range(self):
self.__masklength_range = YANGDynClass(base=RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '([0-9]+\\.\\.[0-9]+)|exact'}), is_leaf=True, yang_name="masklength-range", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/routing-policy', defining_module='openconfig-routing-policy', yang_type='string', is_config=False)
ip_prefix = __builtin__.property(_get_ip_prefix)
masklength_range = __builtin__.property(_get_masklength_range)
_pyangbind_elements = OrderedDict([('ip_prefix', ip_prefix), ('masklength_range', masklength_range), ])
| 68.191045 | 1,049 | 0.66976 | 4,021 | 22,844 | 3.670231 | 0.05521 | 0.019515 | 0.043366 | 0.05204 | 0.974726 | 0.960022 | 0.960022 | 0.960022 | 0.960022 | 0.960022 | 0 | 0.073092 | 0.11062 | 22,844 | 334 | 1,050 | 68.39521 | 0.653295 | 0.241376 | 0 | 0.890244 | 0 | 0.097561 | 0.495476 | 0.351856 | 0 | 0 | 0 | 0 | 0 | 1 | 0.097561 | false | 0 | 0.091463 | 0 | 0.323171 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
f4ab481e8f5dc39e531227c7b92b522b3e8bd038 | 2,836 | py | Python | cscl_tests/examples/smt_qfbv_solver/unit/test_sexp_parser.py | fkutzner/PyCSCL | 7500503ae2565a284af7a69483274ef480747801 | [
"X11"
] | 5 | 2019-02-14T15:25:05.000Z | 2020-04-11T19:57:01.000Z | cscl_tests/examples/smt_qfbv_solver/unit/test_sexp_parser.py | fkutzner/PyCSCL | 7500503ae2565a284af7a69483274ef480747801 | [
"X11"
] | null | null | null | cscl_tests/examples/smt_qfbv_solver/unit/test_sexp_parser.py | fkutzner/PyCSCL | 7500503ae2565a284af7a69483274ef480747801 | [
"X11"
] | null | null | null | import unittest
import cscl_examples.smt_qfbv_solver.sexp_parser as sep
class TestLexSExp(unittest.TestCase):
def test_returns_empty_seq_for_empty_string(self):
result = list(x for x in sep.lex_sexp(""))
self.assertEqual(result, [], "Unexpected result " + str(result))
def test_returns_empty_seq_for_whitespace_string(self):
result = list(x for x in sep.lex_sexp(" \t\n \r"))
self.assertEqual(result, [], "Unexpected result " + str(result))
def test_lex_single_open_paren(self):
result = list(x for x in sep.lex_sexp("("))
self.assertEqual(result, ["("], "Unexpected result " + str(result))
def test_lex_single_close_paren(self):
result = list(x for x in sep.lex_sexp(")"))
self.assertEqual(result, [")"], "Unexpected result " + str(result))
def test_lex_word(self):
result = list(x for x in sep.lex_sexp("foo"))
self.assertEqual(result, ["foo"], "Unexpected result " + str(result))
def test_lex_word_with_parens(self):
result = list(x for x in sep.lex_sexp("(foo)"))
self.assertEqual(result, ["(", "foo", ")"], "Unexpected result " + str(result))
def test_lex_word_with_nested_parens(self):
result = list(x for x in sep.lex_sexp("(foo ( bar))"))
self.assertEqual(result, ["(", "foo", "(", "bar", ")", ")"], "Unexpected result " + str(result))
class TestParseSExp(unittest.TestCase):
def test_returns_empty_list_for_empty_string(self):
result = sep.parse_sexp(iter([]))
self.assertEqual(result, [], "Unexpected result " + str(result))
def test_parse_word(self):
result = sep.parse_sexp(iter(["foo"]))
self.assertEqual(result, ["foo"], "Unexpected result " + str(result))
def test_parse_two_words(self):
result = sep.parse_sexp(iter(["foo", "bar"]))
self.assertEqual(result, ["foo", "bar"], "Unexpected result " + str(result))
def test_parse_list_expr(self):
result = sep.parse_sexp(iter(["(", "foo", ")"]))
self.assertEqual(result, [["foo"]], "Unexpected result " + str(result))
def test_parse_two_list_exprs(self):
result = sep.parse_sexp(iter(["(", "foo", ")", "(", "bar", ")"]))
self.assertEqual(result, [["foo"], ["bar"]], "Unexpected result " + str(result))
def test_parse_nested_list_exprs(self):
result = sep.parse_sexp(iter(["(", "foo", "1", "2", "(", "bar", "(", "baz", "0", ")", "bam",
")", ")", "(", "bar", ")"]))
self.assertEqual(result, [["foo", "1", "2", ["bar", ["baz", "0"], "bam"]], ["bar"]],
"Unexpected result " + str(result))
def test_refuses_malformed_sexp(self):
with self.assertRaises(ValueError):
sep.parse_sexp(iter(["(", "foo", "(", "bar", ")"]))
| 43.630769 | 104 | 0.596968 | 351 | 2,836 | 4.612536 | 0.17094 | 0.060531 | 0.168623 | 0.200741 | 0.874614 | 0.849907 | 0.762199 | 0.739345 | 0.716492 | 0.654725 | 0 | 0.002694 | 0.214739 | 2,836 | 64 | 105 | 44.3125 | 0.724293 | 0 | 0 | 0.104167 | 0 | 0 | 0.133992 | 0 | 0 | 0 | 0 | 0 | 0.291667 | 1 | 0.291667 | false | 0 | 0.041667 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
f4c24673a2c1bfecd8e89654e5ae63e1e5b67072 | 11,608 | py | Python | app/tests/unit_tests/test_database_wrapper.py | regg00/docker-xmedius-adsync | 738bdb82d37173c9363d69c8148a95c43043aeba | [
"MIT"
] | null | null | null | app/tests/unit_tests/test_database_wrapper.py | regg00/docker-xmedius-adsync | 738bdb82d37173c9363d69c8148a95c43043aeba | [
"MIT"
] | null | null | null | app/tests/unit_tests/test_database_wrapper.py | regg00/docker-xmedius-adsync | 738bdb82d37173c9363d69c8148a95c43043aeba | [
"MIT"
] | null | null | null | #////////////////////////////////////////////////////////////////////////////
# Copyright (c) 2012 Sagemcom Canada Permission to use this work
# for any purpose must be obtained in writing from Sagemcom Canada
# 5252 de Maisonneuve Blvd. West, suite 400, Montreal, Quebec H4A 3S5
#////////////////////////////////////////////////////////////////////////////
import os
import unittest
from mock import patch, Mock, MagicMock
from libs.sql.database_wrapper import DatabaseWrapper
import sqlite3
@patch.object(DatabaseWrapper, '_create_database', Mock())
@patch.object(DatabaseWrapper, '_create_tables', Mock())
def build_database_mock():
if os.path.exists("database.db"):
os.remove("database.db")
return DatabaseWrapper()
def build_and_init_database_mock(table_func):
mock = build_database_mock()
mock._create_database()
getattr(mock, table_func)()
return mock
def reset_database(database):
database.terminate()
if os.path.exists("database.db"):
os.remove("database.db")
def connect_and_fetch(query):
conn_handle = sqlite3.connect("database.db")
cursor = conn_handle.cursor()
cursor.execute(query)
content = cursor.fetchall()
cursor.close()
conn_handle.close()
return content
class TestCreateDatabase(unittest.TestCase):
def setUp(self):
self.database_mock = build_database_mock()
def tearDown(self):
reset_database(self.database_mock)
def test_success(self):
self.database_mock._create_database()
self.assertTrue(os.path.exists("database.db"))
class TestCreateAllTables(unittest.TestCase):
def setUp(self):
self.database_mock = build_database_mock()
self.database_mock._create_retry_table = MagicMock()
self.database_mock._create_users_table = MagicMock()
self.database_mock._create_usn_table = MagicMock()
self.database_mock._create_fax_assign_table = MagicMock()
self.database_mock._create_fax_unassign_table = MagicMock()
def tearDown(self):
reset_database(self.database_mock)
def test_success(self):
self.database_mock._create_tables()
class TestCreateUsnTable(unittest.TestCase):
def setUp(self):
self.database_mock = build_and_init_database_mock('_create_usn_table')
def tearDown(self):
reset_database(self.database_mock)
def test_create_usn_table(self):
if not self.database_mock._execute_and_fetch("SELECT name FROM sqlite_master WHERE type='table' AND name='usnInfos'"):
self.fail()
class TestCreateUserTable(unittest.TestCase):
def setUp(self):
self.database_mock = build_and_init_database_mock('_create_users_table')
def tearDown(self):
reset_database(self.database_mock)
def test_create_users_table(self):
if not self.database_mock._execute_and_fetch("SELECT name FROM sqlite_master WHERE type='table' AND name='usersInfos'"):
self.fail()
class TestCreateRetryTable(unittest.TestCase):
def setUp(self):
self.database_mock = build_and_init_database_mock('_create_retry_table')
def tearDown(self):
reset_database(self.database_mock)
def test_create_retry_table(self):
if not self.database_mock._execute_and_fetch("SELECT name FROM sqlite_master WHERE type='table' AND name='retryInfos'"):
self.fail()
class TestGetLastSuccessfulUsn(unittest.TestCase):
def setUp(self):
self.database_mock = build_and_init_database_mock('_create_usn_table')
def tearDown(self):
reset_database(self.database_mock)
def test_success(self):
self.database_mock._execute_and_fetch = MagicMock(return_value=[(10,)])
self.assertEqual(10, self.database_mock.get_last_successful_usn())
def test_fail(self):
self.database_mock._execute_and_fetch = Exception()
self.assertRaises(Exception, self.database_mock.get_last_successful_usn)
class TestSetLastSuccessfulUsn(unittest.TestCase):
def setUp(self):
self.database_mock = build_and_init_database_mock('_create_usn_table')
def tearDown(self):
reset_database(self.database_mock)
@patch.object(DatabaseWrapper, '_cursor', None, create=True)
def test_success(self):
self.database_mock.set_last_successful_usn(10)
self.assertEqual([(10,)], self.database_mock._execute_and_fetch("SELECT lastSuccessfulUSN FROM usnInfos"))
def test_fail(self):
self.database_mock.set_last_successful_usn = Exception()
self.assertRaises(Exception, self.database_mock.set_last_successful_usn, 0)
class TestAddUser(unittest.TestCase):
def setUp(self):
self.database_mock = build_and_init_database_mock('_create_users_table')
def tearDown(self):
reset_database(self.database_mock)
def test_add(self):
amount_before = len(self.database_mock._execute_and_fetch('SELECT * FROM usersInfos'))
self.database_mock.add_user('1', 0, 0, 0)
amount_after = len(self.database_mock._execute_and_fetch('SELECT * FROM usersInfos'))
self.assertEqual(amount_after, amount_before + 1)
def test_add_twice_same_user(self):
self.database_mock.add_user('0', 0, 0, 0)
self.assertRaises(Exception, self.database_mock.add_user, ('0', 0, 0, 0))
def test_wrong_parameters(self):
self.assertRaises(Exception, self.database_mock.add_user, Mock(), None, None)
class TestModifyUser(unittest.TestCase):
def setUp(self):
self.database_mock = build_and_init_database_mock('_create_users_table')
def tearDown(self):
reset_database(self.database_mock)
def test_modify_fax(self):
try:
self.database_mock.add_user('0', 0, 0, 0)
except:
query = 'INSERT INTO usersInfos VALUES (?,?,?,?)'
self.database_mock._cursor.execute(query, ['0', 0, 0, 0])
self.database_mock._connectionHandler.commit()
self.database_mock.modify_fax('0', 123)
entry = self.database_mock._execute_and_fetch('SELECT * FROM usersInfos')
self.assertEqual(123, entry[0][2])
def test_modify_unexisting_user(self):
self.assertRaises(Exception, self.database_mock.modify_fax, -1, None)
def test_wrong_parameters(self):
self.assertRaises(Exception, self.database_mock.modify_fax, Mock(), None)
class TestRemoveUser(unittest.TestCase):
def setUp(self):
self.database_mock = build_and_init_database_mock('_create_users_table')
def tearDown(self):
reset_database(self.database_mock)
def test_remove_user(self):
try:
self.database_mock.add_user('0', 0, 0, 0)
except:
query = 'INSERT INTO usersInfos VALUES (?,?,?,?)'
self.database_mock._cursor.execute(query, ['0', 0, 0, 0])
self.database_mock._connectionHandler.commit()
amount_before = len(self.database_mock._execute_and_fetch('SELECT * FROM usersInfos'))
try:
self.database_mock.remove_user('0')
except:
self.fail()
amount_after = len(self.database_mock._execute_and_fetch('SELECT * FROM usersInfos'))
self.assertEqual(amount_after, amount_before - 1)
def test_remove_twice_same_user(self):
try:
self.database_mock.add_user('1', 0, 0, 0)
except:
query = 'INSERT INTO usersInfos VALUES (?,?,?,?)'
self.database_mock._cursor.execute(query, ['1', 0, 0, 0])
self.database_mock._connectionHandler.commit()
self.database_mock.remove_user('1')
self.assertRaises(Exception, self.database_mock.remove_user, '1')
def test_wrong_parameters(self):
self.assertRaises(Exception, self.database_mock.remove_user, Mock())
class TestGetUserInfos(unittest.TestCase):
def setUp(self):
self.database_mock = build_and_init_database_mock('_create_users_table')
def tearDown(self):
reset_database(self.database_mock)
def test_with_results(self):
try:
self.database_mock.add_user('0', 0, 0, 0)
except:
query = 'INSERT INTO usersInfos VALUES (?,?,?,?)'
self.database_mock._cursor.execute(query, ['0', 0, 0, 0])
self.database_mock._connectionHandler.commit()
self.assertEqual({'fax': 0, 'id': 0}, self.database_mock.get_user_infos('0'))
def test_no_results(self):
self.assertEqual({}, self.database_mock.get_user_infos("TEST"))
class TestAddRetryEntry(unittest.TestCase):
def setUp(self):
self.database_mock = build_and_init_database_mock('_create_retry_table')
def tearDown(self):
reset_database(self.database_mock)
def test_add(self):
amount_before = len(self.database_mock._execute_and_fetch('SELECT * FROM retryInfos'))
self.database_mock.add_retry_entry('0')
amount_after = len(self.database_mock._execute_and_fetch('SELECT * FROM retryInfos'))
self.assertEqual(amount_after, amount_before + 1)
def test_add_twice_same_user(self):
amount_before = len(self.database_mock._execute_and_fetch('SELECT * FROM retryInfos'))
self.database_mock.add_retry_entry('1')
self.database_mock.add_retry_entry('1')
amount_after = len(self.database_mock._execute_and_fetch('SELECT * FROM retryInfos'))
self.assertEqual(amount_after, amount_before + 1)
def test_wrong_parameters(self):
self.assertRaises(Exception, self.database_mock.add_retry_entry, Mock(), None, None)
class TestRemoveRetryEntry(unittest.TestCase):
def setUp(self):
self.database_mock = build_and_init_database_mock('_create_retry_table')
def tearDown(self):
reset_database(self.database_mock)
def test_remove_user(self):
try:
self.database_mock.add_retry_entry('0')
except:
query = 'INSERT INTO retryInfos VALUES (?)'
self.database_mock._cursor.execute(query, ['0'])
self.database_mock._connectionHandler.commit()
amount_before = len(self.database_mock._execute_and_fetch('SELECT * FROM retryInfos'))
self.database_mock.remove_retry_entry('0')
amount_after = len(self.database_mock._execute_and_fetch('SELECT * FROM retryInfos'))
self.assertEqual(amount_after, amount_before - 1)
def test_remove_twice_same_user(self):
try:
self.database_mock.add_retry_entry('0')
except:
query = 'INSERT INTO retryInfos VALUES (?)'
self.database_mock._cursor.execute(query, ['0'])
self.database_mock._connectionHandler.commit()
self.database_mock.remove_retry_entry('0')
self.assertRaises(Exception, self.database_mock.remove_retry_entry, '0')
def test_wrong_parameters(self):
self.assertRaises(Exception, self.database_mock.remove_user, Mock())
class TestGetRetryEntries(unittest.TestCase):
def setUp(self):
self.database_mock = build_and_init_database_mock('_create_retry_table')
def tearDown(self):
reset_database(self.database_mock)
def test_with_results(self):
try:
self.database_mock.add_retry_entry('0')
except:
query = 'INSERT INTO retryInfos VALUES (?)'
self.database_mock._cursor.execute(query, ['0'])
self.database_mock._connectionHandler.commit()
self.assertEqual(['0'], self.database_mock.get_retry_entries())
def test_no_results(self):
self.assertEqual([], self.database_mock.get_retry_entries())
| 34.858859 | 128 | 0.687026 | 1,423 | 11,608 | 5.283907 | 0.106114 | 0.189919 | 0.214922 | 0.055858 | 0.831095 | 0.807421 | 0.785211 | 0.716053 | 0.691847 | 0.672829 | 0 | 0.011007 | 0.193832 | 11,608 | 332 | 129 | 34.963855 | 0.792477 | 0.029979 | 0 | 0.65812 | 0 | 0 | 0.100489 | 0 | 0 | 0 | 0 | 0 | 0.102564 | 1 | 0.25641 | false | 0 | 0.021368 | 0 | 0.350427 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f4e7f4519377fcdbf915e2e388bdef90a1825f4a | 20,165 | py | Python | replications/Angrist_Krueger_1991/auxiliary/tables.py | JonathanWillnow/ose-course-data-science | a8aea30e3811fbac35eec141f86e6b48047e69a3 | [
"MIT"
] | null | null | null | replications/Angrist_Krueger_1991/auxiliary/tables.py | JonathanWillnow/ose-course-data-science | a8aea30e3811fbac35eec141f86e6b48047e69a3 | [
"MIT"
] | null | null | null | replications/Angrist_Krueger_1991/auxiliary/tables.py | JonathanWillnow/ose-course-data-science | a8aea30e3811fbac35eec141f86e6b48047e69a3 | [
"MIT"
] | null | null | null | def create_table_qob(results, outcome_variables=None):
table = """
<table>
<thead>
<tr>
<th>Outcome variable</th>
<th>Birth cohort</th>
<th>Mean</th>
<th colspan="3">Quarter-of-birth effect</th>
<th>F-test</th>
</tr>
<tr>
<th></th><th></th><th></th>
<th>I</th>
<th>II</th>
<th>III</th>
<th>[P-value]</th>
</tr>
</thead>
<tbody>
"""
if outcome_variables:
for out_var, rslt in zip(outcome_variables, results):
table += create_table_row_qob(out_var, rslt["cohort"], rslt["mean"], rslt["ols"])
else:
for rslt in results:
table += create_table_row_qob(rslt["var"], rslt["cohort"], rslt["mean"], rslt["ols"])
table += """
</tbody>
</table>
"""
return table
def create_table_row_qob(outcome_variable, cohort, mean, ols):
table_row = f"""
<tr>
<td>{outcome_variable}</td>
<td>{cohort}</td>
<td>{mean:5.2f}</td>
<td>{ols.params['DUMMY_QOB_1']:6.3f}</td>
<td>{ols.params['DUMMY_QOB_2']:6.3f}</td>
<td>{ols.params['DUMMY_QOB_3']:6.3f}</td>
<td>{ols.fvalue:6.1f}</td>
</tr>
<tr>
<td></td><td></td><td></td>
<td>({ols.bse['DUMMY_QOB_1']:5.3f})</td>
<td>({ols.bse['DUMMY_QOB_2']:5.3f})</td>
<td>({ols.bse['DUMMY_QOB_3']:5.3f})</td>
<td>[{ols.f_pvalue:6.4f}]</td>
</tr>
"""
return table_row
def create_table_wald_estimates(title, results):
table = f"""
<table>
<thead>
<tr>
<th colspan="4">
{title}
</th>
</tr>
<tr>
<th></th>
<th>(1)<br>Born<br>in 1st quarter<br>of year</th>
<th>(2)<br>Born in 2nd,<br>3rd, or 4th<br>quarter of year</th>
<th>(3)<br>Difference<br>(std.error)<br>(1) - (2)</th>
</tr>
</thead>
<tbody>
<tr>
<td>ln (wkly. wage)</td>
<td>{results['wage_1st']:6.4f}</td>
<td>{results['wage_other']:6.4f}</td>
<td>{results['wage_diff']:6.5f}</td>
</tr>
<tr>
<td></td><td></td><td></td>
<td>({results['wage_err']:6.5f})</td>
</tr>
<tr>
<td>Education</td>
<td>{results['educ_1st']:6.4f}</td>
<td>{results['educ_other']:6.4f}</td>
<td>{results['educ_diff']:6.5f}</td>
</tr>
<tr>
<td></td><td></td><td></td>
<td>({results['educ_err']:6.5f})</td>
</tr>
<tr>
<td>Wald est. of return to education</td>
<td></td><td></td>
<td>{results['wald_est']:6.5f}</td>
</tr>
<tr>
<td></td><td></td><td></td>
<td>({results['wald_err']:6.5f})</td>
</tr>
<tr>
<td>OLS return to education</td>
<td></td><td></td>
<td>{results['ols_est']:6.5f}</td>
</tr>
<tr>
<td></td><td></td><td></td>
<td>({results['ols_err']:6.5f})</td>
</tr>
</tbody>
</table>
"""
return table
def create_table_4_5_6(results):
table = """
<table>
<thead>
<tr>
<th></th>
"""
table += "\n".join([f"<th>({i})</th>" for i in range(1, 9)])
table += """
</tr>
<tr>
<th>Independent variable</th>
"""
table += "\n".join([f"<th>{method}</th>" for method in ["OLS", "TSLS"] * 4])
table += """
</tr>
</thead>
<tbody>
<tr>
<td>Years of education</td>
"""
table += "\n".join([f'<td>{rslt.params["EDUC"]:5.4f}</td>' for rslt in results])
table += """
</tr>
<tr>
<td></td>
"""
table += "\n".join([f'<td>({rslt.bse["EDUC"]:5.4f})</td>' for rslt in results])
table += """
</tr>
<tr>
<td>Race (1 = black)</td>
"""
table += "\n".join(
[
f'<td>{rslt.params["RACE"]:6.4f}</td>' if "RACE" in rslt.params else "<td>-</td>"
for rslt in results
]
)
table += """
</tr>
<tr>
<td></td>
"""
table += "\n".join(
[
f'<td>({rslt.bse["RACE"]:6.4f})</td>' if "RACE" in rslt.bse else "<td></td>"
for rslt in results
]
)
table += """
</tr>
<tr>
<td>SMSA (1 = center city)</td>
"""
table += "\n".join(
[
f'<td>{rslt.params["SMSA"]:6.4f}</td>' if "SMSA" in rslt.params else "<td>-</td>"
for rslt in results
]
)
table += """
</tr>
<tr>
<td></td>
"""
table += "\n".join(
[
f'<td>({rslt.bse["SMSA"]:6.4f})</td>' if "SMSA" in rslt.bse else "<td></td>"
for rslt in results
]
)
table += """
</tr>
<tr>
<td>Married (1 = married)</td>
"""
table += "\n".join(
[
f'<td>{rslt.params["MARRIED"]:6.4f}</td>' if "MARRIED" in rslt.params else "<td>-</td>"
for rslt in results
]
)
table += """
</tr>
<tr>
<td></td>
"""
table += "\n".join(
[
f'<td>({rslt.bse["MARRIED"]:6.4f})</td>' if "MARRIED" in rslt.bse else "<td></td>"
for rslt in results
]
)
table += """
</tr>
<tr>
<td>9 Year-of-birth dummies</td>
"""
table += "\n".join(["<td>Yes</td>"] * 8)
table += """
</tr>
<tr>
<td>8 Region-of-residence dummies</td>
"""
table += "\n".join(["<td>No</td>"] * 4 + ["<td>Yes</td>"] * 4)
table += """
</tr>
<tr>
<td>Age</td>
"""
table += "\n".join(
[
f'<td>{rslt.params["AGEQ"]:6.4f}</td>' if "AGEQ" in rslt.params else "<td>-</td>"
for rslt in results
]
)
table += """
</tr>
<tr>
<td></td>
"""
table += "\n".join(
[
f'<td>({rslt.bse["AGEQ"]:6.4f})</td>' if "AGEQ" in rslt.bse else "<td></td>"
for rslt in results
]
)
table += """
</tr>
<tr>
<td>Age Squared</td>
"""
table += "\n".join(
[
f'<td>{rslt.params["AGESQ"]:6.4f}</td>' if "AGESQ" in rslt.params else "<td>-</td>"
for rslt in results
]
)
table += """
</tr>
<tr>
<td></td>
"""
table += "\n".join(
[
f'<td>({rslt.bse["AGESQ"]:6.4f})</td>' if "AGESQ" in rslt.bse else "<td></td>"
for rslt in results
]
)
table += """
</tr>
</tbody>
</table>
"""
return table
def create_table_7_8(results, race=True):
table = """
<table>
<thead>
<tr>
<th></th>
"""
table += "\n".join([f"<th>({i})</th>" for i in range(1, 9)])
table += """
</tr>
<tr>
<th>Independent variable</th>
"""
table += "\n".join([f"<th>{method}</th>" for method in ["OLS", "TSLS"] * 4])
table += """
</tr>
</thead>
<tbody>
<tr>
<td>Years of education</td>
"""
table += "\n".join([f'<td>{rslt.params["EDUC"]:5.4f}</td>' for rslt in results])
table += """
</tr>
<tr>
<td></td>
"""
table += "\n".join([f'<td>({rslt.bse["EDUC"]:5.4f})</td>' for rslt in results])
if race:
table += """
</tr>
<tr>
<td>Race (1 = black)</td>
"""
table += "\n".join(
[
f'<td>{rslt.params["RACE"]:6.4f}</td>' if "RACE" in rslt.params else "<td>-</td>"
for rslt in results
]
)
table += """
</tr>
<tr>
<td></td>
"""
table += "\n".join(
[
f'<td>({rslt.bse["RACE"]:6.4f})</td>' if "RACE" in rslt.bse else "<td></td>"
for rslt in results
]
)
table += """
</tr>
<tr>
<td>SMSA (1 = center city)</td>
"""
table += "\n".join(
[
f'<td>{rslt.params["SMSA"]:6.4f}</td>' if "SMSA" in rslt.params else "<td>-</td>"
for rslt in results
]
)
table += """
</tr>
<tr>
<td></td>
"""
table += "\n".join(
[
f'<td>({rslt.bse["SMSA"]:6.4f})</td>' if "SMSA" in rslt.bse else "<td></td>"
for rslt in results
]
)
table += """
</tr>
<tr>
<td>Married (1 = married)</td>
"""
table += "\n".join(
[
f'<td>{rslt.params["MARRIED"]:6.4f}</td>' if "MARRIED" in rslt.params else "<td>-</td>"
for rslt in results
]
)
table += """
</tr>
<tr>
<td></td>
"""
table += "\n".join(
[
f'<td>({rslt.bse["MARRIED"]:6.4f})</td>' if "MARRIED" in rslt.bse else "<td></td>"
for rslt in results
]
)
table += """
</tr>
<tr>
<td>9 Year-of-birth dummies</td>
"""
table += "\n".join(["<td>Yes</td>"] * 8)
table += """
</tr>
<tr>
<td>8 Region-of-residence dummies</td>
"""
table += "\n".join(["<td>No</td>"] * 4 + ["<td>Yes</td>"] * 4)
if race:
table += """
</tr>
<tr>
<td>50 State-of-birth dummies</td>
"""
else:
table += """
</tr>
<tr>
<td>49 State-of-birth dummies</td>
"""
table += "\n".join(["<td>Yes</td>"] * 8)
table += """
</tr>
<tr>
<td>Age</td>
"""
table += "\n".join(
[
f'<td>{rslt.params["AGEQ"]:6.4f}</td>' if "AGEQ" in rslt.params else "<td>-</td>"
for rslt in results
]
)
table += """
</tr>
<tr>
<td></td>
"""
table += "\n".join(
[
f'<td>({rslt.bse["AGEQ"]:6.4f})</td>' if "AGEQ" in rslt.bse else "<td></td>"
for rslt in results
]
)
table += """
</tr>
<tr>
<td>Age Squared</td>
"""
table += "\n".join(
[
f'<td>{rslt.params["AGESQ"]:6.4f}</td>' if "AGESQ" in rslt.params else "<td>-</td>"
for rslt in results
]
)
table += """
</tr>
<tr>
<td></td>
"""
table += "\n".join(
[
f'<td>({rslt.bse["AGESQ"]:6.4f})</td>' if "AGESQ" in rslt.bse else "<td></td>"
for rslt in results
]
)
table += """
</tr>
</tbody>
</table>
"""
return table
def create_table_mstly_hrmlss_ecnmtrcs_4_6_2(tsls, liml, f_test):
table = """
<table>
<thead>
<tr>
<th></th>
<th>(1)</th>
<th>(2)</th>
<th>(3)</th>
<th>(4)</th>
<th>(5)</th>
<th>(6)</th>
</tr>
</thead>
<tbody>
<tr>
<td>2SLS</td>
"""
table += "\n".join([f'<td>{rslt.params["EDUC"]:5.4f}</td>' for rslt in tsls])
table += """
</tr>
<tr>
<td></td>
"""
table += "\n".join([f'<td>({rslt.bse["EDUC"]:5.4f})</td>' for rslt in tsls])
table += """
</tr>
<tr>
<td>LIML</td>
"""
table += "\n".join([f'<td>{rslt.params["EDUC"]:5.4f}</td>' for rslt in liml])
table += """
</tr>
<tr>
<td></td>
"""
table += "\n".join([f'<td>({rslt.std_errors["EDUC"]:5.4f})</td>' for rslt in liml])
table += """
</tr>
<tr>
<td>F-statistic\n(excluded instruments)</td>
"""
table += "\n".join(
[f"<td>{f.fvalue[0][0]:5.4f}</td>" if f is not None else "<td></td>" for f in f_test]
)
table += """
</tr>
<tr><td></td><td></td><td></td><td></td><td></td><td></td><td></td></tr>
<tr>
<td>Controls</td>
<td></td><td></td><td></td><td></td><td></td><td></td>
</tr>
<tr>
<td>Year of birth</td>
<td>x</td><td>x</td><td>x</td><td>x</td><td>x</td><td>x</td>
</tr>
<tr>
<td>State of birth</td>
<td></td><td></td><td></td><td></td><td>x</td><td>x</td>
</tr>
<tr>
<td>Age, age squared</td>
<td></td><td>x</td><td></td><td>x</td><td></td><td>x</td>
</tr>
<tr><td></td><td></td><td></td><td></td><td></td><td></td><td></td></tr>
<tr>
<td>Excluded instruments</td>
<td></td><td></td><td></td><td></td><td></td>
</tr>
<tr>
<td>Quarter-of-birth dummies</td>
<td>x</td><td>x</td><td></td><td></td><td></td><td></td>
</tr>
<tr>
<td>Quarter of birth * year of birth</td>
<td></td><td></td><td>x</td><td>x</td><td>x</td><td>x</td>
</tr>
<tr>
<td>Quarter of birth * year of birth</td>
<td></td><td></td><td></td><td></td><td>x</td><td>x</td>
</tr>
<tr>
<td>Number of Excluded instruments</td>
<td>3</td><td>2</td><td>30</td><td>28</td><td>180</td><td>178</td>
</tr>
</tbody>
</table>
"""
return table
def create_weak_instruments_table_1(results, f_test, partial_rsquared):
table = """
<table>
<thead>
<tr>
<th></th>
<th>(1)</th>
<th>(2)</th>
<th>(3)</th>
<th>(4)</th>
<th>(5)</th>
<th>(6)</th>
</tr>
<tr>
<th></th>
<th>OLS</th>
<th>IV</th>
<th>OLS</th>
<th>IV</th>
<th>OLS</th>
<th>IV</th>
</tr>
</thead>
<tbody>
<tr>
<td>Coefficient</td>
"""
table += "\n".join([f'<td>{rslt.params["EDUC"]:5.4f}</td>' for rslt in results])
table += """
</tr>
<tr>
<td></td>
"""
table += "\n".join([f'<td>({rslt.bse["EDUC"]:5.4f})</td>' for rslt in results])
table += """
</tr>
<tr>
<td>F(excluded instruments)</td>
"""
table += "\n".join(
[f"<td>{f.fvalue[0][0]:5.4f}</td>" if f is not None else "<td></td>" for f in f_test]
)
table += """
</tr>
<tr>
<td>Parital R squared (excluded instruments, x100)</td>
"""
table += "\n".join(
[
f"<td>{r_sq * 100:5.4f}</td>" if r_sq is not None else "<td></td>"
for r_sq in partial_rsquared
]
)
table += """
</tr>
<tr><td></td><td></td><td></td><td></td><td></td><td></td><td></td></tr>
<tr>
<td>Age Control Variables</td>
<td></td><td></td><td></td><td></td><td></td><td></td>
</tr>
<tr>
<td>Age, age squared</td>
<td>x</td><td>x</td><td></td><td></td><td>x</td><td>x</td>
</tr>
<tr>
<td>9 Year of birth dummies</td>
<td></td><td></td><td>x</td><td>x</td><td>x</td><td>x</td>
</tr>
<tr><td></td><td></td><td></td><td></td><td></td><td></td><td></td></tr>
<tr>
<td>Excluded instruments</td>
<td></td><td></td><td></td><td></td><td></td><td></td>
</tr>
<tr>
<td>Quarter-of-birth dummies</td>
<td></td><td>x</td><td></td><td>x</td><td></td><td>x</td>
</tr>
<tr>
<td>Quarter of birth * year of birth</td>
<td></td><td></td><td></td><td></td><td>x</td><td>x</td>
</tr>
<tr>
<td>Number of Excluded instruments</td>
<td></td><td>3</td><td></td><td>30</td><td></td><td>28</td>
</tr>
</tbody>
</table>
"""
return table
def create_weak_instruments_table_2(results, f_test, partial_rsquared):
table = """
<table>
<thead>
<tr>
<th></th>
<th>(1)</th>
<th>(2)</th>
<th>(3)</th>
<th>(4)</th>
</tr>
<tr>
<th></th>
<th>OLS</th>
<th>IV</th>
<th>OLS</th>
<th>IV</th>
</tr>
</thead>
<tbody>
<tr>
<td>Coefficient</td>
"""
table += "\n".join([f'<td>{rslt.params["EDUC"]:5.4f}</td>' for rslt in results])
table += """
</tr>
<tr>
<td></td>
"""
table += "\n".join([f'<td>({rslt.bse["EDUC"]:5.4f})</td>' for rslt in results])
table += """
</tr>
<tr>
<td>F(excluded instruments)</td>
"""
table += "\n".join(
[f"<td>{f.fvalue[0][0]:5.4f}</td>" if f is not None else "<td></td>" for f in f_test]
)
table += """
</tr>
<tr>
<td>Parital R squared (excluded instruments, x100)</td>
"""
table += "\n".join(
[
f"<td>{r_sq * 100:5.4f}</td>" if r_sq is not None else "<td></td>"
for r_sq in partial_rsquared
]
)
table += """
</tr>
<tr>
<td>Age Control Variables</td>
<td></td><td></td><td></td><td></td>
</tr>
<tr>
<td>Age, age squared</td>
<td></td><td></td><td>x</td><td>x</td>
</tr>
<tr>
<td>9 Year of birth dummies</td>
<td>x</td><td>x</td><td>x</td><td>x</td>
</tr>
<tr><td></td><td></td><td></td><td></td><td></td></tr>
<tr>
<td>Excluded instruments</td>
<td></td><td></td><td></td><td></td>
</tr>
<tr>
<td>Quarter-of-birth dummies</td>
<td></td><td>x</td><td></td><td>x</td>
</tr>
<tr>
<td>Quarter of birth * year of birth</td>
<td></td><td>x</td><td></td><td>x</td>
</tr>
<tr>
<td>Quarter of birth * state of birth</td>
<td></td><td>x</td><td></td><td>x</td>
</tr>
<tr>
<td>Number of Excluded instruments</td>
<td></td><td>180</td><td></td><td>178</td>
</tr>
</tbody>
</table>
"""
return table
| 27.398098 | 99 | 0.354525 | 2,394 | 20,165 | 2.949875 | 0.057644 | 0.201643 | 0.192014 | 0.207307 | 0.897621 | 0.878221 | 0.840272 | 0.820023 | 0.799773 | 0.794251 | 0 | 0.019904 | 0.421969 | 20,165 | 735 | 100 | 27.435374 | 0.585964 | 0 | 0 | 0.785206 | 0 | 0.042674 | 0.699182 | 0.188792 | 0 | 0 | 0 | 0 | 0 | 1 | 0.01138 | false | 0 | 0 | 0 | 0.02276 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
f4fe7b1cf3c5e32d24b98aea60de084f39d352ff | 83 | py | Python | planga/__init__.py | ResiliaDev/planga-python | a70438c3b9b1e2749ca72eb93614cf9d1d78a610 | [
"MIT"
] | 1 | 2018-10-15T17:05:42.000Z | 2018-10-15T17:05:42.000Z | planga/__init__.py | ResiliaDev/planga-python | a70438c3b9b1e2749ca72eb93614cf9d1d78a610 | [
"MIT"
] | 1 | 2018-10-15T14:36:02.000Z | 2018-10-15T15:20:37.000Z | planga/__init__.py | ResiliaDev/planga-python | a70438c3b9b1e2749ca72eb93614cf9d1d78a610 | [
"MIT"
] | null | null | null | name = "planga"
from .planga import Planga
from .planga import PlangaConfiguration | 20.75 | 39 | 0.807229 | 10 | 83 | 6.7 | 0.5 | 0.298507 | 0.477612 | 0.656716 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.13253 | 83 | 4 | 39 | 20.75 | 0.930556 | 0 | 0 | 0 | 0 | 0 | 0.071429 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
5201bd1fcc554c866e99809329bae8c08f5fa41d | 3,310 | py | Python | wildlifelicensing/apps/customer_management/tests.py | jawaidm/wildlifelicensing | 87e8e9ab163e0d7bbb0c7a654a13ce8a4d8fcf82 | [
"Apache-2.0"
] | null | null | null | wildlifelicensing/apps/customer_management/tests.py | jawaidm/wildlifelicensing | 87e8e9ab163e0d7bbb0c7a654a13ce8a4d8fcf82 | [
"Apache-2.0"
] | 11 | 2019-03-19T02:03:11.000Z | 2019-05-31T07:20:59.000Z | wildlifelicensing/apps/customer_management/tests.py | jawaidm/wildlifelicensing | 87e8e9ab163e0d7bbb0c7a654a13ce8a4d8fcf82 | [
"Apache-2.0"
] | 2 | 2020-08-10T10:17:10.000Z | 2021-10-31T23:20:53.000Z | from django.shortcuts import reverse
from wildlifelicensing.apps.main.tests import helpers as helpers
class LookupViewTest(helpers.BasePermissionViewTestCase):
view_url = reverse('wl_customer_management:customer_lookup')
@property
def permissions(self):
return {
'get': {
'allowed': [self.officer],
'forbidden': [self.customer, self.assessor],
},
}
class CustomerEditDetailsViewTest(helpers.BasePermissionViewTestCase):
def setUp(self):
self.view_url = reverse('wl_customer_management:edit_customer_details',
args=[self.customer.pk])
@property
def permissions(self):
return {
'get': {
'allowed': [self.officer],
'forbidden': [self.customer, self.assessor],
},
'post': {
'allowed': [self.officer],
'forbidden': [self.customer, self.assessor],
}
}
class CustomerEditProfileViewTest(helpers.BasePermissionViewTestCase):
def setUp(self):
self.view_url = reverse('wl_customer_management:edit_customer_profile',
args=[self.customer.pk])
@property
def permissions(self):
return {
'get': {
'allowed': [self.officer],
'forbidden': [self.customer, self.assessor],
},
'post': {
'allowed': [self.officer],
'forbidden': [self.customer, self.assessor],
}
}
class CustomerApplicationViewTest(helpers.BasePermissionViewTestCase):
def setUp(self):
self.view_url = reverse('wl_customer_management:data_applications',
args=[self.customer.pk])
@property
def permissions(self):
return {
'get': {
'allowed': [self.officer],
'forbidden': [self.customer, self.assessor],
},
'post': {
'allowed': [self.officer],
'forbidden': [self.customer, self.assessor],
}
}
class CustomerLicenceViewTest(helpers.BasePermissionViewTestCase):
def setUp(self):
self.view_url = reverse('wl_customer_management:data_licences',
args=[self.customer.pk])
@property
def permissions(self):
return {
'get': {
'allowed': [self.officer],
'forbidden': [self.customer, self.assessor],
},
'post': {
'allowed': [self.officer],
'forbidden': [self.customer, self.assessor],
}
}
class CustomerReturnViewTest(helpers.BasePermissionViewTestCase):
def setUp(self):
self.view_url = reverse('wl_customer_management:data_returns',
args=[self.customer.pk])
@property
def permissions(self):
return {
'get': {
'allowed': [self.officer],
'forbidden': [self.customer, self.assessor],
},
'post': {
'allowed': [self.officer],
'forbidden': [self.customer, self.assessor],
}
} | 28.782609 | 79 | 0.525076 | 257 | 3,310 | 6.661479 | 0.178988 | 0.11215 | 0.115654 | 0.173481 | 0.8125 | 0.8125 | 0.79264 | 0.79264 | 0.79264 | 0.78972 | 0 | 0 | 0.358006 | 3,310 | 115 | 80 | 28.782609 | 0.805647 | 0 | 0 | 0.663043 | 0 | 0 | 0.136213 | 0.07158 | 0 | 0 | 0 | 0 | 0 | 1 | 0.119565 | false | 0 | 0.021739 | 0.065217 | 0.282609 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
52024f2efdaa6fc7f745c989f57b37ed5bf54d37 | 30 | py | Python | testSources/correct/p2.py | Gavaharlal/fpTask1 | 1f070ad9ad0c56f56488cc27ac263854c7c010d8 | [
"BSD-3-Clause"
] | null | null | null | testSources/correct/p2.py | Gavaharlal/fpTask1 | 1f070ad9ad0c56f56488cc27ac263854c7c010d8 | [
"BSD-3-Clause"
] | null | null | null | testSources/correct/p2.py | Gavaharlal/fpTask1 | 1f070ad9ad0c56f56488cc27ac263854c7c010d8 | [
"BSD-3-Clause"
] | null | null | null | def func(a, b):
return a+b | 15 | 15 | 0.566667 | 7 | 30 | 2.428571 | 0.714286 | 0.235294 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.266667 | 30 | 2 | 16 | 15 | 0.772727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
521cf8c91c9828d8dfc65a4e926ba2a584671fcf | 4,786 | py | Python | tests/cpu/instruction/test_bitey_cpu_instruction_dec.py | jgerrish/bitey | a393a83c19338d94116f3405f4b8a0f03ea84d79 | [
"MIT"
] | null | null | null | tests/cpu/instruction/test_bitey_cpu_instruction_dec.py | jgerrish/bitey | a393a83c19338d94116f3405f4b8a0f03ea84d79 | [
"MIT"
] | null | null | null | tests/cpu/instruction/test_bitey_cpu_instruction_dec.py | jgerrish/bitey | a393a83c19338d94116f3405f4b8a0f03ea84d79 | [
"MIT"
] | null | null | null | # Test the DEC instructions (DEC, DEX, DEY)
import pytest
from bitey.computer.computer import Computer
def build_computer():
computer = None
with open("chip/6502.json") as f:
chip_data = f.read()
computer = Computer.build_from_json(chip_data)
return computer
return None
# module scope means run once per test module
@pytest.fixture(scope="module")
def setup():
computer = build_computer()
yield computer
def test_dex(setup):
"Test DEX that doesn't wrap"
computer = setup
computer.reset()
computer.cpu.registers["X"].set(0x02)
# The DEX instruction
computer.memory.write(0x00, 0xCA)
computer.memory.write(0x01, 0xCA)
computer.cpu.registers["PC"].set(0x00)
computer.cpu.get_next_instruction(computer.memory)
computer.cpu.execute_instruction(computer.memory)
assert computer.cpu.registers["X"].get() == 0x01
assert computer.cpu.flags["Z"].status is False
def test_dex_zero_flag_set(setup):
"Test DEX that setting to zero sets the Z flag"
computer = setup
computer.reset()
computer.cpu.registers["X"].set(0x01)
# The DEX instruction
computer.memory.write(0x00, 0xCA)
computer.memory.write(0x01, 0xCA)
computer.cpu.registers["PC"].set(0x00)
computer.cpu.get_next_instruction(computer.memory)
computer.cpu.execute_instruction(computer.memory)
assert computer.cpu.registers["X"].get() == 0x00
assert computer.cpu.flags["Z"].status is True
def test_dex_wrap_zero_flag_not_set(setup):
"Test DEX that wrapping does not set the Z flag"
computer = setup
computer.reset()
computer.cpu.registers["X"].set(0x00)
# The DEX instruction
computer.memory.write(0x00, 0xCA)
computer.memory.write(0x01, 0xCA)
computer.cpu.registers["PC"].set(0x00)
computer.cpu.get_next_instruction(computer.memory)
computer.cpu.execute_instruction(computer.memory)
assert computer.cpu.registers["X"].get() == 0xFF
assert computer.cpu.flags["Z"].status is False
def test_dey(setup):
"Test DEY that doesn't wrap"
computer = setup
computer.reset()
computer.cpu.registers["Y"].set(0x02)
# The DEY instruction
computer.memory.write(0x00, 0x88)
computer.memory.write(0x01, 0x88)
computer.cpu.registers["PC"].set(0x00)
computer.cpu.get_next_instruction(computer.memory)
computer.cpu.execute_instruction(computer.memory)
assert computer.cpu.registers["Y"].get() == 0x01
assert computer.cpu.flags["Z"].status is False
def test_dey_zero_flag_set(setup):
"Test DEY that setting to zero sets the Z flag"
computer = setup
computer.reset()
computer.cpu.registers["Y"].set(0x01)
# The DEY instruction
computer.memory.write(0x00, 0x88)
computer.memory.write(0x01, 0x88)
computer.cpu.registers["PC"].set(0x00)
computer.cpu.get_next_instruction(computer.memory)
computer.cpu.execute_instruction(computer.memory)
assert computer.cpu.registers["Y"].get() == 0x00
assert computer.cpu.flags["Z"].status is True
def test_dey_wraps_zero_flag_not_set(setup):
"Test DEY that wrapping doesn't set the Z flag"
computer = setup
computer.reset()
computer.cpu.registers["Y"].set(0x00)
# The DEY instruction
computer.memory.write(0x00, 0x88)
computer.memory.write(0x01, 0x88)
computer.cpu.registers["PC"].set(0x00)
computer.cpu.get_next_instruction(computer.memory)
computer.cpu.execute_instruction(computer.memory)
assert computer.cpu.registers["Y"].get() == 0xFF
assert computer.cpu.flags["Z"].status is False
def test_dec_zeropage(setup):
"Test DEY that doesn't set zero flag"
computer = setup
computer.reset()
# The DEC instruction
computer.memory.write(0x00, 0xC6)
# The zero page address to decrement
computer.memory.write(0x01, 0x10)
# The value to decrement
computer.memory.write(0x10, 0x02)
computer.cpu.registers["PC"].set(0x00)
computer.cpu.get_next_instruction(computer.memory)
computer.cpu.execute_instruction(computer.memory)
assert computer.memory.read(0x10) == 0x01
assert computer.cpu.flags["Z"].status is False
def test_dec_zeropage_zero_flag_set(setup):
"Test DEC that wraps sets the Z flag"
computer = setup
computer.reset()
# The DEC instruction
computer.memory.write(0x00, 0xC6)
# The zero page address to decrement
computer.memory.write(0x01, 0x10)
# The value to decrement
computer.memory.write(0x10, 0x01)
computer.cpu.registers["PC"].set(0x00)
computer.cpu.get_next_instruction(computer.memory)
computer.cpu.execute_instruction(computer.memory)
assert computer.memory.read(0x10) == 0x00
assert computer.cpu.flags["Z"].status is True
| 27.193182 | 54 | 0.707062 | 654 | 4,786 | 5.084098 | 0.120795 | 0.145564 | 0.180451 | 0.062556 | 0.870977 | 0.846917 | 0.822857 | 0.822857 | 0.820451 | 0.806316 | 0 | 0.042111 | 0.176348 | 4,786 | 175 | 55 | 27.348571 | 0.80137 | 0.140618 | 0 | 0.587156 | 0 | 0 | 0.081406 | 0 | 0 | 0 | 0.054422 | 0 | 0.146789 | 1 | 0.091743 | false | 0 | 0.018349 | 0 | 0.12844 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
52292dcd05ec9a15481382bf8b507cd1fdb07263 | 79,520 | py | Python | Bucket 2.0/ScriptRunner.py | PixetBits/Bucket-1-3 | e4f33e7b5a45fa25593e2b664a54358c7e8fed1c | [
"MIT"
] | null | null | null | Bucket 2.0/ScriptRunner.py | PixetBits/Bucket-1-3 | e4f33e7b5a45fa25593e2b664a54358c7e8fed1c | [
"MIT"
] | null | null | null | Bucket 2.0/ScriptRunner.py | PixetBits/Bucket-1-3 | e4f33e7b5a45fa25593e2b664a54358c7e8fed1c | [
"MIT"
] | null | null | null | #This script run the
#Bucket Script, call
#Other functions and
#manange all stuff.
#Python Libraries
from importlib import import_module as require
#Error Message
from ErrorMessage import Error
#Comments
from DellComments import Remove
#Multiclasses
from ImportLibStf import GetTasks, SupervisorDad
#Variables functions
from IntFunctions import *
from StrFunctions import *
from BolFunctions import *
from FltFunctions import *
from LstFunctions import *
from LoopsBreaker import *
#Logic Door
from LogicDoorLib import LogicBlk
#Declareted Variables
actI = {"" : "int",}
actS = {"" : "str",}
actB = {"" : "bol",}
actF = {"" : "flt",}
actL = {"" : "lst",}
#Security Variables
safeI = {"" : "int",}
safeS = {"" : "str",}
safeB = {"" : "bol",}
safeF = {"" : "flt",}
fafeL = {"" : "lst",}
#Task data
actT = {"" : "tsk",}
#Libraries added
Libs = {"" : "lib",}
#If-Maybe-Else system
ifBk = {"status" : False, "block" : "", "inside" : False}
def Runner(lines, path, name) : #
#In a class
onClss = False
#In a function
onFunc = False
#In a loop
onLoop = False
#In a task, but not running
onTask = False
#Fix-Lines-----------------------------------------------------------------------------------------------------#
#Remove comments
for l in range(0, len(lines)) : #
if "--" in lines[l] : lines[l] = Remove(lines[l])
#
#Remove strange chars
lines = [l.replace("\n", "") for l in lines]
lines = [l.replace("\t", "") for l in lines]
lines = [l.replace(" ", "") for l in lines]
lines = [l.strip() for l in lines]
#Non usable lines
lines = [l for l in lines if l != ""]
#To-Lib--------------------------------------------------------------------------------------------------------#
#Other libraries
for i in range(0, len(lines)) : #
#in Class
if lines[i].startswith("#") : #
if lines[i] == "#" + name + " in Bucket:" : onClss = True
#
#Math Functions
if lines[i] == "[to MathF]" and onClss == False : Libs.update({"mathF" : require("toMathFLib")})
elif lines[i] == "[to MathF]" : Error("[Compiler Error] Lib called after class declaration.", "lin", lines[i])
#Add your librarie's module [0] bellow
#----------Here----------#
#
onClss = False
#Multiclass----------------------------------------------------------------------------------------------------#
for l in range(0, len(lines) - 1) : #
#Import lines here
if lines[l].startswith("with ") and onFunc == False : #
clin = lines[l].replace("with ", "")
word = clin.split()
#with name[0] in[1] Sandbox.[2]
#Missed keyword
try : #
if word[1] != "in" : Error("[Sintax Error] Missed \"in\" keyword." "lin", lines[l])
if word[2] != "Sandbox." : Error("[Sintax Error] Missed \"Sandbox.\" keyword." "lin", lines[l])
#
except :
Error("[Sintax Error] Missed keywords." "lin", lines[l])
#Get script
try : #
#Get current path
if path != name : path = path.replace(name + ".bk", "")
else : path = ""
fileD = open(path + word[0].replace("#", "") + ".bk")
lData = fileD.readlines()
fileD.close()
#
#No file with this name
except FileNotFoundError :
Error("Bucket cannot import an nonexistent file. Check the name and try again.", "sys")
#Import tasks from other script
lines = lines[:l] + GetTasks(lData, word[0]) + lines[l + 1:]
#
#Interface
if lines[l].startswith("dad ") and onClss == False : #
#Running *in* a dad class
if lines[l].endswith(":") : Error("Interfaces cannot be executed.", "sys")
clin = lines[l].replace("dad ", "")
dadN = clin.replace(".", "")
#Get script
try : #
#Get current path
if path != name : path = path.replace(name + ".bk", "")
else : path = ""
fileD = open(path + dadN.replace("#", "") + ".bk")
lData = fileD.readlines()
fileD.close()
#
#No file with this name
except FileNotFoundError :
Error("Dad file not fount. Check the name and try again.", "sys")
#Get all requirements
needL = SupervisorDad(lData, dadN)
#Check if has be done
for i in range(0, len(needL)) : #
#Get
obligation = needL[i]
result = any(line.startswith(obligation) for line in lines)
if result == False : Error("[Dad error] your dad forces you had a [" + obligation + "] line.", "sys")
#
#
#
#Task-Stuff----------------------------------------------------------------------------------------------------#
taskKey = {"s" : 0, "e" : 0, "n" : "", "v" : ""}
#Find tasks
for l in range(0, len(lines)) : #
#Start of task block
if "task:" in lines[l] : #
word = lines[l].split()
#name[0] task:[1] arg[2]
#Get name and line
taskKey["s"] = l
taskKey["n"] = word[0]
#Get argmt name
if len(word) > 2 : taskKey["v"] = word[2]
#
#End of task block
if lines[l] == "end t." : #
#Set the last line
taskKey["e"] = l
actT.update({taskKey["n"] : taskKey})
#Reset param
taskKey = {"s" : 0, "e" : 0, "n" : "", "v" : ""}
#
#
#--------------------------------------------------------------------------------------------------------------#
#Loop index
loop = {"q" : "", "e" : ""}
#Run the lines
for l in range(0, len(lines)) : #
#Editable line
clin = lines[l]
#Skip task lines
if len(actT) > 1 : #
for task in actT : #
if task == "" : continue
if l == actT[task]["s"] : onTask = True
if l == actT[task]["e"] : onTask = False
#
#
#Class
if lines[l].startswith("#") : #
if lines[l] == "#" + name + " in Bucket:" : onClss = True
else : Error("[Syntax Error] The class name must be the same of the file.", "lin", lines[l])
#
if lines[l] == "close." : onClss = False
#Main Function
if lines[l] == "bucket open:" :
if onClss == True : onFunc = True
else : Error("[Syntax Error] Main out of class.", "lin", lines[l])
if lines[l] == "end m." : onFunc = False
#Task
if lines[l].endswith("task:") :
if onClss == True : onFunc = True
else : Error("[Syntax Error] Task out of class.", "lin", lines[l])
if lines[l] == "end t." : onFunc = False
#end loop
if lines[l] == "end l." : onLoop = False
#In Main or Task
if onFunc == True and (onLoop == False and onTask == False) : #
#To-Lib------------------------------------------------------------------------------------------------#
#Math Functions
if "mathF" in Libs : clin = Libs["mathF"].Main(lines[l], actI, actF)
#Add your librarie's module [1] bellow
#----------Here----------#
#If-Maybe-Else--------------------------------------------------------------------------------------------#
#If block
if clin.startswith("if ") : #
word = clin.split()
#Missed keyword
if word[len(word) - 1] == "do:" : clin = clin.replace(" do:", "")
else : Error("[Sintax Error] Missed \"do:\" keyword.", "lin", lines[l])
#Rechange
scentence = clin
#For two args
outOne = ""
outTwo = ""
#Keyword index
index = ""
joint = ""
#Set the current logic block
ifBk["block"] = "if"
#Say are in a sentence
ifBk["inside"] = True
#And Conector
if " and " in clin:
#Store what the conective
joint = "and"
#Break args
index = clin.find(" and ")
clin = scentence[:index]
#Or Conector
if " or " in clin :
#Store what the conective
joint = "or"
#Break args
index = clin.find(" or ")
clin = scentence[:index]
#Break
clin = clin.replace("if ", "")
args = clin.split()
#if var[0] cond[1] val[2] do:
#An int statlement
if args[0] in actI : outOne = LogicBlk(clin, lines[l], actI)
#An str statlement
elif args[0] in actS : outOne = LogicBlk(clin, lines[l], actS)
#An bol statlement
elif args[0] in actB : outOne = LogicBlk(clin, lines[l], actB)
#An flt statlement
elif args[0] in actF : outOne = LogicBlk(clin, lines[l], actF)
#Unknown var
else : Error("This is not a declareted variable.", "lin", lines[l])
#Last arg
if index != "" : #
#Get member
clin = scentence[index + len(joint) + 2:]
#Break
args = clin.split()
#An int statlement
if args[0] in actI : outTwo = LogicBlk(clin, lines[l], actI)
#An str statlement
elif args[0] in actS : outTwo = LogicBlk(clin, lines[l], actS)
#An bol statlement
elif args[0] in actB : outTwo = LogicBlk(clin, lines[l], actB)
#An flt statlement
elif args[0] in actF : outTwo = LogicBlk(clin, lines[l], actF)
#Both need be true
if joint == "and" : #
if outOne and outTwo : ifBk["status"] = True
else : ifBk["status"] = False
#
#One of these need be true
if joint == "or" : #
if outOne or outTwo : ifBk["status"] = True
else : ifBk["status"] = False
#
#
#Just one sentence
else : ifBk["status"] = outOne
#
#Maybe Block
if clin.startswith("maybe ") and ifBk["status"] == False : #
word = clin.split()
#Missed keyword
if word[len(word) - 1] == "do:" : clin = clin.replace(" do:", "")
else : Error("[Sintax Error] Missed \"do:\" keyword.", "lin", lines[l])
#Rechange
scentence = clin
#For two args
outOne = ""
outTwo = ""
#Keyword index
index = ""
joint = ""
#Set the current logic block
ifBk["block"] = "maybe"
#Say are in a sentence
ifBk["inside"] = True
#And Conector
if " and " in clin:
#Store what the conective
joint = "and"
#Break args
index = clin.find(" and ")
clin = scentence[:index]
#Or Conector
if " or " in clin :
#Store what the conective
joint = "or"
#Break args
index = clin.find(" or ")
clin = scentence[:index]
#Break
clin = clin.replace("maybe ", "")
args = clin.split()
#maybe var[0] cond[1] val[2] do:
#An int statlement
if args[0] in actI : outOne = LogicBlk(clin, lines[l], actI)
#An str statlement
elif args[0] in actS : outOne = LogicBlk(clin, lines[l], actS)
#An bol statlement
elif args[0] in actB : outOne = LogicBlk(clin, lines[l], actB)
#An flt statlement
elif args[0] in actF : outOne = LogicBlk(clin, lines[l], actF)
#Unknown var
else : Error("This is not a declareted variable.", "lin", lines[l])
#Last arg
if index != "" : #
#Get member
clin = scentence[index + len(joint) + 2:]
#Break
args = clin.split()
#An int statlement
if args[0] in actI : outTwo = LogicBlk(clin, lines[l], actI)
#An str statlement
elif args[0] in actS : outTwo = LogicBlk(clin, lines[l], actS)
#An bol statlement
elif args[0] in actB : outTwo = LogicBlk(clin, lines[l], actB)
#An flt statlement
elif args[0] in actF : outTwo = LogicBlk(clin, lines[l], actF)
#Both need be true
if joint == "and" : #
if outOne and outTwo : ifBk["status"] = True
else : ifBk["status"] = False
#
#One of these need be true
if joint == "or" : #
if outOne or outTwo : ifBk["status"] = True
else : ifBk["status"] = False
#
#
#Just one sentence
else : ifBk["status"] = outOne
#
#Else block
if clin.startswith("else ") and ifBk["status"] == False : #
#Set the current logic block
ifBk["block"] = "else"
#Say are in a sentence
ifBk["inside"] = True
#
#Out of sentence
if clin == "end." : ifBk["inside"] = False
#Enabled system
if clin.startswith("> ") and ifBk["inside"] == True : #
working = False
#True door
if ifBk["block"] == "if" or ifBk["block"] == "maybe" :
if ifBk["status"] == True : working = True
#False door
elif ifBk["block"] == "else" and ifBk["status"] == False : working = True
#Actions
if working == True : clin = clin.replace("> ", "")
#Error
if clin.startswith("if") : Error("[Syntax Error] Bucket does not suport ifs inside ifs.", "lin", lines[l])
#
#------------------------------------------------------------------------------------------------------#
#Declare int
if clin.startswith("int ") : actI.update(NewInt(clin, actI))
#Declare str
if clin.startswith("str ") : actS.update(NewStr(clin, actS))
#Declare bol
if clin.startswith("bol ") : actB.update(NewBol(clin, actB))
#Declare flt
if clin.startswith("flt ") : actF.update(NewFlt(clin, actB))
#Declare lst
if clin.startswith("lst ") : actL.update(NewLst(clin, actL))
#set
if clin.startswith("set ") : #
#Remove keyword
name = clin.replace(clin[0:4], "")
name = name.split()
#set name[0] to[1] vall[2] index[3]
#A list value
if name[2] in actL : #
#Update index
if name[3] in actI : name[3] = str(actI[name[3]])
if name[3] in actS : name[3] = str(actS[name[3]])
if name[3] in actB : name[3] = str(actB[name[3]])
if name[3] in actF : name[3] = str(actF[name[3]])
#Fix bol definition
if name[3] == "True" : name[3] = "yes"
if name[3] == "False" : name[3] = "not"
#Key error
if not name[3] in actL[name[2]] : Error("[Compiler Error] Index out of range.", "lin", lines[l])
#Update a int
if name[0] in actI : actI.update({name[0] : actL[name[2]][name[3]]})
#Update a str
elif name[0] in actS : actS.update({name[0] : actL[name[2]][name[3]]})
#Update a bol
elif name[0] in actB : actB.update({name[0] : actL[name[2]][name[3]]})
#Update a flt
elif name[0] in actF : actF.update({name[0] : actL[name[2]][name[3]]})
#Unknown varible
else : Error("[Compiler Error] This is not a declareted variable.", "lin", lines[l])
#
#Default value
else : #
#Update a int
if name[0] in actI : actI.update(SetInt(clin, actI))
#Update a str
elif name[0] in actS : actS.update(SetStr(clin, actS))
#Update a bol
elif name[0] in actB : actB.update(SetBol(clin, actB))
#Update a flt
elif name[0] in actF : actF.update(SetFlt(clin, actF))
#Unknown varible
else : Error("This is not a declareted variable.", "lin", lines[l])
#
#
#Make
if clin.startswith("make ") : #
#Remove keyword
clin = clin.replace(clin[0:5], "")
args = clin.split()
#make var[0] ~[1] val[2]
#Change a int
if args[0] in actI : actI.update(Make(clin, lines[l], actI))
#Change a str
elif args[0] in actS : actS.update(Make(clin, lines[l], actS))
#Change a flt
elif args[0] in actF : actF.update(Make(clin, lines[l], actF))
#bol or else
else : Error("[Compiler Error] This variable is probably an bol. Bucket cannot change this.", "lin", lines[l])
#
#Rise
if clin.startswith("rise ") : #
#Remove keyword
vall = clin.replace(clin[0:5], "")
indx = vall.find(" in ")
name = vall[:indx]
#Cannot rise strings
if name in actS : Error("[Compiler Error] A str cannot be summed with a number.", "lin", lines[l])
#Cannot rise boolean
elif name in actB : Error("[Compiler Error] A bol cannot be summed with a number.", "lin", lines[l])
#a int
elif name in actI : actI.update(Rise(clin, actI))
#a flt
elif name in actF : actF.update(Rise(clin, actF))
else : Error("[Compiler Error] This variable do not exist.", "lin", lines[l])
#
#Down
if clin.startswith("down ") : #
#Remove keyword
clin = clin.replace(clin[0:5], "")
index = clin.find(" in ")
name = clin[:index]
#Cannot make down strings
if name in actS : Error("[Compiler Error] A str cannot be decreased by number.", "lin", lines[l])
#Cannot make down Boolean
elif name in actB : Error("[Compiler Error] A bol cannot be decreased by number.", "lin", lines[l])
#a int
elif name in actI : actI.update(Down(clin, actI))
#a flt
elif name in actF : actF.update(Down(clin, actF))
else : Error("[Compiler Error] This variable do not exist.", "lin", lines[l])
#
#Round
if clin.startswith("round ") : #
#Remove keyword
clin = clin.replace(clin[0:6], "")
#Round value if exists
value = RoundT(clin, lines[l], actF)
#This flt have the same name of an int
if value in actI : Error("[Compiler Error] You already have a int with the same name of your flt.", "lin", lines[l])
#Allright
else : #
actI.update({clin : value})
del actF[clin]
#
#
#Convert
if clin.startswith("convert ") : #
#Remove keyword
clin = clin.replace(clin[0:8], "")
#To int
if clin.endswith(" to int") : #
#Remove keyword
clin = clin.replace(" to int", "")
#An string
if clin in actS : #
actI.update({clin : CToInt(actS[clin], lines[l])})
del actS[clin]
#
#An boolean
if clin in actB : #
actI.update({clin : CToInt(actB[clin], lines[l])})
del actB[clin]
#
#An float
if clin in actF : #
actI.update({clin : CToInt(actF[clin], lines[l])})
del actF[clin]
#
#
#To str
if clin.endswith(" to str") : #
#Remove keyword
clin = clin.replace(" to str", "")
#An interable
if clin in actI : #
actS.update({clin : CToStr(actI[clin])})
del actI[clin]
#
#An boolean
elif clin in actB : #
actS.update({clin : CToStr(actB[clin])})
del actB[clin]
#
#An float
elif clin in actF : #
actS.update({clin : CToStr(actF[clin])})
del actF[clin]
#
else : Error("[Compiler Error] This is not a declareted variable.", "lin", lines[l])
#
#To bol
if clin.endswith(" to bol") : #
#Remove keyword
clin = clin.replace(" to bol", "")
#An interable
if clin in actI : #
actB.update({clin : CToBol(actI[clin], lines[l])})
del actI[clin]
#
#An string
if clin in actS : #
actB.update({clin : CToBol(actS[clin], lines[l])})
del actS[clin]
#
#An float
elif clin in actF : #
actB.update({clin : CToBol(actF[clin], lines[l])})
del actF[clin]
#
else : Error("[Compiler Error] This is not a declareted variable.", "lin", lines[l])
#
#To flt
if clin.endswith(" to flt") : #
#Remove keyword
clin = clin.replace(" to flt", "")
#An interable
if clin in actI : #
actF.update({clin : CToFlt(actI[clin], lines[l])})
del actI[clin]
#
#An string
if clin in actS : #
actF.update({clin : CToFlt(actS[clin], lines[l])})
del actS[clin]
#
#An boolean
elif clin in actB : Error("[Compiler Error] Bucket do not understand booleans as floats.", "lin", lines[l])
#
#
#Basic BIn/BOud
if clin.startswith("show ") : #
#Remove keyword
clin = clin.replace(clin[0:5], "")
#Special codes
if "\\s" in clin : clin = val.replace("\\s", " ")
if "\\q" in clin : clin = val.replace("\\q", "\"")
if "\\n" in clin : clin = val.replace("\\n", "\n")
if "\\t" in clin : clin = val.replace("\\t", "\t")
#With Arg
if "with " in clin : #
#Get the keyword index
index = clin.index(" with ")
strng = clin[:index]
argmt = clin[index + len(" with "):]
#Fix text value
if strng in actS : strng = actS[strng]
elif strng.startswith("'") and strng.endswith("'") : strng = strng.replace("'", "")
else : Error("[Compiler Error] Cannot print a non string.", "lin", lines[l])
#Replace by a int
if "[i]" in strng and IsAInt(argmt, actI) == True : #
#Number
try : #
int(argmt)
print(strng.replace("[i]", argmt))
#
#Variable
except ValueError :
print(strng.replace("[i]", str(actI[argmt])))
#
#Replace by a str
elif "[s]" in strng and IsAStr(argmt, actS) == True : #
#Text
if argmt.startswith("'") : #
#Empty
if argmt == "''" : argmt = ""
#Remove apostrophos
else : argmt = argmt[1:len(argmt) - 1]
print(strng.replace("[s]", argmt))
#
#Variable
else : print(strng.replace("[s]", actS[argmt]))
#
#Replace by a bol
elif "[b]" in strng and IsABol(argmt, actB) == True : #
#Value
if argmt == "yes" : print(strng.replace("[b]", "yes"))
elif argmt == "not" : print(strng.replace("[b]", "not"))
#Variable
else : #
if actB[argmt] == True : print(strng.replace("[b]", "yes"))
if actB[argmt] == False : print(strng.replace("[b]", "not"))
#
#
#Replace by a flt
elif "[f]" in strng and IsAFlt(argmt, actF) == True : #
#Value
if argmt.endswith(".f") : argmt = argmt.replace(".f", "")
#Number
try : #
float(argmt)
print(strng.replace("[f]", argmt))
#
#Variable
except ValueError :
print(strng.replace("[f]", str(actF[argmt])))
#
#Replace by a lst
elif "[l]" in strng and IsALst(argmt, actL) == True : #
#Variable
if argmt in actL :
#Get list from actL
argmt = actL[argmt]
#Initidal visible list
toShow = "["
#Get just the values
for item in argmt : #
if "'" in item : item = item.replace("'", "")
if item == "" : next
elif item == max(argmt.keys()) : toShow = toShow + str(argmt[item]) + "]"
else : toShow = toShow + str(argmt[item]) + ", "
#
print(strng.replace("[l]", toShow))
#
#Value
else : print(strng.replace("[l]", argmt))
#
#Without Keyword
else : Error("[Compiler Error] Sitation Keyword not found.", "lin", lines[l])
#
#Is a str value
elif clin.startswith("'") and clin.endswith("'") : #
#Empty
if clin == "''" : clin = ""
#Remove apostrophos
else : clin = clin[1:len(clin) - 1]
print(clin)
#
#Is other str
elif clin in actS : print(actS[clin])
#Input
elif clin.startswith("sand") : print(Sand(clin, lines[l], actS))
#Error
else : Error("[Compiler Error] Bucket cannot show this.", "lin", lines[l])
#
#List functions
#fin
if clin.startswith("fin ") : #
#Remove keyword
clin = clin.replace(clin[0:4], "")
#Get list
word = clin.split()
#find item[0] in[1] list[2] to[3] item[4]
#Need a int to get str index
if word[4] in actI and IsAStr(word[2], actS) : actI.update({word[4] : Fin(clin, lines[l], actS, actL)})
#Need a str to get lst index
elif word[4] in actS and IsALst(word[2], actL) : actS.update({word[4] : Fin(clin, lines[l], actS, actL)})
#Something else
else : Error("[Compiler Error] Type error or unknown variables.", "lin", lines[l])
#
#add
if clin.startswith("add ") : #
#Remove keyword
clin = clin.replace(clin[0:4], "")
#Get list
word = clin.split()
#add item[0] on[1] list[2] as[3] value[4]
if word[2] in actL : actL[word[2]].update(Add(clin, lines[l], actL))
else : Error("[Compiler Error] This list do not exist.", "lin", lines[l])
#
#del
if clin.startswith("del ") : #
#Remove keyword
clin = clin.replace(clin[0:4], "")
#Get list
word = clin.split()
#del item[0] of[1] list[2]
if word[2] in actL : actL.update(Del(clin, lines[l], actL))
else : Error("[Compiler Error] This list do not exist.", "lin", lines[l])
#
#siz
if clin.startswith("siz ") : #
#Remove keyword
clin = clin.replace(clin[0:4], "")
#For index
ledr = clin
ledr = ledr.replace("of ", "")
#siz of list to var
index = ledr.find(" to ")
aList = ledr[:index]
vName = ledr[index + 4:]
#This list exists
if aList in actL : #
if vName in actI : actI.update(Siz(clin, lines[l], actL[aList]))
else : Error("[Compiler Error] Siz function returns a int, so it needs a int to assign.", "lin", lines[l])
#
elif aList in actS : #
if vName in actI : actI.update(Siz(clin, lines[l], actS[aList]))
else : Error("[Compiler Error] Siz function returns a int, so it needs a int to assign.", "lin", lines[l])
#
else : Error("[Compiler Error] This list do not exist.", "lin", lines[l])
#
#brk
if clin.startswith("brk ") : #
clin = clin.replace(clin[0:4], "")
word = clin.split()
#brk string[0] in[1] word[2] to[3] var[4]
#Unknow vars
if not word[0] in actS : Error("[Compiler Error] This string do not exist.", "lin", lines[l])
if not word[4] in actL : Error("[Compiler Error] Cannot assign to a nonexistent list.", "lin", lines[l])
#List exits
else : actL.update(Brk(clin, lines[l], actS))
#
#loops
#For Loop
if clin.startswith("for ") : #
#for x times if y ~ z do:
loop = NewForLoop(l, lines, actI)
#Return :
#max count|question|variable|start/end line
#No return
if loop == None : Error("[Syntax Error] No loop ender.", "lin", lines[0])
#variable type
#Check a int
if loop["v"] in actI : loop["v"] = actI
#Check a str
elif loop["v"] in actS : loop["v"] = actS
#Check a bol
elif loop["v"] in actB : loop["v"] = actB
#Check a flt
elif loop["v"] in actF : loop["v"] = actF
#Non existent
else : Error("[Compiler Error] This is not a variable to check.", "lin", lines[l])
#For count
for c in range(0, loop["m"]) : #
#For every line
for i in range(loop["s"], loop["e"]) : #
#Question
quest = LogicBlk(loop["q"], lines[l], loop["v"])
#If True then ...
if quest == True : #
#Run other Runner
LittleRun(i, lines)
#Update var
if loop["v"][""] == "int" : loop["v"] = actI
if loop["v"][""] == "str" : loop["v"] = actS
if loop["v"][""] == "bol" : loop["v"] = actB
if loop["v"][""] == "flt" : loop["v"] = actF
#
#
#
#
#For Loop
if clin.startswith("every ") : #
#every item in list do:
loop = NewEveryLoop(l, lines, actL)
#Return :
#var name| var list| start l| end line|
#No return
if loop == None : Error("[Syntax Error] No loop ender.", "lin", lines[0])
#For count
#Remove type index
List = {"" : "",}
List.update(loop["l"])
del List[""]
#for every item in list do:
for c in List : #
indx = loop["l"][c]
#Create temporary variable
if type(indx) == int : actI.update({loop["v"] : indx})
elif type(indx) == str : actS.update({loop["v"] : indx})
elif type(indx) == bool : actB.update({loop["v"] : indx})
elif type(indx) == float : actF.update({loop["v"] : indx})
#For every lile
for i in range(loop["s"], loop["e"]) : #
#Run other Runner
LittleRun(i, lines)
#
#
#Remove temporary variable
if loop["v"] in actI : del actI[loop["v"]]
if loop["v"] in actS : del actS[loop["v"]]
if loop["v"] in actB : del actB[loop["v"]]
if loop["v"] in actF : del actF[loop["v"]]
onLoop = True
#
#Tasks
#call
if clin.startswith("call ") : #
clin = clin.replace(clin[0:5], "")
#call name: arg to var
index = clin.find(" to ")
taskD = clin[:index]
vName = clin[index + 4:]
argmt = None
#Task configuration
if ":" in taskD : #
dataT = taskD.split(":")
#task[0] arg[1]
try : #
tName = dataT[0]
argmt = dataT[1]
tName = tName.strip()
argmt = argmt.strip()
#
except :
Error("[Syntax Error] no argment gived.", "lin", lines[l])
#
#Just name
else : tName = taskD
#Get task data
if tName in actT : taskD = actT[tName]
else : Error("[Compiler Error] Trying to call a unknown task.", "lin", lines[l])
#To del the temporary var
varT = "nil"
#Variables beafore task
safeI, safeS, safeB, safeF = actI, actS, actB, actF
#argment
if argmt != None : #
#No argment
if taskD["v"] == "" : Error("[Compiler Error] This task do not requires a argment.", "lin", lines[l])
#Add a temporary int
if IsAInt(argmt, actI) == True : #
varT = "int"
actI.update({taskD["v"] : actI[argmt]})
#
#Add a temporary str
if IsAStr(argmt, actS) == True : #
varT = "str"
actS.update({taskD["v"] : actS[argmt]})
#
#Add a temporary bol
if IsABol(argmt, actB) == True : #
varT = "bool"
actB.update({taskD["v"] : actB[argmt]})
#
#Add a temporary flt
if IsAFlt(argmt, actF) == True : #
varT = "float"
actF.update({taskD["v"] : actF[argmt]})
#
#
#Run
for i in range(taskD["s"], taskD["e"]) : #
retrn = LittleRun(i, lines)
#
#Dell task variables
#Int's
for i in actI :
if not i in safeI : del actI[i]
#Str's
for i in actS :
if not i in safeS : del actS[i]
#Bol's
for i in actB :
if not i in safeB : del actB[i]
#Fts's
for i in actF :
if not i in safeF : del actF[i]
#Remove the temporary var
if argmt != None : #
#Remove a int
if varT == "int" : del actI[taskD["v"]]
if varT == "str" : del actS[taskD["v"]]
if varT == "bool" : del actB[taskD["v"]]
if varT == "float" : del actF[taskD["v"]]
#
#No return
if retrn == None :
if not vName == "self." : Error("[Compiler Error] Tasks must return a value.", "lin", lines[taskD["e"]])
#Return value
else : #
#Get the return type as string
typ = str(type(retrn))
typ = typ.replace("'", "")
typ = typ.replace("<class ", "")
typ = typ[:-1]
#Check if var exists
#A int
if vName in actI : #
if typ == "int" : actI.update({vName : retrn})
#Not the same type
else : Error("[Compiler Error] This variable do not acept the return type.", "lin", lines[l])
#
#A str
elif vName in actS : #
if typ == "str" : actS.update({vName : retrn})
#Not the same type
else : Error("[Compiler Error] This variable do not acept the return type.", "lin", lines[l])
#
#A bol
elif vName in actB : #
if typ == "bool" : actB.update({vName : retrn})
#Not the same type
else : Error("[Compiler Error] This variable do not acept the return type.", "lin", lines[l])
#
#A flt
elif vName in actF : #
if typ == "float" : actF.update({vName : retrn})
#Not the same type
else : Error("[Compiler Error] This variable do not acept the return type.", "lin", lines[l])
#
#Unknown varible
else : Error("[Compiler Error] This is not a declareted variable.", "lin", lines[l])
#
#
#Strange line
keywords = ["int ", "str ", "bol ", "flt ", "lst ",
"set ", "make ", "rise ", "down ",
"convert ","show ", "fin ", "add ",
"del ", "siz ","brk ", "for ", "every ",
"call", "if ", "maybe ", "else ", "> "]
endwords = ["close.", "end m.", "end t.", "end l.", "end."]
#Check
true_one = any([lines[l].startswith(key) for key in keywords])
true_two = lines[l] in endwords
#Not the main function line and are in main function
if onFunc == True and lines[l] != "bucket open:" : #
#Do not start with any keywords or end some block
if true_one == False and true_two == False : #
#Are not a task
if not "task:" in lines[l] :
Error("[Syntax error] Strange command.", "lin", lines[l])
#
#
#
#Out of block
elif onFunc == False : #
#Libraries
if lines[l] == "[to Basic]" : continue
elif lines[l].startswith("[to ") and lines[l].endswith("]") : continue
#Class stuff
elif lines[l].startswith("dad ") : continue
#Main class
elif lines[l].endswith("in Bucket:") or clin == "close." : continue
#One function
elif lines[l] == "bucket open:" or clin == "end m." : continue
#One Task
elif "task:" in lines[l] or clin == "end t." : continue
#Out of system block
elif onTask == False : Error("[Syntax Error] Line out of an block.", "lin", lines[l])
#
#
#No "close." line
if lines[len(lines) - 1] != "close." : Error("[Syntax Error] Your script did not close the class block.", "lin", lines[len(lines) - 1])
#End script
print("\n--------------------\n")
input("Press enter to exit.\n")
#
def LittleRun(l, lines) : #
#Editable line
clin = lines[l]
#To-Lib----------------------------------------------------------------------------------------------------#
#Math Functions
if "mathF" in Libs : clin = Libs["mathF"].Main(lines[l], actI, actF)
#Add your librarie's module [2] bellow
#----------Here----------#
#If-Or-Else----------------------------------------------------------------------------------------------------#
#If block
if clin.startswith("if ") : #
word = clin.split()
#Missed keyword
if word[len(word) - 1] == "do:" : clin = clin.replace(" do:", "")
else : Error("[Sintax Error] Missed \"do:\" keyword.", "lin", lines[l])
#Rechange
scentence = clin
#For two args
outOne = ""
outTwo = ""
#Keyword index
index = ""
joint = ""
#Set the current logic block
ifBk["block"] = "if"
#Say are in a sentence
ifBk["inside"] = True
#And Conector
if " and " in clin:
#Store what the conective
joint = "and"
#Break args
index = clin.find(" and ")
clin = scentence[:index]
#Or Conector
if " or " in clin :
#Store what the conective
joint = "or"
#Break args
index = clin.find(" or ")
clin = scentence[:index]
#Break
clin = clin.replace("if ", "")
args = clin.split()
#if var[0] cond[1] val[2] do:
#An int statlement
if args[0] in actI : outOne = LogicBlk(clin, lines[l], actI)
#An str statlement
elif args[0] in actS : outOne = LogicBlk(clin, lines[l], actS)
#An bol statlement
elif args[0] in actB : outOne = LogicBlk(clin, lines[l], actB)
#An flt statlement
elif args[0] in actF : outOne = LogicBlk(clin, lines[l], actF)
#Unknown var
else : Error("This is not a declareted variable.", "lin", lines[l])
#Last arg
if index != "" : #
#Get member
clin = scentence[index + len(joint) + 2:]
#Break
args = clin.split()
#An int statlement
if args[0] in actI : outTwo = LogicBlk(clin, lines[l], actI)
#An str statlement
elif args[0] in actS : outTwo = LogicBlk(clin, lines[l], actS)
#An bol statlement
elif args[0] in actB : outTwo = LogicBlk(clin, lines[l], actB)
#An flt statlement
elif args[0] in actF : outTwo = LogicBlk(clin, lines[l], actF)
#Both need be true
if joint == "and" : #
if outOne and outTwo : ifBk["status"] = True
else : ifBk["status"] = False
#
#One of these need be true
if joint == "or" : #
if outOne or outTwo : ifBk["status"] = True
else : ifBk["status"] = False
#
#
#Just one sentence
else : ifBk["status"] = outOne
#
#Maybe Block
if clin.startswith("maybe ") and ifBk["status"] == False : #
word = clin.split()
#Missed keyword
if word[len(word) - 1] == "do:" : clin = clin.replace(" do:", "")
else : Error("[Sintax Error] Missed \"do:\" keyword.", "lin", lines[l])
#Rechange
scentence = clin
#For two args
outOne = ""
outTwo = ""
#Keyword index
index = ""
joint = ""
#Set the current logic block
ifBk["block"] = "maybe"
#Say are in a sentence
ifBk["inside"] = True
#And Conector
if " and " in clin:
#Store what the conective
joint = "and"
#Break args
index = clin.find(" and ")
clin = scentence[:index]
#Or Conector
if " or " in clin :
#Store what the conective
joint = "or"
#Break args
index = clin.find(" or ")
clin = scentence[:index]
#Break
clin = clin.replace("maybe ", "")
args = clin.split()
#maybe var[0] cond[1] val[2] do:
#An int statlement
if args[0] in actI : outOne = LogicBlk(clin, lines[l], actI)
#An str statlement
elif args[0] in actS : outOne = LogicBlk(clin, lines[l], actS)
#An bol statlement
elif args[0] in actB : outOne = LogicBlk(clin, lines[l], actB)
#An flt statlement
elif args[0] in actF : outOne = LogicBlk(clin, lines[l], actF)
#Unknown var
else : Error("This is not a declareted variable.", "lin", lines[l])
#Last arg
if index != "" : #
#Get member
clin = scentence[index + len(joint) + 2:]
#Break
args = clin.split()
#An int statlement
if args[0] in actI : outTwo = LogicBlk(clin, lines[l], actI)
#An str statlement
elif args[0] in actS : outTwo = LogicBlk(clin, lines[l], actS)
#An bol statlement
elif args[0] in actB : outTwo = LogicBlk(clin, lines[l], actB)
#An flt statlement
elif args[0] in actF : outTwo = LogicBlk(clin, lines[l], actF)
#Both need be true
if joint == "and" : #
if outOne and outTwo : ifBk["status"] = True
else : ifBk["status"] = False
#
#One of these need be true
if joint == "or" : #
if outOne or outTwo : ifBk["status"] = True
else : ifBk["status"] = False
#
#
#Just one sentence
else : ifBk["status"] = outOne
#
#Else block
if clin.startswith("else ") and ifBk["status"] == False : #
#Set the current logic block
ifBk["block"] = "else"
#Say are in a sentence
ifBk["inside"] = True
#
#Out of sentence
if clin == "end." : ifBk["inside"] = False
#Enabled system
if clin.startswith("> ") and ifBk["inside"] == True : #
working = False
#True door
if ifBk["block"] == "if" or ifBk["block"] == "maybe" :
if ifBk["status"] == True : working = True
#False door
elif ifBk["block"] == "else" and ifBk["status"] == False : working = True
#Actions
if working == True : clin = clin.replace("> ", "")
#Error
if clin.startswith("if") : Error("[Syntax Error] Bucket does not suport ifs inside ifs.", "lin", lines[l])
#
#------------------------------------------------------------------------------------------------------#
#Declare int
if clin.startswith("int ") : actI.update(NewInt(clin, actI))
#Declare str
if clin.startswith("str ") : actS.update(NewStr(clin, actS))
#Declare bol
if clin.startswith("bol ") : actB.update(NewBol(clin, actB))
#Declare flt
if clin.startswith("flt ") : actF.update(NewFlt(clin, actB))
#Declare lst
if clin.startswith("lst ") : actL.update(NewLst(clin, actL))
#set
if clin.startswith("set ") : #
#Remove keyword
name = clin.replace(clin[0:4], "")
name = name.split()
#set name[0] to[1] vall[2] index[3]
#A list value
if name[2] in actL : #
#Update index
if name[3] in actI : name[3] = str(actI[name[3]])
if name[3] in actS : name[3] = str(actS[name[3]])
if name[3] in actB : name[3] = str(actB[name[3]])
if name[3] in actF : name[3] = str(actF[name[3]])
#Fix bol definition
if name[3] == "True" : name[3] = "yes"
if name[3] == "False" : name[3] = "not"
#Key error
if not name[3] in actL[name[2]] : Error("[Compiler Error] Index out of range.", "lin", lines[l])
#Update a int
if name[0] in actI : actI.update({name[0] : actL[name[2]][name[3]]})
#Update a str
elif name[0] in actS : actS.update({name[0] : actL[name[2]][name[3]]})
#Update a bol
elif name[0] in actB : actB.update({name[0] : actL[name[2]][name[3]]})
#Update a flt
elif name[0] in actF : actF.update({name[0] : actL[name[2]][name[3]]})
#Unknown varible
else : Error("[Compiler Error] This is not a declareted variable.", "lin", lines[l])
#
#Default value
else : #
#Update a int
if name[0] in actI : actI.update(SetInt(clin, actI))
#Update a str
elif name[0] in actS : actS.update(SetStr(clin, actS))
#Update a bol
elif name[0] in actB : actB.update(SetBol(clin, actB))
#Update a flt
elif name[0] in actF : actF.update(SetFlt(clin, actF))
#Unknown varible
else : Error("This is not a declareted variable.", "lin", lines[l])
#
#
#Make
if clin.startswith("make ") : #
#Remove keyword
clin = clin.replace(clin[0:5], "")
args = clin.split()
#make var[0] ~[1] val[2]
#Change a int
if args[0] in actI : actI.update(Make(clin, lines[l], actI))
#Change a str
elif args[0] in actS : actS.update(Make(clin, lines[l], actS))
#Change a flt
elif args[0] in actF : actF.update(Make(clin, lines[l], actF))
#bol or else
else : Error("[Compiler Error] This variable is probably an bol. Bucket cannot change this.", "lin", lines[l])
#
#Rise
if clin.startswith("rise ") : #
#Remove keyword
vall = clin.replace(clin[0:5], "")
indx = vall.find(" in ")
name = vall[:indx]
#Cannot rise strings
if name in actS : Error("[Compiler Error] A str cannot be summed with a number.", "lin", lines[l])
#Cannot rise boolean
elif name in actB : Error("[Compiler Error] A bol cannot be summed with a number.", "lin", lines[l])
#a int
elif name in actI : actI.update(Rise(clin, actI))
#a flt
elif name in actF : actF.update(Rise(clin, actF))
else : Error("[Compiler Error] This variable do not exist.", "lin", lines[l])
#
#Down
if clin.startswith("down ") : #
#Remove keyword
clin = clin.replace(clin[0:5], "")
index = clin.find(" in ")
name = clin[:index]
#Cannot make down strings
if name in actS : Error("[Compiler Error] A str cannot be decreased by number.", "lin", lines[l])
#Cannot make down Boolean
elif name in actB : Error("[Compiler Error] A bol cannot be decreased by number.", "lin", lines[l])
#a int
elif name in actI : actI.update(Down(clin, actI))
#a flt
elif name in actF : actF.update(Down(clin, actF))
else : Error("[Compiler Error] This variable do not exist.", "lin", lines[l])
#
#Round
if clin.startswith("round ") : #
#Remove keyword
clin = clin.replace(clin[0:6], "")
#Round value if exists
value = RoundT(clin, lines[l], actF)
#This flt have the same name of an int
if value in actI : Error("[Compiler Error] You already have a int with the same name of your flt.", "lin", lines[l])
#Allright
else : #
actI.update({clin : value})
del actF[clin]
#
#
#Convert
if clin.startswith("convert ") : #
#Remove keyword
clin = clin.replace(clin[0:8], "")
#To int
if clin.endswith(" to int") : #
#Remove keyword
clin = clin.replace(" to int", "")
#An string
if clin in actS : #
actI.update({clin : CToInt(actS[clin], lines[l])})
del actS[clin]
#
#An boolean
if clin in actB : #
actI.update({clin : CToInt(actB[clin], lines[l])})
del actB[clin]
#
#An float
if clin in actF : #
actI.update({clin : CToInt(actF[clin], lines[l])})
del actF[clin]
#
#
#To str
if clin.endswith(" to str") : #
#Remove keyword
clin = clin.replace(" to str", "")
#An interable
if clin in actI : #
actS.update({clin : CToStr(actI[clin])})
del actI[clin]
#
#An boolean
elif clin in actB : #
actS.update({clin : CToStr(actB[clin])})
del actB[clin]
#
#An float
elif clin in actF : #
actS.update({clin : CToStr(actF[clin])})
del actF[clin]
#
else : Error("[Compiler Error] This is not a declareted variable.", "lin", lines[l])
#
#To bol
if clin.endswith(" to bol") : #
#Remove keyword
clin = clin.replace(" to bol", "")
#An interable
if clin in actI : #
actB.update({clin : CToBol(actI[clin], lines[l])})
del actI[clin]
#
#An string
if clin in actS : #
actB.update({clin : CToBol(actS[clin], lines[l])})
del actS[clin]
#
#An float
elif clin in actF : #
actB.update({clin : CToBol(actF[clin], lines[l])})
del actF[clin]
#
else : Error("[Compiler Error] This is not a declareted variable.", "lin", lines[l])
#
#To flt
if clin.endswith(" to flt") : #
#Remove keyword
clin = clin.replace(" to flt", "")
#An interable
if clin in actI : #
actF.update({clin : CToFlt(actI[clin], lines[l])})
del actI[clin]
#
#An string
if clin in actS : #
actF.update({clin : CToFlt(actS[clin], lines[l])})
del actS[clin]
#
#An boolean
elif clin in actB : Error("[Compiler Error] Bucket do not understand booleans as floats.", "lin", lines[l])
#
#
#Basic BIn/BOud
if clin.startswith("show ") : #
#Remove keyword
clin = clin.replace(clin[0:5], "")
#Special codes
if "\\s" in clin : clin = val.replace("\\s", " ")
if "\\q" in clin : clin = val.replace("\\q", "\"")
if "\\n" in clin : clin = val.replace("\\n", "\n")
if "\\t" in clin : clin = val.replace("\\t", "\t")
#With Arg
if "with " in clin : #
#Get the keyword index
index = clin.index(" with ")
strng = clin[:index]
argmt = clin[index + len(" with "):]
#Fix text value
if strng in actS : strng = actS[strng]
elif strng.startswith("'") and strng.endswith("'") : strng = strng.replace("'", "")
else : Error("[Compiler Error] Cannot print a non string.", "lin", lines[l])
#Replace by a int
if "[i]" in strng and IsAInt(argmt, actI) == True : #
#Number
try : #
int(argmt)
print(strng.replace("[i]", argmt))
#
#Variable
except ValueError :
print(strng.replace("[i]", str(actI[argmt])))
#
#Replace by a str
elif "[s]" in strng and IsAStr(argmt, actS) == True : #
#Text
if argmt.startswith("'") : #
#Empty
if argmt == "''" : argmt = ""
#Remove apostrophos
else : argmt = argmt[1:len(argmt) - 1]
print(strng.replace("[s]", argmt))
#
#Variable
else : print(strng.replace("[s]", actS[argmt]))
#
#Replace by a bol
elif "[b]" in strng and IsABol(argmt, actB) == True : #
#Value
if argmt == "yes" : print(strng.replace("[b]", "yes"))
elif argmt == "not" : print(strng.replace("[b]", "not"))
#Variable
else : #
if actB[argmt] == True : print(strng.replace("[b]", "yes"))
if actB[argmt] == False : print(strng.replace("[b]", "not"))
#
#
#Replace by a flt
elif "[f]" in strng and IsAFlt(argmt, actF) == True : #
#Value
if argmt.endswith(".f") : argmt = argmt.replace(".f", "")
#Number
try : #
float(argmt)
print(strng.replace("[f]", argmt))
#
#Variable
except ValueError :
print(strng.replace("[f]", str(actF[argmt])))
#
#Replace by a lst
elif "[l]" in strng and IsALst(argmt, actL) == True : #
#Variable
if argmt in actL :
#Get list from actL
argmt = actL[argmt]
#Initidal visible list
toShow = "["
#Get just the values
for item in argmt : #
if "'" in item : item = item.replace("'", "")
if item == "" : next
elif item == max(argmt.keys()) : toShow = toShow + str(argmt[item]) + "]"
else : toShow = toShow + str(argmt[item]) + ", "
#
print(strng.replace("[l]", toShow))
#
#Value
else : print(strng.replace("[l]", argmt))
#
#Without Keyword
else : Error("[Compiler Error] Sitation Keyword not found.", "lin", lines[l])
#
#Is a str value
elif clin.startswith("'") and clin.endswith("'") : #
#Empty
if clin == "''" : clin = ""
#Remove apostrophos
else : clin = clin[1:len(clin) - 1]
print(clin)
#
#Is other str
elif clin in actS : print(actS[clin])
#Input
elif clin.startswith("sand") : print(Sand(clin, lines[l], actS))
#Error
else : Error("[Compiler Error] Bucket cannot show this.", "lin", lines[l])
#
#List functions
#fin
if clin.startswith("fin ") : #
#Remove keyword
clin = clin.replace(clin[0:4], "")
#Get list
word = clin.split()
#find item[0] in[1] list[2] to[3] item[4]
#Need a int to get str index
if word[4] in actI and IsAStr(word[2], actS) : actI.update({word[4] : Fin(clin, lines[l], actS, actL)})
#Need a str to get lst index
elif word[4] in actS and IsALst(word[2], actL) : actS.update({word[4] : Fin(clin, lines[l], actS, actL)})
#Something else
else : Error("[Compiler Error] Type error or unknown variables.", "lin", lines[l])
#
#add
if clin.startswith("add ") : #
#Remove keyword
clin = clin.replace(clin[0:4], "")
#Get list
word = clin.split()
#add item[0] on[1] list[2] as[3] value[4]
if word[2] in actL : actL[word[2]].update(Add(clin, lines[l], actL))
else : Error("[Compiler Error] This list do not exist.", "lin", lines[l])
#
#del
if clin.startswith("del ") : #
#Remove keyword
clin = clin.replace(clin[0:4], "")
#Get list
word = clin.split()
#del item[0] of[1] list[2]
if word[2] in actL : actL.update(Del(clin, lines[l], actL))
else : Error("[Compiler Error] This list do not exist.", "lin", lines[l])
#
#siz
if clin.startswith("siz ") : #
#Remove keyword
clin = clin.replace(clin[0:4], "")
#For index
ledr = clin
ledr = ledr.replace("of ", "")
#siz of list to var
index = ledr.find(" to ")
aList = ledr[:index]
vName = ledr[index + 4:]
#This list exists
if aList in actL : #
if vName in actI : actI.update(Siz(clin, lines[l], actL[aList]))
else : Error("[Compiler Error] Siz function returns a int, so it needs a int to assign.", "lin", lines[l])
#
elif aList in actS : #
if vName in actI : actI.update(Siz(clin, lines[l], actS[aList]))
else : Error("[Compiler Error] Siz function returns a int, so it needs a int to assign.", "lin", lines[l])
#
else : Error("[Compiler Error] This list do not exist.", "lin", lines[l])
#
#brk
if clin.startswith("brk ") : #
clin = clin.replace(clin[0:4], "")
word = clin.split()
#brk string[0] in[1] word[2] to[3] var[4]
#Unknow vars
if not word[0] in actS : Error("[Compiler Error] This string do not exist.", "lin", lines[l])
if not word[4] in actL : Error("[Compiler Error] Cannot assign to a nonexistent list.", "lin", lines[l])
#List exits
else : actL.update(Brk(clin, lines[l], actS))
#
#loops
#For Loop
if clin.startswith("for ") : #
#for x times if y ~ z do:
loop = NewForLoop(l, lines, actI)
#Return :
#max count|question|variable|start/end line
#No return
if loop == None : Error("[Syntax Error] No loop ender.", "lin", lines[0])
#variable type
#Check a int
if loop["v"] in actI : loop["v"] = actI
#Check a str
elif loop["v"] in actS : loop["v"] = actS
#Check a bol
elif loop["v"] in actB : loop["v"] = actB
#Check a flt
elif loop["v"] in actF : loop["v"] = actF
#Non existent
else : Error("[Compiler Error] This is not a variable to check.", "lin", lines[l])
#For count
for c in range(0, loop["m"]) : #
#For every line
for i in range(loop["s"], loop["e"]) : #
#Question
quest = LogicBlk(loop["q"], lines[l], loop["v"])
#If True then ...
if quest == True : #
#Run other Runner
LittleRun(i, lines)
#Update var
if loop["v"][""] == "int" : loop["v"] = actI
if loop["v"][""] == "str" : loop["v"] = actS
if loop["v"][""] == "bol" : loop["v"] = actB
if loop["v"][""] == "flt" : loop["v"] = actF
#
#
#
#
#For Loop
if clin.startswith("every ") : #
#every item in list do:
loop = NewEveryLoop(l, lines, actL)
#Return :
#var name| var list| start l| end line|
#No return
if loop == None : Error("[Syntax Error] No loop ender.", "lin", lines[0])
#For count
#Remove type index
List = {"" : "",}
List.update(loop["l"])
del List[""]
#for every item in list do:
for c in List : #
indx = loop["l"][c]
#Create temporary variable
if type(indx) == int : actI.update({loop["v"] : indx})
elif type(indx) == str : actS.update({loop["v"] : indx})
elif type(indx) == bool : actB.update({loop["v"] : indx})
elif type(indx) == float : actF.update({loop["v"] : indx})
#For every lile
for i in range(loop["s"], loop["e"]) : #
#Run other Runner
LittleRun(i, lines)
#
#
#Remove temporary variable
if loop["v"] in actI : del actI[loop["v"]]
if loop["v"] in actS : del actS[loop["v"]]
if loop["v"] in actB : del actB[loop["v"]]
if loop["v"] in actF : del actF[loop["v"]]
#
#Tasks
#call
if clin.startswith("call ") : #
clin = clin.replace(clin[0:5], "")
#call name: arg to var
index = clin.find(" to ")
taskD = clin[:index]
vName = clin[index + 4:]
argmt = None
#Task configuration
if ":" in taskD : #
dataT = taskD.split(":")
#task[0] arg[1]
try : #
tName = dataT[0]
argmt = dataT[1]
tName = tName.strip()
argmt = argmt.strip()
#
except :
Error("[Syntax Error] no argment gived.", "lin", lines[l])
#
#Just name
else : tName = taskD
#Get task data
if tName in actT : taskD = actT[tName]
else : Error("[Compiler Error] Trying to call a unknown task.", "lin", lines[l])
#To del the temporary var
varT = "nil"
#Variables beafore task
safeI, safeS, safeB, safeF = actI, actS, actB, actF
#argment
if argmt != None : #
#No argment
if taskD["v"] == "" : Error("[Compiler Error] This task do not requires a argment.", "lin", lines[l])
#Add a temporary int
if IsAInt(argmt, actI) == True : #
varT = "int"
actI.update({taskD["v"] : actI[argmt]})
#
#Add a temporary str
if IsAStr(argmt, actS) == True : #
varT = "str"
actS.update({taskD["v"] : actS[argmt]})
#
#Add a temporary bol
if IsABol(argmt, actB) == True : #
varT = "bool"
actB.update({taskD["v"] : actB[argmt]})
#
#Add a temporary flt
if IsAFlt(argmt, actF) == True : #
varT = "float"
actF.update({taskD["v"] : actF[argmt]})
#
#
#Run
for i in range(taskD["s"], taskD["e"]) : #
retrn = LittleRun(i, lines)
#
#Dell task variables
#Int's
for i in actI :
if not i in safeI : del actI[i]
#Str's
for i in actS :
if not i in safeS : del actS[i]
#Bol's
for i in actB :
if not i in safeB : del actB[i]
#Fts's
for i in actF :
if not i in safeF : del actF[i]
#Remove the temporary var
if argmt != None : #
#Remove a int
if varT == "int" : del actI[taskD["v"]]
if varT == "str" : del actS[taskD["v"]]
if varT == "bool" : del actB[taskD["v"]]
if varT == "float" : del actF[taskD["v"]]
#
#No return
if retrn == None :
if not vName == "self." : Error("[Compiler Error] Tasks must return a value.", "lin", lines[taskD["e"]])
#Return value
else : #
#Get the return type as string
typ = str(type(retrn))
typ = typ.replace("'", "")
typ = typ.replace("<class ", "")
typ = typ[:-1]
#Check if var exists
#A int
if vName in actI : #
if typ == "int" : actI.update({vName : retrn})
#Not the same type
else : Error("[Compiler Error] This variable do not acept the return type.", "lin", lines[l])
#
#A str
elif vName in actS : #
if typ == "str" : actS.update({vName : retrn})
#Not the same type
else : Error("[Compiler Error] This variable do not acept the return type.", "lin", lines[l])
#
#A bol
elif vName in actB : #
if typ == "bool" : actB.update({vName : retrn})
#Not the same type
else : Error("[Compiler Error] This variable do not acept the return type.", "lin", lines[l])
#
#A flt
elif vName in actF : #
if typ == "float" : actF.update({vName : retrn})
#Not the same type
else : Error("[Compiler Error] This variable do not acept the return type.", "lin", lines[l])
#
#Unknown varible
else : Error("[Compiler Error] This is not a declareted variable.", "lin", lines[l])
#
#
#Task return
if clin.startswith("return ") : #
clin = clin.replace("return ", "")
#Return variable value
if clin in actI : return actI[clin]
elif clin in actS : return actS[clin]
elif clin in actB : return actB[clin]
elif clin in actF : return actF[clin]
else : #
try : #
if clin.startswith("'") and clin.endswith("'") : return clin.replace("'", "")
elif clin == "yes" : return True
elif clin == "not" : return False
elif clin.endswith(".f") : return float(clin.replace(".f", ""))
else : return int(clin)
#
except :
Error("[Compiler Error] This is not a declareted variable.", "lin", lines[l])
#
#
# | 31.769876 | 140 | 0.392128 | 7,793 | 79,520 | 4.000642 | 0.049403 | 0.038875 | 0.025115 | 0.031049 | 0.880938 | 0.876672 | 0.873047 | 0.86644 | 0.86644 | 0.863746 | 0 | 0.007702 | 0.479137 | 79,520 | 2,503 | 141 | 31.769876 | 0.74502 | 0.121793 | 0 | 0.870075 | 0 | 0.060703 | 0.105166 | 0.000362 | 0 | 0 | 0 | 0 | 0 | 1 | 0.00213 | false | 0 | 0.01278 | 0 | 0.014909 | 0.035144 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5239cc9530ab8fd2718b2358e076f2bf5ba10b96 | 6,407 | py | Python | loldib/getratings/models/NA/na_kayle/na_kayle_sup.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_kayle/na_kayle_sup.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_kayle/na_kayle_sup.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | from getratings.models.ratings import Ratings
class NA_Kayle_Sup_Aatrox(Ratings):
pass
class NA_Kayle_Sup_Ahri(Ratings):
pass
class NA_Kayle_Sup_Akali(Ratings):
pass
class NA_Kayle_Sup_Alistar(Ratings):
pass
class NA_Kayle_Sup_Amumu(Ratings):
pass
class NA_Kayle_Sup_Anivia(Ratings):
pass
class NA_Kayle_Sup_Annie(Ratings):
pass
class NA_Kayle_Sup_Ashe(Ratings):
pass
class NA_Kayle_Sup_AurelionSol(Ratings):
pass
class NA_Kayle_Sup_Azir(Ratings):
pass
class NA_Kayle_Sup_Bard(Ratings):
pass
class NA_Kayle_Sup_Blitzcrank(Ratings):
pass
class NA_Kayle_Sup_Brand(Ratings):
pass
class NA_Kayle_Sup_Braum(Ratings):
pass
class NA_Kayle_Sup_Caitlyn(Ratings):
pass
class NA_Kayle_Sup_Camille(Ratings):
pass
class NA_Kayle_Sup_Cassiopeia(Ratings):
pass
class NA_Kayle_Sup_Chogath(Ratings):
pass
class NA_Kayle_Sup_Corki(Ratings):
pass
class NA_Kayle_Sup_Darius(Ratings):
pass
class NA_Kayle_Sup_Diana(Ratings):
pass
class NA_Kayle_Sup_Draven(Ratings):
pass
class NA_Kayle_Sup_DrMundo(Ratings):
pass
class NA_Kayle_Sup_Ekko(Ratings):
pass
class NA_Kayle_Sup_Elise(Ratings):
pass
class NA_Kayle_Sup_Evelynn(Ratings):
pass
class NA_Kayle_Sup_Ezreal(Ratings):
pass
class NA_Kayle_Sup_Fiddlesticks(Ratings):
pass
class NA_Kayle_Sup_Fiora(Ratings):
pass
class NA_Kayle_Sup_Fizz(Ratings):
pass
class NA_Kayle_Sup_Galio(Ratings):
pass
class NA_Kayle_Sup_Gangplank(Ratings):
pass
class NA_Kayle_Sup_Garen(Ratings):
pass
class NA_Kayle_Sup_Gnar(Ratings):
pass
class NA_Kayle_Sup_Gragas(Ratings):
pass
class NA_Kayle_Sup_Graves(Ratings):
pass
class NA_Kayle_Sup_Hecarim(Ratings):
pass
class NA_Kayle_Sup_Heimerdinger(Ratings):
pass
class NA_Kayle_Sup_Illaoi(Ratings):
pass
class NA_Kayle_Sup_Irelia(Ratings):
pass
class NA_Kayle_Sup_Ivern(Ratings):
pass
class NA_Kayle_Sup_Janna(Ratings):
pass
class NA_Kayle_Sup_JarvanIV(Ratings):
pass
class NA_Kayle_Sup_Jax(Ratings):
pass
class NA_Kayle_Sup_Jayce(Ratings):
pass
class NA_Kayle_Sup_Jhin(Ratings):
pass
class NA_Kayle_Sup_Jinx(Ratings):
pass
class NA_Kayle_Sup_Kalista(Ratings):
pass
class NA_Kayle_Sup_Karma(Ratings):
pass
class NA_Kayle_Sup_Karthus(Ratings):
pass
class NA_Kayle_Sup_Kassadin(Ratings):
pass
class NA_Kayle_Sup_Katarina(Ratings):
pass
class NA_Kayle_Sup_Kayle(Ratings):
pass
class NA_Kayle_Sup_Kayn(Ratings):
pass
class NA_Kayle_Sup_Kennen(Ratings):
pass
class NA_Kayle_Sup_Khazix(Ratings):
pass
class NA_Kayle_Sup_Kindred(Ratings):
pass
class NA_Kayle_Sup_Kled(Ratings):
pass
class NA_Kayle_Sup_KogMaw(Ratings):
pass
class NA_Kayle_Sup_Leblanc(Ratings):
pass
class NA_Kayle_Sup_LeeSin(Ratings):
pass
class NA_Kayle_Sup_Leona(Ratings):
pass
class NA_Kayle_Sup_Lissandra(Ratings):
pass
class NA_Kayle_Sup_Lucian(Ratings):
pass
class NA_Kayle_Sup_Lulu(Ratings):
pass
class NA_Kayle_Sup_Lux(Ratings):
pass
class NA_Kayle_Sup_Malphite(Ratings):
pass
class NA_Kayle_Sup_Malzahar(Ratings):
pass
class NA_Kayle_Sup_Maokai(Ratings):
pass
class NA_Kayle_Sup_MasterYi(Ratings):
pass
class NA_Kayle_Sup_MissFortune(Ratings):
pass
class NA_Kayle_Sup_MonkeyKing(Ratings):
pass
class NA_Kayle_Sup_Mordekaiser(Ratings):
pass
class NA_Kayle_Sup_Morgana(Ratings):
pass
class NA_Kayle_Sup_Nami(Ratings):
pass
class NA_Kayle_Sup_Nasus(Ratings):
pass
class NA_Kayle_Sup_Nautilus(Ratings):
pass
class NA_Kayle_Sup_Nidalee(Ratings):
pass
class NA_Kayle_Sup_Nocturne(Ratings):
pass
class NA_Kayle_Sup_Nunu(Ratings):
pass
class NA_Kayle_Sup_Olaf(Ratings):
pass
class NA_Kayle_Sup_Orianna(Ratings):
pass
class NA_Kayle_Sup_Ornn(Ratings):
pass
class NA_Kayle_Sup_Pantheon(Ratings):
pass
class NA_Kayle_Sup_Poppy(Ratings):
pass
class NA_Kayle_Sup_Quinn(Ratings):
pass
class NA_Kayle_Sup_Rakan(Ratings):
pass
class NA_Kayle_Sup_Rammus(Ratings):
pass
class NA_Kayle_Sup_RekSai(Ratings):
pass
class NA_Kayle_Sup_Renekton(Ratings):
pass
class NA_Kayle_Sup_Rengar(Ratings):
pass
class NA_Kayle_Sup_Riven(Ratings):
pass
class NA_Kayle_Sup_Rumble(Ratings):
pass
class NA_Kayle_Sup_Ryze(Ratings):
pass
class NA_Kayle_Sup_Sejuani(Ratings):
pass
class NA_Kayle_Sup_Shaco(Ratings):
pass
class NA_Kayle_Sup_Shen(Ratings):
pass
class NA_Kayle_Sup_Shyvana(Ratings):
pass
class NA_Kayle_Sup_Singed(Ratings):
pass
class NA_Kayle_Sup_Sion(Ratings):
pass
class NA_Kayle_Sup_Sivir(Ratings):
pass
class NA_Kayle_Sup_Skarner(Ratings):
pass
class NA_Kayle_Sup_Sona(Ratings):
pass
class NA_Kayle_Sup_Soraka(Ratings):
pass
class NA_Kayle_Sup_Swain(Ratings):
pass
class NA_Kayle_Sup_Syndra(Ratings):
pass
class NA_Kayle_Sup_TahmKench(Ratings):
pass
class NA_Kayle_Sup_Taliyah(Ratings):
pass
class NA_Kayle_Sup_Talon(Ratings):
pass
class NA_Kayle_Sup_Taric(Ratings):
pass
class NA_Kayle_Sup_Teemo(Ratings):
pass
class NA_Kayle_Sup_Thresh(Ratings):
pass
class NA_Kayle_Sup_Tristana(Ratings):
pass
class NA_Kayle_Sup_Trundle(Ratings):
pass
class NA_Kayle_Sup_Tryndamere(Ratings):
pass
class NA_Kayle_Sup_TwistedFate(Ratings):
pass
class NA_Kayle_Sup_Twitch(Ratings):
pass
class NA_Kayle_Sup_Udyr(Ratings):
pass
class NA_Kayle_Sup_Urgot(Ratings):
pass
class NA_Kayle_Sup_Varus(Ratings):
pass
class NA_Kayle_Sup_Vayne(Ratings):
pass
class NA_Kayle_Sup_Veigar(Ratings):
pass
class NA_Kayle_Sup_Velkoz(Ratings):
pass
class NA_Kayle_Sup_Vi(Ratings):
pass
class NA_Kayle_Sup_Viktor(Ratings):
pass
class NA_Kayle_Sup_Vladimir(Ratings):
pass
class NA_Kayle_Sup_Volibear(Ratings):
pass
class NA_Kayle_Sup_Warwick(Ratings):
pass
class NA_Kayle_Sup_Xayah(Ratings):
pass
class NA_Kayle_Sup_Xerath(Ratings):
pass
class NA_Kayle_Sup_XinZhao(Ratings):
pass
class NA_Kayle_Sup_Yasuo(Ratings):
pass
class NA_Kayle_Sup_Yorick(Ratings):
pass
class NA_Kayle_Sup_Zac(Ratings):
pass
class NA_Kayle_Sup_Zed(Ratings):
pass
class NA_Kayle_Sup_Ziggs(Ratings):
pass
class NA_Kayle_Sup_Zilean(Ratings):
pass
class NA_Kayle_Sup_Zyra(Ratings):
pass
| 15.364508 | 46 | 0.761667 | 972 | 6,407 | 4.59465 | 0.151235 | 0.216301 | 0.370802 | 0.463502 | 0.797582 | 0.797582 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173404 | 6,407 | 416 | 47 | 15.401442 | 0.843278 | 0 | 0 | 0.498195 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.498195 | 0.00361 | 0 | 0.501805 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 7 |
524fdcf00eeddfc231d100a0bb9410349d0d15a4 | 268,231 | py | Python | sdk/textanalytics/azure-ai-textanalytics/azure/ai/textanalytics/_generated/v2022_03_01_preview/models/_models_py3.py | kashifkhan/azure-sdk-for-python | 9c28b76e89b0855e41bd12d5b4a59b51acd47eec | [
"MIT"
] | null | null | null | sdk/textanalytics/azure-ai-textanalytics/azure/ai/textanalytics/_generated/v2022_03_01_preview/models/_models_py3.py | kashifkhan/azure-sdk-for-python | 9c28b76e89b0855e41bd12d5b4a59b51acd47eec | [
"MIT"
] | null | null | null | sdk/textanalytics/azure-ai-textanalytics/azure/ai/textanalytics/_generated/v2022_03_01_preview/models/_models_py3.py | kashifkhan/azure-sdk-for-python | 9c28b76e89b0855e41bd12d5b4a59b51acd47eec | [
"MIT"
] | null | null | null | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is regenerated.
# --------------------------------------------------------------------------
import datetime
from typing import Any, Dict, List, Optional, Union
from azure.core.exceptions import HttpResponseError
import msrest.serialization
from ._text_analytics_client_enums import *
class AnalyzeTextTask(msrest.serialization.Model):
"""AnalyzeTextTask.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: AnalyzeTextEntityLinkingInput, AnalyzeTextEntityRecognitionInput, AnalyzeTextKeyPhraseExtractionInput, AnalyzeTextLanguageDetectionInput, AnalyzeTextPiiEntitiesRecognitionInput, AnalyzeTextSentimentAnalysisInput.
All required parameters must be populated in order to send to Azure.
:ivar kind: Required. Enumeration of supported Text Analysis tasks.Constant filled by server.
Possible values include: "SentimentAnalysis", "EntityRecognition", "PiiEntityRecognition",
"KeyPhraseExtraction", "LanguageDetection", "EntityLinking".
:vartype kind: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextTaskKind
"""
_validation = {
'kind': {'required': True},
}
_attribute_map = {
'kind': {'key': 'kind', 'type': 'str'},
}
_subtype_map = {
'kind': {'EntityLinking': 'AnalyzeTextEntityLinkingInput', 'EntityRecognition': 'AnalyzeTextEntityRecognitionInput', 'KeyPhraseExtraction': 'AnalyzeTextKeyPhraseExtractionInput', 'LanguageDetection': 'AnalyzeTextLanguageDetectionInput', 'PiiEntityRecognition': 'AnalyzeTextPiiEntitiesRecognitionInput', 'SentimentAnalysis': 'AnalyzeTextSentimentAnalysisInput'}
}
def __init__(
self,
**kwargs
):
"""
"""
super(AnalyzeTextTask, self).__init__(**kwargs)
self.kind = None # type: Optional[str]
class AnalyzeTextEntityLinkingInput(AnalyzeTextTask):
"""AnalyzeTextEntityLinkingInput.
All required parameters must be populated in order to send to Azure.
:ivar kind: Required. Enumeration of supported Text Analysis tasks.Constant filled by server.
Possible values include: "SentimentAnalysis", "EntityRecognition", "PiiEntityRecognition",
"KeyPhraseExtraction", "LanguageDetection", "EntityLinking".
:vartype kind: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextTaskKind
:ivar analysis_input:
:vartype analysis_input:
~azure.ai.textanalytics.v2022_03_01_preview.models.MultiLanguageAnalysisInput
:ivar parameters: Supported parameters for an Entity Linking task.
:vartype parameters:
~azure.ai.textanalytics.v2022_03_01_preview.models.EntityLinkingTaskParameters
"""
_validation = {
'kind': {'required': True},
}
_attribute_map = {
'kind': {'key': 'kind', 'type': 'str'},
'analysis_input': {'key': 'analysisInput', 'type': 'MultiLanguageAnalysisInput'},
'parameters': {'key': 'parameters', 'type': 'EntityLinkingTaskParameters'},
}
def __init__(
self,
*,
analysis_input: Optional["MultiLanguageAnalysisInput"] = None,
parameters: Optional["EntityLinkingTaskParameters"] = None,
**kwargs
):
"""
:keyword analysis_input:
:paramtype analysis_input:
~azure.ai.textanalytics.v2022_03_01_preview.models.MultiLanguageAnalysisInput
:keyword parameters: Supported parameters for an Entity Linking task.
:paramtype parameters:
~azure.ai.textanalytics.v2022_03_01_preview.models.EntityLinkingTaskParameters
"""
super(AnalyzeTextEntityLinkingInput, self).__init__(**kwargs)
self.kind = 'EntityLinking' # type: str
self.analysis_input = analysis_input
self.parameters = parameters
class AnalyzeTextEntityRecognitionInput(AnalyzeTextTask):
"""AnalyzeTextEntityRecognitionInput.
All required parameters must be populated in order to send to Azure.
:ivar kind: Required. Enumeration of supported Text Analysis tasks.Constant filled by server.
Possible values include: "SentimentAnalysis", "EntityRecognition", "PiiEntityRecognition",
"KeyPhraseExtraction", "LanguageDetection", "EntityLinking".
:vartype kind: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextTaskKind
:ivar analysis_input:
:vartype analysis_input:
~azure.ai.textanalytics.v2022_03_01_preview.models.MultiLanguageAnalysisInput
:ivar parameters: Supported parameters for an Entity Recognition task.
:vartype parameters: ~azure.ai.textanalytics.v2022_03_01_preview.models.EntitiesTaskParameters
"""
_validation = {
'kind': {'required': True},
}
_attribute_map = {
'kind': {'key': 'kind', 'type': 'str'},
'analysis_input': {'key': 'analysisInput', 'type': 'MultiLanguageAnalysisInput'},
'parameters': {'key': 'parameters', 'type': 'EntitiesTaskParameters'},
}
def __init__(
self,
*,
analysis_input: Optional["MultiLanguageAnalysisInput"] = None,
parameters: Optional["EntitiesTaskParameters"] = None,
**kwargs
):
"""
:keyword analysis_input:
:paramtype analysis_input:
~azure.ai.textanalytics.v2022_03_01_preview.models.MultiLanguageAnalysisInput
:keyword parameters: Supported parameters for an Entity Recognition task.
:paramtype parameters:
~azure.ai.textanalytics.v2022_03_01_preview.models.EntitiesTaskParameters
"""
super(AnalyzeTextEntityRecognitionInput, self).__init__(**kwargs)
self.kind = 'EntityRecognition' # type: str
self.analysis_input = analysis_input
self.parameters = parameters
class AnalyzeTextJobsInput(msrest.serialization.Model):
"""AnalyzeTextJobsInput.
All required parameters must be populated in order to send to Azure.
:ivar display_name: Optional display name for the analysis job.
:vartype display_name: str
:ivar analysis_input: Required.
:vartype analysis_input:
~azure.ai.textanalytics.v2022_03_01_preview.models.MultiLanguageAnalysisInput
:ivar tasks: Required. The set of tasks to execute on the input documents.
:vartype tasks: list[~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextLROTask]
"""
_validation = {
'analysis_input': {'required': True},
'tasks': {'required': True},
}
_attribute_map = {
'display_name': {'key': 'displayName', 'type': 'str'},
'analysis_input': {'key': 'analysisInput', 'type': 'MultiLanguageAnalysisInput'},
'tasks': {'key': 'tasks', 'type': '[AnalyzeTextLROTask]'},
}
def __init__(
self,
*,
analysis_input: "MultiLanguageAnalysisInput",
tasks: List["AnalyzeTextLROTask"],
display_name: Optional[str] = None,
**kwargs
):
"""
:keyword display_name: Optional display name for the analysis job.
:paramtype display_name: str
:keyword analysis_input: Required.
:paramtype analysis_input:
~azure.ai.textanalytics.v2022_03_01_preview.models.MultiLanguageAnalysisInput
:keyword tasks: Required. The set of tasks to execute on the input documents.
:paramtype tasks: list[~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextLROTask]
"""
super(AnalyzeTextJobsInput, self).__init__(**kwargs)
self.display_name = display_name
self.analysis_input = analysis_input
self.tasks = tasks
class AnalyzeTextJobStatistics(msrest.serialization.Model):
"""AnalyzeTextJobStatistics.
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
"""
_attribute_map = {
'statistics': {'key': 'statistics', 'type': 'RequestStatistics'},
}
def __init__(
self,
*,
statistics: Optional["RequestStatistics"] = None,
**kwargs
):
"""
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
"""
super(AnalyzeTextJobStatistics, self).__init__(**kwargs)
self.statistics = statistics
class TasksState(msrest.serialization.Model):
"""TasksState.
All required parameters must be populated in order to send to Azure.
:ivar tasks: Required.
:vartype tasks: ~azure.ai.textanalytics.v2022_03_01_preview.models.TasksStateTasks
"""
_validation = {
'tasks': {'required': True},
}
_attribute_map = {
'tasks': {'key': 'tasks', 'type': 'TasksStateTasks'},
}
def __init__(
self,
*,
tasks: "TasksStateTasks",
**kwargs
):
"""
:keyword tasks: Required.
:paramtype tasks: ~azure.ai.textanalytics.v2022_03_01_preview.models.TasksStateTasks
"""
super(TasksState, self).__init__(**kwargs)
self.tasks = tasks
class JobState(msrest.serialization.Model):
"""JobState.
All required parameters must be populated in order to send to Azure.
:ivar display_name:
:vartype display_name: str
:ivar created_date_time: Required.
:vartype created_date_time: ~datetime.datetime
:ivar expiration_date_time:
:vartype expiration_date_time: ~datetime.datetime
:ivar job_id: Required.
:vartype job_id: str
:ivar last_update_date_time: Required.
:vartype last_update_date_time: ~datetime.datetime
:ivar status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:vartype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
:ivar errors:
:vartype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.Error]
:ivar next_link:
:vartype next_link: str
"""
_validation = {
'created_date_time': {'required': True},
'job_id': {'required': True},
'last_update_date_time': {'required': True},
'status': {'required': True},
}
_attribute_map = {
'display_name': {'key': 'displayName', 'type': 'str'},
'created_date_time': {'key': 'createdDateTime', 'type': 'iso-8601'},
'expiration_date_time': {'key': 'expirationDateTime', 'type': 'iso-8601'},
'job_id': {'key': 'jobId', 'type': 'str'},
'last_update_date_time': {'key': 'lastUpdateDateTime', 'type': 'iso-8601'},
'status': {'key': 'status', 'type': 'str'},
'errors': {'key': 'errors', 'type': '[Error]'},
'next_link': {'key': 'nextLink', 'type': 'str'},
}
def __init__(
self,
*,
created_date_time: datetime.datetime,
job_id: str,
last_update_date_time: datetime.datetime,
status: Union[str, "State"],
display_name: Optional[str] = None,
expiration_date_time: Optional[datetime.datetime] = None,
errors: Optional[List["Error"]] = None,
next_link: Optional[str] = None,
**kwargs
):
"""
:keyword display_name:
:paramtype display_name: str
:keyword created_date_time: Required.
:paramtype created_date_time: ~datetime.datetime
:keyword expiration_date_time:
:paramtype expiration_date_time: ~datetime.datetime
:keyword job_id: Required.
:paramtype job_id: str
:keyword last_update_date_time: Required.
:paramtype last_update_date_time: ~datetime.datetime
:keyword status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:paramtype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
:keyword errors:
:paramtype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.Error]
:keyword next_link:
:paramtype next_link: str
"""
super(JobState, self).__init__(**kwargs)
self.display_name = display_name
self.created_date_time = created_date_time
self.expiration_date_time = expiration_date_time
self.job_id = job_id
self.last_update_date_time = last_update_date_time
self.status = status
self.errors = errors
self.next_link = next_link
class AnalyzeTextJobState(JobState, TasksState, AnalyzeTextJobStatistics):
"""AnalyzeTextJobState.
All required parameters must be populated in order to send to Azure.
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:ivar tasks: Required.
:vartype tasks: ~azure.ai.textanalytics.v2022_03_01_preview.models.TasksStateTasks
:ivar display_name:
:vartype display_name: str
:ivar created_date_time: Required.
:vartype created_date_time: ~datetime.datetime
:ivar expiration_date_time:
:vartype expiration_date_time: ~datetime.datetime
:ivar job_id: Required.
:vartype job_id: str
:ivar last_update_date_time: Required.
:vartype last_update_date_time: ~datetime.datetime
:ivar status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:vartype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
:ivar errors:
:vartype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.Error]
:ivar next_link:
:vartype next_link: str
"""
_validation = {
'tasks': {'required': True},
'created_date_time': {'required': True},
'job_id': {'required': True},
'last_update_date_time': {'required': True},
'status': {'required': True},
}
_attribute_map = {
'statistics': {'key': 'statistics', 'type': 'RequestStatistics'},
'tasks': {'key': 'tasks', 'type': 'TasksStateTasks'},
'display_name': {'key': 'displayName', 'type': 'str'},
'created_date_time': {'key': 'createdDateTime', 'type': 'iso-8601'},
'expiration_date_time': {'key': 'expirationDateTime', 'type': 'iso-8601'},
'job_id': {'key': 'jobId', 'type': 'str'},
'last_update_date_time': {'key': 'lastUpdateDateTime', 'type': 'iso-8601'},
'status': {'key': 'status', 'type': 'str'},
'errors': {'key': 'errors', 'type': '[Error]'},
'next_link': {'key': 'nextLink', 'type': 'str'},
}
def __init__(
self,
*,
tasks: "TasksStateTasks",
created_date_time: datetime.datetime,
job_id: str,
last_update_date_time: datetime.datetime,
status: Union[str, "State"],
statistics: Optional["RequestStatistics"] = None,
display_name: Optional[str] = None,
expiration_date_time: Optional[datetime.datetime] = None,
errors: Optional[List["Error"]] = None,
next_link: Optional[str] = None,
**kwargs
):
"""
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:keyword tasks: Required.
:paramtype tasks: ~azure.ai.textanalytics.v2022_03_01_preview.models.TasksStateTasks
:keyword display_name:
:paramtype display_name: str
:keyword created_date_time: Required.
:paramtype created_date_time: ~datetime.datetime
:keyword expiration_date_time:
:paramtype expiration_date_time: ~datetime.datetime
:keyword job_id: Required.
:paramtype job_id: str
:keyword last_update_date_time: Required.
:paramtype last_update_date_time: ~datetime.datetime
:keyword status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:paramtype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
:keyword errors:
:paramtype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.Error]
:keyword next_link:
:paramtype next_link: str
"""
super(AnalyzeTextJobState, self).__init__(display_name=display_name, created_date_time=created_date_time, expiration_date_time=expiration_date_time, job_id=job_id, last_update_date_time=last_update_date_time, status=status, errors=errors, next_link=next_link, tasks=tasks, statistics=statistics, **kwargs)
self.statistics = statistics
self.tasks = tasks
self.display_name = display_name
self.created_date_time = created_date_time
self.expiration_date_time = expiration_date_time
self.job_id = job_id
self.last_update_date_time = last_update_date_time
self.status = status
self.errors = errors
self.next_link = next_link
class AnalyzeTextKeyPhraseExtractionInput(AnalyzeTextTask):
"""AnalyzeTextKeyPhraseExtractionInput.
All required parameters must be populated in order to send to Azure.
:ivar kind: Required. Enumeration of supported Text Analysis tasks.Constant filled by server.
Possible values include: "SentimentAnalysis", "EntityRecognition", "PiiEntityRecognition",
"KeyPhraseExtraction", "LanguageDetection", "EntityLinking".
:vartype kind: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextTaskKind
:ivar analysis_input:
:vartype analysis_input:
~azure.ai.textanalytics.v2022_03_01_preview.models.MultiLanguageAnalysisInput
:ivar parameters: Supported parameters for a Key Phrase Extraction task.
:vartype parameters: ~azure.ai.textanalytics.v2022_03_01_preview.models.KeyPhraseTaskParameters
"""
_validation = {
'kind': {'required': True},
}
_attribute_map = {
'kind': {'key': 'kind', 'type': 'str'},
'analysis_input': {'key': 'analysisInput', 'type': 'MultiLanguageAnalysisInput'},
'parameters': {'key': 'parameters', 'type': 'KeyPhraseTaskParameters'},
}
def __init__(
self,
*,
analysis_input: Optional["MultiLanguageAnalysisInput"] = None,
parameters: Optional["KeyPhraseTaskParameters"] = None,
**kwargs
):
"""
:keyword analysis_input:
:paramtype analysis_input:
~azure.ai.textanalytics.v2022_03_01_preview.models.MultiLanguageAnalysisInput
:keyword parameters: Supported parameters for a Key Phrase Extraction task.
:paramtype parameters:
~azure.ai.textanalytics.v2022_03_01_preview.models.KeyPhraseTaskParameters
"""
super(AnalyzeTextKeyPhraseExtractionInput, self).__init__(**kwargs)
self.kind = 'KeyPhraseExtraction' # type: str
self.analysis_input = analysis_input
self.parameters = parameters
class AnalyzeTextLanguageDetectionInput(AnalyzeTextTask):
"""AnalyzeTextLanguageDetectionInput.
All required parameters must be populated in order to send to Azure.
:ivar kind: Required. Enumeration of supported Text Analysis tasks.Constant filled by server.
Possible values include: "SentimentAnalysis", "EntityRecognition", "PiiEntityRecognition",
"KeyPhraseExtraction", "LanguageDetection", "EntityLinking".
:vartype kind: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextTaskKind
:ivar analysis_input:
:vartype analysis_input:
~azure.ai.textanalytics.v2022_03_01_preview.models.LanguageDetectionAnalysisInput
:ivar parameters: Supported parameters for a Language Detection task.
:vartype parameters:
~azure.ai.textanalytics.v2022_03_01_preview.models.LanguageDetectionTaskParameters
"""
_validation = {
'kind': {'required': True},
}
_attribute_map = {
'kind': {'key': 'kind', 'type': 'str'},
'analysis_input': {'key': 'analysisInput', 'type': 'LanguageDetectionAnalysisInput'},
'parameters': {'key': 'parameters', 'type': 'LanguageDetectionTaskParameters'},
}
def __init__(
self,
*,
analysis_input: Optional["LanguageDetectionAnalysisInput"] = None,
parameters: Optional["LanguageDetectionTaskParameters"] = None,
**kwargs
):
"""
:keyword analysis_input:
:paramtype analysis_input:
~azure.ai.textanalytics.v2022_03_01_preview.models.LanguageDetectionAnalysisInput
:keyword parameters: Supported parameters for a Language Detection task.
:paramtype parameters:
~azure.ai.textanalytics.v2022_03_01_preview.models.LanguageDetectionTaskParameters
"""
super(AnalyzeTextLanguageDetectionInput, self).__init__(**kwargs)
self.kind = 'LanguageDetection' # type: str
self.analysis_input = analysis_input
self.parameters = parameters
class TaskState(msrest.serialization.Model):
"""TaskState.
All required parameters must be populated in order to send to Azure.
:ivar last_update_date_time: Required.
:vartype last_update_date_time: ~datetime.datetime
:ivar status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:vartype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
"""
_validation = {
'last_update_date_time': {'required': True},
'status': {'required': True},
}
_attribute_map = {
'last_update_date_time': {'key': 'lastUpdateDateTime', 'type': 'iso-8601'},
'status': {'key': 'status', 'type': 'str'},
}
def __init__(
self,
*,
last_update_date_time: datetime.datetime,
status: Union[str, "State"],
**kwargs
):
"""
:keyword last_update_date_time: Required.
:paramtype last_update_date_time: ~datetime.datetime
:keyword status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:paramtype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
"""
super(TaskState, self).__init__(**kwargs)
self.last_update_date_time = last_update_date_time
self.status = status
class TaskIdentifier(msrest.serialization.Model):
"""Base task object.
:ivar task_name:
:vartype task_name: str
"""
_attribute_map = {
'task_name': {'key': 'taskName', 'type': 'str'},
}
def __init__(
self,
*,
task_name: Optional[str] = None,
**kwargs
):
"""
:keyword task_name:
:paramtype task_name: str
"""
super(TaskIdentifier, self).__init__(**kwargs)
self.task_name = task_name
class AnalyzeTextLROResult(TaskIdentifier, TaskState):
"""AnalyzeTextLROResult.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: CustomEntityRecognitionLROResult, CustomMultiLabelClassificationLROResult, CustomSingleLabelClassificationLROResult, EntityLinkingLROResult, EntityRecognitionLROResult, ExtractiveSummarizationLROResult, HealthcareLROResult, KeyPhraseExtractionLROResult, PiiEntityRecognitionLROResult, SentimentLROResult.
All required parameters must be populated in order to send to Azure.
:ivar last_update_date_time: Required.
:vartype last_update_date_time: ~datetime.datetime
:ivar status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:vartype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
:ivar task_name:
:vartype task_name: str
:ivar kind: Required. Enumeration of supported Text Analysis long-running operation task
results.Constant filled by server. Possible values include: "SentimentAnalysisLROResults",
"EntityRecognitionLROResults", "PiiEntityRecognitionLROResults",
"KeyPhraseExtractionLROResults", "EntityLinkingLROResults", "HealthcareLROResults",
"ExtractiveSummarizationLROResults", "CustomEntityRecognitionLROResults",
"CustomSingleLabelClassificationLROResults", "CustomMultiLabelClassificationLROResults".
:vartype kind: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextLROResultsKind
"""
_validation = {
'last_update_date_time': {'required': True},
'status': {'required': True},
'kind': {'required': True},
}
_attribute_map = {
'last_update_date_time': {'key': 'lastUpdateDateTime', 'type': 'iso-8601'},
'status': {'key': 'status', 'type': 'str'},
'task_name': {'key': 'taskName', 'type': 'str'},
'kind': {'key': 'kind', 'type': 'str'},
}
_subtype_map = {
'kind': {'CustomEntityRecognitionLROResults': 'CustomEntityRecognitionLROResult', 'CustomMultiLabelClassificationLROResults': 'CustomMultiLabelClassificationLROResult', 'CustomSingleLabelClassificationLROResults': 'CustomSingleLabelClassificationLROResult', 'EntityLinkingLROResults': 'EntityLinkingLROResult', 'EntityRecognitionLROResults': 'EntityRecognitionLROResult', 'ExtractiveSummarizationLROResults': 'ExtractiveSummarizationLROResult', 'HealthcareLROResults': 'HealthcareLROResult', 'KeyPhraseExtractionLROResults': 'KeyPhraseExtractionLROResult', 'PiiEntityRecognitionLROResults': 'PiiEntityRecognitionLROResult', 'SentimentAnalysisLROResults': 'SentimentLROResult'}
}
def __init__(
self,
*,
last_update_date_time: datetime.datetime,
status: Union[str, "State"],
task_name: Optional[str] = None,
**kwargs
):
"""
:keyword last_update_date_time: Required.
:paramtype last_update_date_time: ~datetime.datetime
:keyword status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:paramtype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
:keyword task_name:
:paramtype task_name: str
"""
super(AnalyzeTextLROResult, self).__init__(task_name=task_name, last_update_date_time=last_update_date_time, status=status, **kwargs)
self.last_update_date_time = last_update_date_time
self.status = status
self.kind = 'AnalyzeTextLROResult' # type: str
self.task_name = task_name
class AnalyzeTextLROTask(TaskIdentifier):
"""AnalyzeTextLROTask.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: CustomEntitiesLROTask, CustomMultiLabelClassificationLROTask, CustomSingleLabelClassificationLROTask, EntityLinkingLROTask, EntitiesLROTask, ExtractiveSummarizationLROTask, HealthcareLROTask, KeyPhraseLROTask, PiiLROTask, SentimentAnalysisLROTask.
All required parameters must be populated in order to send to Azure.
:ivar task_name:
:vartype task_name: str
:ivar kind: Required. Enumeration of supported long-running Text Analysis tasks.Constant filled
by server. Possible values include: "SentimentAnalysis", "EntityRecognition",
"PiiEntityRecognition", "KeyPhraseExtraction", "EntityLinking", "Healthcare",
"ExtractiveSummarization", "CustomEntityRecognition", "CustomSingleLabelClassification",
"CustomMultiLabelClassification".
:vartype kind: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextLROTaskKind
"""
_validation = {
'kind': {'required': True},
}
_attribute_map = {
'task_name': {'key': 'taskName', 'type': 'str'},
'kind': {'key': 'kind', 'type': 'str'},
}
_subtype_map = {
'kind': {'CustomEntityRecognition': 'CustomEntitiesLROTask', 'CustomMultiLabelClassification': 'CustomMultiLabelClassificationLROTask', 'CustomSingleLabelClassification': 'CustomSingleLabelClassificationLROTask', 'EntityLinking': 'EntityLinkingLROTask', 'EntityRecognition': 'EntitiesLROTask', 'ExtractiveSummarization': 'ExtractiveSummarizationLROTask', 'Healthcare': 'HealthcareLROTask', 'KeyPhraseExtraction': 'KeyPhraseLROTask', 'PiiEntityRecognition': 'PiiLROTask', 'SentimentAnalysis': 'SentimentAnalysisLROTask'}
}
def __init__(
self,
*,
task_name: Optional[str] = None,
**kwargs
):
"""
:keyword task_name:
:paramtype task_name: str
"""
super(AnalyzeTextLROTask, self).__init__(task_name=task_name, **kwargs)
self.kind = 'AnalyzeTextLROTask' # type: str
class AnalyzeTextPiiEntitiesRecognitionInput(AnalyzeTextTask):
"""AnalyzeTextPiiEntitiesRecognitionInput.
All required parameters must be populated in order to send to Azure.
:ivar kind: Required. Enumeration of supported Text Analysis tasks.Constant filled by server.
Possible values include: "SentimentAnalysis", "EntityRecognition", "PiiEntityRecognition",
"KeyPhraseExtraction", "LanguageDetection", "EntityLinking".
:vartype kind: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextTaskKind
:ivar analysis_input:
:vartype analysis_input:
~azure.ai.textanalytics.v2022_03_01_preview.models.MultiLanguageAnalysisInput
:ivar parameters: Supported parameters for a PII Entities Recognition task.
:vartype parameters: ~azure.ai.textanalytics.v2022_03_01_preview.models.PiiTaskParameters
"""
_validation = {
'kind': {'required': True},
}
_attribute_map = {
'kind': {'key': 'kind', 'type': 'str'},
'analysis_input': {'key': 'analysisInput', 'type': 'MultiLanguageAnalysisInput'},
'parameters': {'key': 'parameters', 'type': 'PiiTaskParameters'},
}
def __init__(
self,
*,
analysis_input: Optional["MultiLanguageAnalysisInput"] = None,
parameters: Optional["PiiTaskParameters"] = None,
**kwargs
):
"""
:keyword analysis_input:
:paramtype analysis_input:
~azure.ai.textanalytics.v2022_03_01_preview.models.MultiLanguageAnalysisInput
:keyword parameters: Supported parameters for a PII Entities Recognition task.
:paramtype parameters: ~azure.ai.textanalytics.v2022_03_01_preview.models.PiiTaskParameters
"""
super(AnalyzeTextPiiEntitiesRecognitionInput, self).__init__(**kwargs)
self.kind = 'PiiEntityRecognition' # type: str
self.analysis_input = analysis_input
self.parameters = parameters
class AnalyzeTextSentimentAnalysisInput(AnalyzeTextTask):
"""AnalyzeTextSentimentAnalysisInput.
All required parameters must be populated in order to send to Azure.
:ivar kind: Required. Enumeration of supported Text Analysis tasks.Constant filled by server.
Possible values include: "SentimentAnalysis", "EntityRecognition", "PiiEntityRecognition",
"KeyPhraseExtraction", "LanguageDetection", "EntityLinking".
:vartype kind: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextTaskKind
:ivar analysis_input:
:vartype analysis_input:
~azure.ai.textanalytics.v2022_03_01_preview.models.MultiLanguageAnalysisInput
:ivar parameters: Supported parameters for a Sentiment Analysis task.
:vartype parameters:
~azure.ai.textanalytics.v2022_03_01_preview.models.SentimentAnalysisTaskParameters
"""
_validation = {
'kind': {'required': True},
}
_attribute_map = {
'kind': {'key': 'kind', 'type': 'str'},
'analysis_input': {'key': 'analysisInput', 'type': 'MultiLanguageAnalysisInput'},
'parameters': {'key': 'parameters', 'type': 'SentimentAnalysisTaskParameters'},
}
def __init__(
self,
*,
analysis_input: Optional["MultiLanguageAnalysisInput"] = None,
parameters: Optional["SentimentAnalysisTaskParameters"] = None,
**kwargs
):
"""
:keyword analysis_input:
:paramtype analysis_input:
~azure.ai.textanalytics.v2022_03_01_preview.models.MultiLanguageAnalysisInput
:keyword parameters: Supported parameters for a Sentiment Analysis task.
:paramtype parameters:
~azure.ai.textanalytics.v2022_03_01_preview.models.SentimentAnalysisTaskParameters
"""
super(AnalyzeTextSentimentAnalysisInput, self).__init__(**kwargs)
self.kind = 'SentimentAnalysis' # type: str
self.analysis_input = analysis_input
self.parameters = parameters
class AnalyzeTextTaskResult(msrest.serialization.Model):
"""AnalyzeTextTaskResult.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: EntityLinkingTaskResult, EntitiesTaskResult, KeyPhraseTaskResult, LanguageDetectionTaskResult, PiiTaskResult, SentimentTaskResult.
All required parameters must be populated in order to send to Azure.
:ivar kind: Required. Enumeration of supported Text Analysis task results.Constant filled by
server. Possible values include: "SentimentAnalysisResults", "EntityRecognitionResults",
"PiiEntityRecognitionResults", "KeyPhraseExtractionResults", "LanguageDetectionResults",
"EntityLinkingResults".
:vartype kind: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextTaskResultsKind
"""
_validation = {
'kind': {'required': True},
}
_attribute_map = {
'kind': {'key': 'kind', 'type': 'str'},
}
_subtype_map = {
'kind': {'EntityLinkingResults': 'EntityLinkingTaskResult', 'EntityRecognitionResults': 'EntitiesTaskResult', 'KeyPhraseExtractionResults': 'KeyPhraseTaskResult', 'LanguageDetectionResults': 'LanguageDetectionTaskResult', 'PiiEntityRecognitionResults': 'PiiTaskResult', 'SentimentAnalysisResults': 'SentimentTaskResult'}
}
def __init__(
self,
**kwargs
):
"""
"""
super(AnalyzeTextTaskResult, self).__init__(**kwargs)
self.kind = None # type: Optional[str]
class ClassificationResult(msrest.serialization.Model):
"""ClassificationResult.
All required parameters must be populated in order to send to Azure.
:ivar category: Required. Classification type.
:vartype category: str
:ivar confidence_score: Required. Confidence score between 0 and 1 of the recognized class.
:vartype confidence_score: float
"""
_validation = {
'category': {'required': True},
'confidence_score': {'required': True},
}
_attribute_map = {
'category': {'key': 'category', 'type': 'str'},
'confidence_score': {'key': 'confidenceScore', 'type': 'float'},
}
def __init__(
self,
*,
category: str,
confidence_score: float,
**kwargs
):
"""
:keyword category: Required. Classification type.
:paramtype category: str
:keyword confidence_score: Required. Confidence score between 0 and 1 of the recognized class.
:paramtype confidence_score: float
"""
super(ClassificationResult, self).__init__(**kwargs)
self.category = category
self.confidence_score = confidence_score
class CustomEntitiesLROTask(AnalyzeTextLROTask):
"""Use custom models to ease the process of information extraction from unstructured documents like contracts or financial documents.
All required parameters must be populated in order to send to Azure.
:ivar task_name:
:vartype task_name: str
:ivar kind: Required. Enumeration of supported long-running Text Analysis tasks.Constant filled
by server. Possible values include: "SentimentAnalysis", "EntityRecognition",
"PiiEntityRecognition", "KeyPhraseExtraction", "EntityLinking", "Healthcare",
"ExtractiveSummarization", "CustomEntityRecognition", "CustomSingleLabelClassification",
"CustomMultiLabelClassification".
:vartype kind: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextLROTaskKind
:ivar parameters: Supported parameters for a Custom Entities task.
:vartype parameters:
~azure.ai.textanalytics.v2022_03_01_preview.models.CustomEntitiesTaskParameters
"""
_validation = {
'kind': {'required': True},
}
_attribute_map = {
'task_name': {'key': 'taskName', 'type': 'str'},
'kind': {'key': 'kind', 'type': 'str'},
'parameters': {'key': 'parameters', 'type': 'CustomEntitiesTaskParameters'},
}
def __init__(
self,
*,
task_name: Optional[str] = None,
parameters: Optional["CustomEntitiesTaskParameters"] = None,
**kwargs
):
"""
:keyword task_name:
:paramtype task_name: str
:keyword parameters: Supported parameters for a Custom Entities task.
:paramtype parameters:
~azure.ai.textanalytics.v2022_03_01_preview.models.CustomEntitiesTaskParameters
"""
super(CustomEntitiesLROTask, self).__init__(task_name=task_name, **kwargs)
self.kind = 'CustomEntityRecognition' # type: str
self.parameters = parameters
class CustomResult(msrest.serialization.Model):
"""CustomResult.
All required parameters must be populated in order to send to Azure.
:ivar errors: Required. Errors by document id.
:vartype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentError]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:ivar project_name: Required. This field indicates the project name for the model.
:vartype project_name: str
:ivar deployment_name: Required. This field indicates the deployment name for the model.
:vartype deployment_name: str
"""
_validation = {
'errors': {'required': True},
'project_name': {'required': True},
'deployment_name': {'required': True},
}
_attribute_map = {
'errors': {'key': 'errors', 'type': '[DocumentError]'},
'statistics': {'key': 'statistics', 'type': 'RequestStatistics'},
'project_name': {'key': 'projectName', 'type': 'str'},
'deployment_name': {'key': 'deploymentName', 'type': 'str'},
}
def __init__(
self,
*,
errors: List["DocumentError"],
project_name: str,
deployment_name: str,
statistics: Optional["RequestStatistics"] = None,
**kwargs
):
"""
:keyword errors: Required. Errors by document id.
:paramtype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentError]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:keyword project_name: Required. This field indicates the project name for the model.
:paramtype project_name: str
:keyword deployment_name: Required. This field indicates the deployment name for the model.
:paramtype deployment_name: str
"""
super(CustomResult, self).__init__(**kwargs)
self.errors = errors
self.statistics = statistics
self.project_name = project_name
self.deployment_name = deployment_name
class CustomEntitiesResult(CustomResult):
"""CustomEntitiesResult.
All required parameters must be populated in order to send to Azure.
:ivar errors: Required. Errors by document id.
:vartype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentError]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:ivar project_name: Required. This field indicates the project name for the model.
:vartype project_name: str
:ivar deployment_name: Required. This field indicates the deployment name for the model.
:vartype deployment_name: str
:ivar documents: Required. Response by document.
:vartype documents:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.CustomEntitiesResultDocumentsItem]
"""
_validation = {
'errors': {'required': True},
'project_name': {'required': True},
'deployment_name': {'required': True},
'documents': {'required': True},
}
_attribute_map = {
'errors': {'key': 'errors', 'type': '[DocumentError]'},
'statistics': {'key': 'statistics', 'type': 'RequestStatistics'},
'project_name': {'key': 'projectName', 'type': 'str'},
'deployment_name': {'key': 'deploymentName', 'type': 'str'},
'documents': {'key': 'documents', 'type': '[CustomEntitiesResultDocumentsItem]'},
}
def __init__(
self,
*,
errors: List["DocumentError"],
project_name: str,
deployment_name: str,
documents: List["CustomEntitiesResultDocumentsItem"],
statistics: Optional["RequestStatistics"] = None,
**kwargs
):
"""
:keyword errors: Required. Errors by document id.
:paramtype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentError]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:keyword project_name: Required. This field indicates the project name for the model.
:paramtype project_name: str
:keyword deployment_name: Required. This field indicates the deployment name for the model.
:paramtype deployment_name: str
:keyword documents: Required. Response by document.
:paramtype documents:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.CustomEntitiesResultDocumentsItem]
"""
super(CustomEntitiesResult, self).__init__(errors=errors, statistics=statistics, project_name=project_name, deployment_name=deployment_name, **kwargs)
self.documents = documents
class DocumentResult(msrest.serialization.Model):
"""DocumentResult.
All required parameters must be populated in order to send to Azure.
:ivar id: Required. Unique, non-empty document identifier.
:vartype id: str
:ivar warnings: Required. Warnings encountered while processing document.
:vartype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
"""
_validation = {
'id': {'required': True},
'warnings': {'required': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'warnings': {'key': 'warnings', 'type': '[DocumentWarning]'},
'statistics': {'key': 'statistics', 'type': 'DocumentStatistics'},
}
def __init__(
self,
*,
id: str,
warnings: List["DocumentWarning"],
statistics: Optional["DocumentStatistics"] = None,
**kwargs
):
"""
:keyword id: Required. Unique, non-empty document identifier.
:paramtype id: str
:keyword warnings: Required. Warnings encountered while processing document.
:paramtype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
"""
super(DocumentResult, self).__init__(**kwargs)
self.id = id
self.warnings = warnings
self.statistics = statistics
class EntitiesDocumentResult(DocumentResult):
"""EntitiesDocumentResult.
All required parameters must be populated in order to send to Azure.
:ivar id: Required. Unique, non-empty document identifier.
:vartype id: str
:ivar warnings: Required. Warnings encountered while processing document.
:vartype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:ivar entities: Required. Recognized entities in the document.
:vartype entities: list[~azure.ai.textanalytics.v2022_03_01_preview.models.Entity]
"""
_validation = {
'id': {'required': True},
'warnings': {'required': True},
'entities': {'required': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'warnings': {'key': 'warnings', 'type': '[DocumentWarning]'},
'statistics': {'key': 'statistics', 'type': 'DocumentStatistics'},
'entities': {'key': 'entities', 'type': '[Entity]'},
}
def __init__(
self,
*,
id: str,
warnings: List["DocumentWarning"],
entities: List["Entity"],
statistics: Optional["DocumentStatistics"] = None,
**kwargs
):
"""
:keyword id: Required. Unique, non-empty document identifier.
:paramtype id: str
:keyword warnings: Required. Warnings encountered while processing document.
:paramtype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:keyword entities: Required. Recognized entities in the document.
:paramtype entities: list[~azure.ai.textanalytics.v2022_03_01_preview.models.Entity]
"""
super(EntitiesDocumentResult, self).__init__(id=id, warnings=warnings, statistics=statistics, **kwargs)
self.entities = entities
class CustomEntitiesResultDocumentsItem(EntitiesDocumentResult):
"""CustomEntitiesResultDocumentsItem.
All required parameters must be populated in order to send to Azure.
:ivar id: Required. Unique, non-empty document identifier.
:vartype id: str
:ivar warnings: Required. Warnings encountered while processing document.
:vartype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:ivar entities: Required. Recognized entities in the document.
:vartype entities: list[~azure.ai.textanalytics.v2022_03_01_preview.models.Entity]
"""
_validation = {
'id': {'required': True},
'warnings': {'required': True},
'entities': {'required': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'warnings': {'key': 'warnings', 'type': '[DocumentWarning]'},
'statistics': {'key': 'statistics', 'type': 'DocumentStatistics'},
'entities': {'key': 'entities', 'type': '[Entity]'},
}
def __init__(
self,
*,
id: str,
warnings: List["DocumentWarning"],
entities: List["Entity"],
statistics: Optional["DocumentStatistics"] = None,
**kwargs
):
"""
:keyword id: Required. Unique, non-empty document identifier.
:paramtype id: str
:keyword warnings: Required. Warnings encountered while processing document.
:paramtype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:keyword entities: Required. Recognized entities in the document.
:paramtype entities: list[~azure.ai.textanalytics.v2022_03_01_preview.models.Entity]
"""
super(CustomEntitiesResultDocumentsItem, self).__init__(id=id, warnings=warnings, statistics=statistics, entities=entities, **kwargs)
class TaskParameters(msrest.serialization.Model):
"""Base parameters object for a text analysis task.
:ivar logging_opt_out:
:vartype logging_opt_out: bool
"""
_attribute_map = {
'logging_opt_out': {'key': 'loggingOptOut', 'type': 'bool'},
}
def __init__(
self,
*,
logging_opt_out: Optional[bool] = False,
**kwargs
):
"""
:keyword logging_opt_out:
:paramtype logging_opt_out: bool
"""
super(TaskParameters, self).__init__(**kwargs)
self.logging_opt_out = logging_opt_out
class CustomTaskParameters(TaskParameters):
"""Parameters object for a text analysis task using custom models.
All required parameters must be populated in order to send to Azure.
:ivar logging_opt_out:
:vartype logging_opt_out: bool
:ivar project_name: Required.
:vartype project_name: str
:ivar deployment_name: Required.
:vartype deployment_name: str
"""
_validation = {
'project_name': {'required': True},
'deployment_name': {'required': True},
}
_attribute_map = {
'logging_opt_out': {'key': 'loggingOptOut', 'type': 'bool'},
'project_name': {'key': 'project-name', 'type': 'str'},
'deployment_name': {'key': 'deployment-name', 'type': 'str'},
}
def __init__(
self,
*,
project_name: str,
deployment_name: str,
logging_opt_out: Optional[bool] = False,
**kwargs
):
"""
:keyword logging_opt_out:
:paramtype logging_opt_out: bool
:keyword project_name: Required.
:paramtype project_name: str
:keyword deployment_name: Required.
:paramtype deployment_name: str
"""
super(CustomTaskParameters, self).__init__(logging_opt_out=logging_opt_out, **kwargs)
self.project_name = project_name
self.deployment_name = deployment_name
class CustomEntitiesTaskParameters(CustomTaskParameters):
"""Supported parameters for a Custom Entities task.
All required parameters must be populated in order to send to Azure.
:ivar logging_opt_out:
:vartype logging_opt_out: bool
:ivar project_name: Required.
:vartype project_name: str
:ivar deployment_name: Required.
:vartype deployment_name: str
:ivar string_index_type: Specifies the method used to interpret string offsets. Defaults to
Text Elements (Graphemes) according to Unicode v8.0.0. For additional information see
https://aka.ms/text-analytics-offsets. Possible values include: "TextElements_v8",
"UnicodeCodePoint", "Utf16CodeUnit". Default value: "TextElements_v8".
:vartype string_index_type: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.StringIndexType
"""
_validation = {
'project_name': {'required': True},
'deployment_name': {'required': True},
}
_attribute_map = {
'logging_opt_out': {'key': 'loggingOptOut', 'type': 'bool'},
'project_name': {'key': 'project-name', 'type': 'str'},
'deployment_name': {'key': 'deployment-name', 'type': 'str'},
'string_index_type': {'key': 'stringIndexType', 'type': 'str'},
}
def __init__(
self,
*,
project_name: str,
deployment_name: str,
logging_opt_out: Optional[bool] = False,
string_index_type: Optional[Union[str, "StringIndexType"]] = "TextElements_v8",
**kwargs
):
"""
:keyword logging_opt_out:
:paramtype logging_opt_out: bool
:keyword project_name: Required.
:paramtype project_name: str
:keyword deployment_name: Required.
:paramtype deployment_name: str
:keyword string_index_type: Specifies the method used to interpret string offsets. Defaults to
Text Elements (Graphemes) according to Unicode v8.0.0. For additional information see
https://aka.ms/text-analytics-offsets. Possible values include: "TextElements_v8",
"UnicodeCodePoint", "Utf16CodeUnit". Default value: "TextElements_v8".
:paramtype string_index_type: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.StringIndexType
"""
super(CustomEntitiesTaskParameters, self).__init__(logging_opt_out=logging_opt_out, project_name=project_name, deployment_name=deployment_name, **kwargs)
self.string_index_type = string_index_type
class CustomEntityRecognitionLROResult(AnalyzeTextLROResult):
"""CustomEntityRecognitionLROResult.
All required parameters must be populated in order to send to Azure.
:ivar last_update_date_time: Required.
:vartype last_update_date_time: ~datetime.datetime
:ivar status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:vartype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
:ivar task_name:
:vartype task_name: str
:ivar kind: Required. Enumeration of supported Text Analysis long-running operation task
results.Constant filled by server. Possible values include: "SentimentAnalysisLROResults",
"EntityRecognitionLROResults", "PiiEntityRecognitionLROResults",
"KeyPhraseExtractionLROResults", "EntityLinkingLROResults", "HealthcareLROResults",
"ExtractiveSummarizationLROResults", "CustomEntityRecognitionLROResults",
"CustomSingleLabelClassificationLROResults", "CustomMultiLabelClassificationLROResults".
:vartype kind: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextLROResultsKind
:ivar results: Required.
:vartype results: ~azure.ai.textanalytics.v2022_03_01_preview.models.CustomEntitiesResult
"""
_validation = {
'last_update_date_time': {'required': True},
'status': {'required': True},
'kind': {'required': True},
'results': {'required': True},
}
_attribute_map = {
'last_update_date_time': {'key': 'lastUpdateDateTime', 'type': 'iso-8601'},
'status': {'key': 'status', 'type': 'str'},
'task_name': {'key': 'taskName', 'type': 'str'},
'kind': {'key': 'kind', 'type': 'str'},
'results': {'key': 'results', 'type': 'CustomEntitiesResult'},
}
def __init__(
self,
*,
last_update_date_time: datetime.datetime,
status: Union[str, "State"],
results: "CustomEntitiesResult",
task_name: Optional[str] = None,
**kwargs
):
"""
:keyword last_update_date_time: Required.
:paramtype last_update_date_time: ~datetime.datetime
:keyword status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:paramtype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
:keyword task_name:
:paramtype task_name: str
:keyword results: Required.
:paramtype results: ~azure.ai.textanalytics.v2022_03_01_preview.models.CustomEntitiesResult
"""
super(CustomEntityRecognitionLROResult, self).__init__(last_update_date_time=last_update_date_time, status=status, task_name=task_name, **kwargs)
self.kind = 'CustomEntityRecognitionLROResults' # type: str
self.results = results
class CustomMultiLabelClassificationLROResult(AnalyzeTextLROResult):
"""CustomMultiLabelClassificationLROResult.
All required parameters must be populated in order to send to Azure.
:ivar last_update_date_time: Required.
:vartype last_update_date_time: ~datetime.datetime
:ivar status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:vartype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
:ivar task_name:
:vartype task_name: str
:ivar kind: Required. Enumeration of supported Text Analysis long-running operation task
results.Constant filled by server. Possible values include: "SentimentAnalysisLROResults",
"EntityRecognitionLROResults", "PiiEntityRecognitionLROResults",
"KeyPhraseExtractionLROResults", "EntityLinkingLROResults", "HealthcareLROResults",
"ExtractiveSummarizationLROResults", "CustomEntityRecognitionLROResults",
"CustomSingleLabelClassificationLROResults", "CustomMultiLabelClassificationLROResults".
:vartype kind: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextLROResultsKind
:ivar results: Required.
:vartype results:
~azure.ai.textanalytics.v2022_03_01_preview.models.CustomMultiLabelClassificationResult
"""
_validation = {
'last_update_date_time': {'required': True},
'status': {'required': True},
'kind': {'required': True},
'results': {'required': True},
}
_attribute_map = {
'last_update_date_time': {'key': 'lastUpdateDateTime', 'type': 'iso-8601'},
'status': {'key': 'status', 'type': 'str'},
'task_name': {'key': 'taskName', 'type': 'str'},
'kind': {'key': 'kind', 'type': 'str'},
'results': {'key': 'results', 'type': 'CustomMultiLabelClassificationResult'},
}
def __init__(
self,
*,
last_update_date_time: datetime.datetime,
status: Union[str, "State"],
results: "CustomMultiLabelClassificationResult",
task_name: Optional[str] = None,
**kwargs
):
"""
:keyword last_update_date_time: Required.
:paramtype last_update_date_time: ~datetime.datetime
:keyword status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:paramtype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
:keyword task_name:
:paramtype task_name: str
:keyword results: Required.
:paramtype results:
~azure.ai.textanalytics.v2022_03_01_preview.models.CustomMultiLabelClassificationResult
"""
super(CustomMultiLabelClassificationLROResult, self).__init__(last_update_date_time=last_update_date_time, status=status, task_name=task_name, **kwargs)
self.kind = 'CustomMultiLabelClassificationLROResults' # type: str
self.results = results
class CustomMultiLabelClassificationLROTask(AnalyzeTextLROTask):
"""Use custom models to classify text into multi label taxonomy.
All required parameters must be populated in order to send to Azure.
:ivar task_name:
:vartype task_name: str
:ivar kind: Required. Enumeration of supported long-running Text Analysis tasks.Constant filled
by server. Possible values include: "SentimentAnalysis", "EntityRecognition",
"PiiEntityRecognition", "KeyPhraseExtraction", "EntityLinking", "Healthcare",
"ExtractiveSummarization", "CustomEntityRecognition", "CustomSingleLabelClassification",
"CustomMultiLabelClassification".
:vartype kind: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextLROTaskKind
:ivar parameters: Supported parameters for a Custom Multi Classification task.
:vartype parameters:
~azure.ai.textanalytics.v2022_03_01_preview.models.CustomMultiLabelClassificationTaskParameters
"""
_validation = {
'kind': {'required': True},
}
_attribute_map = {
'task_name': {'key': 'taskName', 'type': 'str'},
'kind': {'key': 'kind', 'type': 'str'},
'parameters': {'key': 'parameters', 'type': 'CustomMultiLabelClassificationTaskParameters'},
}
def __init__(
self,
*,
task_name: Optional[str] = None,
parameters: Optional["CustomMultiLabelClassificationTaskParameters"] = None,
**kwargs
):
"""
:keyword task_name:
:paramtype task_name: str
:keyword parameters: Supported parameters for a Custom Multi Classification task.
:paramtype parameters:
~azure.ai.textanalytics.v2022_03_01_preview.models.CustomMultiLabelClassificationTaskParameters
"""
super(CustomMultiLabelClassificationLROTask, self).__init__(task_name=task_name, **kwargs)
self.kind = 'CustomMultiLabelClassification' # type: str
self.parameters = parameters
class CustomMultiLabelClassificationResult(CustomResult):
"""CustomMultiLabelClassificationResult.
All required parameters must be populated in order to send to Azure.
:ivar errors: Required. Errors by document id.
:vartype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentError]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:ivar project_name: Required. This field indicates the project name for the model.
:vartype project_name: str
:ivar deployment_name: Required. This field indicates the deployment name for the model.
:vartype deployment_name: str
:ivar documents: Required. Response by document.
:vartype documents:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.CustomMultiLabelClassificationResultDocumentsItem]
"""
_validation = {
'errors': {'required': True},
'project_name': {'required': True},
'deployment_name': {'required': True},
'documents': {'required': True},
}
_attribute_map = {
'errors': {'key': 'errors', 'type': '[DocumentError]'},
'statistics': {'key': 'statistics', 'type': 'RequestStatistics'},
'project_name': {'key': 'projectName', 'type': 'str'},
'deployment_name': {'key': 'deploymentName', 'type': 'str'},
'documents': {'key': 'documents', 'type': '[CustomMultiLabelClassificationResultDocumentsItem]'},
}
def __init__(
self,
*,
errors: List["DocumentError"],
project_name: str,
deployment_name: str,
documents: List["CustomMultiLabelClassificationResultDocumentsItem"],
statistics: Optional["RequestStatistics"] = None,
**kwargs
):
"""
:keyword errors: Required. Errors by document id.
:paramtype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentError]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:keyword project_name: Required. This field indicates the project name for the model.
:paramtype project_name: str
:keyword deployment_name: Required. This field indicates the deployment name for the model.
:paramtype deployment_name: str
:keyword documents: Required. Response by document.
:paramtype documents:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.CustomMultiLabelClassificationResultDocumentsItem]
"""
super(CustomMultiLabelClassificationResult, self).__init__(errors=errors, statistics=statistics, project_name=project_name, deployment_name=deployment_name, **kwargs)
self.documents = documents
class MultiClassificationDocumentResult(DocumentResult):
"""MultiClassificationDocumentResult.
All required parameters must be populated in order to send to Azure.
:ivar id: Required. Unique, non-empty document identifier.
:vartype id: str
:ivar warnings: Required. Warnings encountered while processing document.
:vartype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:ivar class_property: Required.
:vartype class_property:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.ClassificationResult]
"""
_validation = {
'id': {'required': True},
'warnings': {'required': True},
'class_property': {'required': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'warnings': {'key': 'warnings', 'type': '[DocumentWarning]'},
'statistics': {'key': 'statistics', 'type': 'DocumentStatistics'},
'class_property': {'key': 'class', 'type': '[ClassificationResult]'},
}
def __init__(
self,
*,
id: str,
warnings: List["DocumentWarning"],
class_property: List["ClassificationResult"],
statistics: Optional["DocumentStatistics"] = None,
**kwargs
):
"""
:keyword id: Required. Unique, non-empty document identifier.
:paramtype id: str
:keyword warnings: Required. Warnings encountered while processing document.
:paramtype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:keyword class_property: Required.
:paramtype class_property:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.ClassificationResult]
"""
super(MultiClassificationDocumentResult, self).__init__(id=id, warnings=warnings, statistics=statistics, **kwargs)
self.class_property = class_property
class CustomMultiLabelClassificationResultDocumentsItem(MultiClassificationDocumentResult):
"""CustomMultiLabelClassificationResultDocumentsItem.
All required parameters must be populated in order to send to Azure.
:ivar id: Required. Unique, non-empty document identifier.
:vartype id: str
:ivar warnings: Required. Warnings encountered while processing document.
:vartype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:ivar class_property: Required.
:vartype class_property:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.ClassificationResult]
"""
_validation = {
'id': {'required': True},
'warnings': {'required': True},
'class_property': {'required': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'warnings': {'key': 'warnings', 'type': '[DocumentWarning]'},
'statistics': {'key': 'statistics', 'type': 'DocumentStatistics'},
'class_property': {'key': 'class', 'type': '[ClassificationResult]'},
}
def __init__(
self,
*,
id: str,
warnings: List["DocumentWarning"],
class_property: List["ClassificationResult"],
statistics: Optional["DocumentStatistics"] = None,
**kwargs
):
"""
:keyword id: Required. Unique, non-empty document identifier.
:paramtype id: str
:keyword warnings: Required. Warnings encountered while processing document.
:paramtype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:keyword class_property: Required.
:paramtype class_property:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.ClassificationResult]
"""
super(CustomMultiLabelClassificationResultDocumentsItem, self).__init__(id=id, warnings=warnings, statistics=statistics, class_property=class_property, **kwargs)
class CustomMultiLabelClassificationTaskParameters(CustomTaskParameters):
"""Supported parameters for a Custom Multi Classification task.
All required parameters must be populated in order to send to Azure.
:ivar logging_opt_out:
:vartype logging_opt_out: bool
:ivar project_name: Required.
:vartype project_name: str
:ivar deployment_name: Required.
:vartype deployment_name: str
"""
_validation = {
'project_name': {'required': True},
'deployment_name': {'required': True},
}
_attribute_map = {
'logging_opt_out': {'key': 'loggingOptOut', 'type': 'bool'},
'project_name': {'key': 'project-name', 'type': 'str'},
'deployment_name': {'key': 'deployment-name', 'type': 'str'},
}
def __init__(
self,
*,
project_name: str,
deployment_name: str,
logging_opt_out: Optional[bool] = False,
**kwargs
):
"""
:keyword logging_opt_out:
:paramtype logging_opt_out: bool
:keyword project_name: Required.
:paramtype project_name: str
:keyword deployment_name: Required.
:paramtype deployment_name: str
"""
super(CustomMultiLabelClassificationTaskParameters, self).__init__(logging_opt_out=logging_opt_out, project_name=project_name, deployment_name=deployment_name, **kwargs)
class CustomSingleLabelClassificationLROResult(AnalyzeTextLROResult):
"""CustomSingleLabelClassificationLROResult.
All required parameters must be populated in order to send to Azure.
:ivar last_update_date_time: Required.
:vartype last_update_date_time: ~datetime.datetime
:ivar status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:vartype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
:ivar task_name:
:vartype task_name: str
:ivar kind: Required. Enumeration of supported Text Analysis long-running operation task
results.Constant filled by server. Possible values include: "SentimentAnalysisLROResults",
"EntityRecognitionLROResults", "PiiEntityRecognitionLROResults",
"KeyPhraseExtractionLROResults", "EntityLinkingLROResults", "HealthcareLROResults",
"ExtractiveSummarizationLROResults", "CustomEntityRecognitionLROResults",
"CustomSingleLabelClassificationLROResults", "CustomMultiLabelClassificationLROResults".
:vartype kind: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextLROResultsKind
:ivar results: Required.
:vartype results:
~azure.ai.textanalytics.v2022_03_01_preview.models.CustomSingleLabelClassificationResult
"""
_validation = {
'last_update_date_time': {'required': True},
'status': {'required': True},
'kind': {'required': True},
'results': {'required': True},
}
_attribute_map = {
'last_update_date_time': {'key': 'lastUpdateDateTime', 'type': 'iso-8601'},
'status': {'key': 'status', 'type': 'str'},
'task_name': {'key': 'taskName', 'type': 'str'},
'kind': {'key': 'kind', 'type': 'str'},
'results': {'key': 'results', 'type': 'CustomSingleLabelClassificationResult'},
}
def __init__(
self,
*,
last_update_date_time: datetime.datetime,
status: Union[str, "State"],
results: "CustomSingleLabelClassificationResult",
task_name: Optional[str] = None,
**kwargs
):
"""
:keyword last_update_date_time: Required.
:paramtype last_update_date_time: ~datetime.datetime
:keyword status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:paramtype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
:keyword task_name:
:paramtype task_name: str
:keyword results: Required.
:paramtype results:
~azure.ai.textanalytics.v2022_03_01_preview.models.CustomSingleLabelClassificationResult
"""
super(CustomSingleLabelClassificationLROResult, self).__init__(last_update_date_time=last_update_date_time, status=status, task_name=task_name, **kwargs)
self.kind = 'CustomSingleLabelClassificationLROResults' # type: str
self.results = results
class CustomSingleLabelClassificationLROTask(AnalyzeTextLROTask):
"""Use custom models to classify text into single label taxonomy.
All required parameters must be populated in order to send to Azure.
:ivar task_name:
:vartype task_name: str
:ivar kind: Required. Enumeration of supported long-running Text Analysis tasks.Constant filled
by server. Possible values include: "SentimentAnalysis", "EntityRecognition",
"PiiEntityRecognition", "KeyPhraseExtraction", "EntityLinking", "Healthcare",
"ExtractiveSummarization", "CustomEntityRecognition", "CustomSingleLabelClassification",
"CustomMultiLabelClassification".
:vartype kind: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextLROTaskKind
:ivar parameters: Supported parameters for a Custom Single Classification task.
:vartype parameters:
~azure.ai.textanalytics.v2022_03_01_preview.models.CustomSingleLabelClassificationTaskParameters
"""
_validation = {
'kind': {'required': True},
}
_attribute_map = {
'task_name': {'key': 'taskName', 'type': 'str'},
'kind': {'key': 'kind', 'type': 'str'},
'parameters': {'key': 'parameters', 'type': 'CustomSingleLabelClassificationTaskParameters'},
}
def __init__(
self,
*,
task_name: Optional[str] = None,
parameters: Optional["CustomSingleLabelClassificationTaskParameters"] = None,
**kwargs
):
"""
:keyword task_name:
:paramtype task_name: str
:keyword parameters: Supported parameters for a Custom Single Classification task.
:paramtype parameters:
~azure.ai.textanalytics.v2022_03_01_preview.models.CustomSingleLabelClassificationTaskParameters
"""
super(CustomSingleLabelClassificationLROTask, self).__init__(task_name=task_name, **kwargs)
self.kind = 'CustomSingleLabelClassification' # type: str
self.parameters = parameters
class CustomSingleLabelClassificationResult(CustomResult):
"""CustomSingleLabelClassificationResult.
All required parameters must be populated in order to send to Azure.
:ivar errors: Required. Errors by document id.
:vartype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentError]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:ivar project_name: Required. This field indicates the project name for the model.
:vartype project_name: str
:ivar deployment_name: Required. This field indicates the deployment name for the model.
:vartype deployment_name: str
:ivar documents: Required. Response by document.
:vartype documents:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.CustomSingleLabelClassificationResultDocumentsItem]
"""
_validation = {
'errors': {'required': True},
'project_name': {'required': True},
'deployment_name': {'required': True},
'documents': {'required': True},
}
_attribute_map = {
'errors': {'key': 'errors', 'type': '[DocumentError]'},
'statistics': {'key': 'statistics', 'type': 'RequestStatistics'},
'project_name': {'key': 'projectName', 'type': 'str'},
'deployment_name': {'key': 'deploymentName', 'type': 'str'},
'documents': {'key': 'documents', 'type': '[CustomSingleLabelClassificationResultDocumentsItem]'},
}
def __init__(
self,
*,
errors: List["DocumentError"],
project_name: str,
deployment_name: str,
documents: List["CustomSingleLabelClassificationResultDocumentsItem"],
statistics: Optional["RequestStatistics"] = None,
**kwargs
):
"""
:keyword errors: Required. Errors by document id.
:paramtype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentError]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:keyword project_name: Required. This field indicates the project name for the model.
:paramtype project_name: str
:keyword deployment_name: Required. This field indicates the deployment name for the model.
:paramtype deployment_name: str
:keyword documents: Required. Response by document.
:paramtype documents:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.CustomSingleLabelClassificationResultDocumentsItem]
"""
super(CustomSingleLabelClassificationResult, self).__init__(errors=errors, statistics=statistics, project_name=project_name, deployment_name=deployment_name, **kwargs)
self.documents = documents
class SingleClassificationDocumentResult(DocumentResult):
"""SingleClassificationDocumentResult.
All required parameters must be populated in order to send to Azure.
:ivar id: Required. Unique, non-empty document identifier.
:vartype id: str
:ivar warnings: Required. Warnings encountered while processing document.
:vartype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:ivar class_property: Required.
:vartype class_property:
~azure.ai.textanalytics.v2022_03_01_preview.models.ClassificationResult
"""
_validation = {
'id': {'required': True},
'warnings': {'required': True},
'class_property': {'required': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'warnings': {'key': 'warnings', 'type': '[DocumentWarning]'},
'statistics': {'key': 'statistics', 'type': 'DocumentStatistics'},
'class_property': {'key': 'class', 'type': 'ClassificationResult'},
}
def __init__(
self,
*,
id: str,
warnings: List["DocumentWarning"],
class_property: "ClassificationResult",
statistics: Optional["DocumentStatistics"] = None,
**kwargs
):
"""
:keyword id: Required. Unique, non-empty document identifier.
:paramtype id: str
:keyword warnings: Required. Warnings encountered while processing document.
:paramtype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:keyword class_property: Required.
:paramtype class_property:
~azure.ai.textanalytics.v2022_03_01_preview.models.ClassificationResult
"""
super(SingleClassificationDocumentResult, self).__init__(id=id, warnings=warnings, statistics=statistics, **kwargs)
self.class_property = class_property
class CustomSingleLabelClassificationResultDocumentsItem(SingleClassificationDocumentResult):
"""CustomSingleLabelClassificationResultDocumentsItem.
All required parameters must be populated in order to send to Azure.
:ivar id: Required. Unique, non-empty document identifier.
:vartype id: str
:ivar warnings: Required. Warnings encountered while processing document.
:vartype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:ivar class_property: Required.
:vartype class_property:
~azure.ai.textanalytics.v2022_03_01_preview.models.ClassificationResult
"""
_validation = {
'id': {'required': True},
'warnings': {'required': True},
'class_property': {'required': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'warnings': {'key': 'warnings', 'type': '[DocumentWarning]'},
'statistics': {'key': 'statistics', 'type': 'DocumentStatistics'},
'class_property': {'key': 'class', 'type': 'ClassificationResult'},
}
def __init__(
self,
*,
id: str,
warnings: List["DocumentWarning"],
class_property: "ClassificationResult",
statistics: Optional["DocumentStatistics"] = None,
**kwargs
):
"""
:keyword id: Required. Unique, non-empty document identifier.
:paramtype id: str
:keyword warnings: Required. Warnings encountered while processing document.
:paramtype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:keyword class_property: Required.
:paramtype class_property:
~azure.ai.textanalytics.v2022_03_01_preview.models.ClassificationResult
"""
super(CustomSingleLabelClassificationResultDocumentsItem, self).__init__(id=id, warnings=warnings, statistics=statistics, class_property=class_property, **kwargs)
class CustomSingleLabelClassificationTaskParameters(CustomTaskParameters):
"""Supported parameters for a Custom Single Classification task.
All required parameters must be populated in order to send to Azure.
:ivar logging_opt_out:
:vartype logging_opt_out: bool
:ivar project_name: Required.
:vartype project_name: str
:ivar deployment_name: Required.
:vartype deployment_name: str
"""
_validation = {
'project_name': {'required': True},
'deployment_name': {'required': True},
}
_attribute_map = {
'logging_opt_out': {'key': 'loggingOptOut', 'type': 'bool'},
'project_name': {'key': 'project-name', 'type': 'str'},
'deployment_name': {'key': 'deployment-name', 'type': 'str'},
}
def __init__(
self,
*,
project_name: str,
deployment_name: str,
logging_opt_out: Optional[bool] = False,
**kwargs
):
"""
:keyword logging_opt_out:
:paramtype logging_opt_out: bool
:keyword project_name: Required.
:paramtype project_name: str
:keyword deployment_name: Required.
:paramtype deployment_name: str
"""
super(CustomSingleLabelClassificationTaskParameters, self).__init__(logging_opt_out=logging_opt_out, project_name=project_name, deployment_name=deployment_name, **kwargs)
class DetectedLanguage(msrest.serialization.Model):
"""DetectedLanguage.
All required parameters must be populated in order to send to Azure.
:ivar name: Required. Long name of a detected language (e.g. English, French).
:vartype name: str
:ivar iso6391_name: Required. A two letter representation of the detected language according to
the ISO 639-1 standard (e.g. en, fr).
:vartype iso6391_name: str
:ivar confidence_score: Required. A confidence score between 0 and 1. Scores close to 1
indicate 100% certainty that the identified language is true.
:vartype confidence_score: float
"""
_validation = {
'name': {'required': True},
'iso6391_name': {'required': True},
'confidence_score': {'required': True},
}
_attribute_map = {
'name': {'key': 'name', 'type': 'str'},
'iso6391_name': {'key': 'iso6391Name', 'type': 'str'},
'confidence_score': {'key': 'confidenceScore', 'type': 'float'},
}
def __init__(
self,
*,
name: str,
iso6391_name: str,
confidence_score: float,
**kwargs
):
"""
:keyword name: Required. Long name of a detected language (e.g. English, French).
:paramtype name: str
:keyword iso6391_name: Required. A two letter representation of the detected language according
to the ISO 639-1 standard (e.g. en, fr).
:paramtype iso6391_name: str
:keyword confidence_score: Required. A confidence score between 0 and 1. Scores close to 1
indicate 100% certainty that the identified language is true.
:paramtype confidence_score: float
"""
super(DetectedLanguage, self).__init__(**kwargs)
self.name = name
self.iso6391_name = iso6391_name
self.confidence_score = confidence_score
class DocumentError(msrest.serialization.Model):
"""DocumentError.
All required parameters must be populated in order to send to Azure.
:ivar id: Required. Document Id.
:vartype id: str
:ivar error: Required. Document Error.
:vartype error: ~azure.ai.textanalytics.v2022_03_01_preview.models.Error
"""
_validation = {
'id': {'required': True},
'error': {'required': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'error': {'key': 'error', 'type': 'Error'},
}
def __init__(
self,
*,
id: str,
error: "Error",
**kwargs
):
"""
:keyword id: Required. Document Id.
:paramtype id: str
:keyword error: Required. Document Error.
:paramtype error: ~azure.ai.textanalytics.v2022_03_01_preview.models.Error
"""
super(DocumentError, self).__init__(**kwargs)
self.id = id
self.error = error
class DocumentStatistics(msrest.serialization.Model):
"""if showStats=true was specified in the request this field will contain information about the document payload.
All required parameters must be populated in order to send to Azure.
:ivar characters_count: Required. Number of text elements recognized in the document.
:vartype characters_count: int
:ivar transactions_count: Required. Number of transactions for the document.
:vartype transactions_count: int
"""
_validation = {
'characters_count': {'required': True},
'transactions_count': {'required': True},
}
_attribute_map = {
'characters_count': {'key': 'charactersCount', 'type': 'int'},
'transactions_count': {'key': 'transactionsCount', 'type': 'int'},
}
def __init__(
self,
*,
characters_count: int,
transactions_count: int,
**kwargs
):
"""
:keyword characters_count: Required. Number of text elements recognized in the document.
:paramtype characters_count: int
:keyword transactions_count: Required. Number of transactions for the document.
:paramtype transactions_count: int
"""
super(DocumentStatistics, self).__init__(**kwargs)
self.characters_count = characters_count
self.transactions_count = transactions_count
class DocumentWarning(msrest.serialization.Model):
"""DocumentWarning.
All required parameters must be populated in order to send to Azure.
:ivar code: Required. Error code. Possible values include: "LongWordsInDocument",
"DocumentTruncated".
:vartype code: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.WarningCodeValue
:ivar message: Required. Warning message.
:vartype message: str
:ivar target_ref: A JSON pointer reference indicating the target object.
:vartype target_ref: str
"""
_validation = {
'code': {'required': True},
'message': {'required': True},
}
_attribute_map = {
'code': {'key': 'code', 'type': 'str'},
'message': {'key': 'message', 'type': 'str'},
'target_ref': {'key': 'targetRef', 'type': 'str'},
}
def __init__(
self,
*,
code: Union[str, "WarningCodeValue"],
message: str,
target_ref: Optional[str] = None,
**kwargs
):
"""
:keyword code: Required. Error code. Possible values include: "LongWordsInDocument",
"DocumentTruncated".
:paramtype code: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.WarningCodeValue
:keyword message: Required. Warning message.
:paramtype message: str
:keyword target_ref: A JSON pointer reference indicating the target object.
:paramtype target_ref: str
"""
super(DocumentWarning, self).__init__(**kwargs)
self.code = code
self.message = message
self.target_ref = target_ref
class EntitiesLROTask(AnalyzeTextLROTask):
"""An object representing the task definition for an Entities Recognition task.
All required parameters must be populated in order to send to Azure.
:ivar task_name:
:vartype task_name: str
:ivar kind: Required. Enumeration of supported long-running Text Analysis tasks.Constant filled
by server. Possible values include: "SentimentAnalysis", "EntityRecognition",
"PiiEntityRecognition", "KeyPhraseExtraction", "EntityLinking", "Healthcare",
"ExtractiveSummarization", "CustomEntityRecognition", "CustomSingleLabelClassification",
"CustomMultiLabelClassification".
:vartype kind: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextLROTaskKind
:ivar parameters: Supported parameters for an Entity Recognition task.
:vartype parameters: ~azure.ai.textanalytics.v2022_03_01_preview.models.EntitiesTaskParameters
"""
_validation = {
'kind': {'required': True},
}
_attribute_map = {
'task_name': {'key': 'taskName', 'type': 'str'},
'kind': {'key': 'kind', 'type': 'str'},
'parameters': {'key': 'parameters', 'type': 'EntitiesTaskParameters'},
}
def __init__(
self,
*,
task_name: Optional[str] = None,
parameters: Optional["EntitiesTaskParameters"] = None,
**kwargs
):
"""
:keyword task_name:
:paramtype task_name: str
:keyword parameters: Supported parameters for an Entity Recognition task.
:paramtype parameters:
~azure.ai.textanalytics.v2022_03_01_preview.models.EntitiesTaskParameters
"""
super(EntitiesLROTask, self).__init__(task_name=task_name, **kwargs)
self.kind = 'EntityRecognition' # type: str
self.parameters = parameters
class PreBuiltResult(msrest.serialization.Model):
"""PreBuiltResult.
All required parameters must be populated in order to send to Azure.
:ivar errors: Required. Errors by document id.
:vartype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentError]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:ivar model_version: Required. This field indicates which model is used for scoring.
:vartype model_version: str
"""
_validation = {
'errors': {'required': True},
'model_version': {'required': True},
}
_attribute_map = {
'errors': {'key': 'errors', 'type': '[DocumentError]'},
'statistics': {'key': 'statistics', 'type': 'RequestStatistics'},
'model_version': {'key': 'modelVersion', 'type': 'str'},
}
def __init__(
self,
*,
errors: List["DocumentError"],
model_version: str,
statistics: Optional["RequestStatistics"] = None,
**kwargs
):
"""
:keyword errors: Required. Errors by document id.
:paramtype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentError]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:keyword model_version: Required. This field indicates which model is used for scoring.
:paramtype model_version: str
"""
super(PreBuiltResult, self).__init__(**kwargs)
self.errors = errors
self.statistics = statistics
self.model_version = model_version
class EntitiesResult(PreBuiltResult):
"""EntitiesResult.
All required parameters must be populated in order to send to Azure.
:ivar errors: Required. Errors by document id.
:vartype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentError]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:ivar model_version: Required. This field indicates which model is used for scoring.
:vartype model_version: str
:ivar documents: Required. Response by document.
:vartype documents:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.EntitiesResultDocumentsItem]
"""
_validation = {
'errors': {'required': True},
'model_version': {'required': True},
'documents': {'required': True},
}
_attribute_map = {
'errors': {'key': 'errors', 'type': '[DocumentError]'},
'statistics': {'key': 'statistics', 'type': 'RequestStatistics'},
'model_version': {'key': 'modelVersion', 'type': 'str'},
'documents': {'key': 'documents', 'type': '[EntitiesResultDocumentsItem]'},
}
def __init__(
self,
*,
errors: List["DocumentError"],
model_version: str,
documents: List["EntitiesResultDocumentsItem"],
statistics: Optional["RequestStatistics"] = None,
**kwargs
):
"""
:keyword errors: Required. Errors by document id.
:paramtype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentError]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:keyword model_version: Required. This field indicates which model is used for scoring.
:paramtype model_version: str
:keyword documents: Required. Response by document.
:paramtype documents:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.EntitiesResultDocumentsItem]
"""
super(EntitiesResult, self).__init__(errors=errors, statistics=statistics, model_version=model_version, **kwargs)
self.documents = documents
class EntitiesResultDocumentsItem(EntitiesDocumentResult):
"""EntitiesResultDocumentsItem.
All required parameters must be populated in order to send to Azure.
:ivar id: Required. Unique, non-empty document identifier.
:vartype id: str
:ivar warnings: Required. Warnings encountered while processing document.
:vartype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:ivar entities: Required. Recognized entities in the document.
:vartype entities: list[~azure.ai.textanalytics.v2022_03_01_preview.models.Entity]
"""
_validation = {
'id': {'required': True},
'warnings': {'required': True},
'entities': {'required': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'warnings': {'key': 'warnings', 'type': '[DocumentWarning]'},
'statistics': {'key': 'statistics', 'type': 'DocumentStatistics'},
'entities': {'key': 'entities', 'type': '[Entity]'},
}
def __init__(
self,
*,
id: str,
warnings: List["DocumentWarning"],
entities: List["Entity"],
statistics: Optional["DocumentStatistics"] = None,
**kwargs
):
"""
:keyword id: Required. Unique, non-empty document identifier.
:paramtype id: str
:keyword warnings: Required. Warnings encountered while processing document.
:paramtype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:keyword entities: Required. Recognized entities in the document.
:paramtype entities: list[~azure.ai.textanalytics.v2022_03_01_preview.models.Entity]
"""
super(EntitiesResultDocumentsItem, self).__init__(id=id, warnings=warnings, statistics=statistics, entities=entities, **kwargs)
class PreBuiltTaskParameters(TaskParameters):
"""Parameters object for a text analysis task using pre-built models.
:ivar logging_opt_out:
:vartype logging_opt_out: bool
:ivar model_version:
:vartype model_version: str
"""
_attribute_map = {
'logging_opt_out': {'key': 'loggingOptOut', 'type': 'bool'},
'model_version': {'key': 'modelVersion', 'type': 'str'},
}
def __init__(
self,
*,
logging_opt_out: Optional[bool] = False,
model_version: Optional[str] = "latest",
**kwargs
):
"""
:keyword logging_opt_out:
:paramtype logging_opt_out: bool
:keyword model_version:
:paramtype model_version: str
"""
super(PreBuiltTaskParameters, self).__init__(logging_opt_out=logging_opt_out, **kwargs)
self.model_version = model_version
class EntitiesTaskParameters(PreBuiltTaskParameters):
"""Supported parameters for an Entity Recognition task.
:ivar logging_opt_out:
:vartype logging_opt_out: bool
:ivar model_version:
:vartype model_version: str
:ivar string_index_type: Specifies the method used to interpret string offsets. Defaults to
Text Elements (Graphemes) according to Unicode v8.0.0. For additional information see
https://aka.ms/text-analytics-offsets. Possible values include: "TextElements_v8",
"UnicodeCodePoint", "Utf16CodeUnit". Default value: "TextElements_v8".
:vartype string_index_type: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.StringIndexType
"""
_attribute_map = {
'logging_opt_out': {'key': 'loggingOptOut', 'type': 'bool'},
'model_version': {'key': 'modelVersion', 'type': 'str'},
'string_index_type': {'key': 'stringIndexType', 'type': 'str'},
}
def __init__(
self,
*,
logging_opt_out: Optional[bool] = False,
model_version: Optional[str] = "latest",
string_index_type: Optional[Union[str, "StringIndexType"]] = "TextElements_v8",
**kwargs
):
"""
:keyword logging_opt_out:
:paramtype logging_opt_out: bool
:keyword model_version:
:paramtype model_version: str
:keyword string_index_type: Specifies the method used to interpret string offsets. Defaults to
Text Elements (Graphemes) according to Unicode v8.0.0. For additional information see
https://aka.ms/text-analytics-offsets. Possible values include: "TextElements_v8",
"UnicodeCodePoint", "Utf16CodeUnit". Default value: "TextElements_v8".
:paramtype string_index_type: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.StringIndexType
"""
super(EntitiesTaskParameters, self).__init__(logging_opt_out=logging_opt_out, model_version=model_version, **kwargs)
self.string_index_type = string_index_type
class EntitiesTaskResult(AnalyzeTextTaskResult):
"""EntitiesTaskResult.
All required parameters must be populated in order to send to Azure.
:ivar kind: Required. Enumeration of supported Text Analysis task results.Constant filled by
server. Possible values include: "SentimentAnalysisResults", "EntityRecognitionResults",
"PiiEntityRecognitionResults", "KeyPhraseExtractionResults", "LanguageDetectionResults",
"EntityLinkingResults".
:vartype kind: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextTaskResultsKind
:ivar results: Required.
:vartype results: ~azure.ai.textanalytics.v2022_03_01_preview.models.EntitiesResult
"""
_validation = {
'kind': {'required': True},
'results': {'required': True},
}
_attribute_map = {
'kind': {'key': 'kind', 'type': 'str'},
'results': {'key': 'results', 'type': 'EntitiesResult'},
}
def __init__(
self,
*,
results: "EntitiesResult",
**kwargs
):
"""
:keyword results: Required.
:paramtype results: ~azure.ai.textanalytics.v2022_03_01_preview.models.EntitiesResult
"""
super(EntitiesTaskResult, self).__init__(**kwargs)
self.kind = 'EntityRecognitionResults' # type: str
self.results = results
class Entity(msrest.serialization.Model):
"""Entity.
All required parameters must be populated in order to send to Azure.
:ivar text: Required. Entity text as appears in the request.
:vartype text: str
:ivar category: Required. Entity type.
:vartype category: str
:ivar subcategory: (Optional) Entity sub type.
:vartype subcategory: str
:ivar offset: Required. Start position for the entity text. Use of different 'stringIndexType'
values can affect the offset returned.
:vartype offset: int
:ivar length: Required. Length for the entity text. Use of different 'stringIndexType' values
can affect the length returned.
:vartype length: int
:ivar confidence_score: Required. Confidence score between 0 and 1 of the extracted entity.
:vartype confidence_score: float
"""
_validation = {
'text': {'required': True},
'category': {'required': True},
'offset': {'required': True},
'length': {'required': True},
'confidence_score': {'required': True},
}
_attribute_map = {
'text': {'key': 'text', 'type': 'str'},
'category': {'key': 'category', 'type': 'str'},
'subcategory': {'key': 'subcategory', 'type': 'str'},
'offset': {'key': 'offset', 'type': 'int'},
'length': {'key': 'length', 'type': 'int'},
'confidence_score': {'key': 'confidenceScore', 'type': 'float'},
}
def __init__(
self,
*,
text: str,
category: str,
offset: int,
length: int,
confidence_score: float,
subcategory: Optional[str] = None,
**kwargs
):
"""
:keyword text: Required. Entity text as appears in the request.
:paramtype text: str
:keyword category: Required. Entity type.
:paramtype category: str
:keyword subcategory: (Optional) Entity sub type.
:paramtype subcategory: str
:keyword offset: Required. Start position for the entity text. Use of different
'stringIndexType' values can affect the offset returned.
:paramtype offset: int
:keyword length: Required. Length for the entity text. Use of different 'stringIndexType'
values can affect the length returned.
:paramtype length: int
:keyword confidence_score: Required. Confidence score between 0 and 1 of the extracted entity.
:paramtype confidence_score: float
"""
super(Entity, self).__init__(**kwargs)
self.text = text
self.category = category
self.subcategory = subcategory
self.offset = offset
self.length = length
self.confidence_score = confidence_score
class EntityLinkingLROResult(AnalyzeTextLROResult):
"""EntityLinkingLROResult.
All required parameters must be populated in order to send to Azure.
:ivar last_update_date_time: Required.
:vartype last_update_date_time: ~datetime.datetime
:ivar status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:vartype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
:ivar task_name:
:vartype task_name: str
:ivar kind: Required. Enumeration of supported Text Analysis long-running operation task
results.Constant filled by server. Possible values include: "SentimentAnalysisLROResults",
"EntityRecognitionLROResults", "PiiEntityRecognitionLROResults",
"KeyPhraseExtractionLROResults", "EntityLinkingLROResults", "HealthcareLROResults",
"ExtractiveSummarizationLROResults", "CustomEntityRecognitionLROResults",
"CustomSingleLabelClassificationLROResults", "CustomMultiLabelClassificationLROResults".
:vartype kind: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextLROResultsKind
:ivar results: Required.
:vartype results: ~azure.ai.textanalytics.v2022_03_01_preview.models.EntityLinkingResult
"""
_validation = {
'last_update_date_time': {'required': True},
'status': {'required': True},
'kind': {'required': True},
'results': {'required': True},
}
_attribute_map = {
'last_update_date_time': {'key': 'lastUpdateDateTime', 'type': 'iso-8601'},
'status': {'key': 'status', 'type': 'str'},
'task_name': {'key': 'taskName', 'type': 'str'},
'kind': {'key': 'kind', 'type': 'str'},
'results': {'key': 'results', 'type': 'EntityLinkingResult'},
}
def __init__(
self,
*,
last_update_date_time: datetime.datetime,
status: Union[str, "State"],
results: "EntityLinkingResult",
task_name: Optional[str] = None,
**kwargs
):
"""
:keyword last_update_date_time: Required.
:paramtype last_update_date_time: ~datetime.datetime
:keyword status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:paramtype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
:keyword task_name:
:paramtype task_name: str
:keyword results: Required.
:paramtype results: ~azure.ai.textanalytics.v2022_03_01_preview.models.EntityLinkingResult
"""
super(EntityLinkingLROResult, self).__init__(last_update_date_time=last_update_date_time, status=status, task_name=task_name, **kwargs)
self.kind = 'EntityLinkingLROResults' # type: str
self.results = results
class EntityLinkingLROTask(AnalyzeTextLROTask):
"""An object representing the task definition for an Entity Linking task.
All required parameters must be populated in order to send to Azure.
:ivar task_name:
:vartype task_name: str
:ivar kind: Required. Enumeration of supported long-running Text Analysis tasks.Constant filled
by server. Possible values include: "SentimentAnalysis", "EntityRecognition",
"PiiEntityRecognition", "KeyPhraseExtraction", "EntityLinking", "Healthcare",
"ExtractiveSummarization", "CustomEntityRecognition", "CustomSingleLabelClassification",
"CustomMultiLabelClassification".
:vartype kind: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextLROTaskKind
:ivar parameters: Supported parameters for an Entity Linking task.
:vartype parameters:
~azure.ai.textanalytics.v2022_03_01_preview.models.EntityLinkingTaskParameters
"""
_validation = {
'kind': {'required': True},
}
_attribute_map = {
'task_name': {'key': 'taskName', 'type': 'str'},
'kind': {'key': 'kind', 'type': 'str'},
'parameters': {'key': 'parameters', 'type': 'EntityLinkingTaskParameters'},
}
def __init__(
self,
*,
task_name: Optional[str] = None,
parameters: Optional["EntityLinkingTaskParameters"] = None,
**kwargs
):
"""
:keyword task_name:
:paramtype task_name: str
:keyword parameters: Supported parameters for an Entity Linking task.
:paramtype parameters:
~azure.ai.textanalytics.v2022_03_01_preview.models.EntityLinkingTaskParameters
"""
super(EntityLinkingLROTask, self).__init__(task_name=task_name, **kwargs)
self.kind = 'EntityLinking' # type: str
self.parameters = parameters
class EntityLinkingResult(PreBuiltResult):
"""EntityLinkingResult.
All required parameters must be populated in order to send to Azure.
:ivar errors: Required. Errors by document id.
:vartype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentError]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:ivar model_version: Required. This field indicates which model is used for scoring.
:vartype model_version: str
:ivar documents: Required. Response by document.
:vartype documents:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.EntityLinkingResultDocumentsItem]
"""
_validation = {
'errors': {'required': True},
'model_version': {'required': True},
'documents': {'required': True},
}
_attribute_map = {
'errors': {'key': 'errors', 'type': '[DocumentError]'},
'statistics': {'key': 'statistics', 'type': 'RequestStatistics'},
'model_version': {'key': 'modelVersion', 'type': 'str'},
'documents': {'key': 'documents', 'type': '[EntityLinkingResultDocumentsItem]'},
}
def __init__(
self,
*,
errors: List["DocumentError"],
model_version: str,
documents: List["EntityLinkingResultDocumentsItem"],
statistics: Optional["RequestStatistics"] = None,
**kwargs
):
"""
:keyword errors: Required. Errors by document id.
:paramtype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentError]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:keyword model_version: Required. This field indicates which model is used for scoring.
:paramtype model_version: str
:keyword documents: Required. Response by document.
:paramtype documents:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.EntityLinkingResultDocumentsItem]
"""
super(EntityLinkingResult, self).__init__(errors=errors, statistics=statistics, model_version=model_version, **kwargs)
self.documents = documents
class LinkedEntitiesDocumentResult(DocumentResult):
"""LinkedEntitiesDocumentResult.
All required parameters must be populated in order to send to Azure.
:ivar id: Required. Unique, non-empty document identifier.
:vartype id: str
:ivar warnings: Required. Warnings encountered while processing document.
:vartype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:ivar entities: Required. Recognized well known entities in the document.
:vartype entities: list[~azure.ai.textanalytics.v2022_03_01_preview.models.LinkedEntity]
"""
_validation = {
'id': {'required': True},
'warnings': {'required': True},
'entities': {'required': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'warnings': {'key': 'warnings', 'type': '[DocumentWarning]'},
'statistics': {'key': 'statistics', 'type': 'DocumentStatistics'},
'entities': {'key': 'entities', 'type': '[LinkedEntity]'},
}
def __init__(
self,
*,
id: str,
warnings: List["DocumentWarning"],
entities: List["LinkedEntity"],
statistics: Optional["DocumentStatistics"] = None,
**kwargs
):
"""
:keyword id: Required. Unique, non-empty document identifier.
:paramtype id: str
:keyword warnings: Required. Warnings encountered while processing document.
:paramtype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:keyword entities: Required. Recognized well known entities in the document.
:paramtype entities: list[~azure.ai.textanalytics.v2022_03_01_preview.models.LinkedEntity]
"""
super(LinkedEntitiesDocumentResult, self).__init__(id=id, warnings=warnings, statistics=statistics, **kwargs)
self.entities = entities
class EntityLinkingResultDocumentsItem(LinkedEntitiesDocumentResult):
"""EntityLinkingResultDocumentsItem.
All required parameters must be populated in order to send to Azure.
:ivar id: Required. Unique, non-empty document identifier.
:vartype id: str
:ivar warnings: Required. Warnings encountered while processing document.
:vartype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:ivar entities: Required. Recognized well known entities in the document.
:vartype entities: list[~azure.ai.textanalytics.v2022_03_01_preview.models.LinkedEntity]
"""
_validation = {
'id': {'required': True},
'warnings': {'required': True},
'entities': {'required': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'warnings': {'key': 'warnings', 'type': '[DocumentWarning]'},
'statistics': {'key': 'statistics', 'type': 'DocumentStatistics'},
'entities': {'key': 'entities', 'type': '[LinkedEntity]'},
}
def __init__(
self,
*,
id: str,
warnings: List["DocumentWarning"],
entities: List["LinkedEntity"],
statistics: Optional["DocumentStatistics"] = None,
**kwargs
):
"""
:keyword id: Required. Unique, non-empty document identifier.
:paramtype id: str
:keyword warnings: Required. Warnings encountered while processing document.
:paramtype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:keyword entities: Required. Recognized well known entities in the document.
:paramtype entities: list[~azure.ai.textanalytics.v2022_03_01_preview.models.LinkedEntity]
"""
super(EntityLinkingResultDocumentsItem, self).__init__(id=id, warnings=warnings, statistics=statistics, entities=entities, **kwargs)
class EntityLinkingTaskParameters(PreBuiltTaskParameters):
"""Supported parameters for an Entity Linking task.
:ivar logging_opt_out:
:vartype logging_opt_out: bool
:ivar model_version:
:vartype model_version: str
:ivar string_index_type: Specifies the method used to interpret string offsets. Defaults to
Text Elements (Graphemes) according to Unicode v8.0.0. For additional information see
https://aka.ms/text-analytics-offsets. Possible values include: "TextElements_v8",
"UnicodeCodePoint", "Utf16CodeUnit". Default value: "TextElements_v8".
:vartype string_index_type: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.StringIndexType
"""
_attribute_map = {
'logging_opt_out': {'key': 'loggingOptOut', 'type': 'bool'},
'model_version': {'key': 'modelVersion', 'type': 'str'},
'string_index_type': {'key': 'stringIndexType', 'type': 'str'},
}
def __init__(
self,
*,
logging_opt_out: Optional[bool] = False,
model_version: Optional[str] = "latest",
string_index_type: Optional[Union[str, "StringIndexType"]] = "TextElements_v8",
**kwargs
):
"""
:keyword logging_opt_out:
:paramtype logging_opt_out: bool
:keyword model_version:
:paramtype model_version: str
:keyword string_index_type: Specifies the method used to interpret string offsets. Defaults to
Text Elements (Graphemes) according to Unicode v8.0.0. For additional information see
https://aka.ms/text-analytics-offsets. Possible values include: "TextElements_v8",
"UnicodeCodePoint", "Utf16CodeUnit". Default value: "TextElements_v8".
:paramtype string_index_type: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.StringIndexType
"""
super(EntityLinkingTaskParameters, self).__init__(logging_opt_out=logging_opt_out, model_version=model_version, **kwargs)
self.string_index_type = string_index_type
class EntityLinkingTaskResult(AnalyzeTextTaskResult):
"""EntityLinkingTaskResult.
All required parameters must be populated in order to send to Azure.
:ivar kind: Required. Enumeration of supported Text Analysis task results.Constant filled by
server. Possible values include: "SentimentAnalysisResults", "EntityRecognitionResults",
"PiiEntityRecognitionResults", "KeyPhraseExtractionResults", "LanguageDetectionResults",
"EntityLinkingResults".
:vartype kind: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextTaskResultsKind
:ivar results: Required.
:vartype results: ~azure.ai.textanalytics.v2022_03_01_preview.models.EntityLinkingResult
"""
_validation = {
'kind': {'required': True},
'results': {'required': True},
}
_attribute_map = {
'kind': {'key': 'kind', 'type': 'str'},
'results': {'key': 'results', 'type': 'EntityLinkingResult'},
}
def __init__(
self,
*,
results: "EntityLinkingResult",
**kwargs
):
"""
:keyword results: Required.
:paramtype results: ~azure.ai.textanalytics.v2022_03_01_preview.models.EntityLinkingResult
"""
super(EntityLinkingTaskResult, self).__init__(**kwargs)
self.kind = 'EntityLinkingResults' # type: str
self.results = results
class EntityRecognitionLROResult(AnalyzeTextLROResult):
"""EntityRecognitionLROResult.
All required parameters must be populated in order to send to Azure.
:ivar last_update_date_time: Required.
:vartype last_update_date_time: ~datetime.datetime
:ivar status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:vartype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
:ivar task_name:
:vartype task_name: str
:ivar kind: Required. Enumeration of supported Text Analysis long-running operation task
results.Constant filled by server. Possible values include: "SentimentAnalysisLROResults",
"EntityRecognitionLROResults", "PiiEntityRecognitionLROResults",
"KeyPhraseExtractionLROResults", "EntityLinkingLROResults", "HealthcareLROResults",
"ExtractiveSummarizationLROResults", "CustomEntityRecognitionLROResults",
"CustomSingleLabelClassificationLROResults", "CustomMultiLabelClassificationLROResults".
:vartype kind: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextLROResultsKind
:ivar results: Required.
:vartype results: ~azure.ai.textanalytics.v2022_03_01_preview.models.EntitiesResult
"""
_validation = {
'last_update_date_time': {'required': True},
'status': {'required': True},
'kind': {'required': True},
'results': {'required': True},
}
_attribute_map = {
'last_update_date_time': {'key': 'lastUpdateDateTime', 'type': 'iso-8601'},
'status': {'key': 'status', 'type': 'str'},
'task_name': {'key': 'taskName', 'type': 'str'},
'kind': {'key': 'kind', 'type': 'str'},
'results': {'key': 'results', 'type': 'EntitiesResult'},
}
def __init__(
self,
*,
last_update_date_time: datetime.datetime,
status: Union[str, "State"],
results: "EntitiesResult",
task_name: Optional[str] = None,
**kwargs
):
"""
:keyword last_update_date_time: Required.
:paramtype last_update_date_time: ~datetime.datetime
:keyword status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:paramtype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
:keyword task_name:
:paramtype task_name: str
:keyword results: Required.
:paramtype results: ~azure.ai.textanalytics.v2022_03_01_preview.models.EntitiesResult
"""
super(EntityRecognitionLROResult, self).__init__(last_update_date_time=last_update_date_time, status=status, task_name=task_name, **kwargs)
self.kind = 'EntityRecognitionLROResults' # type: str
self.results = results
class Error(msrest.serialization.Model):
"""The error object.
All required parameters must be populated in order to send to Azure.
:ivar additional_properties: Unmatched properties from the message are deserialized to this
collection.
:vartype additional_properties: dict[str, any]
:ivar code: Required. One of a server-defined set of error codes. Possible values include:
"InvalidRequest", "InvalidArgument", "Unauthorized", "Forbidden", "NotFound",
"ProjectNotFound", "OperationNotFound", "AzureCognitiveSearchNotFound",
"AzureCognitiveSearchIndexNotFound", "TooManyRequests", "AzureCognitiveSearchThrottling",
"AzureCognitiveSearchIndexLimitReached", "InternalServerError", "ServiceUnavailable".
:vartype code: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.ErrorCode
:ivar message: Required. A human-readable representation of the error.
:vartype message: str
:ivar target: The target of the error.
:vartype target: str
:ivar details: An array of details about specific errors that led to this reported error.
:vartype details: list[~azure.ai.textanalytics.v2022_03_01_preview.models.Error]
:ivar innererror: An object containing more specific information than the current object about
the error.
:vartype innererror: ~azure.ai.textanalytics.v2022_03_01_preview.models.InnerErrorModel
"""
_validation = {
'code': {'required': True},
'message': {'required': True},
}
_attribute_map = {
'additional_properties': {'key': '', 'type': '{object}'},
'code': {'key': 'code', 'type': 'str'},
'message': {'key': 'message', 'type': 'str'},
'target': {'key': 'target', 'type': 'str'},
'details': {'key': 'details', 'type': '[Error]'},
'innererror': {'key': 'innererror', 'type': 'InnerErrorModel'},
}
def __init__(
self,
*,
code: Union[str, "ErrorCode"],
message: str,
additional_properties: Optional[Dict[str, Any]] = None,
target: Optional[str] = None,
details: Optional[List["Error"]] = None,
innererror: Optional["InnerErrorModel"] = None,
**kwargs
):
"""
:keyword additional_properties: Unmatched properties from the message are deserialized to this
collection.
:paramtype additional_properties: dict[str, any]
:keyword code: Required. One of a server-defined set of error codes. Possible values include:
"InvalidRequest", "InvalidArgument", "Unauthorized", "Forbidden", "NotFound",
"ProjectNotFound", "OperationNotFound", "AzureCognitiveSearchNotFound",
"AzureCognitiveSearchIndexNotFound", "TooManyRequests", "AzureCognitiveSearchThrottling",
"AzureCognitiveSearchIndexLimitReached", "InternalServerError", "ServiceUnavailable".
:paramtype code: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.ErrorCode
:keyword message: Required. A human-readable representation of the error.
:paramtype message: str
:keyword target: The target of the error.
:paramtype target: str
:keyword details: An array of details about specific errors that led to this reported error.
:paramtype details: list[~azure.ai.textanalytics.v2022_03_01_preview.models.Error]
:keyword innererror: An object containing more specific information than the current object
about the error.
:paramtype innererror: ~azure.ai.textanalytics.v2022_03_01_preview.models.InnerErrorModel
"""
super(Error, self).__init__(**kwargs)
self.additional_properties = additional_properties
self.code = code
self.message = message
self.target = target
self.details = details
self.innererror = innererror
class ErrorResponse(msrest.serialization.Model):
"""Error response.
All required parameters must be populated in order to send to Azure.
:ivar error: Required. The error object.
:vartype error: ~azure.ai.textanalytics.v2022_03_01_preview.models.Error
"""
_validation = {
'error': {'required': True},
}
_attribute_map = {
'error': {'key': 'error', 'type': 'Error'},
}
def __init__(
self,
*,
error: "Error",
**kwargs
):
"""
:keyword error: Required. The error object.
:paramtype error: ~azure.ai.textanalytics.v2022_03_01_preview.models.Error
"""
super(ErrorResponse, self).__init__(**kwargs)
self.error = error
class ExtractedSummaryDocumentResult(DocumentResult):
"""ExtractedSummaryDocumentResult.
All required parameters must be populated in order to send to Azure.
:ivar id: Required. Unique, non-empty document identifier.
:vartype id: str
:ivar warnings: Required. Warnings encountered while processing document.
:vartype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:ivar sentences: Required. A ranked list of sentences representing the extracted summary.
:vartype sentences:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.ExtractedSummarySentence]
"""
_validation = {
'id': {'required': True},
'warnings': {'required': True},
'sentences': {'required': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'warnings': {'key': 'warnings', 'type': '[DocumentWarning]'},
'statistics': {'key': 'statistics', 'type': 'DocumentStatistics'},
'sentences': {'key': 'sentences', 'type': '[ExtractedSummarySentence]'},
}
def __init__(
self,
*,
id: str,
warnings: List["DocumentWarning"],
sentences: List["ExtractedSummarySentence"],
statistics: Optional["DocumentStatistics"] = None,
**kwargs
):
"""
:keyword id: Required. Unique, non-empty document identifier.
:paramtype id: str
:keyword warnings: Required. Warnings encountered while processing document.
:paramtype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:keyword sentences: Required. A ranked list of sentences representing the extracted summary.
:paramtype sentences:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.ExtractedSummarySentence]
"""
super(ExtractedSummaryDocumentResult, self).__init__(id=id, warnings=warnings, statistics=statistics, **kwargs)
self.sentences = sentences
class ExtractedSummarySentence(msrest.serialization.Model):
"""ExtractedSummarySentence.
All required parameters must be populated in order to send to Azure.
:ivar text: Required. The extracted sentence text.
:vartype text: str
:ivar rank_score: Required. A double value representing the relevance of the sentence within
the summary. Higher values indicate higher importance.
:vartype rank_score: float
:ivar offset: Required. The sentence offset from the start of the document, based on the value
of the parameter StringIndexType.
:vartype offset: int
:ivar length: Required. The length of the sentence.
:vartype length: int
"""
_validation = {
'text': {'required': True},
'rank_score': {'required': True},
'offset': {'required': True},
'length': {'required': True},
}
_attribute_map = {
'text': {'key': 'text', 'type': 'str'},
'rank_score': {'key': 'rankScore', 'type': 'float'},
'offset': {'key': 'offset', 'type': 'int'},
'length': {'key': 'length', 'type': 'int'},
}
def __init__(
self,
*,
text: str,
rank_score: float,
offset: int,
length: int,
**kwargs
):
"""
:keyword text: Required. The extracted sentence text.
:paramtype text: str
:keyword rank_score: Required. A double value representing the relevance of the sentence within
the summary. Higher values indicate higher importance.
:paramtype rank_score: float
:keyword offset: Required. The sentence offset from the start of the document, based on the
value of the parameter StringIndexType.
:paramtype offset: int
:keyword length: Required. The length of the sentence.
:paramtype length: int
"""
super(ExtractedSummarySentence, self).__init__(**kwargs)
self.text = text
self.rank_score = rank_score
self.offset = offset
self.length = length
class ExtractiveSummarizationLROResult(AnalyzeTextLROResult):
"""ExtractiveSummarizationLROResult.
All required parameters must be populated in order to send to Azure.
:ivar last_update_date_time: Required.
:vartype last_update_date_time: ~datetime.datetime
:ivar status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:vartype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
:ivar task_name:
:vartype task_name: str
:ivar kind: Required. Enumeration of supported Text Analysis long-running operation task
results.Constant filled by server. Possible values include: "SentimentAnalysisLROResults",
"EntityRecognitionLROResults", "PiiEntityRecognitionLROResults",
"KeyPhraseExtractionLROResults", "EntityLinkingLROResults", "HealthcareLROResults",
"ExtractiveSummarizationLROResults", "CustomEntityRecognitionLROResults",
"CustomSingleLabelClassificationLROResults", "CustomMultiLabelClassificationLROResults".
:vartype kind: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextLROResultsKind
:ivar results: Required.
:vartype results:
~azure.ai.textanalytics.v2022_03_01_preview.models.ExtractiveSummarizationResult
"""
_validation = {
'last_update_date_time': {'required': True},
'status': {'required': True},
'kind': {'required': True},
'results': {'required': True},
}
_attribute_map = {
'last_update_date_time': {'key': 'lastUpdateDateTime', 'type': 'iso-8601'},
'status': {'key': 'status', 'type': 'str'},
'task_name': {'key': 'taskName', 'type': 'str'},
'kind': {'key': 'kind', 'type': 'str'},
'results': {'key': 'results', 'type': 'ExtractiveSummarizationResult'},
}
def __init__(
self,
*,
last_update_date_time: datetime.datetime,
status: Union[str, "State"],
results: "ExtractiveSummarizationResult",
task_name: Optional[str] = None,
**kwargs
):
"""
:keyword last_update_date_time: Required.
:paramtype last_update_date_time: ~datetime.datetime
:keyword status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:paramtype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
:keyword task_name:
:paramtype task_name: str
:keyword results: Required.
:paramtype results:
~azure.ai.textanalytics.v2022_03_01_preview.models.ExtractiveSummarizationResult
"""
super(ExtractiveSummarizationLROResult, self).__init__(last_update_date_time=last_update_date_time, status=status, task_name=task_name, **kwargs)
self.kind = 'ExtractiveSummarizationLROResults' # type: str
self.results = results
class ExtractiveSummarizationLROTask(AnalyzeTextLROTask):
"""An object representing the task definition for an Extractive Summarization task.
All required parameters must be populated in order to send to Azure.
:ivar task_name:
:vartype task_name: str
:ivar kind: Required. Enumeration of supported long-running Text Analysis tasks.Constant filled
by server. Possible values include: "SentimentAnalysis", "EntityRecognition",
"PiiEntityRecognition", "KeyPhraseExtraction", "EntityLinking", "Healthcare",
"ExtractiveSummarization", "CustomEntityRecognition", "CustomSingleLabelClassification",
"CustomMultiLabelClassification".
:vartype kind: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextLROTaskKind
:ivar parameters: Supported parameters for an Extractive Summarization task.
:vartype parameters:
~azure.ai.textanalytics.v2022_03_01_preview.models.ExtractiveSummarizationTaskParameters
"""
_validation = {
'kind': {'required': True},
}
_attribute_map = {
'task_name': {'key': 'taskName', 'type': 'str'},
'kind': {'key': 'kind', 'type': 'str'},
'parameters': {'key': 'parameters', 'type': 'ExtractiveSummarizationTaskParameters'},
}
def __init__(
self,
*,
task_name: Optional[str] = None,
parameters: Optional["ExtractiveSummarizationTaskParameters"] = None,
**kwargs
):
"""
:keyword task_name:
:paramtype task_name: str
:keyword parameters: Supported parameters for an Extractive Summarization task.
:paramtype parameters:
~azure.ai.textanalytics.v2022_03_01_preview.models.ExtractiveSummarizationTaskParameters
"""
super(ExtractiveSummarizationLROTask, self).__init__(task_name=task_name, **kwargs)
self.kind = 'ExtractiveSummarization' # type: str
self.parameters = parameters
class ExtractiveSummarizationResult(PreBuiltResult):
"""ExtractiveSummarizationResult.
All required parameters must be populated in order to send to Azure.
:ivar errors: Required. Errors by document id.
:vartype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentError]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:ivar model_version: Required. This field indicates which model is used for scoring.
:vartype model_version: str
:ivar documents: Required. Response by document.
:vartype documents:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.ExtractiveSummarizationResultDocumentsItem]
"""
_validation = {
'errors': {'required': True},
'model_version': {'required': True},
'documents': {'required': True},
}
_attribute_map = {
'errors': {'key': 'errors', 'type': '[DocumentError]'},
'statistics': {'key': 'statistics', 'type': 'RequestStatistics'},
'model_version': {'key': 'modelVersion', 'type': 'str'},
'documents': {'key': 'documents', 'type': '[ExtractiveSummarizationResultDocumentsItem]'},
}
def __init__(
self,
*,
errors: List["DocumentError"],
model_version: str,
documents: List["ExtractiveSummarizationResultDocumentsItem"],
statistics: Optional["RequestStatistics"] = None,
**kwargs
):
"""
:keyword errors: Required. Errors by document id.
:paramtype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentError]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:keyword model_version: Required. This field indicates which model is used for scoring.
:paramtype model_version: str
:keyword documents: Required. Response by document.
:paramtype documents:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.ExtractiveSummarizationResultDocumentsItem]
"""
super(ExtractiveSummarizationResult, self).__init__(errors=errors, statistics=statistics, model_version=model_version, **kwargs)
self.documents = documents
class ExtractiveSummarizationResultDocumentsItem(ExtractedSummaryDocumentResult):
"""ExtractiveSummarizationResultDocumentsItem.
All required parameters must be populated in order to send to Azure.
:ivar id: Required. Unique, non-empty document identifier.
:vartype id: str
:ivar warnings: Required. Warnings encountered while processing document.
:vartype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:ivar sentences: Required. A ranked list of sentences representing the extracted summary.
:vartype sentences:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.ExtractedSummarySentence]
"""
_validation = {
'id': {'required': True},
'warnings': {'required': True},
'sentences': {'required': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'warnings': {'key': 'warnings', 'type': '[DocumentWarning]'},
'statistics': {'key': 'statistics', 'type': 'DocumentStatistics'},
'sentences': {'key': 'sentences', 'type': '[ExtractedSummarySentence]'},
}
def __init__(
self,
*,
id: str,
warnings: List["DocumentWarning"],
sentences: List["ExtractedSummarySentence"],
statistics: Optional["DocumentStatistics"] = None,
**kwargs
):
"""
:keyword id: Required. Unique, non-empty document identifier.
:paramtype id: str
:keyword warnings: Required. Warnings encountered while processing document.
:paramtype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:keyword sentences: Required. A ranked list of sentences representing the extracted summary.
:paramtype sentences:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.ExtractedSummarySentence]
"""
super(ExtractiveSummarizationResultDocumentsItem, self).__init__(id=id, warnings=warnings, statistics=statistics, sentences=sentences, **kwargs)
class ExtractiveSummarizationTaskParameters(PreBuiltTaskParameters):
"""Supported parameters for an Extractive Summarization task.
:ivar logging_opt_out:
:vartype logging_opt_out: bool
:ivar model_version:
:vartype model_version: str
:ivar sentence_count:
:vartype sentence_count: int
:ivar sort_by: The sorting criteria to use for the results of Extractive Summarization.
Possible values include: "Offset", "Rank". Default value: "Offset".
:vartype sort_by: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.ExtractiveSummarizationSortingCriteria
:ivar string_index_type: Specifies the method used to interpret string offsets. Defaults to
Text Elements (Graphemes) according to Unicode v8.0.0. For additional information see
https://aka.ms/text-analytics-offsets. Possible values include: "TextElements_v8",
"UnicodeCodePoint", "Utf16CodeUnit". Default value: "TextElements_v8".
:vartype string_index_type: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.StringIndexType
"""
_attribute_map = {
'logging_opt_out': {'key': 'loggingOptOut', 'type': 'bool'},
'model_version': {'key': 'modelVersion', 'type': 'str'},
'sentence_count': {'key': 'sentenceCount', 'type': 'int'},
'sort_by': {'key': 'sortBy', 'type': 'str'},
'string_index_type': {'key': 'stringIndexType', 'type': 'str'},
}
def __init__(
self,
*,
logging_opt_out: Optional[bool] = False,
model_version: Optional[str] = "latest",
sentence_count: Optional[int] = 3,
sort_by: Optional[Union[str, "ExtractiveSummarizationSortingCriteria"]] = "Offset",
string_index_type: Optional[Union[str, "StringIndexType"]] = "TextElements_v8",
**kwargs
):
"""
:keyword logging_opt_out:
:paramtype logging_opt_out: bool
:keyword model_version:
:paramtype model_version: str
:keyword sentence_count:
:paramtype sentence_count: int
:keyword sort_by: The sorting criteria to use for the results of Extractive Summarization.
Possible values include: "Offset", "Rank". Default value: "Offset".
:paramtype sort_by: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.ExtractiveSummarizationSortingCriteria
:keyword string_index_type: Specifies the method used to interpret string offsets. Defaults to
Text Elements (Graphemes) according to Unicode v8.0.0. For additional information see
https://aka.ms/text-analytics-offsets. Possible values include: "TextElements_v8",
"UnicodeCodePoint", "Utf16CodeUnit". Default value: "TextElements_v8".
:paramtype string_index_type: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.StringIndexType
"""
super(ExtractiveSummarizationTaskParameters, self).__init__(logging_opt_out=logging_opt_out, model_version=model_version, **kwargs)
self.sentence_count = sentence_count
self.sort_by = sort_by
self.string_index_type = string_index_type
class HealthcareAssertion(msrest.serialization.Model):
"""HealthcareAssertion.
:ivar conditionality: Describes any conditionality on the entity. Possible values include:
"hypothetical", "conditional".
:vartype conditionality: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.Conditionality
:ivar certainty: Describes the entities certainty and polarity. Possible values include:
"positive", "positivePossible", "neutralPossible", "negativePossible", "negative".
:vartype certainty: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.Certainty
:ivar association: Describes if the entity is the subject of the text or if it describes
someone else. Possible values include: "subject", "other".
:vartype association: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.Association
"""
_attribute_map = {
'conditionality': {'key': 'conditionality', 'type': 'str'},
'certainty': {'key': 'certainty', 'type': 'str'},
'association': {'key': 'association', 'type': 'str'},
}
def __init__(
self,
*,
conditionality: Optional[Union[str, "Conditionality"]] = None,
certainty: Optional[Union[str, "Certainty"]] = None,
association: Optional[Union[str, "Association"]] = None,
**kwargs
):
"""
:keyword conditionality: Describes any conditionality on the entity. Possible values include:
"hypothetical", "conditional".
:paramtype conditionality: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.Conditionality
:keyword certainty: Describes the entities certainty and polarity. Possible values include:
"positive", "positivePossible", "neutralPossible", "negativePossible", "negative".
:paramtype certainty: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.Certainty
:keyword association: Describes if the entity is the subject of the text or if it describes
someone else. Possible values include: "subject", "other".
:paramtype association: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.Association
"""
super(HealthcareAssertion, self).__init__(**kwargs)
self.conditionality = conditionality
self.certainty = certainty
self.association = association
class HealthcareEntitiesDocumentResult(DocumentResult):
"""HealthcareEntitiesDocumentResult.
All required parameters must be populated in order to send to Azure.
:ivar id: Required. Unique, non-empty document identifier.
:vartype id: str
:ivar warnings: Required. Warnings encountered while processing document.
:vartype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:ivar entities: Required. Healthcare entities.
:vartype entities: list[~azure.ai.textanalytics.v2022_03_01_preview.models.HealthcareEntity]
:ivar relations: Required. Healthcare entity relations.
:vartype relations: list[~azure.ai.textanalytics.v2022_03_01_preview.models.HealthcareRelation]
"""
_validation = {
'id': {'required': True},
'warnings': {'required': True},
'entities': {'required': True},
'relations': {'required': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'warnings': {'key': 'warnings', 'type': '[DocumentWarning]'},
'statistics': {'key': 'statistics', 'type': 'DocumentStatistics'},
'entities': {'key': 'entities', 'type': '[HealthcareEntity]'},
'relations': {'key': 'relations', 'type': '[HealthcareRelation]'},
}
def __init__(
self,
*,
id: str,
warnings: List["DocumentWarning"],
entities: List["HealthcareEntity"],
relations: List["HealthcareRelation"],
statistics: Optional["DocumentStatistics"] = None,
**kwargs
):
"""
:keyword id: Required. Unique, non-empty document identifier.
:paramtype id: str
:keyword warnings: Required. Warnings encountered while processing document.
:paramtype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:keyword entities: Required. Healthcare entities.
:paramtype entities: list[~azure.ai.textanalytics.v2022_03_01_preview.models.HealthcareEntity]
:keyword relations: Required. Healthcare entity relations.
:paramtype relations:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.HealthcareRelation]
"""
super(HealthcareEntitiesDocumentResult, self).__init__(id=id, warnings=warnings, statistics=statistics, **kwargs)
self.entities = entities
self.relations = relations
class HealthcareEntity(msrest.serialization.Model):
"""HealthcareEntity.
All required parameters must be populated in order to send to Azure.
:ivar text: Required. Entity text as appears in the request.
:vartype text: str
:ivar category: Required. Healthcare Entity Category. Possible values include:
"BODY_STRUCTURE", "AGE", "GENDER", "EXAMINATION_NAME", "DATE", "DIRECTION", "FREQUENCY",
"MEASUREMENT_VALUE", "MEASUREMENT_UNIT", "RELATIONAL_OPERATOR", "TIME", "GENE_OR_PROTEIN",
"VARIANT", "ADMINISTRATIVE_EVENT", "CARE_ENVIRONMENT", "HEALTHCARE_PROFESSION", "DIAGNOSIS",
"SYMPTOM_OR_SIGN", "CONDITION_QUALIFIER", "MEDICATION_CLASS", "MEDICATION_NAME", "DOSAGE",
"MEDICATION_FORM", "MEDICATION_ROUTE", "FAMILY_RELATION", "TREATMENT_NAME".
:vartype category: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.HealthcareEntityCategory
:ivar subcategory: (Optional) Entity sub type.
:vartype subcategory: str
:ivar offset: Required. Start position for the entity text. Use of different 'stringIndexType'
values can affect the offset returned.
:vartype offset: int
:ivar length: Required. Length for the entity text. Use of different 'stringIndexType' values
can affect the length returned.
:vartype length: int
:ivar confidence_score: Required. Confidence score between 0 and 1 of the extracted entity.
:vartype confidence_score: float
:ivar assertion:
:vartype assertion: ~azure.ai.textanalytics.v2022_03_01_preview.models.HealthcareAssertion
:ivar name: Preferred name for the entity. Example: 'histologically' would have a 'name' of
'histologic'.
:vartype name: str
:ivar links: Entity references in known data sources.
:vartype links: list[~azure.ai.textanalytics.v2022_03_01_preview.models.HealthcareEntityLink]
"""
_validation = {
'text': {'required': True},
'category': {'required': True},
'offset': {'required': True},
'length': {'required': True},
'confidence_score': {'required': True},
}
_attribute_map = {
'text': {'key': 'text', 'type': 'str'},
'category': {'key': 'category', 'type': 'str'},
'subcategory': {'key': 'subcategory', 'type': 'str'},
'offset': {'key': 'offset', 'type': 'int'},
'length': {'key': 'length', 'type': 'int'},
'confidence_score': {'key': 'confidenceScore', 'type': 'float'},
'assertion': {'key': 'assertion', 'type': 'HealthcareAssertion'},
'name': {'key': 'name', 'type': 'str'},
'links': {'key': 'links', 'type': '[HealthcareEntityLink]'},
}
def __init__(
self,
*,
text: str,
category: Union[str, "HealthcareEntityCategory"],
offset: int,
length: int,
confidence_score: float,
subcategory: Optional[str] = None,
assertion: Optional["HealthcareAssertion"] = None,
name: Optional[str] = None,
links: Optional[List["HealthcareEntityLink"]] = None,
**kwargs
):
"""
:keyword text: Required. Entity text as appears in the request.
:paramtype text: str
:keyword category: Required. Healthcare Entity Category. Possible values include:
"BODY_STRUCTURE", "AGE", "GENDER", "EXAMINATION_NAME", "DATE", "DIRECTION", "FREQUENCY",
"MEASUREMENT_VALUE", "MEASUREMENT_UNIT", "RELATIONAL_OPERATOR", "TIME", "GENE_OR_PROTEIN",
"VARIANT", "ADMINISTRATIVE_EVENT", "CARE_ENVIRONMENT", "HEALTHCARE_PROFESSION", "DIAGNOSIS",
"SYMPTOM_OR_SIGN", "CONDITION_QUALIFIER", "MEDICATION_CLASS", "MEDICATION_NAME", "DOSAGE",
"MEDICATION_FORM", "MEDICATION_ROUTE", "FAMILY_RELATION", "TREATMENT_NAME".
:paramtype category: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.HealthcareEntityCategory
:keyword subcategory: (Optional) Entity sub type.
:paramtype subcategory: str
:keyword offset: Required. Start position for the entity text. Use of different
'stringIndexType' values can affect the offset returned.
:paramtype offset: int
:keyword length: Required. Length for the entity text. Use of different 'stringIndexType'
values can affect the length returned.
:paramtype length: int
:keyword confidence_score: Required. Confidence score between 0 and 1 of the extracted entity.
:paramtype confidence_score: float
:keyword assertion:
:paramtype assertion: ~azure.ai.textanalytics.v2022_03_01_preview.models.HealthcareAssertion
:keyword name: Preferred name for the entity. Example: 'histologically' would have a 'name' of
'histologic'.
:paramtype name: str
:keyword links: Entity references in known data sources.
:paramtype links: list[~azure.ai.textanalytics.v2022_03_01_preview.models.HealthcareEntityLink]
"""
super(HealthcareEntity, self).__init__(**kwargs)
self.text = text
self.category = category
self.subcategory = subcategory
self.offset = offset
self.length = length
self.confidence_score = confidence_score
self.assertion = assertion
self.name = name
self.links = links
class HealthcareEntityLink(msrest.serialization.Model):
"""HealthcareEntityLink.
All required parameters must be populated in order to send to Azure.
:ivar data_source: Required. Entity Catalog. Examples include: UMLS, CHV, MSH, etc.
:vartype data_source: str
:ivar id: Required. Entity id in the given source catalog.
:vartype id: str
"""
_validation = {
'data_source': {'required': True},
'id': {'required': True},
}
_attribute_map = {
'data_source': {'key': 'dataSource', 'type': 'str'},
'id': {'key': 'id', 'type': 'str'},
}
def __init__(
self,
*,
data_source: str,
id: str,
**kwargs
):
"""
:keyword data_source: Required. Entity Catalog. Examples include: UMLS, CHV, MSH, etc.
:paramtype data_source: str
:keyword id: Required. Entity id in the given source catalog.
:paramtype id: str
"""
super(HealthcareEntityLink, self).__init__(**kwargs)
self.data_source = data_source
self.id = id
class HealthcareLROResult(AnalyzeTextLROResult):
"""HealthcareLROResult.
All required parameters must be populated in order to send to Azure.
:ivar last_update_date_time: Required.
:vartype last_update_date_time: ~datetime.datetime
:ivar status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:vartype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
:ivar task_name:
:vartype task_name: str
:ivar kind: Required. Enumeration of supported Text Analysis long-running operation task
results.Constant filled by server. Possible values include: "SentimentAnalysisLROResults",
"EntityRecognitionLROResults", "PiiEntityRecognitionLROResults",
"KeyPhraseExtractionLROResults", "EntityLinkingLROResults", "HealthcareLROResults",
"ExtractiveSummarizationLROResults", "CustomEntityRecognitionLROResults",
"CustomSingleLabelClassificationLROResults", "CustomMultiLabelClassificationLROResults".
:vartype kind: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextLROResultsKind
:ivar results: Required.
:vartype results: ~azure.ai.textanalytics.v2022_03_01_preview.models.HealthcareResult
"""
_validation = {
'last_update_date_time': {'required': True},
'status': {'required': True},
'kind': {'required': True},
'results': {'required': True},
}
_attribute_map = {
'last_update_date_time': {'key': 'lastUpdateDateTime', 'type': 'iso-8601'},
'status': {'key': 'status', 'type': 'str'},
'task_name': {'key': 'taskName', 'type': 'str'},
'kind': {'key': 'kind', 'type': 'str'},
'results': {'key': 'results', 'type': 'HealthcareResult'},
}
def __init__(
self,
*,
last_update_date_time: datetime.datetime,
status: Union[str, "State"],
results: "HealthcareResult",
task_name: Optional[str] = None,
**kwargs
):
"""
:keyword last_update_date_time: Required.
:paramtype last_update_date_time: ~datetime.datetime
:keyword status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:paramtype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
:keyword task_name:
:paramtype task_name: str
:keyword results: Required.
:paramtype results: ~azure.ai.textanalytics.v2022_03_01_preview.models.HealthcareResult
"""
super(HealthcareLROResult, self).__init__(last_update_date_time=last_update_date_time, status=status, task_name=task_name, **kwargs)
self.kind = 'HealthcareLROResults' # type: str
self.results = results
class HealthcareLROTask(AnalyzeTextLROTask):
"""HealthcareLROTask.
All required parameters must be populated in order to send to Azure.
:ivar task_name:
:vartype task_name: str
:ivar kind: Required. Enumeration of supported long-running Text Analysis tasks.Constant filled
by server. Possible values include: "SentimentAnalysis", "EntityRecognition",
"PiiEntityRecognition", "KeyPhraseExtraction", "EntityLinking", "Healthcare",
"ExtractiveSummarization", "CustomEntityRecognition", "CustomSingleLabelClassification",
"CustomMultiLabelClassification".
:vartype kind: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextLROTaskKind
:ivar parameters: Supported parameters for a Healthcare task.
:vartype parameters:
~azure.ai.textanalytics.v2022_03_01_preview.models.HealthcareTaskParameters
"""
_validation = {
'kind': {'required': True},
}
_attribute_map = {
'task_name': {'key': 'taskName', 'type': 'str'},
'kind': {'key': 'kind', 'type': 'str'},
'parameters': {'key': 'parameters', 'type': 'HealthcareTaskParameters'},
}
def __init__(
self,
*,
task_name: Optional[str] = None,
parameters: Optional["HealthcareTaskParameters"] = None,
**kwargs
):
"""
:keyword task_name:
:paramtype task_name: str
:keyword parameters: Supported parameters for a Healthcare task.
:paramtype parameters:
~azure.ai.textanalytics.v2022_03_01_preview.models.HealthcareTaskParameters
"""
super(HealthcareLROTask, self).__init__(task_name=task_name, **kwargs)
self.kind = 'Healthcare' # type: str
self.parameters = parameters
class HealthcareRelation(msrest.serialization.Model):
"""Every relation is an entity graph of a certain relationType, where all entities are connected and have specific roles within the relation context.
All required parameters must be populated in order to send to Azure.
:ivar relation_type: Required. Type of relation. Examples include: ``DosageOfMedication`` or
'FrequencyOfMedication', etc. Possible values include: "Abbreviation",
"DirectionOfBodyStructure", "DirectionOfCondition", "DirectionOfExamination",
"DirectionOfTreatment", "DosageOfMedication", "FormOfMedication", "FrequencyOfMedication",
"FrequencyOfTreatment", "QualifierOfCondition", "RelationOfExamination", "RouteOfMedication",
"TimeOfCondition", "TimeOfEvent", "TimeOfExamination", "TimeOfMedication", "TimeOfTreatment",
"UnitOfCondition", "UnitOfExamination", "ValueOfCondition", "ValueOfExamination".
:vartype relation_type: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.RelationType
:ivar entities: Required. The entities in the relation.
:vartype entities:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.HealthcareRelationEntity]
"""
_validation = {
'relation_type': {'required': True},
'entities': {'required': True},
}
_attribute_map = {
'relation_type': {'key': 'relationType', 'type': 'str'},
'entities': {'key': 'entities', 'type': '[HealthcareRelationEntity]'},
}
def __init__(
self,
*,
relation_type: Union[str, "RelationType"],
entities: List["HealthcareRelationEntity"],
**kwargs
):
"""
:keyword relation_type: Required. Type of relation. Examples include: ``DosageOfMedication`` or
'FrequencyOfMedication', etc. Possible values include: "Abbreviation",
"DirectionOfBodyStructure", "DirectionOfCondition", "DirectionOfExamination",
"DirectionOfTreatment", "DosageOfMedication", "FormOfMedication", "FrequencyOfMedication",
"FrequencyOfTreatment", "QualifierOfCondition", "RelationOfExamination", "RouteOfMedication",
"TimeOfCondition", "TimeOfEvent", "TimeOfExamination", "TimeOfMedication", "TimeOfTreatment",
"UnitOfCondition", "UnitOfExamination", "ValueOfCondition", "ValueOfExamination".
:paramtype relation_type: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.RelationType
:keyword entities: Required. The entities in the relation.
:paramtype entities:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.HealthcareRelationEntity]
"""
super(HealthcareRelation, self).__init__(**kwargs)
self.relation_type = relation_type
self.entities = entities
class HealthcareRelationEntity(msrest.serialization.Model):
"""HealthcareRelationEntity.
All required parameters must be populated in order to send to Azure.
:ivar ref: Required. Reference link object, using a JSON pointer RFC 6901 (URI Fragment
Identifier Representation), pointing to the entity .
:vartype ref: str
:ivar role: Required. Role of entity in the relationship. For example: 'CD20-positive diffuse
large B-cell lymphoma' has the following entities with their roles in parenthesis: CD20
(GeneOrProtein), Positive (Expression), diffuse large B-cell lymphoma (Diagnosis).
:vartype role: str
"""
_validation = {
'ref': {'required': True},
'role': {'required': True},
}
_attribute_map = {
'ref': {'key': 'ref', 'type': 'str'},
'role': {'key': 'role', 'type': 'str'},
}
def __init__(
self,
*,
ref: str,
role: str,
**kwargs
):
"""
:keyword ref: Required. Reference link object, using a JSON pointer RFC 6901 (URI Fragment
Identifier Representation), pointing to the entity .
:paramtype ref: str
:keyword role: Required. Role of entity in the relationship. For example: 'CD20-positive
diffuse large B-cell lymphoma' has the following entities with their roles in parenthesis:
CD20 (GeneOrProtein), Positive (Expression), diffuse large B-cell lymphoma (Diagnosis).
:paramtype role: str
"""
super(HealthcareRelationEntity, self).__init__(**kwargs)
self.ref = ref
self.role = role
class HealthcareResult(PreBuiltResult):
"""HealthcareResult.
All required parameters must be populated in order to send to Azure.
:ivar errors: Required. Errors by document id.
:vartype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentError]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:ivar model_version: Required. This field indicates which model is used for scoring.
:vartype model_version: str
:ivar documents: Required.
:vartype documents:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.HealthcareResultDocumentsItem]
"""
_validation = {
'errors': {'required': True},
'model_version': {'required': True},
'documents': {'required': True},
}
_attribute_map = {
'errors': {'key': 'errors', 'type': '[DocumentError]'},
'statistics': {'key': 'statistics', 'type': 'RequestStatistics'},
'model_version': {'key': 'modelVersion', 'type': 'str'},
'documents': {'key': 'documents', 'type': '[HealthcareResultDocumentsItem]'},
}
def __init__(
self,
*,
errors: List["DocumentError"],
model_version: str,
documents: List["HealthcareResultDocumentsItem"],
statistics: Optional["RequestStatistics"] = None,
**kwargs
):
"""
:keyword errors: Required. Errors by document id.
:paramtype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentError]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:keyword model_version: Required. This field indicates which model is used for scoring.
:paramtype model_version: str
:keyword documents: Required.
:paramtype documents:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.HealthcareResultDocumentsItem]
"""
super(HealthcareResult, self).__init__(errors=errors, statistics=statistics, model_version=model_version, **kwargs)
self.documents = documents
class HealthcareResultDocumentsItem(HealthcareEntitiesDocumentResult):
"""HealthcareResultDocumentsItem.
All required parameters must be populated in order to send to Azure.
:ivar id: Required. Unique, non-empty document identifier.
:vartype id: str
:ivar warnings: Required. Warnings encountered while processing document.
:vartype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:ivar entities: Required. Healthcare entities.
:vartype entities: list[~azure.ai.textanalytics.v2022_03_01_preview.models.HealthcareEntity]
:ivar relations: Required. Healthcare entity relations.
:vartype relations: list[~azure.ai.textanalytics.v2022_03_01_preview.models.HealthcareRelation]
"""
_validation = {
'id': {'required': True},
'warnings': {'required': True},
'entities': {'required': True},
'relations': {'required': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'warnings': {'key': 'warnings', 'type': '[DocumentWarning]'},
'statistics': {'key': 'statistics', 'type': 'DocumentStatistics'},
'entities': {'key': 'entities', 'type': '[HealthcareEntity]'},
'relations': {'key': 'relations', 'type': '[HealthcareRelation]'},
}
def __init__(
self,
*,
id: str,
warnings: List["DocumentWarning"],
entities: List["HealthcareEntity"],
relations: List["HealthcareRelation"],
statistics: Optional["DocumentStatistics"] = None,
**kwargs
):
"""
:keyword id: Required. Unique, non-empty document identifier.
:paramtype id: str
:keyword warnings: Required. Warnings encountered while processing document.
:paramtype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:keyword entities: Required. Healthcare entities.
:paramtype entities: list[~azure.ai.textanalytics.v2022_03_01_preview.models.HealthcareEntity]
:keyword relations: Required. Healthcare entity relations.
:paramtype relations:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.HealthcareRelation]
"""
super(HealthcareResultDocumentsItem, self).__init__(id=id, warnings=warnings, statistics=statistics, entities=entities, relations=relations, **kwargs)
class HealthcareTaskParameters(PreBuiltTaskParameters):
"""Supported parameters for a Healthcare task.
:ivar logging_opt_out:
:vartype logging_opt_out: bool
:ivar model_version:
:vartype model_version: str
:ivar string_index_type: Specifies the method used to interpret string offsets. Defaults to
Text Elements (Graphemes) according to Unicode v8.0.0. For additional information see
https://aka.ms/text-analytics-offsets. Possible values include: "TextElements_v8",
"UnicodeCodePoint", "Utf16CodeUnit". Default value: "TextElements_v8".
:vartype string_index_type: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.StringIndexType
"""
_attribute_map = {
'logging_opt_out': {'key': 'loggingOptOut', 'type': 'bool'},
'model_version': {'key': 'modelVersion', 'type': 'str'},
'string_index_type': {'key': 'stringIndexType', 'type': 'str'},
}
def __init__(
self,
*,
logging_opt_out: Optional[bool] = False,
model_version: Optional[str] = "latest",
string_index_type: Optional[Union[str, "StringIndexType"]] = "TextElements_v8",
**kwargs
):
"""
:keyword logging_opt_out:
:paramtype logging_opt_out: bool
:keyword model_version:
:paramtype model_version: str
:keyword string_index_type: Specifies the method used to interpret string offsets. Defaults to
Text Elements (Graphemes) according to Unicode v8.0.0. For additional information see
https://aka.ms/text-analytics-offsets. Possible values include: "TextElements_v8",
"UnicodeCodePoint", "Utf16CodeUnit". Default value: "TextElements_v8".
:paramtype string_index_type: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.StringIndexType
"""
super(HealthcareTaskParameters, self).__init__(logging_opt_out=logging_opt_out, model_version=model_version, **kwargs)
self.string_index_type = string_index_type
class InnerErrorModel(msrest.serialization.Model):
"""An object containing more specific information about the error. As per Microsoft One API guidelines - https://github.com/Microsoft/api-guidelines/blob/vNext/Guidelines.md#7102-error-condition-responses.
All required parameters must be populated in order to send to Azure.
:ivar code: Required. One of a server-defined set of error codes. Possible values include:
"InvalidRequest", "InvalidParameterValue", "KnowledgeBaseNotFound",
"AzureCognitiveSearchNotFound", "AzureCognitiveSearchThrottling", "ExtractionFailure",
"InvalidRequestBodyFormat", "EmptyRequest", "MissingInputDocuments", "InvalidDocument",
"ModelVersionIncorrect", "InvalidDocumentBatch", "UnsupportedLanguageCode",
"InvalidCountryHint".
:vartype code: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.InnerErrorCode
:ivar message: Required. Error message.
:vartype message: str
:ivar details: Error details.
:vartype details: dict[str, str]
:ivar target: Error target.
:vartype target: str
:ivar innererror: An object containing more specific information than the current object about
the error.
:vartype innererror: ~azure.ai.textanalytics.v2022_03_01_preview.models.InnerErrorModel
"""
_validation = {
'code': {'required': True},
'message': {'required': True},
}
_attribute_map = {
'code': {'key': 'code', 'type': 'str'},
'message': {'key': 'message', 'type': 'str'},
'details': {'key': 'details', 'type': '{str}'},
'target': {'key': 'target', 'type': 'str'},
'innererror': {'key': 'innererror', 'type': 'InnerErrorModel'},
}
def __init__(
self,
*,
code: Union[str, "InnerErrorCode"],
message: str,
details: Optional[Dict[str, str]] = None,
target: Optional[str] = None,
innererror: Optional["InnerErrorModel"] = None,
**kwargs
):
"""
:keyword code: Required. One of a server-defined set of error codes. Possible values include:
"InvalidRequest", "InvalidParameterValue", "KnowledgeBaseNotFound",
"AzureCognitiveSearchNotFound", "AzureCognitiveSearchThrottling", "ExtractionFailure",
"InvalidRequestBodyFormat", "EmptyRequest", "MissingInputDocuments", "InvalidDocument",
"ModelVersionIncorrect", "InvalidDocumentBatch", "UnsupportedLanguageCode",
"InvalidCountryHint".
:paramtype code: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.InnerErrorCode
:keyword message: Required. Error message.
:paramtype message: str
:keyword details: Error details.
:paramtype details: dict[str, str]
:keyword target: Error target.
:paramtype target: str
:keyword innererror: An object containing more specific information than the current object
about the error.
:paramtype innererror: ~azure.ai.textanalytics.v2022_03_01_preview.models.InnerErrorModel
"""
super(InnerErrorModel, self).__init__(**kwargs)
self.code = code
self.message = message
self.details = details
self.target = target
self.innererror = innererror
class JobErrors(msrest.serialization.Model):
"""JobErrors.
:ivar errors:
:vartype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.Error]
"""
_attribute_map = {
'errors': {'key': 'errors', 'type': '[Error]'},
}
def __init__(
self,
*,
errors: Optional[List["Error"]] = None,
**kwargs
):
"""
:keyword errors:
:paramtype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.Error]
"""
super(JobErrors, self).__init__(**kwargs)
self.errors = errors
class JobMetadata(msrest.serialization.Model):
"""JobMetadata.
All required parameters must be populated in order to send to Azure.
:ivar display_name:
:vartype display_name: str
:ivar created_date_time: Required.
:vartype created_date_time: ~datetime.datetime
:ivar expiration_date_time:
:vartype expiration_date_time: ~datetime.datetime
:ivar job_id: Required.
:vartype job_id: str
:ivar last_update_date_time: Required.
:vartype last_update_date_time: ~datetime.datetime
:ivar status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:vartype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
"""
_validation = {
'created_date_time': {'required': True},
'job_id': {'required': True},
'last_update_date_time': {'required': True},
'status': {'required': True},
}
_attribute_map = {
'display_name': {'key': 'displayName', 'type': 'str'},
'created_date_time': {'key': 'createdDateTime', 'type': 'iso-8601'},
'expiration_date_time': {'key': 'expirationDateTime', 'type': 'iso-8601'},
'job_id': {'key': 'jobId', 'type': 'str'},
'last_update_date_time': {'key': 'lastUpdateDateTime', 'type': 'iso-8601'},
'status': {'key': 'status', 'type': 'str'},
}
def __init__(
self,
*,
created_date_time: datetime.datetime,
job_id: str,
last_update_date_time: datetime.datetime,
status: Union[str, "State"],
display_name: Optional[str] = None,
expiration_date_time: Optional[datetime.datetime] = None,
**kwargs
):
"""
:keyword display_name:
:paramtype display_name: str
:keyword created_date_time: Required.
:paramtype created_date_time: ~datetime.datetime
:keyword expiration_date_time:
:paramtype expiration_date_time: ~datetime.datetime
:keyword job_id: Required.
:paramtype job_id: str
:keyword last_update_date_time: Required.
:paramtype last_update_date_time: ~datetime.datetime
:keyword status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:paramtype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
"""
super(JobMetadata, self).__init__(**kwargs)
self.display_name = display_name
self.created_date_time = created_date_time
self.expiration_date_time = expiration_date_time
self.job_id = job_id
self.last_update_date_time = last_update_date_time
self.status = status
class KeyPhraseExtractionLROResult(AnalyzeTextLROResult):
"""KeyPhraseExtractionLROResult.
All required parameters must be populated in order to send to Azure.
:ivar last_update_date_time: Required.
:vartype last_update_date_time: ~datetime.datetime
:ivar status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:vartype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
:ivar task_name:
:vartype task_name: str
:ivar kind: Required. Enumeration of supported Text Analysis long-running operation task
results.Constant filled by server. Possible values include: "SentimentAnalysisLROResults",
"EntityRecognitionLROResults", "PiiEntityRecognitionLROResults",
"KeyPhraseExtractionLROResults", "EntityLinkingLROResults", "HealthcareLROResults",
"ExtractiveSummarizationLROResults", "CustomEntityRecognitionLROResults",
"CustomSingleLabelClassificationLROResults", "CustomMultiLabelClassificationLROResults".
:vartype kind: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextLROResultsKind
:ivar results: Required.
:vartype results: ~azure.ai.textanalytics.v2022_03_01_preview.models.KeyPhraseResult
"""
_validation = {
'last_update_date_time': {'required': True},
'status': {'required': True},
'kind': {'required': True},
'results': {'required': True},
}
_attribute_map = {
'last_update_date_time': {'key': 'lastUpdateDateTime', 'type': 'iso-8601'},
'status': {'key': 'status', 'type': 'str'},
'task_name': {'key': 'taskName', 'type': 'str'},
'kind': {'key': 'kind', 'type': 'str'},
'results': {'key': 'results', 'type': 'KeyPhraseResult'},
}
def __init__(
self,
*,
last_update_date_time: datetime.datetime,
status: Union[str, "State"],
results: "KeyPhraseResult",
task_name: Optional[str] = None,
**kwargs
):
"""
:keyword last_update_date_time: Required.
:paramtype last_update_date_time: ~datetime.datetime
:keyword status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:paramtype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
:keyword task_name:
:paramtype task_name: str
:keyword results: Required.
:paramtype results: ~azure.ai.textanalytics.v2022_03_01_preview.models.KeyPhraseResult
"""
super(KeyPhraseExtractionLROResult, self).__init__(last_update_date_time=last_update_date_time, status=status, task_name=task_name, **kwargs)
self.kind = 'KeyPhraseExtractionLROResults' # type: str
self.results = results
class KeyPhraseLROTask(AnalyzeTextLROTask):
"""An object representing the task definition for a Key Phrase Extraction task.
All required parameters must be populated in order to send to Azure.
:ivar task_name:
:vartype task_name: str
:ivar kind: Required. Enumeration of supported long-running Text Analysis tasks.Constant filled
by server. Possible values include: "SentimentAnalysis", "EntityRecognition",
"PiiEntityRecognition", "KeyPhraseExtraction", "EntityLinking", "Healthcare",
"ExtractiveSummarization", "CustomEntityRecognition", "CustomSingleLabelClassification",
"CustomMultiLabelClassification".
:vartype kind: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextLROTaskKind
:ivar parameters: Supported parameters for a Key Phrase Extraction task.
:vartype parameters: ~azure.ai.textanalytics.v2022_03_01_preview.models.KeyPhraseTaskParameters
"""
_validation = {
'kind': {'required': True},
}
_attribute_map = {
'task_name': {'key': 'taskName', 'type': 'str'},
'kind': {'key': 'kind', 'type': 'str'},
'parameters': {'key': 'parameters', 'type': 'KeyPhraseTaskParameters'},
}
def __init__(
self,
*,
task_name: Optional[str] = None,
parameters: Optional["KeyPhraseTaskParameters"] = None,
**kwargs
):
"""
:keyword task_name:
:paramtype task_name: str
:keyword parameters: Supported parameters for a Key Phrase Extraction task.
:paramtype parameters:
~azure.ai.textanalytics.v2022_03_01_preview.models.KeyPhraseTaskParameters
"""
super(KeyPhraseLROTask, self).__init__(task_name=task_name, **kwargs)
self.kind = 'KeyPhraseExtraction' # type: str
self.parameters = parameters
class KeyPhraseResult(PreBuiltResult):
"""KeyPhraseResult.
All required parameters must be populated in order to send to Azure.
:ivar errors: Required. Errors by document id.
:vartype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentError]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:ivar model_version: Required. This field indicates which model is used for scoring.
:vartype model_version: str
:ivar documents: Required. Response by document.
:vartype documents:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.KeyPhraseResultDocumentsItem]
"""
_validation = {
'errors': {'required': True},
'model_version': {'required': True},
'documents': {'required': True},
}
_attribute_map = {
'errors': {'key': 'errors', 'type': '[DocumentError]'},
'statistics': {'key': 'statistics', 'type': 'RequestStatistics'},
'model_version': {'key': 'modelVersion', 'type': 'str'},
'documents': {'key': 'documents', 'type': '[KeyPhraseResultDocumentsItem]'},
}
def __init__(
self,
*,
errors: List["DocumentError"],
model_version: str,
documents: List["KeyPhraseResultDocumentsItem"],
statistics: Optional["RequestStatistics"] = None,
**kwargs
):
"""
:keyword errors: Required. Errors by document id.
:paramtype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentError]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:keyword model_version: Required. This field indicates which model is used for scoring.
:paramtype model_version: str
:keyword documents: Required. Response by document.
:paramtype documents:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.KeyPhraseResultDocumentsItem]
"""
super(KeyPhraseResult, self).__init__(errors=errors, statistics=statistics, model_version=model_version, **kwargs)
self.documents = documents
class KeyPhrasesDocumentResult(DocumentResult):
"""KeyPhrasesDocumentResult.
All required parameters must be populated in order to send to Azure.
:ivar id: Required. Unique, non-empty document identifier.
:vartype id: str
:ivar warnings: Required. Warnings encountered while processing document.
:vartype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:ivar key_phrases: Required. A list of representative words or phrases. The number of key
phrases returned is proportional to the number of words in the input document.
:vartype key_phrases: list[str]
"""
_validation = {
'id': {'required': True},
'warnings': {'required': True},
'key_phrases': {'required': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'warnings': {'key': 'warnings', 'type': '[DocumentWarning]'},
'statistics': {'key': 'statistics', 'type': 'DocumentStatistics'},
'key_phrases': {'key': 'keyPhrases', 'type': '[str]'},
}
def __init__(
self,
*,
id: str,
warnings: List["DocumentWarning"],
key_phrases: List[str],
statistics: Optional["DocumentStatistics"] = None,
**kwargs
):
"""
:keyword id: Required. Unique, non-empty document identifier.
:paramtype id: str
:keyword warnings: Required. Warnings encountered while processing document.
:paramtype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:keyword key_phrases: Required. A list of representative words or phrases. The number of key
phrases returned is proportional to the number of words in the input document.
:paramtype key_phrases: list[str]
"""
super(KeyPhrasesDocumentResult, self).__init__(id=id, warnings=warnings, statistics=statistics, **kwargs)
self.key_phrases = key_phrases
class KeyPhraseResultDocumentsItem(KeyPhrasesDocumentResult):
"""KeyPhraseResultDocumentsItem.
All required parameters must be populated in order to send to Azure.
:ivar id: Required. Unique, non-empty document identifier.
:vartype id: str
:ivar warnings: Required. Warnings encountered while processing document.
:vartype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:ivar key_phrases: Required. A list of representative words or phrases. The number of key
phrases returned is proportional to the number of words in the input document.
:vartype key_phrases: list[str]
"""
_validation = {
'id': {'required': True},
'warnings': {'required': True},
'key_phrases': {'required': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'warnings': {'key': 'warnings', 'type': '[DocumentWarning]'},
'statistics': {'key': 'statistics', 'type': 'DocumentStatistics'},
'key_phrases': {'key': 'keyPhrases', 'type': '[str]'},
}
def __init__(
self,
*,
id: str,
warnings: List["DocumentWarning"],
key_phrases: List[str],
statistics: Optional["DocumentStatistics"] = None,
**kwargs
):
"""
:keyword id: Required. Unique, non-empty document identifier.
:paramtype id: str
:keyword warnings: Required. Warnings encountered while processing document.
:paramtype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:keyword key_phrases: Required. A list of representative words or phrases. The number of key
phrases returned is proportional to the number of words in the input document.
:paramtype key_phrases: list[str]
"""
super(KeyPhraseResultDocumentsItem, self).__init__(id=id, warnings=warnings, statistics=statistics, key_phrases=key_phrases, **kwargs)
class KeyPhraseTaskParameters(PreBuiltTaskParameters):
"""Supported parameters for a Key Phrase Extraction task.
:ivar logging_opt_out:
:vartype logging_opt_out: bool
:ivar model_version:
:vartype model_version: str
"""
_attribute_map = {
'logging_opt_out': {'key': 'loggingOptOut', 'type': 'bool'},
'model_version': {'key': 'modelVersion', 'type': 'str'},
}
def __init__(
self,
*,
logging_opt_out: Optional[bool] = False,
model_version: Optional[str] = "latest",
**kwargs
):
"""
:keyword logging_opt_out:
:paramtype logging_opt_out: bool
:keyword model_version:
:paramtype model_version: str
"""
super(KeyPhraseTaskParameters, self).__init__(logging_opt_out=logging_opt_out, model_version=model_version, **kwargs)
class KeyPhraseTaskResult(AnalyzeTextTaskResult):
"""KeyPhraseTaskResult.
All required parameters must be populated in order to send to Azure.
:ivar kind: Required. Enumeration of supported Text Analysis task results.Constant filled by
server. Possible values include: "SentimentAnalysisResults", "EntityRecognitionResults",
"PiiEntityRecognitionResults", "KeyPhraseExtractionResults", "LanguageDetectionResults",
"EntityLinkingResults".
:vartype kind: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextTaskResultsKind
:ivar results: Required.
:vartype results: ~azure.ai.textanalytics.v2022_03_01_preview.models.KeyPhraseResult
"""
_validation = {
'kind': {'required': True},
'results': {'required': True},
}
_attribute_map = {
'kind': {'key': 'kind', 'type': 'str'},
'results': {'key': 'results', 'type': 'KeyPhraseResult'},
}
def __init__(
self,
*,
results: "KeyPhraseResult",
**kwargs
):
"""
:keyword results: Required.
:paramtype results: ~azure.ai.textanalytics.v2022_03_01_preview.models.KeyPhraseResult
"""
super(KeyPhraseTaskResult, self).__init__(**kwargs)
self.kind = 'KeyPhraseExtractionResults' # type: str
self.results = results
class LanguageDetectionAnalysisInput(msrest.serialization.Model):
"""LanguageDetectionAnalysisInput.
:ivar documents:
:vartype documents: list[~azure.ai.textanalytics.v2022_03_01_preview.models.LanguageInput]
"""
_attribute_map = {
'documents': {'key': 'documents', 'type': '[LanguageInput]'},
}
def __init__(
self,
*,
documents: Optional[List["LanguageInput"]] = None,
**kwargs
):
"""
:keyword documents:
:paramtype documents: list[~azure.ai.textanalytics.v2022_03_01_preview.models.LanguageInput]
"""
super(LanguageDetectionAnalysisInput, self).__init__(**kwargs)
self.documents = documents
class LanguageDetectionDocumentResult(DocumentResult):
"""LanguageDetectionDocumentResult.
All required parameters must be populated in order to send to Azure.
:ivar id: Required. Unique, non-empty document identifier.
:vartype id: str
:ivar warnings: Required. Warnings encountered while processing document.
:vartype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:ivar detected_language: Required. Detected Language.
:vartype detected_language: ~azure.ai.textanalytics.v2022_03_01_preview.models.DetectedLanguage
"""
_validation = {
'id': {'required': True},
'warnings': {'required': True},
'detected_language': {'required': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'warnings': {'key': 'warnings', 'type': '[DocumentWarning]'},
'statistics': {'key': 'statistics', 'type': 'DocumentStatistics'},
'detected_language': {'key': 'detectedLanguage', 'type': 'DetectedLanguage'},
}
def __init__(
self,
*,
id: str,
warnings: List["DocumentWarning"],
detected_language: "DetectedLanguage",
statistics: Optional["DocumentStatistics"] = None,
**kwargs
):
"""
:keyword id: Required. Unique, non-empty document identifier.
:paramtype id: str
:keyword warnings: Required. Warnings encountered while processing document.
:paramtype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:keyword detected_language: Required. Detected Language.
:paramtype detected_language:
~azure.ai.textanalytics.v2022_03_01_preview.models.DetectedLanguage
"""
super(LanguageDetectionDocumentResult, self).__init__(id=id, warnings=warnings, statistics=statistics, **kwargs)
self.detected_language = detected_language
class LanguageDetectionResult(PreBuiltResult):
"""LanguageDetectionResult.
All required parameters must be populated in order to send to Azure.
:ivar errors: Required. Errors by document id.
:vartype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentError]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:ivar model_version: Required. This field indicates which model is used for scoring.
:vartype model_version: str
:ivar documents: Required. Response by document.
:vartype documents:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.LanguageDetectionDocumentResult]
"""
_validation = {
'errors': {'required': True},
'model_version': {'required': True},
'documents': {'required': True},
}
_attribute_map = {
'errors': {'key': 'errors', 'type': '[DocumentError]'},
'statistics': {'key': 'statistics', 'type': 'RequestStatistics'},
'model_version': {'key': 'modelVersion', 'type': 'str'},
'documents': {'key': 'documents', 'type': '[LanguageDetectionDocumentResult]'},
}
def __init__(
self,
*,
errors: List["DocumentError"],
model_version: str,
documents: List["LanguageDetectionDocumentResult"],
statistics: Optional["RequestStatistics"] = None,
**kwargs
):
"""
:keyword errors: Required. Errors by document id.
:paramtype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentError]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:keyword model_version: Required. This field indicates which model is used for scoring.
:paramtype model_version: str
:keyword documents: Required. Response by document.
:paramtype documents:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.LanguageDetectionDocumentResult]
"""
super(LanguageDetectionResult, self).__init__(errors=errors, statistics=statistics, model_version=model_version, **kwargs)
self.documents = documents
class LanguageDetectionTaskParameters(PreBuiltTaskParameters):
"""Supported parameters for a Language Detection task.
:ivar logging_opt_out:
:vartype logging_opt_out: bool
:ivar model_version:
:vartype model_version: str
"""
_attribute_map = {
'logging_opt_out': {'key': 'loggingOptOut', 'type': 'bool'},
'model_version': {'key': 'modelVersion', 'type': 'str'},
}
def __init__(
self,
*,
logging_opt_out: Optional[bool] = False,
model_version: Optional[str] = "latest",
**kwargs
):
"""
:keyword logging_opt_out:
:paramtype logging_opt_out: bool
:keyword model_version:
:paramtype model_version: str
"""
super(LanguageDetectionTaskParameters, self).__init__(logging_opt_out=logging_opt_out, model_version=model_version, **kwargs)
class LanguageDetectionTaskResult(AnalyzeTextTaskResult):
"""LanguageDetectionTaskResult.
All required parameters must be populated in order to send to Azure.
:ivar kind: Required. Enumeration of supported Text Analysis task results.Constant filled by
server. Possible values include: "SentimentAnalysisResults", "EntityRecognitionResults",
"PiiEntityRecognitionResults", "KeyPhraseExtractionResults", "LanguageDetectionResults",
"EntityLinkingResults".
:vartype kind: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextTaskResultsKind
:ivar results: Required.
:vartype results: ~azure.ai.textanalytics.v2022_03_01_preview.models.LanguageDetectionResult
"""
_validation = {
'kind': {'required': True},
'results': {'required': True},
}
_attribute_map = {
'kind': {'key': 'kind', 'type': 'str'},
'results': {'key': 'results', 'type': 'LanguageDetectionResult'},
}
def __init__(
self,
*,
results: "LanguageDetectionResult",
**kwargs
):
"""
:keyword results: Required.
:paramtype results: ~azure.ai.textanalytics.v2022_03_01_preview.models.LanguageDetectionResult
"""
super(LanguageDetectionTaskResult, self).__init__(**kwargs)
self.kind = 'LanguageDetectionResults' # type: str
self.results = results
class LanguageInput(msrest.serialization.Model):
"""LanguageInput.
All required parameters must be populated in order to send to Azure.
:ivar id: Required. Unique, non-empty document identifier.
:vartype id: str
:ivar text: Required.
:vartype text: str
:ivar country_hint:
:vartype country_hint: str
"""
_validation = {
'id': {'required': True},
'text': {'required': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'text': {'key': 'text', 'type': 'str'},
'country_hint': {'key': 'countryHint', 'type': 'str'},
}
def __init__(
self,
*,
id: str,
text: str,
country_hint: Optional[str] = None,
**kwargs
):
"""
:keyword id: Required. Unique, non-empty document identifier.
:paramtype id: str
:keyword text: Required.
:paramtype text: str
:keyword country_hint:
:paramtype country_hint: str
"""
super(LanguageInput, self).__init__(**kwargs)
self.id = id
self.text = text
self.country_hint = country_hint
class LinkedEntity(msrest.serialization.Model):
"""LinkedEntity.
All required parameters must be populated in order to send to Azure.
:ivar name: Required. Entity Linking formal name.
:vartype name: str
:ivar matches: Required. List of instances this entity appears in the text.
:vartype matches: list[~azure.ai.textanalytics.v2022_03_01_preview.models.Match]
:ivar language: Required. Language used in the data source.
:vartype language: str
:ivar id: Unique identifier of the recognized entity from the data source.
:vartype id: str
:ivar url: Required. URL for the entity's page from the data source.
:vartype url: str
:ivar data_source: Required. Data source used to extract entity linking, such as Wiki/Bing etc.
:vartype data_source: str
:ivar bing_id: Bing Entity Search API unique identifier of the recognized entity.
:vartype bing_id: str
"""
_validation = {
'name': {'required': True},
'matches': {'required': True},
'language': {'required': True},
'url': {'required': True},
'data_source': {'required': True},
}
_attribute_map = {
'name': {'key': 'name', 'type': 'str'},
'matches': {'key': 'matches', 'type': '[Match]'},
'language': {'key': 'language', 'type': 'str'},
'id': {'key': 'id', 'type': 'str'},
'url': {'key': 'url', 'type': 'str'},
'data_source': {'key': 'dataSource', 'type': 'str'},
'bing_id': {'key': 'bingId', 'type': 'str'},
}
def __init__(
self,
*,
name: str,
matches: List["Match"],
language: str,
url: str,
data_source: str,
id: Optional[str] = None,
bing_id: Optional[str] = None,
**kwargs
):
"""
:keyword name: Required. Entity Linking formal name.
:paramtype name: str
:keyword matches: Required. List of instances this entity appears in the text.
:paramtype matches: list[~azure.ai.textanalytics.v2022_03_01_preview.models.Match]
:keyword language: Required. Language used in the data source.
:paramtype language: str
:keyword id: Unique identifier of the recognized entity from the data source.
:paramtype id: str
:keyword url: Required. URL for the entity's page from the data source.
:paramtype url: str
:keyword data_source: Required. Data source used to extract entity linking, such as Wiki/Bing
etc.
:paramtype data_source: str
:keyword bing_id: Bing Entity Search API unique identifier of the recognized entity.
:paramtype bing_id: str
"""
super(LinkedEntity, self).__init__(**kwargs)
self.name = name
self.matches = matches
self.language = language
self.id = id
self.url = url
self.data_source = data_source
self.bing_id = bing_id
class Match(msrest.serialization.Model):
"""Match.
All required parameters must be populated in order to send to Azure.
:ivar confidence_score: Required. If a well known item is recognized, a decimal number denoting
the confidence level between 0 and 1 will be returned.
:vartype confidence_score: float
:ivar text: Required. Entity text as appears in the request.
:vartype text: str
:ivar offset: Required. Start position for the entity match text.
:vartype offset: int
:ivar length: Required. Length for the entity match text.
:vartype length: int
"""
_validation = {
'confidence_score': {'required': True},
'text': {'required': True},
'offset': {'required': True},
'length': {'required': True},
}
_attribute_map = {
'confidence_score': {'key': 'confidenceScore', 'type': 'float'},
'text': {'key': 'text', 'type': 'str'},
'offset': {'key': 'offset', 'type': 'int'},
'length': {'key': 'length', 'type': 'int'},
}
def __init__(
self,
*,
confidence_score: float,
text: str,
offset: int,
length: int,
**kwargs
):
"""
:keyword confidence_score: Required. If a well known item is recognized, a decimal number
denoting the confidence level between 0 and 1 will be returned.
:paramtype confidence_score: float
:keyword text: Required. Entity text as appears in the request.
:paramtype text: str
:keyword offset: Required. Start position for the entity match text.
:paramtype offset: int
:keyword length: Required. Length for the entity match text.
:paramtype length: int
"""
super(Match, self).__init__(**kwargs)
self.confidence_score = confidence_score
self.text = text
self.offset = offset
self.length = length
class MultiLanguageAnalysisInput(msrest.serialization.Model):
"""MultiLanguageAnalysisInput.
:ivar documents:
:vartype documents: list[~azure.ai.textanalytics.v2022_03_01_preview.models.MultiLanguageInput]
"""
_attribute_map = {
'documents': {'key': 'documents', 'type': '[MultiLanguageInput]'},
}
def __init__(
self,
*,
documents: Optional[List["MultiLanguageInput"]] = None,
**kwargs
):
"""
:keyword documents:
:paramtype documents:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.MultiLanguageInput]
"""
super(MultiLanguageAnalysisInput, self).__init__(**kwargs)
self.documents = documents
class MultiLanguageInput(msrest.serialization.Model):
"""Contains an input document to be analyzed by the service.
All required parameters must be populated in order to send to Azure.
:ivar id: Required. A unique, non-empty document identifier.
:vartype id: str
:ivar text: Required. The input text to process.
:vartype text: str
:ivar language: (Optional) This is the 2 letter ISO 639-1 representation of a language. For
example, use "en" for English; "es" for Spanish etc. If not set, use "en" for English as
default.
:vartype language: str
"""
_validation = {
'id': {'required': True},
'text': {'required': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'text': {'key': 'text', 'type': 'str'},
'language': {'key': 'language', 'type': 'str'},
}
def __init__(
self,
*,
id: str,
text: str,
language: Optional[str] = None,
**kwargs
):
"""
:keyword id: Required. A unique, non-empty document identifier.
:paramtype id: str
:keyword text: Required. The input text to process.
:paramtype text: str
:keyword language: (Optional) This is the 2 letter ISO 639-1 representation of a language. For
example, use "en" for English; "es" for Spanish etc. If not set, use "en" for English as
default.
:paramtype language: str
"""
super(MultiLanguageInput, self).__init__(**kwargs)
self.id = id
self.text = text
self.language = language
class Pagination(msrest.serialization.Model):
"""Pagination.
:ivar next_link:
:vartype next_link: str
"""
_attribute_map = {
'next_link': {'key': 'nextLink', 'type': 'str'},
}
def __init__(
self,
*,
next_link: Optional[str] = None,
**kwargs
):
"""
:keyword next_link:
:paramtype next_link: str
"""
super(Pagination, self).__init__(**kwargs)
self.next_link = next_link
class PiiEntitiesDocumentResult(DocumentResult):
"""PiiEntitiesDocumentResult.
All required parameters must be populated in order to send to Azure.
:ivar id: Required. Unique, non-empty document identifier.
:vartype id: str
:ivar warnings: Required. Warnings encountered while processing document.
:vartype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:ivar redacted_text: Required. Returns redacted text.
:vartype redacted_text: str
:ivar entities: Required. Recognized entities in the document.
:vartype entities: list[~azure.ai.textanalytics.v2022_03_01_preview.models.Entity]
"""
_validation = {
'id': {'required': True},
'warnings': {'required': True},
'redacted_text': {'required': True},
'entities': {'required': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'warnings': {'key': 'warnings', 'type': '[DocumentWarning]'},
'statistics': {'key': 'statistics', 'type': 'DocumentStatistics'},
'redacted_text': {'key': 'redactedText', 'type': 'str'},
'entities': {'key': 'entities', 'type': '[Entity]'},
}
def __init__(
self,
*,
id: str,
warnings: List["DocumentWarning"],
redacted_text: str,
entities: List["Entity"],
statistics: Optional["DocumentStatistics"] = None,
**kwargs
):
"""
:keyword id: Required. Unique, non-empty document identifier.
:paramtype id: str
:keyword warnings: Required. Warnings encountered while processing document.
:paramtype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:keyword redacted_text: Required. Returns redacted text.
:paramtype redacted_text: str
:keyword entities: Required. Recognized entities in the document.
:paramtype entities: list[~azure.ai.textanalytics.v2022_03_01_preview.models.Entity]
"""
super(PiiEntitiesDocumentResult, self).__init__(id=id, warnings=warnings, statistics=statistics, **kwargs)
self.redacted_text = redacted_text
self.entities = entities
class PiiEntityRecognitionLROResult(AnalyzeTextLROResult):
"""PiiEntityRecognitionLROResult.
All required parameters must be populated in order to send to Azure.
:ivar last_update_date_time: Required.
:vartype last_update_date_time: ~datetime.datetime
:ivar status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:vartype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
:ivar task_name:
:vartype task_name: str
:ivar kind: Required. Enumeration of supported Text Analysis long-running operation task
results.Constant filled by server. Possible values include: "SentimentAnalysisLROResults",
"EntityRecognitionLROResults", "PiiEntityRecognitionLROResults",
"KeyPhraseExtractionLROResults", "EntityLinkingLROResults", "HealthcareLROResults",
"ExtractiveSummarizationLROResults", "CustomEntityRecognitionLROResults",
"CustomSingleLabelClassificationLROResults", "CustomMultiLabelClassificationLROResults".
:vartype kind: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextLROResultsKind
:ivar results: Required.
:vartype results: ~azure.ai.textanalytics.v2022_03_01_preview.models.PiiResult
"""
_validation = {
'last_update_date_time': {'required': True},
'status': {'required': True},
'kind': {'required': True},
'results': {'required': True},
}
_attribute_map = {
'last_update_date_time': {'key': 'lastUpdateDateTime', 'type': 'iso-8601'},
'status': {'key': 'status', 'type': 'str'},
'task_name': {'key': 'taskName', 'type': 'str'},
'kind': {'key': 'kind', 'type': 'str'},
'results': {'key': 'results', 'type': 'PiiResult'},
}
def __init__(
self,
*,
last_update_date_time: datetime.datetime,
status: Union[str, "State"],
results: "PiiResult",
task_name: Optional[str] = None,
**kwargs
):
"""
:keyword last_update_date_time: Required.
:paramtype last_update_date_time: ~datetime.datetime
:keyword status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:paramtype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
:keyword task_name:
:paramtype task_name: str
:keyword results: Required.
:paramtype results: ~azure.ai.textanalytics.v2022_03_01_preview.models.PiiResult
"""
super(PiiEntityRecognitionLROResult, self).__init__(last_update_date_time=last_update_date_time, status=status, task_name=task_name, **kwargs)
self.kind = 'PiiEntityRecognitionLROResults' # type: str
self.results = results
class PiiLROTask(AnalyzeTextLROTask):
"""An object representing the task definition for a PII Entities Recognition task.
All required parameters must be populated in order to send to Azure.
:ivar task_name:
:vartype task_name: str
:ivar kind: Required. Enumeration of supported long-running Text Analysis tasks.Constant filled
by server. Possible values include: "SentimentAnalysis", "EntityRecognition",
"PiiEntityRecognition", "KeyPhraseExtraction", "EntityLinking", "Healthcare",
"ExtractiveSummarization", "CustomEntityRecognition", "CustomSingleLabelClassification",
"CustomMultiLabelClassification".
:vartype kind: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextLROTaskKind
:ivar parameters: Supported parameters for a PII Entities Recognition task.
:vartype parameters: ~azure.ai.textanalytics.v2022_03_01_preview.models.PiiTaskParameters
"""
_validation = {
'kind': {'required': True},
}
_attribute_map = {
'task_name': {'key': 'taskName', 'type': 'str'},
'kind': {'key': 'kind', 'type': 'str'},
'parameters': {'key': 'parameters', 'type': 'PiiTaskParameters'},
}
def __init__(
self,
*,
task_name: Optional[str] = None,
parameters: Optional["PiiTaskParameters"] = None,
**kwargs
):
"""
:keyword task_name:
:paramtype task_name: str
:keyword parameters: Supported parameters for a PII Entities Recognition task.
:paramtype parameters: ~azure.ai.textanalytics.v2022_03_01_preview.models.PiiTaskParameters
"""
super(PiiLROTask, self).__init__(task_name=task_name, **kwargs)
self.kind = 'PiiEntityRecognition' # type: str
self.parameters = parameters
class PiiResult(PreBuiltResult):
"""PiiResult.
All required parameters must be populated in order to send to Azure.
:ivar errors: Required. Errors by document id.
:vartype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentError]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:ivar model_version: Required. This field indicates which model is used for scoring.
:vartype model_version: str
:ivar documents: Required. Response by document.
:vartype documents:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.PiiResultDocumentsItem]
"""
_validation = {
'errors': {'required': True},
'model_version': {'required': True},
'documents': {'required': True},
}
_attribute_map = {
'errors': {'key': 'errors', 'type': '[DocumentError]'},
'statistics': {'key': 'statistics', 'type': 'RequestStatistics'},
'model_version': {'key': 'modelVersion', 'type': 'str'},
'documents': {'key': 'documents', 'type': '[PiiResultDocumentsItem]'},
}
def __init__(
self,
*,
errors: List["DocumentError"],
model_version: str,
documents: List["PiiResultDocumentsItem"],
statistics: Optional["RequestStatistics"] = None,
**kwargs
):
"""
:keyword errors: Required. Errors by document id.
:paramtype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentError]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:keyword model_version: Required. This field indicates which model is used for scoring.
:paramtype model_version: str
:keyword documents: Required. Response by document.
:paramtype documents:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.PiiResultDocumentsItem]
"""
super(PiiResult, self).__init__(errors=errors, statistics=statistics, model_version=model_version, **kwargs)
self.documents = documents
class PiiResultDocumentsItem(PiiEntitiesDocumentResult):
"""PiiResultDocumentsItem.
All required parameters must be populated in order to send to Azure.
:ivar id: Required. Unique, non-empty document identifier.
:vartype id: str
:ivar warnings: Required. Warnings encountered while processing document.
:vartype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:ivar redacted_text: Required. Returns redacted text.
:vartype redacted_text: str
:ivar entities: Required. Recognized entities in the document.
:vartype entities: list[~azure.ai.textanalytics.v2022_03_01_preview.models.Entity]
"""
_validation = {
'id': {'required': True},
'warnings': {'required': True},
'redacted_text': {'required': True},
'entities': {'required': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'warnings': {'key': 'warnings', 'type': '[DocumentWarning]'},
'statistics': {'key': 'statistics', 'type': 'DocumentStatistics'},
'redacted_text': {'key': 'redactedText', 'type': 'str'},
'entities': {'key': 'entities', 'type': '[Entity]'},
}
def __init__(
self,
*,
id: str,
warnings: List["DocumentWarning"],
redacted_text: str,
entities: List["Entity"],
statistics: Optional["DocumentStatistics"] = None,
**kwargs
):
"""
:keyword id: Required. Unique, non-empty document identifier.
:paramtype id: str
:keyword warnings: Required. Warnings encountered while processing document.
:paramtype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:keyword redacted_text: Required. Returns redacted text.
:paramtype redacted_text: str
:keyword entities: Required. Recognized entities in the document.
:paramtype entities: list[~azure.ai.textanalytics.v2022_03_01_preview.models.Entity]
"""
super(PiiResultDocumentsItem, self).__init__(id=id, warnings=warnings, statistics=statistics, redacted_text=redacted_text, entities=entities, **kwargs)
class PiiTaskParameters(PreBuiltTaskParameters):
"""Supported parameters for a PII Entities Recognition task.
:ivar logging_opt_out:
:vartype logging_opt_out: bool
:ivar model_version:
:vartype model_version: str
:ivar domain: The PII domain used for PII Entity Recognition. Possible values include: "phi",
"none". Default value: "none".
:vartype domain: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.PiiDomain
:ivar pii_categories: (Optional) describes the PII categories to return.
:vartype pii_categories: list[str or
~azure.ai.textanalytics.v2022_03_01_preview.models.PiiCategory]
:ivar string_index_type: Specifies the method used to interpret string offsets. Defaults to
Text Elements (Graphemes) according to Unicode v8.0.0. For additional information see
https://aka.ms/text-analytics-offsets. Possible values include: "TextElements_v8",
"UnicodeCodePoint", "Utf16CodeUnit". Default value: "TextElements_v8".
:vartype string_index_type: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.StringIndexType
"""
_validation = {
'pii_categories': {'unique': True},
}
_attribute_map = {
'logging_opt_out': {'key': 'loggingOptOut', 'type': 'bool'},
'model_version': {'key': 'modelVersion', 'type': 'str'},
'domain': {'key': 'domain', 'type': 'str'},
'pii_categories': {'key': 'piiCategories', 'type': '[str]'},
'string_index_type': {'key': 'stringIndexType', 'type': 'str'},
}
def __init__(
self,
*,
logging_opt_out: Optional[bool] = False,
model_version: Optional[str] = "latest",
domain: Optional[Union[str, "PiiDomain"]] = "none",
pii_categories: Optional[List[Union[str, "PiiCategory"]]] = None,
string_index_type: Optional[Union[str, "StringIndexType"]] = "TextElements_v8",
**kwargs
):
"""
:keyword logging_opt_out:
:paramtype logging_opt_out: bool
:keyword model_version:
:paramtype model_version: str
:keyword domain: The PII domain used for PII Entity Recognition. Possible values include:
"phi", "none". Default value: "none".
:paramtype domain: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.PiiDomain
:keyword pii_categories: (Optional) describes the PII categories to return.
:paramtype pii_categories: list[str or
~azure.ai.textanalytics.v2022_03_01_preview.models.PiiCategory]
:keyword string_index_type: Specifies the method used to interpret string offsets. Defaults to
Text Elements (Graphemes) according to Unicode v8.0.0. For additional information see
https://aka.ms/text-analytics-offsets. Possible values include: "TextElements_v8",
"UnicodeCodePoint", "Utf16CodeUnit". Default value: "TextElements_v8".
:paramtype string_index_type: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.StringIndexType
"""
super(PiiTaskParameters, self).__init__(logging_opt_out=logging_opt_out, model_version=model_version, **kwargs)
self.domain = domain
self.pii_categories = pii_categories
self.string_index_type = string_index_type
class PiiTaskResult(AnalyzeTextTaskResult):
"""PiiTaskResult.
All required parameters must be populated in order to send to Azure.
:ivar kind: Required. Enumeration of supported Text Analysis task results.Constant filled by
server. Possible values include: "SentimentAnalysisResults", "EntityRecognitionResults",
"PiiEntityRecognitionResults", "KeyPhraseExtractionResults", "LanguageDetectionResults",
"EntityLinkingResults".
:vartype kind: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextTaskResultsKind
:ivar results: Required.
:vartype results: ~azure.ai.textanalytics.v2022_03_01_preview.models.PiiResult
"""
_validation = {
'kind': {'required': True},
'results': {'required': True},
}
_attribute_map = {
'kind': {'key': 'kind', 'type': 'str'},
'results': {'key': 'results', 'type': 'PiiResult'},
}
def __init__(
self,
*,
results: "PiiResult",
**kwargs
):
"""
:keyword results: Required.
:paramtype results: ~azure.ai.textanalytics.v2022_03_01_preview.models.PiiResult
"""
super(PiiTaskResult, self).__init__(**kwargs)
self.kind = 'PiiEntityRecognitionResults' # type: str
self.results = results
class RequestStatistics(msrest.serialization.Model):
"""if showStats=true was specified in the request this field will contain information about the request payload.
All required parameters must be populated in order to send to Azure.
:ivar documents_count: Required. Number of documents submitted in the request.
:vartype documents_count: int
:ivar valid_documents_count: Required. Number of valid documents. This excludes empty,
over-size limit or non-supported languages documents.
:vartype valid_documents_count: int
:ivar erroneous_documents_count: Required. Number of invalid documents. This includes empty,
over-size limit or non-supported languages documents.
:vartype erroneous_documents_count: int
:ivar transactions_count: Required. Number of transactions for the request.
:vartype transactions_count: long
"""
_validation = {
'documents_count': {'required': True},
'valid_documents_count': {'required': True},
'erroneous_documents_count': {'required': True},
'transactions_count': {'required': True},
}
_attribute_map = {
'documents_count': {'key': 'documentsCount', 'type': 'int'},
'valid_documents_count': {'key': 'validDocumentsCount', 'type': 'int'},
'erroneous_documents_count': {'key': 'erroneousDocumentsCount', 'type': 'int'},
'transactions_count': {'key': 'transactionsCount', 'type': 'long'},
}
def __init__(
self,
*,
documents_count: int,
valid_documents_count: int,
erroneous_documents_count: int,
transactions_count: int,
**kwargs
):
"""
:keyword documents_count: Required. Number of documents submitted in the request.
:paramtype documents_count: int
:keyword valid_documents_count: Required. Number of valid documents. This excludes empty,
over-size limit or non-supported languages documents.
:paramtype valid_documents_count: int
:keyword erroneous_documents_count: Required. Number of invalid documents. This includes empty,
over-size limit or non-supported languages documents.
:paramtype erroneous_documents_count: int
:keyword transactions_count: Required. Number of transactions for the request.
:paramtype transactions_count: long
"""
super(RequestStatistics, self).__init__(**kwargs)
self.documents_count = documents_count
self.valid_documents_count = valid_documents_count
self.erroneous_documents_count = erroneous_documents_count
self.transactions_count = transactions_count
class SentenceAssessment(msrest.serialization.Model):
"""SentenceAssessment.
All required parameters must be populated in order to send to Azure.
:ivar sentiment: Required. Assessment sentiment in the sentence. Possible values include:
"positive", "mixed", "negative".
:vartype sentiment: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.TokenSentimentValue
:ivar confidence_scores: Required. Assessment sentiment confidence scores in the sentence.
:vartype confidence_scores:
~azure.ai.textanalytics.v2022_03_01_preview.models.TargetConfidenceScoreLabel
:ivar offset: Required. The assessment offset from the start of the sentence.
:vartype offset: int
:ivar length: Required. The length of the assessment.
:vartype length: int
:ivar text: Required. The assessment text detected.
:vartype text: str
:ivar is_negated: Required. The indicator representing if the assessment is negated.
:vartype is_negated: bool
"""
_validation = {
'sentiment': {'required': True},
'confidence_scores': {'required': True},
'offset': {'required': True},
'length': {'required': True},
'text': {'required': True},
'is_negated': {'required': True},
}
_attribute_map = {
'sentiment': {'key': 'sentiment', 'type': 'str'},
'confidence_scores': {'key': 'confidenceScores', 'type': 'TargetConfidenceScoreLabel'},
'offset': {'key': 'offset', 'type': 'int'},
'length': {'key': 'length', 'type': 'int'},
'text': {'key': 'text', 'type': 'str'},
'is_negated': {'key': 'isNegated', 'type': 'bool'},
}
def __init__(
self,
*,
sentiment: Union[str, "TokenSentimentValue"],
confidence_scores: "TargetConfidenceScoreLabel",
offset: int,
length: int,
text: str,
is_negated: bool,
**kwargs
):
"""
:keyword sentiment: Required. Assessment sentiment in the sentence. Possible values include:
"positive", "mixed", "negative".
:paramtype sentiment: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.TokenSentimentValue
:keyword confidence_scores: Required. Assessment sentiment confidence scores in the sentence.
:paramtype confidence_scores:
~azure.ai.textanalytics.v2022_03_01_preview.models.TargetConfidenceScoreLabel
:keyword offset: Required. The assessment offset from the start of the sentence.
:paramtype offset: int
:keyword length: Required. The length of the assessment.
:paramtype length: int
:keyword text: Required. The assessment text detected.
:paramtype text: str
:keyword is_negated: Required. The indicator representing if the assessment is negated.
:paramtype is_negated: bool
"""
super(SentenceAssessment, self).__init__(**kwargs)
self.sentiment = sentiment
self.confidence_scores = confidence_scores
self.offset = offset
self.length = length
self.text = text
self.is_negated = is_negated
class SentenceSentiment(msrest.serialization.Model):
"""SentenceSentiment.
All required parameters must be populated in order to send to Azure.
:ivar text: Required. The sentence text.
:vartype text: str
:ivar sentiment: Required. The predicted Sentiment for the sentence. Possible values include:
"positive", "neutral", "negative".
:vartype sentiment: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.SentenceSentimentValue
:ivar confidence_scores: Required. The sentiment confidence score between 0 and 1 for the
sentence for all classes.
:vartype confidence_scores:
~azure.ai.textanalytics.v2022_03_01_preview.models.SentimentConfidenceScorePerLabel
:ivar offset: Required. The sentence offset from the start of the document.
:vartype offset: int
:ivar length: Required. The length of the sentence.
:vartype length: int
:ivar targets: The array of sentence targets for the sentence.
:vartype targets: list[~azure.ai.textanalytics.v2022_03_01_preview.models.SentenceTarget]
:ivar assessments: The array of assessments for the sentence.
:vartype assessments:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.SentenceAssessment]
"""
_validation = {
'text': {'required': True},
'sentiment': {'required': True},
'confidence_scores': {'required': True},
'offset': {'required': True},
'length': {'required': True},
}
_attribute_map = {
'text': {'key': 'text', 'type': 'str'},
'sentiment': {'key': 'sentiment', 'type': 'str'},
'confidence_scores': {'key': 'confidenceScores', 'type': 'SentimentConfidenceScorePerLabel'},
'offset': {'key': 'offset', 'type': 'int'},
'length': {'key': 'length', 'type': 'int'},
'targets': {'key': 'targets', 'type': '[SentenceTarget]'},
'assessments': {'key': 'assessments', 'type': '[SentenceAssessment]'},
}
def __init__(
self,
*,
text: str,
sentiment: Union[str, "SentenceSentimentValue"],
confidence_scores: "SentimentConfidenceScorePerLabel",
offset: int,
length: int,
targets: Optional[List["SentenceTarget"]] = None,
assessments: Optional[List["SentenceAssessment"]] = None,
**kwargs
):
"""
:keyword text: Required. The sentence text.
:paramtype text: str
:keyword sentiment: Required. The predicted Sentiment for the sentence. Possible values
include: "positive", "neutral", "negative".
:paramtype sentiment: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.SentenceSentimentValue
:keyword confidence_scores: Required. The sentiment confidence score between 0 and 1 for the
sentence for all classes.
:paramtype confidence_scores:
~azure.ai.textanalytics.v2022_03_01_preview.models.SentimentConfidenceScorePerLabel
:keyword offset: Required. The sentence offset from the start of the document.
:paramtype offset: int
:keyword length: Required. The length of the sentence.
:paramtype length: int
:keyword targets: The array of sentence targets for the sentence.
:paramtype targets: list[~azure.ai.textanalytics.v2022_03_01_preview.models.SentenceTarget]
:keyword assessments: The array of assessments for the sentence.
:paramtype assessments:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.SentenceAssessment]
"""
super(SentenceSentiment, self).__init__(**kwargs)
self.text = text
self.sentiment = sentiment
self.confidence_scores = confidence_scores
self.offset = offset
self.length = length
self.targets = targets
self.assessments = assessments
class SentenceTarget(msrest.serialization.Model):
"""SentenceTarget.
All required parameters must be populated in order to send to Azure.
:ivar sentiment: Required. Targeted sentiment in the sentence. Possible values include:
"positive", "mixed", "negative".
:vartype sentiment: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.TokenSentimentValue
:ivar confidence_scores: Required. Target sentiment confidence scores for the target in the
sentence.
:vartype confidence_scores:
~azure.ai.textanalytics.v2022_03_01_preview.models.TargetConfidenceScoreLabel
:ivar offset: Required. The target offset from the start of the sentence.
:vartype offset: int
:ivar length: Required. The length of the target.
:vartype length: int
:ivar text: Required. The target text detected.
:vartype text: str
:ivar relations: Required. The array of either assessment or target objects which is related to
the target.
:vartype relations: list[~azure.ai.textanalytics.v2022_03_01_preview.models.TargetRelation]
"""
_validation = {
'sentiment': {'required': True},
'confidence_scores': {'required': True},
'offset': {'required': True},
'length': {'required': True},
'text': {'required': True},
'relations': {'required': True},
}
_attribute_map = {
'sentiment': {'key': 'sentiment', 'type': 'str'},
'confidence_scores': {'key': 'confidenceScores', 'type': 'TargetConfidenceScoreLabel'},
'offset': {'key': 'offset', 'type': 'int'},
'length': {'key': 'length', 'type': 'int'},
'text': {'key': 'text', 'type': 'str'},
'relations': {'key': 'relations', 'type': '[TargetRelation]'},
}
def __init__(
self,
*,
sentiment: Union[str, "TokenSentimentValue"],
confidence_scores: "TargetConfidenceScoreLabel",
offset: int,
length: int,
text: str,
relations: List["TargetRelation"],
**kwargs
):
"""
:keyword sentiment: Required. Targeted sentiment in the sentence. Possible values include:
"positive", "mixed", "negative".
:paramtype sentiment: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.TokenSentimentValue
:keyword confidence_scores: Required. Target sentiment confidence scores for the target in the
sentence.
:paramtype confidence_scores:
~azure.ai.textanalytics.v2022_03_01_preview.models.TargetConfidenceScoreLabel
:keyword offset: Required. The target offset from the start of the sentence.
:paramtype offset: int
:keyword length: Required. The length of the target.
:paramtype length: int
:keyword text: Required. The target text detected.
:paramtype text: str
:keyword relations: Required. The array of either assessment or target objects which is related
to the target.
:paramtype relations: list[~azure.ai.textanalytics.v2022_03_01_preview.models.TargetRelation]
"""
super(SentenceTarget, self).__init__(**kwargs)
self.sentiment = sentiment
self.confidence_scores = confidence_scores
self.offset = offset
self.length = length
self.text = text
self.relations = relations
class SentimentAnalysisLROTask(AnalyzeTextLROTask):
"""An object representing the task definition for a Sentiment Analysis task.
All required parameters must be populated in order to send to Azure.
:ivar task_name:
:vartype task_name: str
:ivar kind: Required. Enumeration of supported long-running Text Analysis tasks.Constant filled
by server. Possible values include: "SentimentAnalysis", "EntityRecognition",
"PiiEntityRecognition", "KeyPhraseExtraction", "EntityLinking", "Healthcare",
"ExtractiveSummarization", "CustomEntityRecognition", "CustomSingleLabelClassification",
"CustomMultiLabelClassification".
:vartype kind: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextLROTaskKind
:ivar parameters: Supported parameters for a Sentiment Analysis task.
:vartype parameters:
~azure.ai.textanalytics.v2022_03_01_preview.models.SentimentAnalysisTaskParameters
"""
_validation = {
'kind': {'required': True},
}
_attribute_map = {
'task_name': {'key': 'taskName', 'type': 'str'},
'kind': {'key': 'kind', 'type': 'str'},
'parameters': {'key': 'parameters', 'type': 'SentimentAnalysisTaskParameters'},
}
def __init__(
self,
*,
task_name: Optional[str] = None,
parameters: Optional["SentimentAnalysisTaskParameters"] = None,
**kwargs
):
"""
:keyword task_name:
:paramtype task_name: str
:keyword parameters: Supported parameters for a Sentiment Analysis task.
:paramtype parameters:
~azure.ai.textanalytics.v2022_03_01_preview.models.SentimentAnalysisTaskParameters
"""
super(SentimentAnalysisLROTask, self).__init__(task_name=task_name, **kwargs)
self.kind = 'SentimentAnalysis' # type: str
self.parameters = parameters
class SentimentAnalysisTaskParameters(PreBuiltTaskParameters):
"""Supported parameters for a Sentiment Analysis task.
:ivar logging_opt_out:
:vartype logging_opt_out: bool
:ivar model_version:
:vartype model_version: str
:ivar opinion_mining:
:vartype opinion_mining: bool
:ivar string_index_type: Specifies the method used to interpret string offsets. Defaults to
Text Elements (Graphemes) according to Unicode v8.0.0. For additional information see
https://aka.ms/text-analytics-offsets. Possible values include: "TextElements_v8",
"UnicodeCodePoint", "Utf16CodeUnit". Default value: "TextElements_v8".
:vartype string_index_type: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.StringIndexType
"""
_attribute_map = {
'logging_opt_out': {'key': 'loggingOptOut', 'type': 'bool'},
'model_version': {'key': 'modelVersion', 'type': 'str'},
'opinion_mining': {'key': 'opinionMining', 'type': 'bool'},
'string_index_type': {'key': 'stringIndexType', 'type': 'str'},
}
def __init__(
self,
*,
logging_opt_out: Optional[bool] = False,
model_version: Optional[str] = "latest",
opinion_mining: Optional[bool] = False,
string_index_type: Optional[Union[str, "StringIndexType"]] = "TextElements_v8",
**kwargs
):
"""
:keyword logging_opt_out:
:paramtype logging_opt_out: bool
:keyword model_version:
:paramtype model_version: str
:keyword opinion_mining:
:paramtype opinion_mining: bool
:keyword string_index_type: Specifies the method used to interpret string offsets. Defaults to
Text Elements (Graphemes) according to Unicode v8.0.0. For additional information see
https://aka.ms/text-analytics-offsets. Possible values include: "TextElements_v8",
"UnicodeCodePoint", "Utf16CodeUnit". Default value: "TextElements_v8".
:paramtype string_index_type: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.StringIndexType
"""
super(SentimentAnalysisTaskParameters, self).__init__(logging_opt_out=logging_opt_out, model_version=model_version, **kwargs)
self.opinion_mining = opinion_mining
self.string_index_type = string_index_type
class SentimentConfidenceScorePerLabel(msrest.serialization.Model):
"""Represents the confidence scores between 0 and 1 across all sentiment classes: positive, neutral, negative.
All required parameters must be populated in order to send to Azure.
:ivar positive: Required.
:vartype positive: float
:ivar neutral: Required.
:vartype neutral: float
:ivar negative: Required.
:vartype negative: float
"""
_validation = {
'positive': {'required': True},
'neutral': {'required': True},
'negative': {'required': True},
}
_attribute_map = {
'positive': {'key': 'positive', 'type': 'float'},
'neutral': {'key': 'neutral', 'type': 'float'},
'negative': {'key': 'negative', 'type': 'float'},
}
def __init__(
self,
*,
positive: float,
neutral: float,
negative: float,
**kwargs
):
"""
:keyword positive: Required.
:paramtype positive: float
:keyword neutral: Required.
:paramtype neutral: float
:keyword negative: Required.
:paramtype negative: float
"""
super(SentimentConfidenceScorePerLabel, self).__init__(**kwargs)
self.positive = positive
self.neutral = neutral
self.negative = negative
class SentimentDocumentResult(DocumentResult):
"""SentimentDocumentResult.
All required parameters must be populated in order to send to Azure.
:ivar id: Required. Unique, non-empty document identifier.
:vartype id: str
:ivar warnings: Required. Warnings encountered while processing document.
:vartype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:ivar sentiment: Required. Predicted sentiment for document (Negative, Neutral, Positive, or
Mixed). Possible values include: "positive", "neutral", "negative", "mixed".
:vartype sentiment: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentSentimentValue
:ivar confidence_scores: Required. Document level sentiment confidence scores between 0 and 1
for each sentiment class.
:vartype confidence_scores:
~azure.ai.textanalytics.v2022_03_01_preview.models.SentimentConfidenceScorePerLabel
:ivar sentences: Required. Sentence level sentiment analysis.
:vartype sentences: list[~azure.ai.textanalytics.v2022_03_01_preview.models.SentenceSentiment]
"""
_validation = {
'id': {'required': True},
'warnings': {'required': True},
'sentiment': {'required': True},
'confidence_scores': {'required': True},
'sentences': {'required': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'warnings': {'key': 'warnings', 'type': '[DocumentWarning]'},
'statistics': {'key': 'statistics', 'type': 'DocumentStatistics'},
'sentiment': {'key': 'sentiment', 'type': 'str'},
'confidence_scores': {'key': 'confidenceScores', 'type': 'SentimentConfidenceScorePerLabel'},
'sentences': {'key': 'sentences', 'type': '[SentenceSentiment]'},
}
def __init__(
self,
*,
id: str,
warnings: List["DocumentWarning"],
sentiment: Union[str, "DocumentSentimentValue"],
confidence_scores: "SentimentConfidenceScorePerLabel",
sentences: List["SentenceSentiment"],
statistics: Optional["DocumentStatistics"] = None,
**kwargs
):
"""
:keyword id: Required. Unique, non-empty document identifier.
:paramtype id: str
:keyword warnings: Required. Warnings encountered while processing document.
:paramtype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:keyword sentiment: Required. Predicted sentiment for document (Negative, Neutral, Positive, or
Mixed). Possible values include: "positive", "neutral", "negative", "mixed".
:paramtype sentiment: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentSentimentValue
:keyword confidence_scores: Required. Document level sentiment confidence scores between 0 and
1 for each sentiment class.
:paramtype confidence_scores:
~azure.ai.textanalytics.v2022_03_01_preview.models.SentimentConfidenceScorePerLabel
:keyword sentences: Required. Sentence level sentiment analysis.
:paramtype sentences:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.SentenceSentiment]
"""
super(SentimentDocumentResult, self).__init__(id=id, warnings=warnings, statistics=statistics, **kwargs)
self.sentiment = sentiment
self.confidence_scores = confidence_scores
self.sentences = sentences
class SentimentLROResult(AnalyzeTextLROResult):
"""SentimentLROResult.
All required parameters must be populated in order to send to Azure.
:ivar last_update_date_time: Required.
:vartype last_update_date_time: ~datetime.datetime
:ivar status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:vartype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
:ivar task_name:
:vartype task_name: str
:ivar kind: Required. Enumeration of supported Text Analysis long-running operation task
results.Constant filled by server. Possible values include: "SentimentAnalysisLROResults",
"EntityRecognitionLROResults", "PiiEntityRecognitionLROResults",
"KeyPhraseExtractionLROResults", "EntityLinkingLROResults", "HealthcareLROResults",
"ExtractiveSummarizationLROResults", "CustomEntityRecognitionLROResults",
"CustomSingleLabelClassificationLROResults", "CustomMultiLabelClassificationLROResults".
:vartype kind: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextLROResultsKind
:ivar results: Required.
:vartype results: ~azure.ai.textanalytics.v2022_03_01_preview.models.SentimentResponse
"""
_validation = {
'last_update_date_time': {'required': True},
'status': {'required': True},
'kind': {'required': True},
'results': {'required': True},
}
_attribute_map = {
'last_update_date_time': {'key': 'lastUpdateDateTime', 'type': 'iso-8601'},
'status': {'key': 'status', 'type': 'str'},
'task_name': {'key': 'taskName', 'type': 'str'},
'kind': {'key': 'kind', 'type': 'str'},
'results': {'key': 'results', 'type': 'SentimentResponse'},
}
def __init__(
self,
*,
last_update_date_time: datetime.datetime,
status: Union[str, "State"],
results: "SentimentResponse",
task_name: Optional[str] = None,
**kwargs
):
"""
:keyword last_update_date_time: Required.
:paramtype last_update_date_time: ~datetime.datetime
:keyword status: Required. Possible values include: "notStarted", "running", "succeeded",
"partiallySucceeded", "failed", "cancelled", "cancelling".
:paramtype status: str or ~azure.ai.textanalytics.v2022_03_01_preview.models.State
:keyword task_name:
:paramtype task_name: str
:keyword results: Required.
:paramtype results: ~azure.ai.textanalytics.v2022_03_01_preview.models.SentimentResponse
"""
super(SentimentLROResult, self).__init__(last_update_date_time=last_update_date_time, status=status, task_name=task_name, **kwargs)
self.kind = 'SentimentAnalysisLROResults' # type: str
self.results = results
class SentimentResponse(PreBuiltResult):
"""SentimentResponse.
All required parameters must be populated in order to send to Azure.
:ivar errors: Required. Errors by document id.
:vartype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentError]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:ivar model_version: Required. This field indicates which model is used for scoring.
:vartype model_version: str
:ivar documents: Required. Sentiment analysis per document.
:vartype documents:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.SentimentResponseDocumentsItem]
"""
_validation = {
'errors': {'required': True},
'model_version': {'required': True},
'documents': {'required': True},
}
_attribute_map = {
'errors': {'key': 'errors', 'type': '[DocumentError]'},
'statistics': {'key': 'statistics', 'type': 'RequestStatistics'},
'model_version': {'key': 'modelVersion', 'type': 'str'},
'documents': {'key': 'documents', 'type': '[SentimentResponseDocumentsItem]'},
}
def __init__(
self,
*,
errors: List["DocumentError"],
model_version: str,
documents: List["SentimentResponseDocumentsItem"],
statistics: Optional["RequestStatistics"] = None,
**kwargs
):
"""
:keyword errors: Required. Errors by document id.
:paramtype errors: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentError]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the request payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.RequestStatistics
:keyword model_version: Required. This field indicates which model is used for scoring.
:paramtype model_version: str
:keyword documents: Required. Sentiment analysis per document.
:paramtype documents:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.SentimentResponseDocumentsItem]
"""
super(SentimentResponse, self).__init__(errors=errors, statistics=statistics, model_version=model_version, **kwargs)
self.documents = documents
class SentimentResponseDocumentsItem(SentimentDocumentResult):
"""SentimentResponseDocumentsItem.
All required parameters must be populated in order to send to Azure.
:ivar id: Required. Unique, non-empty document identifier.
:vartype id: str
:ivar warnings: Required. Warnings encountered while processing document.
:vartype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:ivar statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:vartype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:ivar sentiment: Required. Predicted sentiment for document (Negative, Neutral, Positive, or
Mixed). Possible values include: "positive", "neutral", "negative", "mixed".
:vartype sentiment: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentSentimentValue
:ivar confidence_scores: Required. Document level sentiment confidence scores between 0 and 1
for each sentiment class.
:vartype confidence_scores:
~azure.ai.textanalytics.v2022_03_01_preview.models.SentimentConfidenceScorePerLabel
:ivar sentences: Required. Sentence level sentiment analysis.
:vartype sentences: list[~azure.ai.textanalytics.v2022_03_01_preview.models.SentenceSentiment]
"""
_validation = {
'id': {'required': True},
'warnings': {'required': True},
'sentiment': {'required': True},
'confidence_scores': {'required': True},
'sentences': {'required': True},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'warnings': {'key': 'warnings', 'type': '[DocumentWarning]'},
'statistics': {'key': 'statistics', 'type': 'DocumentStatistics'},
'sentiment': {'key': 'sentiment', 'type': 'str'},
'confidence_scores': {'key': 'confidenceScores', 'type': 'SentimentConfidenceScorePerLabel'},
'sentences': {'key': 'sentences', 'type': '[SentenceSentiment]'},
}
def __init__(
self,
*,
id: str,
warnings: List["DocumentWarning"],
sentiment: Union[str, "DocumentSentimentValue"],
confidence_scores: "SentimentConfidenceScorePerLabel",
sentences: List["SentenceSentiment"],
statistics: Optional["DocumentStatistics"] = None,
**kwargs
):
"""
:keyword id: Required. Unique, non-empty document identifier.
:paramtype id: str
:keyword warnings: Required. Warnings encountered while processing document.
:paramtype warnings: list[~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentWarning]
:keyword statistics: if showStats=true was specified in the request this field will contain
information about the document payload.
:paramtype statistics: ~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentStatistics
:keyword sentiment: Required. Predicted sentiment for document (Negative, Neutral, Positive, or
Mixed). Possible values include: "positive", "neutral", "negative", "mixed".
:paramtype sentiment: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.DocumentSentimentValue
:keyword confidence_scores: Required. Document level sentiment confidence scores between 0 and
1 for each sentiment class.
:paramtype confidence_scores:
~azure.ai.textanalytics.v2022_03_01_preview.models.SentimentConfidenceScorePerLabel
:keyword sentences: Required. Sentence level sentiment analysis.
:paramtype sentences:
list[~azure.ai.textanalytics.v2022_03_01_preview.models.SentenceSentiment]
"""
super(SentimentResponseDocumentsItem, self).__init__(id=id, warnings=warnings, statistics=statistics, sentiment=sentiment, confidence_scores=confidence_scores, sentences=sentences, **kwargs)
class SentimentTaskResult(AnalyzeTextTaskResult):
"""SentimentTaskResult.
All required parameters must be populated in order to send to Azure.
:ivar kind: Required. Enumeration of supported Text Analysis task results.Constant filled by
server. Possible values include: "SentimentAnalysisResults", "EntityRecognitionResults",
"PiiEntityRecognitionResults", "KeyPhraseExtractionResults", "LanguageDetectionResults",
"EntityLinkingResults".
:vartype kind: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextTaskResultsKind
:ivar results: Required.
:vartype results: ~azure.ai.textanalytics.v2022_03_01_preview.models.SentimentResponse
"""
_validation = {
'kind': {'required': True},
'results': {'required': True},
}
_attribute_map = {
'kind': {'key': 'kind', 'type': 'str'},
'results': {'key': 'results', 'type': 'SentimentResponse'},
}
def __init__(
self,
*,
results: "SentimentResponse",
**kwargs
):
"""
:keyword results: Required.
:paramtype results: ~azure.ai.textanalytics.v2022_03_01_preview.models.SentimentResponse
"""
super(SentimentTaskResult, self).__init__(**kwargs)
self.kind = 'SentimentAnalysisResults' # type: str
self.results = results
class TargetConfidenceScoreLabel(msrest.serialization.Model):
"""Represents the confidence scores across all sentiment classes: positive, neutral, negative.
All required parameters must be populated in order to send to Azure.
:ivar positive: Required.
:vartype positive: float
:ivar negative: Required.
:vartype negative: float
"""
_validation = {
'positive': {'required': True},
'negative': {'required': True},
}
_attribute_map = {
'positive': {'key': 'positive', 'type': 'float'},
'negative': {'key': 'negative', 'type': 'float'},
}
def __init__(
self,
*,
positive: float,
negative: float,
**kwargs
):
"""
:keyword positive: Required.
:paramtype positive: float
:keyword negative: Required.
:paramtype negative: float
"""
super(TargetConfidenceScoreLabel, self).__init__(**kwargs)
self.positive = positive
self.negative = negative
class TargetRelation(msrest.serialization.Model):
"""TargetRelation.
All required parameters must be populated in order to send to Azure.
:ivar relation_type: Required. The type related to the target. Possible values include:
"assessment", "target".
:vartype relation_type: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.TargetRelationType
:ivar ref: Required. The JSON pointer indicating the linked object.
:vartype ref: str
"""
_validation = {
'relation_type': {'required': True},
'ref': {'required': True},
}
_attribute_map = {
'relation_type': {'key': 'relationType', 'type': 'str'},
'ref': {'key': 'ref', 'type': 'str'},
}
def __init__(
self,
*,
relation_type: Union[str, "TargetRelationType"],
ref: str,
**kwargs
):
"""
:keyword relation_type: Required. The type related to the target. Possible values include:
"assessment", "target".
:paramtype relation_type: str or
~azure.ai.textanalytics.v2022_03_01_preview.models.TargetRelationType
:keyword ref: Required. The JSON pointer indicating the linked object.
:paramtype ref: str
"""
super(TargetRelation, self).__init__(**kwargs)
self.relation_type = relation_type
self.ref = ref
class TasksStateTasks(msrest.serialization.Model):
"""TasksStateTasks.
All required parameters must be populated in order to send to Azure.
:ivar completed: Required.
:vartype completed: int
:ivar failed: Required.
:vartype failed: int
:ivar in_progress: Required.
:vartype in_progress: int
:ivar total: Required.
:vartype total: int
:ivar items:
:vartype items: list[~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextLROResult]
"""
_validation = {
'completed': {'required': True},
'failed': {'required': True},
'in_progress': {'required': True},
'total': {'required': True},
}
_attribute_map = {
'completed': {'key': 'completed', 'type': 'int'},
'failed': {'key': 'failed', 'type': 'int'},
'in_progress': {'key': 'inProgress', 'type': 'int'},
'total': {'key': 'total', 'type': 'int'},
'items': {'key': 'items', 'type': '[AnalyzeTextLROResult]'},
}
def __init__(
self,
*,
completed: int,
failed: int,
in_progress: int,
total: int,
items: Optional[List["AnalyzeTextLROResult"]] = None,
**kwargs
):
"""
:keyword completed: Required.
:paramtype completed: int
:keyword failed: Required.
:paramtype failed: int
:keyword in_progress: Required.
:paramtype in_progress: int
:keyword total: Required.
:paramtype total: int
:keyword items:
:paramtype items: list[~azure.ai.textanalytics.v2022_03_01_preview.models.AnalyzeTextLROResult]
"""
super(TasksStateTasks, self).__init__(**kwargs)
self.completed = completed
self.failed = failed
self.in_progress = in_progress
self.total = total
self.items = items
| 41.483297 | 684 | 0.67689 | 26,750 | 268,231 | 6.614243 | 0.027776 | 0.017645 | 0.050415 | 0.063019 | 0.867542 | 0.85613 | 0.840944 | 0.826379 | 0.816646 | 0.803178 | 0 | 0.018374 | 0.212932 | 268,231 | 6,465 | 685 | 41.489714 | 0.819701 | 0.540963 | 0 | 0.741736 | 0 | 0 | 0.248252 | 0.050049 | 0 | 0 | 0 | 0 | 0.001816 | 1 | 0.044679 | false | 0 | 0.001816 | 0 | 0.176171 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
52739ee1685f6e2f5d444e122763b40560137bdc | 76 | py | Python | schist/_utils/__init__.py | leomorelli/schist | a3a776febc745eb6015fe3c6478cbd0040d69da2 | [
"BSD-3-Clause"
] | 20 | 2020-07-05T22:05:34.000Z | 2022-01-12T14:53:08.000Z | schist/_utils/__init__.py | dawe/scNSBM | 1643d778901ce8bb14d60af5631771d1f801fc96 | [
"BSD-3-Clause"
] | 17 | 2020-06-03T15:00:47.000Z | 2022-01-21T09:02:56.000Z | schist/_utils/__init__.py | dawe/scNSBM | 1643d778901ce8bb14d60af5631771d1f801fc96 | [
"BSD-3-Clause"
] | 1 | 2020-07-07T14:31:32.000Z | 2020-07-07T14:31:32.000Z | from ._tl_utils import *
from ._gt_utils import *
#from ._pl_utils import *
| 19 | 25 | 0.75 | 12 | 76 | 4.25 | 0.5 | 0.647059 | 0.588235 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.157895 | 76 | 3 | 26 | 25.333333 | 0.796875 | 0.315789 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
874ae74d0946cf10b922921836e28ab90cada237 | 10,686 | py | Python | configs/qutebrowser/config.py | xeenypl/neoi | 7932183b12ab0f648a6257277fc4e167efb0b2e5 | [
"MIT"
] | null | null | null | configs/qutebrowser/config.py | xeenypl/neoi | 7932183b12ab0f648a6257277fc4e167efb0b2e5 | [
"MIT"
] | null | null | null | configs/qutebrowser/config.py | xeenypl/neoi | 7932183b12ab0f648a6257277fc4e167efb0b2e5 | [
"MIT"
] | null | null | null | # Autogenerated config.py
#
# NOTE: config.py is intended for advanced users who are comfortable
# with manually migrating the config file on qutebrowser upgrades. If
# you prefer, you can also configure qutebrowser using the
# :set/:bind/:config-* commands without having to write a config.py
# file.
#
# Documentation:
# qute://help/configuring.html
# qute://help/settings.html
# Uncomment this to still load settings configured via autoconfig.yml
# config.load_autoconfig()
# Time interval (in milliseconds) between auto-saves of
# config/cookies/etc.
# Type: Int
c.auto_save.interval = 1000
# Always restore open sites when qutebrowser is reopened.
# Type: Bool
c.auto_save.session = True
# Automatically start playing `<video>` elements. Note: On Qt < 5.11,
# this option needs a restart and does not support URL patterns.
# Type: Bool
c.content.autoplay = False
# Which cookies to accept. With QtWebEngine, this setting also controls
# other features with tracking capabilities similar to those of cookies;
# including IndexedDB, DOM storage, filesystem API, service workers, and
# AppCache. Note that with QtWebKit, only `all` and `never` are
# supported as per-domain values. Setting `no-3rdparty` or `no-
# unknown-3rdparty` per-domain on QtWebKit will have the same effect as
# `all`. If this setting is used with URL patterns, the pattern gets
# applied to the origin/first party URL of the page making the request,
# not the request URL.
# Type: String
# Valid values:
# - all: Accept all cookies.
# - no-3rdparty: Accept cookies from the same origin only. This is known to break some sites, such as GMail.
# - no-unknown-3rdparty: Accept cookies from the same origin only, unless a cookie is already set for the domain. On QtWebEngine, this is the same as no-3rdparty.
# - never: Don't accept cookies at all.
config.set('content.cookies.accept', 'all', 'chrome-devtools://*')
# Which cookies to accept. With QtWebEngine, this setting also controls
# other features with tracking capabilities similar to those of cookies;
# including IndexedDB, DOM storage, filesystem API, service workers, and
# AppCache. Note that with QtWebKit, only `all` and `never` are
# supported as per-domain values. Setting `no-3rdparty` or `no-
# unknown-3rdparty` per-domain on QtWebKit will have the same effect as
# `all`. If this setting is used with URL patterns, the pattern gets
# applied to the origin/first party URL of the page making the request,
# not the request URL.
# Type: String
# Valid values:
# - all: Accept all cookies.
# - no-3rdparty: Accept cookies from the same origin only. This is known to break some sites, such as GMail.
# - no-unknown-3rdparty: Accept cookies from the same origin only, unless a cookie is already set for the domain. On QtWebEngine, this is the same as no-3rdparty.
# - never: Don't accept cookies at all.
config.set('content.cookies.accept', 'all', 'devtools://*')
# User agent to send. The following placeholders are defined: *
# `{os_info}`: Something like "X11; Linux x86_64". * `{webkit_version}`:
# The underlying WebKit version (set to a fixed value with
# QtWebEngine). * `{qt_key}`: "Qt" for QtWebKit, "QtWebEngine" for
# QtWebEngine. * `{qt_version}`: The underlying Qt version. *
# `{upstream_browser_key}`: "Version" for QtWebKit, "Chrome" for
# QtWebEngine. * `{upstream_browser_version}`: The corresponding
# Safari/Chrome version. * `{qutebrowser_version}`: The currently
# running qutebrowser version. The default value is equal to the
# unchanged user agent of QtWebKit/QtWebEngine. Note that the value
# read from JavaScript is always the global value. With QtWebEngine
# between 5.12 and 5.14 (inclusive), changing the value exposed to
# JavaScript requires a restart.
# Type: FormatString
config.set('content.headers.user_agent', 'Mozilla/5.0 ({os_info}) AppleWebKit/{webkit_version} (KHTML, like Gecko) {upstream_browser_key}/{upstream_browser_version} Safari/{webkit_version}', 'https://web.whatsapp.com/')
# User agent to send. The following placeholders are defined: *
# `{os_info}`: Something like "X11; Linux x86_64". * `{webkit_version}`:
# The underlying WebKit version (set to a fixed value with
# QtWebEngine). * `{qt_key}`: "Qt" for QtWebKit, "QtWebEngine" for
# QtWebEngine. * `{qt_version}`: The underlying Qt version. *
# `{upstream_browser_key}`: "Version" for QtWebKit, "Chrome" for
# QtWebEngine. * `{upstream_browser_version}`: The corresponding
# Safari/Chrome version. * `{qutebrowser_version}`: The currently
# running qutebrowser version. The default value is equal to the
# unchanged user agent of QtWebKit/QtWebEngine. Note that the value
# read from JavaScript is always the global value. With QtWebEngine
# between 5.12 and 5.14 (inclusive), changing the value exposed to
# JavaScript requires a restart.
# Type: FormatString
config.set('content.headers.user_agent', 'Mozilla/5.0 ({os_info}; rv:71.0) Gecko/20100101 Firefox/71.0', 'https://accounts.google.com/*')
# User agent to send. The following placeholders are defined: *
# `{os_info}`: Something like "X11; Linux x86_64". * `{webkit_version}`:
# The underlying WebKit version (set to a fixed value with
# QtWebEngine). * `{qt_key}`: "Qt" for QtWebKit, "QtWebEngine" for
# QtWebEngine. * `{qt_version}`: The underlying Qt version. *
# `{upstream_browser_key}`: "Version" for QtWebKit, "Chrome" for
# QtWebEngine. * `{upstream_browser_version}`: The corresponding
# Safari/Chrome version. * `{qutebrowser_version}`: The currently
# running qutebrowser version. The default value is equal to the
# unchanged user agent of QtWebKit/QtWebEngine. Note that the value
# read from JavaScript is always the global value. With QtWebEngine
# between 5.12 and 5.14 (inclusive), changing the value exposed to
# JavaScript requires a restart.
# Type: FormatString
config.set('content.headers.user_agent', 'Mozilla/5.0 ({os_info}) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99 Safari/537.36', 'https://*.slack.com/*')
# User agent to send. The following placeholders are defined: *
# `{os_info}`: Something like "X11; Linux x86_64". * `{webkit_version}`:
# The underlying WebKit version (set to a fixed value with
# QtWebEngine). * `{qt_key}`: "Qt" for QtWebKit, "QtWebEngine" for
# QtWebEngine. * `{qt_version}`: The underlying Qt version. *
# `{upstream_browser_key}`: "Version" for QtWebKit, "Chrome" for
# QtWebEngine. * `{upstream_browser_version}`: The corresponding
# Safari/Chrome version. * `{qutebrowser_version}`: The currently
# running qutebrowser version. The default value is equal to the
# unchanged user agent of QtWebKit/QtWebEngine. Note that the value
# read from JavaScript is always the global value. With QtWebEngine
# between 5.12 and 5.14 (inclusive), changing the value exposed to
# JavaScript requires a restart.
# Type: FormatString
config.set('content.headers.user_agent', 'Mozilla/5.0 ({os_info}; rv:71.0) Gecko/20100101 Firefox/71.0', 'https://docs.google.com/*')
# User agent to send. The following placeholders are defined: *
# `{os_info}`: Something like "X11; Linux x86_64". * `{webkit_version}`:
# The underlying WebKit version (set to a fixed value with
# QtWebEngine). * `{qt_key}`: "Qt" for QtWebKit, "QtWebEngine" for
# QtWebEngine. * `{qt_version}`: The underlying Qt version. *
# `{upstream_browser_key}`: "Version" for QtWebKit, "Chrome" for
# QtWebEngine. * `{upstream_browser_version}`: The corresponding
# Safari/Chrome version. * `{qutebrowser_version}`: The currently
# running qutebrowser version. The default value is equal to the
# unchanged user agent of QtWebKit/QtWebEngine. Note that the value
# read from JavaScript is always the global value. With QtWebEngine
# between 5.12 and 5.14 (inclusive), changing the value exposed to
# JavaScript requires a restart.
# Type: FormatString
config.set('content.headers.user_agent', 'Mozilla/5.0 ({os_info}; rv:71.0) Gecko/20100101 Firefox/71.0', 'https://drive.google.com/*')
# Load images automatically in web pages.
# Type: Bool
config.set('content.images', True, 'chrome-devtools://*')
# Load images automatically in web pages.
# Type: Bool
config.set('content.images', True, 'devtools://*')
# Enable JavaScript.
# Type: Bool
config.set('content.javascript.enabled', True, 'file://*')
# Enable JavaScript.
# Type: Bool
config.set('content.javascript.enabled', True, 'chrome-devtools://*')
# Enable JavaScript.
# Type: Bool
config.set('content.javascript.enabled', True, 'devtools://*')
# Enable JavaScript.
# Type: Bool
config.set('content.javascript.enabled', True, 'chrome://*/*')
# Enable JavaScript.
# Type: Bool
config.set('content.javascript.enabled', True, 'qute://*/*')
# List of user stylesheet filenames to use.
# Type: List of File, or File
c.content.user_stylesheets = []
# Directory to save downloads to. If unset, a sensible OS-specific
# default is used.
# Type: Directory
c.downloads.location.directory = '~/'
# Duration (in milliseconds) to wait before removing finished downloads.
# If set to -1, downloads are never removed.
# Type: Int
c.downloads.remove_finished = 10000
# Characters used for hint strings.
# Type: UniqueCharString
c.hints.chars = 'ashtneoi'
# When to show the statusbar.
# Type: String
# Valid values:
# - always: Always show the statusbar.
# - never: Always hide the statusbar.
# - in-mode: Show the statusbar when in modes other than normal mode.
c.statusbar.show = 'in-mode'
# When to show the tab bar.
# Type: String
# Valid values:
# - always: Always show the tab bar.
# - never: Always hide the tab bar.
# - multiple: Hide the tab bar if only one tab is open.
# - switching: Show the tab bar when switching tabs.
c.tabs.show = 'always'
# Bindings for normal mode
config.bind('<Ctrl+e>', 'tab-move +')
config.bind('<Ctrl+o>', 'tab-move -')
config.bind('AA', 'spawn mpv {flolow-url}')
config.bind('E', 'tab-next')
config.bind('H', 'hint all tab')
config.bind('I', 'forward')
config.bind('L', 'set-cmd-text -s :open -t')
config.bind('N', 'back')
config.bind('O', 'tab-prev')
config.bind('P', 'open -p')
config.bind('S', 'hint all tab')
config.bind('aa', 'spawn mpv {url}')
config.bind('ad', 'spawn youtube-dl --ignore-errors --yes-playlist --extract-audio --audio-format mp3 --output "%(title)s.%(ext)s" {url}')
config.bind('as', 'spawn youtube-dl {url}')
config.unbind('co')
config.bind('e', 'scroll down')
config.bind('h', 'hint')
config.bind('i', 'scroll right')
config.bind('j', 'enter-mode insert')
config.bind('k', 'search-next')
config.bind('l', 'set-cmd-text -s :open')
config.bind('n', 'scroll left')
config.bind('o', 'scroll up')
config.bind('s', 'hint')
config.bind('tt', 'config-cycle tabs.show switching always')
| 45.862661 | 219 | 0.729553 | 1,533 | 10,686 | 5.039139 | 0.20287 | 0.032362 | 0.028997 | 0.015405 | 0.735146 | 0.726084 | 0.726084 | 0.726084 | 0.708997 | 0.708997 | 0 | 0.016143 | 0.142055 | 10,686 | 232 | 220 | 46.060345 | 0.826462 | 0.727962 | 0 | 0 | 1 | 0.125 | 0.54059 | 0.14707 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5e75685f00748f1758fb45176f6c4c4af5ed62d9 | 492,364 | py | Python | sdk/python/pulumi_fastly/_inputs.py | pulumi/pulumi-fastly | 73e71bade5a6f4f9b97b260405c72e405ee69f73 | [
"ECL-2.0",
"Apache-2.0"
] | 4 | 2019-11-22T08:25:26.000Z | 2022-03-13T10:44:43.000Z | sdk/python/pulumi_fastly/_inputs.py | pulumi/pulumi-fastly | 73e71bade5a6f4f9b97b260405c72e405ee69f73 | [
"ECL-2.0",
"Apache-2.0"
] | 80 | 2020-03-23T09:23:35.000Z | 2022-03-31T15:26:55.000Z | sdk/python/pulumi_fastly/_inputs.py | pulumi/pulumi-fastly | 73e71bade5a6f4f9b97b260405c72e405ee69f73 | [
"ECL-2.0",
"Apache-2.0"
] | 3 | 2020-08-28T07:20:35.000Z | 2021-11-24T18:25:41.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from . import _utilities
__all__ = [
'ServiceACLEntriesv1EntryArgs',
'ServiceComputeBackendArgs',
'ServiceComputeBigqueryloggingArgs',
'ServiceComputeBlobstorageloggingArgs',
'ServiceComputeDictionaryArgs',
'ServiceComputeDirectorArgs',
'ServiceComputeDomainArgs',
'ServiceComputeGcsloggingArgs',
'ServiceComputeHealthcheckArgs',
'ServiceComputeHttpsloggingArgs',
'ServiceComputeLogentryArgs',
'ServiceComputeLoggingCloudfileArgs',
'ServiceComputeLoggingDatadogArgs',
'ServiceComputeLoggingDigitaloceanArgs',
'ServiceComputeLoggingElasticsearchArgs',
'ServiceComputeLoggingFtpArgs',
'ServiceComputeLoggingGooglepubsubArgs',
'ServiceComputeLoggingHerokuArgs',
'ServiceComputeLoggingHoneycombArgs',
'ServiceComputeLoggingKafkaArgs',
'ServiceComputeLoggingKineseArgs',
'ServiceComputeLoggingLogglyArgs',
'ServiceComputeLoggingLogshuttleArgs',
'ServiceComputeLoggingNewrelicArgs',
'ServiceComputeLoggingOpenstackArgs',
'ServiceComputeLoggingScalyrArgs',
'ServiceComputeLoggingSftpArgs',
'ServiceComputePackageArgs',
'ServiceComputePapertrailArgs',
'ServiceComputeS3loggingArgs',
'ServiceComputeSplunkArgs',
'ServiceComputeSumologicArgs',
'ServiceComputeSyslogArgs',
'ServiceWafConfigurationRuleArgs',
'ServiceWafConfigurationRuleExclusionArgs',
'Servicev1AclArgs',
'Servicev1BackendArgs',
'Servicev1BigqueryloggingArgs',
'Servicev1BlobstorageloggingArgs',
'Servicev1CacheSettingArgs',
'Servicev1ConditionArgs',
'Servicev1DictionaryArgs',
'Servicev1DirectorArgs',
'Servicev1DomainArgs',
'Servicev1DynamicsnippetArgs',
'Servicev1GcsloggingArgs',
'Servicev1GzipArgs',
'Servicev1HeaderArgs',
'Servicev1HealthcheckArgs',
'Servicev1HttpsloggingArgs',
'Servicev1LogentryArgs',
'Servicev1LoggingCloudfileArgs',
'Servicev1LoggingDatadogArgs',
'Servicev1LoggingDigitaloceanArgs',
'Servicev1LoggingElasticsearchArgs',
'Servicev1LoggingFtpArgs',
'Servicev1LoggingGooglepubsubArgs',
'Servicev1LoggingHerokuArgs',
'Servicev1LoggingHoneycombArgs',
'Servicev1LoggingKafkaArgs',
'Servicev1LoggingKineseArgs',
'Servicev1LoggingLogglyArgs',
'Servicev1LoggingLogshuttleArgs',
'Servicev1LoggingNewrelicArgs',
'Servicev1LoggingOpenstackArgs',
'Servicev1LoggingScalyrArgs',
'Servicev1LoggingSftpArgs',
'Servicev1PapertrailArgs',
'Servicev1RequestSettingArgs',
'Servicev1ResponseObjectArgs',
'Servicev1S3loggingArgs',
'Servicev1SnippetArgs',
'Servicev1SplunkArgs',
'Servicev1SumologicArgs',
'Servicev1SyslogArgs',
'Servicev1VclArgs',
'Servicev1WafArgs',
'TlsSubscriptionManagedDnsChallengeArgs',
'TlsSubscriptionManagedHttpChallengeArgs',
]
@pulumi.input_type
class ServiceACLEntriesv1EntryArgs:
def __init__(__self__, *,
ip: pulumi.Input[str],
comment: Optional[pulumi.Input[str]] = None,
id: Optional[pulumi.Input[str]] = None,
negated: Optional[pulumi.Input[bool]] = None,
subnet: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] ip: An IP address that is the focus for the ACL
:param pulumi.Input[str] comment: A personal freeform descriptive note
:param pulumi.Input[str] id: The unique ID of the entry
:param pulumi.Input[bool] negated: A boolean that will negate the match if true
:param pulumi.Input[str] subnet: An optional subnet mask applied to the IP address
"""
pulumi.set(__self__, "ip", ip)
if comment is not None:
pulumi.set(__self__, "comment", comment)
if id is not None:
pulumi.set(__self__, "id", id)
if negated is not None:
pulumi.set(__self__, "negated", negated)
if subnet is not None:
pulumi.set(__self__, "subnet", subnet)
@property
@pulumi.getter
def ip(self) -> pulumi.Input[str]:
"""
An IP address that is the focus for the ACL
"""
return pulumi.get(self, "ip")
@ip.setter
def ip(self, value: pulumi.Input[str]):
pulumi.set(self, "ip", value)
@property
@pulumi.getter
def comment(self) -> Optional[pulumi.Input[str]]:
"""
A personal freeform descriptive note
"""
return pulumi.get(self, "comment")
@comment.setter
def comment(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "comment", value)
@property
@pulumi.getter
def id(self) -> Optional[pulumi.Input[str]]:
"""
The unique ID of the entry
"""
return pulumi.get(self, "id")
@id.setter
def id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "id", value)
@property
@pulumi.getter
def negated(self) -> Optional[pulumi.Input[bool]]:
"""
A boolean that will negate the match if true
"""
return pulumi.get(self, "negated")
@negated.setter
def negated(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "negated", value)
@property
@pulumi.getter
def subnet(self) -> Optional[pulumi.Input[str]]:
"""
An optional subnet mask applied to the IP address
"""
return pulumi.get(self, "subnet")
@subnet.setter
def subnet(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "subnet", value)
@pulumi.input_type
class ServiceComputeBackendArgs:
def __init__(__self__, *,
address: pulumi.Input[str],
name: pulumi.Input[str],
auto_loadbalance: Optional[pulumi.Input[bool]] = None,
between_bytes_timeout: Optional[pulumi.Input[int]] = None,
connect_timeout: Optional[pulumi.Input[int]] = None,
error_threshold: Optional[pulumi.Input[int]] = None,
first_byte_timeout: Optional[pulumi.Input[int]] = None,
healthcheck: Optional[pulumi.Input[str]] = None,
max_conn: Optional[pulumi.Input[int]] = None,
max_tls_version: Optional[pulumi.Input[str]] = None,
min_tls_version: Optional[pulumi.Input[str]] = None,
override_host: Optional[pulumi.Input[str]] = None,
port: Optional[pulumi.Input[int]] = None,
shield: Optional[pulumi.Input[str]] = None,
ssl_ca_cert: Optional[pulumi.Input[str]] = None,
ssl_cert_hostname: Optional[pulumi.Input[str]] = None,
ssl_check_cert: Optional[pulumi.Input[bool]] = None,
ssl_ciphers: Optional[pulumi.Input[str]] = None,
ssl_client_cert: Optional[pulumi.Input[str]] = None,
ssl_client_key: Optional[pulumi.Input[str]] = None,
ssl_hostname: Optional[pulumi.Input[str]] = None,
ssl_sni_hostname: Optional[pulumi.Input[str]] = None,
use_ssl: Optional[pulumi.Input[bool]] = None,
weight: Optional[pulumi.Input[int]] = None):
"""
:param pulumi.Input[str] address: An IPv4, hostname, or IPv6 address for the Backend
:param pulumi.Input[str] name: Name for this Backend. Must be unique to this Service. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[bool] auto_loadbalance: Denotes if this Backend should be included in the pool of backends that requests are load balanced against. Default `true`
:param pulumi.Input[int] between_bytes_timeout: How long to wait between bytes in milliseconds. Default `10000`
:param pulumi.Input[int] connect_timeout: How long to wait for a timeout in milliseconds. Default `1000`
:param pulumi.Input[int] error_threshold: Number of errors to allow before the Backend is marked as down. Default `0`
:param pulumi.Input[int] first_byte_timeout: How long to wait for the first bytes in milliseconds. Default `15000`
:param pulumi.Input[str] healthcheck: Name of a defined `healthcheck` to assign to this backend
:param pulumi.Input[int] max_conn: Maximum number of connections for this Backend. Default `200`
:param pulumi.Input[str] max_tls_version: Maximum allowed TLS version on SSL connections to this backend.
:param pulumi.Input[str] min_tls_version: Minimum allowed TLS version on SSL connections to this backend.
:param pulumi.Input[str] override_host: The hostname to override the Host header
:param pulumi.Input[int] port: The port number on which the Backend responds. Default `80`
:param pulumi.Input[str] shield: The POP of the shield designated to reduce inbound load. Valid values for `shield` are included in the `GET /datacenters` API response
:param pulumi.Input[str] ssl_ca_cert: CA certificate attached to origin.
:param pulumi.Input[str] ssl_cert_hostname: Overrides ssl_hostname, but only for cert verification. Does not affect SNI at all
:param pulumi.Input[bool] ssl_check_cert: Be strict about checking SSL certs. Default `true`
:param pulumi.Input[str] ssl_ciphers: Cipher list consisting of one or more cipher strings separated by colons. Commas or spaces are also acceptable separators but colons are normally used.
:param pulumi.Input[str] ssl_client_cert: Client certificate attached to origin. Used when connecting to the backend
:param pulumi.Input[str] ssl_client_key: Client key attached to origin. Used when connecting to the backend
:param pulumi.Input[str] ssl_hostname: Used for both SNI during the TLS handshake and to validate the cert
:param pulumi.Input[str] ssl_sni_hostname: Overrides ssl_hostname, but only for SNI in the handshake. Does not affect cert validation at all
:param pulumi.Input[bool] use_ssl: Whether or not to use SSL to reach the Backend. Default `false`
:param pulumi.Input[int] weight: The [portion of traffic](https://docs.fastly.com/en/guides/load-balancing-configuration#how-weight-affects-load-balancing) to send to this Backend. Each Backend receives weight / total of the traffic. Default `100`
"""
pulumi.set(__self__, "address", address)
pulumi.set(__self__, "name", name)
if auto_loadbalance is not None:
pulumi.set(__self__, "auto_loadbalance", auto_loadbalance)
if between_bytes_timeout is not None:
pulumi.set(__self__, "between_bytes_timeout", between_bytes_timeout)
if connect_timeout is not None:
pulumi.set(__self__, "connect_timeout", connect_timeout)
if error_threshold is not None:
pulumi.set(__self__, "error_threshold", error_threshold)
if first_byte_timeout is not None:
pulumi.set(__self__, "first_byte_timeout", first_byte_timeout)
if healthcheck is not None:
pulumi.set(__self__, "healthcheck", healthcheck)
if max_conn is not None:
pulumi.set(__self__, "max_conn", max_conn)
if max_tls_version is not None:
pulumi.set(__self__, "max_tls_version", max_tls_version)
if min_tls_version is not None:
pulumi.set(__self__, "min_tls_version", min_tls_version)
if override_host is not None:
pulumi.set(__self__, "override_host", override_host)
if port is not None:
pulumi.set(__self__, "port", port)
if shield is not None:
pulumi.set(__self__, "shield", shield)
if ssl_ca_cert is not None:
pulumi.set(__self__, "ssl_ca_cert", ssl_ca_cert)
if ssl_cert_hostname is not None:
pulumi.set(__self__, "ssl_cert_hostname", ssl_cert_hostname)
if ssl_check_cert is not None:
pulumi.set(__self__, "ssl_check_cert", ssl_check_cert)
if ssl_ciphers is not None:
pulumi.set(__self__, "ssl_ciphers", ssl_ciphers)
if ssl_client_cert is not None:
pulumi.set(__self__, "ssl_client_cert", ssl_client_cert)
if ssl_client_key is not None:
pulumi.set(__self__, "ssl_client_key", ssl_client_key)
if ssl_hostname is not None:
warnings.warn("""Use ssl_cert_hostname and ssl_sni_hostname instead.""", DeprecationWarning)
pulumi.log.warn("""ssl_hostname is deprecated: Use ssl_cert_hostname and ssl_sni_hostname instead.""")
if ssl_hostname is not None:
pulumi.set(__self__, "ssl_hostname", ssl_hostname)
if ssl_sni_hostname is not None:
pulumi.set(__self__, "ssl_sni_hostname", ssl_sni_hostname)
if use_ssl is not None:
pulumi.set(__self__, "use_ssl", use_ssl)
if weight is not None:
pulumi.set(__self__, "weight", weight)
@property
@pulumi.getter
def address(self) -> pulumi.Input[str]:
"""
An IPv4, hostname, or IPv6 address for the Backend
"""
return pulumi.get(self, "address")
@address.setter
def address(self, value: pulumi.Input[str]):
pulumi.set(self, "address", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
Name for this Backend. Must be unique to this Service. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="autoLoadbalance")
def auto_loadbalance(self) -> Optional[pulumi.Input[bool]]:
"""
Denotes if this Backend should be included in the pool of backends that requests are load balanced against. Default `true`
"""
return pulumi.get(self, "auto_loadbalance")
@auto_loadbalance.setter
def auto_loadbalance(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "auto_loadbalance", value)
@property
@pulumi.getter(name="betweenBytesTimeout")
def between_bytes_timeout(self) -> Optional[pulumi.Input[int]]:
"""
How long to wait between bytes in milliseconds. Default `10000`
"""
return pulumi.get(self, "between_bytes_timeout")
@between_bytes_timeout.setter
def between_bytes_timeout(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "between_bytes_timeout", value)
@property
@pulumi.getter(name="connectTimeout")
def connect_timeout(self) -> Optional[pulumi.Input[int]]:
"""
How long to wait for a timeout in milliseconds. Default `1000`
"""
return pulumi.get(self, "connect_timeout")
@connect_timeout.setter
def connect_timeout(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "connect_timeout", value)
@property
@pulumi.getter(name="errorThreshold")
def error_threshold(self) -> Optional[pulumi.Input[int]]:
"""
Number of errors to allow before the Backend is marked as down. Default `0`
"""
return pulumi.get(self, "error_threshold")
@error_threshold.setter
def error_threshold(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "error_threshold", value)
@property
@pulumi.getter(name="firstByteTimeout")
def first_byte_timeout(self) -> Optional[pulumi.Input[int]]:
"""
How long to wait for the first bytes in milliseconds. Default `15000`
"""
return pulumi.get(self, "first_byte_timeout")
@first_byte_timeout.setter
def first_byte_timeout(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "first_byte_timeout", value)
@property
@pulumi.getter
def healthcheck(self) -> Optional[pulumi.Input[str]]:
"""
Name of a defined `healthcheck` to assign to this backend
"""
return pulumi.get(self, "healthcheck")
@healthcheck.setter
def healthcheck(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "healthcheck", value)
@property
@pulumi.getter(name="maxConn")
def max_conn(self) -> Optional[pulumi.Input[int]]:
"""
Maximum number of connections for this Backend. Default `200`
"""
return pulumi.get(self, "max_conn")
@max_conn.setter
def max_conn(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_conn", value)
@property
@pulumi.getter(name="maxTlsVersion")
def max_tls_version(self) -> Optional[pulumi.Input[str]]:
"""
Maximum allowed TLS version on SSL connections to this backend.
"""
return pulumi.get(self, "max_tls_version")
@max_tls_version.setter
def max_tls_version(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "max_tls_version", value)
@property
@pulumi.getter(name="minTlsVersion")
def min_tls_version(self) -> Optional[pulumi.Input[str]]:
"""
Minimum allowed TLS version on SSL connections to this backend.
"""
return pulumi.get(self, "min_tls_version")
@min_tls_version.setter
def min_tls_version(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "min_tls_version", value)
@property
@pulumi.getter(name="overrideHost")
def override_host(self) -> Optional[pulumi.Input[str]]:
"""
The hostname to override the Host header
"""
return pulumi.get(self, "override_host")
@override_host.setter
def override_host(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "override_host", value)
@property
@pulumi.getter
def port(self) -> Optional[pulumi.Input[int]]:
"""
The port number on which the Backend responds. Default `80`
"""
return pulumi.get(self, "port")
@port.setter
def port(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "port", value)
@property
@pulumi.getter
def shield(self) -> Optional[pulumi.Input[str]]:
"""
The POP of the shield designated to reduce inbound load. Valid values for `shield` are included in the `GET /datacenters` API response
"""
return pulumi.get(self, "shield")
@shield.setter
def shield(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "shield", value)
@property
@pulumi.getter(name="sslCaCert")
def ssl_ca_cert(self) -> Optional[pulumi.Input[str]]:
"""
CA certificate attached to origin.
"""
return pulumi.get(self, "ssl_ca_cert")
@ssl_ca_cert.setter
def ssl_ca_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ssl_ca_cert", value)
@property
@pulumi.getter(name="sslCertHostname")
def ssl_cert_hostname(self) -> Optional[pulumi.Input[str]]:
"""
Overrides ssl_hostname, but only for cert verification. Does not affect SNI at all
"""
return pulumi.get(self, "ssl_cert_hostname")
@ssl_cert_hostname.setter
def ssl_cert_hostname(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ssl_cert_hostname", value)
@property
@pulumi.getter(name="sslCheckCert")
def ssl_check_cert(self) -> Optional[pulumi.Input[bool]]:
"""
Be strict about checking SSL certs. Default `true`
"""
return pulumi.get(self, "ssl_check_cert")
@ssl_check_cert.setter
def ssl_check_cert(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "ssl_check_cert", value)
@property
@pulumi.getter(name="sslCiphers")
def ssl_ciphers(self) -> Optional[pulumi.Input[str]]:
"""
Cipher list consisting of one or more cipher strings separated by colons. Commas or spaces are also acceptable separators but colons are normally used.
"""
return pulumi.get(self, "ssl_ciphers")
@ssl_ciphers.setter
def ssl_ciphers(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ssl_ciphers", value)
@property
@pulumi.getter(name="sslClientCert")
def ssl_client_cert(self) -> Optional[pulumi.Input[str]]:
"""
Client certificate attached to origin. Used when connecting to the backend
"""
return pulumi.get(self, "ssl_client_cert")
@ssl_client_cert.setter
def ssl_client_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ssl_client_cert", value)
@property
@pulumi.getter(name="sslClientKey")
def ssl_client_key(self) -> Optional[pulumi.Input[str]]:
"""
Client key attached to origin. Used when connecting to the backend
"""
return pulumi.get(self, "ssl_client_key")
@ssl_client_key.setter
def ssl_client_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ssl_client_key", value)
@property
@pulumi.getter(name="sslHostname")
def ssl_hostname(self) -> Optional[pulumi.Input[str]]:
"""
Used for both SNI during the TLS handshake and to validate the cert
"""
return pulumi.get(self, "ssl_hostname")
@ssl_hostname.setter
def ssl_hostname(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ssl_hostname", value)
@property
@pulumi.getter(name="sslSniHostname")
def ssl_sni_hostname(self) -> Optional[pulumi.Input[str]]:
"""
Overrides ssl_hostname, but only for SNI in the handshake. Does not affect cert validation at all
"""
return pulumi.get(self, "ssl_sni_hostname")
@ssl_sni_hostname.setter
def ssl_sni_hostname(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ssl_sni_hostname", value)
@property
@pulumi.getter(name="useSsl")
def use_ssl(self) -> Optional[pulumi.Input[bool]]:
"""
Whether or not to use SSL to reach the Backend. Default `false`
"""
return pulumi.get(self, "use_ssl")
@use_ssl.setter
def use_ssl(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "use_ssl", value)
@property
@pulumi.getter
def weight(self) -> Optional[pulumi.Input[int]]:
"""
The [portion of traffic](https://docs.fastly.com/en/guides/load-balancing-configuration#how-weight-affects-load-balancing) to send to this Backend. Each Backend receives weight / total of the traffic. Default `100`
"""
return pulumi.get(self, "weight")
@weight.setter
def weight(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "weight", value)
@pulumi.input_type
class ServiceComputeBigqueryloggingArgs:
def __init__(__self__, *,
dataset: pulumi.Input[str],
email: pulumi.Input[str],
name: pulumi.Input[str],
project_id: pulumi.Input[str],
secret_key: pulumi.Input[str],
table: pulumi.Input[str],
template: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] dataset: The ID of your BigQuery dataset
:param pulumi.Input[str] email: The email for the service account with write access to your BigQuery dataset. If not provided, this will be pulled from a `FASTLY_BQ_EMAIL` environment variable
:param pulumi.Input[str] name: A unique name to identify this BigQuery logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] project_id: The ID of your GCP project
:param pulumi.Input[str] secret_key: The secret key associated with the service account that has write access to your BigQuery table. If not provided, this will be pulled from the `FASTLY_BQ_SECRET_KEY` environment variable. Typical format for this is a private key in a string with newlines
:param pulumi.Input[str] table: The ID of your BigQuery table
:param pulumi.Input[str] template: BigQuery table name suffix template
"""
pulumi.set(__self__, "dataset", dataset)
pulumi.set(__self__, "email", email)
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "project_id", project_id)
pulumi.set(__self__, "secret_key", secret_key)
pulumi.set(__self__, "table", table)
if template is not None:
pulumi.set(__self__, "template", template)
@property
@pulumi.getter
def dataset(self) -> pulumi.Input[str]:
"""
The ID of your BigQuery dataset
"""
return pulumi.get(self, "dataset")
@dataset.setter
def dataset(self, value: pulumi.Input[str]):
pulumi.set(self, "dataset", value)
@property
@pulumi.getter
def email(self) -> pulumi.Input[str]:
"""
The email for the service account with write access to your BigQuery dataset. If not provided, this will be pulled from a `FASTLY_BQ_EMAIL` environment variable
"""
return pulumi.get(self, "email")
@email.setter
def email(self, value: pulumi.Input[str]):
pulumi.set(self, "email", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
A unique name to identify this BigQuery logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="projectId")
def project_id(self) -> pulumi.Input[str]:
"""
The ID of your GCP project
"""
return pulumi.get(self, "project_id")
@project_id.setter
def project_id(self, value: pulumi.Input[str]):
pulumi.set(self, "project_id", value)
@property
@pulumi.getter(name="secretKey")
def secret_key(self) -> pulumi.Input[str]:
"""
The secret key associated with the service account that has write access to your BigQuery table. If not provided, this will be pulled from the `FASTLY_BQ_SECRET_KEY` environment variable. Typical format for this is a private key in a string with newlines
"""
return pulumi.get(self, "secret_key")
@secret_key.setter
def secret_key(self, value: pulumi.Input[str]):
pulumi.set(self, "secret_key", value)
@property
@pulumi.getter
def table(self) -> pulumi.Input[str]:
"""
The ID of your BigQuery table
"""
return pulumi.get(self, "table")
@table.setter
def table(self, value: pulumi.Input[str]):
pulumi.set(self, "table", value)
@property
@pulumi.getter
def template(self) -> Optional[pulumi.Input[str]]:
"""
BigQuery table name suffix template
"""
return pulumi.get(self, "template")
@template.setter
def template(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "template", value)
@pulumi.input_type
class ServiceComputeBlobstorageloggingArgs:
def __init__(__self__, *,
account_name: pulumi.Input[str],
container: pulumi.Input[str],
name: pulumi.Input[str],
sas_token: pulumi.Input[str],
compression_codec: Optional[pulumi.Input[str]] = None,
file_max_bytes: Optional[pulumi.Input[int]] = None,
gzip_level: Optional[pulumi.Input[int]] = None,
message_type: Optional[pulumi.Input[str]] = None,
path: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[int]] = None,
public_key: Optional[pulumi.Input[str]] = None,
timestamp_format: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] account_name: The unique Azure Blob Storage namespace in which your data objects are stored
:param pulumi.Input[str] container: The name of the Azure Blob Storage container in which to store logs
:param pulumi.Input[str] name: A unique name to identify the Azure Blob Storage endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] sas_token: The Azure shared access signature providing write access to the blob service objects. Be sure to update your token before it expires or the logging functionality will not work
:param pulumi.Input[str] compression_codec: The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
:param pulumi.Input[int] file_max_bytes: Maximum size of an uploaded log file, if non-zero.
:param pulumi.Input[int] gzip_level: Level of Gzip compression from `0-9`. `0` means no compression. `1` is the fastest and the least compressed version, `9` is the slowest and the most compressed version. Default `0`
:param pulumi.Input[str] message_type: How the message should be formatted. Can be either `classic`, `loggly`, `logplex` or `blank`. Default `classic`
:param pulumi.Input[str] path: The path to upload logs to. Must end with a trailing slash. If this field is left empty, the files will be saved in the container's root path
:param pulumi.Input[int] period: How frequently the logs should be transferred in seconds. Default `3600`
:param pulumi.Input[str] public_key: A PGP public key that Fastly will use to encrypt your log files before writing them to disk
:param pulumi.Input[str] timestamp_format: `strftime` specified timestamp formatting. Default `%Y-%m-%dT%H:%M:%S.000`
"""
pulumi.set(__self__, "account_name", account_name)
pulumi.set(__self__, "container", container)
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "sas_token", sas_token)
if compression_codec is not None:
pulumi.set(__self__, "compression_codec", compression_codec)
if file_max_bytes is not None:
pulumi.set(__self__, "file_max_bytes", file_max_bytes)
if gzip_level is not None:
pulumi.set(__self__, "gzip_level", gzip_level)
if message_type is not None:
pulumi.set(__self__, "message_type", message_type)
if path is not None:
pulumi.set(__self__, "path", path)
if period is not None:
pulumi.set(__self__, "period", period)
if public_key is not None:
pulumi.set(__self__, "public_key", public_key)
if timestamp_format is not None:
pulumi.set(__self__, "timestamp_format", timestamp_format)
@property
@pulumi.getter(name="accountName")
def account_name(self) -> pulumi.Input[str]:
"""
The unique Azure Blob Storage namespace in which your data objects are stored
"""
return pulumi.get(self, "account_name")
@account_name.setter
def account_name(self, value: pulumi.Input[str]):
pulumi.set(self, "account_name", value)
@property
@pulumi.getter
def container(self) -> pulumi.Input[str]:
"""
The name of the Azure Blob Storage container in which to store logs
"""
return pulumi.get(self, "container")
@container.setter
def container(self, value: pulumi.Input[str]):
pulumi.set(self, "container", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
A unique name to identify the Azure Blob Storage endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="sasToken")
def sas_token(self) -> pulumi.Input[str]:
"""
The Azure shared access signature providing write access to the blob service objects. Be sure to update your token before it expires or the logging functionality will not work
"""
return pulumi.get(self, "sas_token")
@sas_token.setter
def sas_token(self, value: pulumi.Input[str]):
pulumi.set(self, "sas_token", value)
@property
@pulumi.getter(name="compressionCodec")
def compression_codec(self) -> Optional[pulumi.Input[str]]:
"""
The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
"""
return pulumi.get(self, "compression_codec")
@compression_codec.setter
def compression_codec(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "compression_codec", value)
@property
@pulumi.getter(name="fileMaxBytes")
def file_max_bytes(self) -> Optional[pulumi.Input[int]]:
"""
Maximum size of an uploaded log file, if non-zero.
"""
return pulumi.get(self, "file_max_bytes")
@file_max_bytes.setter
def file_max_bytes(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "file_max_bytes", value)
@property
@pulumi.getter(name="gzipLevel")
def gzip_level(self) -> Optional[pulumi.Input[int]]:
"""
Level of Gzip compression from `0-9`. `0` means no compression. `1` is the fastest and the least compressed version, `9` is the slowest and the most compressed version. Default `0`
"""
return pulumi.get(self, "gzip_level")
@gzip_level.setter
def gzip_level(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "gzip_level", value)
@property
@pulumi.getter(name="messageType")
def message_type(self) -> Optional[pulumi.Input[str]]:
"""
How the message should be formatted. Can be either `classic`, `loggly`, `logplex` or `blank`. Default `classic`
"""
return pulumi.get(self, "message_type")
@message_type.setter
def message_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "message_type", value)
@property
@pulumi.getter
def path(self) -> Optional[pulumi.Input[str]]:
"""
The path to upload logs to. Must end with a trailing slash. If this field is left empty, the files will be saved in the container's root path
"""
return pulumi.get(self, "path")
@path.setter
def path(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "path", value)
@property
@pulumi.getter
def period(self) -> Optional[pulumi.Input[int]]:
"""
How frequently the logs should be transferred in seconds. Default `3600`
"""
return pulumi.get(self, "period")
@period.setter
def period(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "period", value)
@property
@pulumi.getter(name="publicKey")
def public_key(self) -> Optional[pulumi.Input[str]]:
"""
A PGP public key that Fastly will use to encrypt your log files before writing them to disk
"""
return pulumi.get(self, "public_key")
@public_key.setter
def public_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "public_key", value)
@property
@pulumi.getter(name="timestampFormat")
def timestamp_format(self) -> Optional[pulumi.Input[str]]:
"""
`strftime` specified timestamp formatting. Default `%Y-%m-%dT%H:%M:%S.000`
"""
return pulumi.get(self, "timestamp_format")
@timestamp_format.setter
def timestamp_format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "timestamp_format", value)
@pulumi.input_type
class ServiceComputeDictionaryArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
dictionary_id: Optional[pulumi.Input[str]] = None,
force_destroy: Optional[pulumi.Input[bool]] = None,
write_only: Optional[pulumi.Input[bool]] = None):
"""
:param pulumi.Input[str] name: A unique name to identify this dictionary. It is important to note that changing this attribute will delete and recreate the dictionary, and discard the current items in the dictionary
:param pulumi.Input[str] dictionary_id: The ID of the dictionary
:param pulumi.Input[bool] force_destroy: Allow the dictionary to be deleted, even if it contains entries. Defaults to false.
:param pulumi.Input[bool] write_only: If `true`, the dictionary is a [private dictionary](https://docs.fastly.com/en/guides/private-dictionaries). Default is `false`. Please note that changing this attribute will delete and recreate the dictionary, and discard the current items in the dictionary. `Servicev1` resource will only manage the dictionary object itself, and items under private dictionaries can not be managed using `ServiceDictionaryItemsv1` resource. Therefore, using a write-only/private dictionary should only be done if the items are managed outside of the provider
"""
pulumi.set(__self__, "name", name)
if dictionary_id is not None:
pulumi.set(__self__, "dictionary_id", dictionary_id)
if force_destroy is not None:
pulumi.set(__self__, "force_destroy", force_destroy)
if write_only is not None:
pulumi.set(__self__, "write_only", write_only)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
A unique name to identify this dictionary. It is important to note that changing this attribute will delete and recreate the dictionary, and discard the current items in the dictionary
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="dictionaryId")
def dictionary_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the dictionary
"""
return pulumi.get(self, "dictionary_id")
@dictionary_id.setter
def dictionary_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "dictionary_id", value)
@property
@pulumi.getter(name="forceDestroy")
def force_destroy(self) -> Optional[pulumi.Input[bool]]:
"""
Allow the dictionary to be deleted, even if it contains entries. Defaults to false.
"""
return pulumi.get(self, "force_destroy")
@force_destroy.setter
def force_destroy(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "force_destroy", value)
@property
@pulumi.getter(name="writeOnly")
def write_only(self) -> Optional[pulumi.Input[bool]]:
"""
If `true`, the dictionary is a [private dictionary](https://docs.fastly.com/en/guides/private-dictionaries). Default is `false`. Please note that changing this attribute will delete and recreate the dictionary, and discard the current items in the dictionary. `Servicev1` resource will only manage the dictionary object itself, and items under private dictionaries can not be managed using `ServiceDictionaryItemsv1` resource. Therefore, using a write-only/private dictionary should only be done if the items are managed outside of the provider
"""
return pulumi.get(self, "write_only")
@write_only.setter
def write_only(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "write_only", value)
@pulumi.input_type
class ServiceComputeDirectorArgs:
def __init__(__self__, *,
backends: pulumi.Input[Sequence[pulumi.Input[str]]],
name: pulumi.Input[str],
capacity: Optional[pulumi.Input[int]] = None,
comment: Optional[pulumi.Input[str]] = None,
quorum: Optional[pulumi.Input[int]] = None,
retries: Optional[pulumi.Input[int]] = None,
shield: Optional[pulumi.Input[str]] = None,
type: Optional[pulumi.Input[int]] = None):
"""
:param pulumi.Input[Sequence[pulumi.Input[str]]] backends: Names of defined backends to map the director to. Example: `[ "origin1", "origin2" ]`
:param pulumi.Input[str] name: Unique name for this Director. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[int] capacity: Load balancing weight for the backends. Default `100`
:param pulumi.Input[str] comment: An optional comment about the Director
:param pulumi.Input[int] quorum: Percentage of capacity that needs to be up for the director itself to be considered up. Default `75`
:param pulumi.Input[int] retries: How many backends to search if it fails. Default `5`
:param pulumi.Input[str] shield: Selected POP to serve as a "shield" for backends. Valid values for `shield` are included in the [`GET /datacenters`](https://developer.fastly.com/reference/api/utils/datacenter/) API response
:param pulumi.Input[int] type: Type of load balance group to use. Integer, 1 to 4. Values: `1` (random), `3` (hash), `4` (client). Default `1`
"""
pulumi.set(__self__, "backends", backends)
pulumi.set(__self__, "name", name)
if capacity is not None:
pulumi.set(__self__, "capacity", capacity)
if comment is not None:
pulumi.set(__self__, "comment", comment)
if quorum is not None:
pulumi.set(__self__, "quorum", quorum)
if retries is not None:
pulumi.set(__self__, "retries", retries)
if shield is not None:
pulumi.set(__self__, "shield", shield)
if type is not None:
pulumi.set(__self__, "type", type)
@property
@pulumi.getter
def backends(self) -> pulumi.Input[Sequence[pulumi.Input[str]]]:
"""
Names of defined backends to map the director to. Example: `[ "origin1", "origin2" ]`
"""
return pulumi.get(self, "backends")
@backends.setter
def backends(self, value: pulumi.Input[Sequence[pulumi.Input[str]]]):
pulumi.set(self, "backends", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
Unique name for this Director. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def capacity(self) -> Optional[pulumi.Input[int]]:
"""
Load balancing weight for the backends. Default `100`
"""
return pulumi.get(self, "capacity")
@capacity.setter
def capacity(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "capacity", value)
@property
@pulumi.getter
def comment(self) -> Optional[pulumi.Input[str]]:
"""
An optional comment about the Director
"""
return pulumi.get(self, "comment")
@comment.setter
def comment(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "comment", value)
@property
@pulumi.getter
def quorum(self) -> Optional[pulumi.Input[int]]:
"""
Percentage of capacity that needs to be up for the director itself to be considered up. Default `75`
"""
return pulumi.get(self, "quorum")
@quorum.setter
def quorum(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "quorum", value)
@property
@pulumi.getter
def retries(self) -> Optional[pulumi.Input[int]]:
"""
How many backends to search if it fails. Default `5`
"""
return pulumi.get(self, "retries")
@retries.setter
def retries(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "retries", value)
@property
@pulumi.getter
def shield(self) -> Optional[pulumi.Input[str]]:
"""
Selected POP to serve as a "shield" for backends. Valid values for `shield` are included in the [`GET /datacenters`](https://developer.fastly.com/reference/api/utils/datacenter/) API response
"""
return pulumi.get(self, "shield")
@shield.setter
def shield(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "shield", value)
@property
@pulumi.getter
def type(self) -> Optional[pulumi.Input[int]]:
"""
Type of load balance group to use. Integer, 1 to 4. Values: `1` (random), `3` (hash), `4` (client). Default `1`
"""
return pulumi.get(self, "type")
@type.setter
def type(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "type", value)
@pulumi.input_type
class ServiceComputeDomainArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
comment: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: The domain that this Service will respond to. It is important to note that changing this attribute will delete and recreate the resource.
:param pulumi.Input[str] comment: An optional comment about the Domain.
"""
pulumi.set(__self__, "name", name)
if comment is not None:
pulumi.set(__self__, "comment", comment)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The domain that this Service will respond to. It is important to note that changing this attribute will delete and recreate the resource.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def comment(self) -> Optional[pulumi.Input[str]]:
"""
An optional comment about the Domain.
"""
return pulumi.get(self, "comment")
@comment.setter
def comment(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "comment", value)
@pulumi.input_type
class ServiceComputeGcsloggingArgs:
def __init__(__self__, *,
bucket_name: pulumi.Input[str],
name: pulumi.Input[str],
compression_codec: Optional[pulumi.Input[str]] = None,
email: Optional[pulumi.Input[str]] = None,
gzip_level: Optional[pulumi.Input[int]] = None,
message_type: Optional[pulumi.Input[str]] = None,
path: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[int]] = None,
secret_key: Optional[pulumi.Input[str]] = None,
timestamp_format: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] bucket_name: The name of the bucket in which to store the logs
:param pulumi.Input[str] name: A unique name to identify this GCS endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] compression_codec: The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
:param pulumi.Input[str] email: The email address associated with the target GCS bucket on your account. You may optionally provide this secret via an environment variable, `FASTLY_GCS_EMAIL`
:param pulumi.Input[int] gzip_level: Level of Gzip compression, from `0-9`. `0` is no compression. `1` is fastest and least compressed, `9` is slowest and most compressed. Default `0`
:param pulumi.Input[str] message_type: How the message should be formatted; one of: `classic`, `loggly`, `logplex` or `blank`. Default `classic`. [Fastly Documentation](https://developer.fastly.com/reference/api/logging/gcs/)
:param pulumi.Input[str] path: Path to store the files. Must end with a trailing slash. If this field is left empty, the files will be saved in the bucket's root path
:param pulumi.Input[int] period: How frequently the logs should be transferred, in seconds (Default 3600)
:param pulumi.Input[str] secret_key: The secret key associated with the target gcs bucket on your account. You may optionally provide this secret via an environment variable, `FASTLY_GCS_SECRET_KEY`. A typical format for the key is PEM format, containing actual newline characters where required
:param pulumi.Input[str] timestamp_format: specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
pulumi.set(__self__, "bucket_name", bucket_name)
pulumi.set(__self__, "name", name)
if compression_codec is not None:
pulumi.set(__self__, "compression_codec", compression_codec)
if email is not None:
pulumi.set(__self__, "email", email)
if gzip_level is not None:
pulumi.set(__self__, "gzip_level", gzip_level)
if message_type is not None:
pulumi.set(__self__, "message_type", message_type)
if path is not None:
pulumi.set(__self__, "path", path)
if period is not None:
pulumi.set(__self__, "period", period)
if secret_key is not None:
pulumi.set(__self__, "secret_key", secret_key)
if timestamp_format is not None:
pulumi.set(__self__, "timestamp_format", timestamp_format)
@property
@pulumi.getter(name="bucketName")
def bucket_name(self) -> pulumi.Input[str]:
"""
The name of the bucket in which to store the logs
"""
return pulumi.get(self, "bucket_name")
@bucket_name.setter
def bucket_name(self, value: pulumi.Input[str]):
pulumi.set(self, "bucket_name", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
A unique name to identify this GCS endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="compressionCodec")
def compression_codec(self) -> Optional[pulumi.Input[str]]:
"""
The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
"""
return pulumi.get(self, "compression_codec")
@compression_codec.setter
def compression_codec(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "compression_codec", value)
@property
@pulumi.getter
def email(self) -> Optional[pulumi.Input[str]]:
"""
The email address associated with the target GCS bucket on your account. You may optionally provide this secret via an environment variable, `FASTLY_GCS_EMAIL`
"""
return pulumi.get(self, "email")
@email.setter
def email(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "email", value)
@property
@pulumi.getter(name="gzipLevel")
def gzip_level(self) -> Optional[pulumi.Input[int]]:
"""
Level of Gzip compression, from `0-9`. `0` is no compression. `1` is fastest and least compressed, `9` is slowest and most compressed. Default `0`
"""
return pulumi.get(self, "gzip_level")
@gzip_level.setter
def gzip_level(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "gzip_level", value)
@property
@pulumi.getter(name="messageType")
def message_type(self) -> Optional[pulumi.Input[str]]:
"""
How the message should be formatted; one of: `classic`, `loggly`, `logplex` or `blank`. Default `classic`. [Fastly Documentation](https://developer.fastly.com/reference/api/logging/gcs/)
"""
return pulumi.get(self, "message_type")
@message_type.setter
def message_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "message_type", value)
@property
@pulumi.getter
def path(self) -> Optional[pulumi.Input[str]]:
"""
Path to store the files. Must end with a trailing slash. If this field is left empty, the files will be saved in the bucket's root path
"""
return pulumi.get(self, "path")
@path.setter
def path(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "path", value)
@property
@pulumi.getter
def period(self) -> Optional[pulumi.Input[int]]:
"""
How frequently the logs should be transferred, in seconds (Default 3600)
"""
return pulumi.get(self, "period")
@period.setter
def period(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "period", value)
@property
@pulumi.getter(name="secretKey")
def secret_key(self) -> Optional[pulumi.Input[str]]:
"""
The secret key associated with the target gcs bucket on your account. You may optionally provide this secret via an environment variable, `FASTLY_GCS_SECRET_KEY`. A typical format for the key is PEM format, containing actual newline characters where required
"""
return pulumi.get(self, "secret_key")
@secret_key.setter
def secret_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "secret_key", value)
@property
@pulumi.getter(name="timestampFormat")
def timestamp_format(self) -> Optional[pulumi.Input[str]]:
"""
specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
return pulumi.get(self, "timestamp_format")
@timestamp_format.setter
def timestamp_format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "timestamp_format", value)
@pulumi.input_type
class ServiceComputeHealthcheckArgs:
def __init__(__self__, *,
host: pulumi.Input[str],
name: pulumi.Input[str],
path: pulumi.Input[str],
check_interval: Optional[pulumi.Input[int]] = None,
expected_response: Optional[pulumi.Input[int]] = None,
http_version: Optional[pulumi.Input[str]] = None,
initial: Optional[pulumi.Input[int]] = None,
method: Optional[pulumi.Input[str]] = None,
threshold: Optional[pulumi.Input[int]] = None,
timeout: Optional[pulumi.Input[int]] = None,
window: Optional[pulumi.Input[int]] = None):
"""
:param pulumi.Input[str] host: The Host header to send for this Healthcheck
:param pulumi.Input[str] name: A unique name to identify this Healthcheck. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] path: The path to check
:param pulumi.Input[int] check_interval: How often to run the Healthcheck in milliseconds. Default `5000`
:param pulumi.Input[int] expected_response: The status code expected from the host. Default `200`
:param pulumi.Input[str] http_version: Whether to use version 1.0 or 1.1 HTTP. Default `1.1`
:param pulumi.Input[int] initial: When loading a config, the initial number of probes to be seen as OK. Default `3`
:param pulumi.Input[str] method: Which HTTP method to use. Default `HEAD`
:param pulumi.Input[int] threshold: How many Healthchecks must succeed to be considered healthy. Default `3`
:param pulumi.Input[int] timeout: Timeout in milliseconds. Default `500`
:param pulumi.Input[int] window: The number of most recent Healthcheck queries to keep for this Healthcheck. Default `5`
"""
pulumi.set(__self__, "host", host)
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "path", path)
if check_interval is not None:
pulumi.set(__self__, "check_interval", check_interval)
if expected_response is not None:
pulumi.set(__self__, "expected_response", expected_response)
if http_version is not None:
pulumi.set(__self__, "http_version", http_version)
if initial is not None:
pulumi.set(__self__, "initial", initial)
if method is not None:
pulumi.set(__self__, "method", method)
if threshold is not None:
pulumi.set(__self__, "threshold", threshold)
if timeout is not None:
pulumi.set(__self__, "timeout", timeout)
if window is not None:
pulumi.set(__self__, "window", window)
@property
@pulumi.getter
def host(self) -> pulumi.Input[str]:
"""
The Host header to send for this Healthcheck
"""
return pulumi.get(self, "host")
@host.setter
def host(self, value: pulumi.Input[str]):
pulumi.set(self, "host", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
A unique name to identify this Healthcheck. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def path(self) -> pulumi.Input[str]:
"""
The path to check
"""
return pulumi.get(self, "path")
@path.setter
def path(self, value: pulumi.Input[str]):
pulumi.set(self, "path", value)
@property
@pulumi.getter(name="checkInterval")
def check_interval(self) -> Optional[pulumi.Input[int]]:
"""
How often to run the Healthcheck in milliseconds. Default `5000`
"""
return pulumi.get(self, "check_interval")
@check_interval.setter
def check_interval(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "check_interval", value)
@property
@pulumi.getter(name="expectedResponse")
def expected_response(self) -> Optional[pulumi.Input[int]]:
"""
The status code expected from the host. Default `200`
"""
return pulumi.get(self, "expected_response")
@expected_response.setter
def expected_response(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "expected_response", value)
@property
@pulumi.getter(name="httpVersion")
def http_version(self) -> Optional[pulumi.Input[str]]:
"""
Whether to use version 1.0 or 1.1 HTTP. Default `1.1`
"""
return pulumi.get(self, "http_version")
@http_version.setter
def http_version(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "http_version", value)
@property
@pulumi.getter
def initial(self) -> Optional[pulumi.Input[int]]:
"""
When loading a config, the initial number of probes to be seen as OK. Default `3`
"""
return pulumi.get(self, "initial")
@initial.setter
def initial(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "initial", value)
@property
@pulumi.getter
def method(self) -> Optional[pulumi.Input[str]]:
"""
Which HTTP method to use. Default `HEAD`
"""
return pulumi.get(self, "method")
@method.setter
def method(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "method", value)
@property
@pulumi.getter
def threshold(self) -> Optional[pulumi.Input[int]]:
"""
How many Healthchecks must succeed to be considered healthy. Default `3`
"""
return pulumi.get(self, "threshold")
@threshold.setter
def threshold(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "threshold", value)
@property
@pulumi.getter
def timeout(self) -> Optional[pulumi.Input[int]]:
"""
Timeout in milliseconds. Default `500`
"""
return pulumi.get(self, "timeout")
@timeout.setter
def timeout(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "timeout", value)
@property
@pulumi.getter
def window(self) -> Optional[pulumi.Input[int]]:
"""
The number of most recent Healthcheck queries to keep for this Healthcheck. Default `5`
"""
return pulumi.get(self, "window")
@window.setter
def window(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "window", value)
@pulumi.input_type
class ServiceComputeHttpsloggingArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
url: pulumi.Input[str],
content_type: Optional[pulumi.Input[str]] = None,
header_name: Optional[pulumi.Input[str]] = None,
header_value: Optional[pulumi.Input[str]] = None,
json_format: Optional[pulumi.Input[str]] = None,
message_type: Optional[pulumi.Input[str]] = None,
method: Optional[pulumi.Input[str]] = None,
request_max_bytes: Optional[pulumi.Input[int]] = None,
request_max_entries: Optional[pulumi.Input[int]] = None,
tls_ca_cert: Optional[pulumi.Input[str]] = None,
tls_client_cert: Optional[pulumi.Input[str]] = None,
tls_client_key: Optional[pulumi.Input[str]] = None,
tls_hostname: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: The unique name of the HTTPS logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] url: URL that log data will be sent to. Must use the https protocol
:param pulumi.Input[str] content_type: Value of the `Content-Type` header sent with the request
:param pulumi.Input[str] header_name: Custom header sent with the request
:param pulumi.Input[str] header_value: Value of the custom header sent with the request
:param pulumi.Input[str] json_format: Formats log entries as JSON. Can be either disabled (`0`), array of json (`1`), or newline delimited json (`2`)
:param pulumi.Input[str] message_type: How the message should be formatted; one of: `classic`, `loggly`, `logplex` or `blank`. Default `blank`
:param pulumi.Input[str] method: HTTP method used for request. Can be either `POST` or `PUT`. Default `POST`
:param pulumi.Input[int] request_max_bytes: The maximum number of bytes sent in one request
:param pulumi.Input[int] request_max_entries: The maximum number of logs sent in one request
:param pulumi.Input[str] tls_ca_cert: A secure certificate to authenticate the server with. Must be in PEM format
:param pulumi.Input[str] tls_client_cert: The client certificate used to make authenticated requests. Must be in PEM format
:param pulumi.Input[str] tls_client_key: The client private key used to make authenticated requests. Must be in PEM format
:param pulumi.Input[str] tls_hostname: Used during the TLS handshake to validate the certificate
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "url", url)
if content_type is not None:
pulumi.set(__self__, "content_type", content_type)
if header_name is not None:
pulumi.set(__self__, "header_name", header_name)
if header_value is not None:
pulumi.set(__self__, "header_value", header_value)
if json_format is not None:
pulumi.set(__self__, "json_format", json_format)
if message_type is not None:
pulumi.set(__self__, "message_type", message_type)
if method is not None:
pulumi.set(__self__, "method", method)
if request_max_bytes is not None:
pulumi.set(__self__, "request_max_bytes", request_max_bytes)
if request_max_entries is not None:
pulumi.set(__self__, "request_max_entries", request_max_entries)
if tls_ca_cert is not None:
pulumi.set(__self__, "tls_ca_cert", tls_ca_cert)
if tls_client_cert is not None:
pulumi.set(__self__, "tls_client_cert", tls_client_cert)
if tls_client_key is not None:
pulumi.set(__self__, "tls_client_key", tls_client_key)
if tls_hostname is not None:
pulumi.set(__self__, "tls_hostname", tls_hostname)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the HTTPS logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def url(self) -> pulumi.Input[str]:
"""
URL that log data will be sent to. Must use the https protocol
"""
return pulumi.get(self, "url")
@url.setter
def url(self, value: pulumi.Input[str]):
pulumi.set(self, "url", value)
@property
@pulumi.getter(name="contentType")
def content_type(self) -> Optional[pulumi.Input[str]]:
"""
Value of the `Content-Type` header sent with the request
"""
return pulumi.get(self, "content_type")
@content_type.setter
def content_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "content_type", value)
@property
@pulumi.getter(name="headerName")
def header_name(self) -> Optional[pulumi.Input[str]]:
"""
Custom header sent with the request
"""
return pulumi.get(self, "header_name")
@header_name.setter
def header_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "header_name", value)
@property
@pulumi.getter(name="headerValue")
def header_value(self) -> Optional[pulumi.Input[str]]:
"""
Value of the custom header sent with the request
"""
return pulumi.get(self, "header_value")
@header_value.setter
def header_value(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "header_value", value)
@property
@pulumi.getter(name="jsonFormat")
def json_format(self) -> Optional[pulumi.Input[str]]:
"""
Formats log entries as JSON. Can be either disabled (`0`), array of json (`1`), or newline delimited json (`2`)
"""
return pulumi.get(self, "json_format")
@json_format.setter
def json_format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "json_format", value)
@property
@pulumi.getter(name="messageType")
def message_type(self) -> Optional[pulumi.Input[str]]:
"""
How the message should be formatted; one of: `classic`, `loggly`, `logplex` or `blank`. Default `blank`
"""
return pulumi.get(self, "message_type")
@message_type.setter
def message_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "message_type", value)
@property
@pulumi.getter
def method(self) -> Optional[pulumi.Input[str]]:
"""
HTTP method used for request. Can be either `POST` or `PUT`. Default `POST`
"""
return pulumi.get(self, "method")
@method.setter
def method(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "method", value)
@property
@pulumi.getter(name="requestMaxBytes")
def request_max_bytes(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of bytes sent in one request
"""
return pulumi.get(self, "request_max_bytes")
@request_max_bytes.setter
def request_max_bytes(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "request_max_bytes", value)
@property
@pulumi.getter(name="requestMaxEntries")
def request_max_entries(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of logs sent in one request
"""
return pulumi.get(self, "request_max_entries")
@request_max_entries.setter
def request_max_entries(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "request_max_entries", value)
@property
@pulumi.getter(name="tlsCaCert")
def tls_ca_cert(self) -> Optional[pulumi.Input[str]]:
"""
A secure certificate to authenticate the server with. Must be in PEM format
"""
return pulumi.get(self, "tls_ca_cert")
@tls_ca_cert.setter
def tls_ca_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_ca_cert", value)
@property
@pulumi.getter(name="tlsClientCert")
def tls_client_cert(self) -> Optional[pulumi.Input[str]]:
"""
The client certificate used to make authenticated requests. Must be in PEM format
"""
return pulumi.get(self, "tls_client_cert")
@tls_client_cert.setter
def tls_client_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_client_cert", value)
@property
@pulumi.getter(name="tlsClientKey")
def tls_client_key(self) -> Optional[pulumi.Input[str]]:
"""
The client private key used to make authenticated requests. Must be in PEM format
"""
return pulumi.get(self, "tls_client_key")
@tls_client_key.setter
def tls_client_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_client_key", value)
@property
@pulumi.getter(name="tlsHostname")
def tls_hostname(self) -> Optional[pulumi.Input[str]]:
"""
Used during the TLS handshake to validate the certificate
"""
return pulumi.get(self, "tls_hostname")
@tls_hostname.setter
def tls_hostname(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_hostname", value)
@pulumi.input_type
class ServiceComputeLogentryArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
token: pulumi.Input[str],
port: Optional[pulumi.Input[int]] = None,
use_tls: Optional[pulumi.Input[bool]] = None):
"""
:param pulumi.Input[str] name: The unique name of the Logentries logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] token: Use token based authentication (https://logentries.com/doc/input-token/)
:param pulumi.Input[int] port: The port number configured in Logentries
:param pulumi.Input[bool] use_tls: Whether to use TLS for secure logging
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "token", token)
if port is not None:
pulumi.set(__self__, "port", port)
if use_tls is not None:
pulumi.set(__self__, "use_tls", use_tls)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the Logentries logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def token(self) -> pulumi.Input[str]:
"""
Use token based authentication (https://logentries.com/doc/input-token/)
"""
return pulumi.get(self, "token")
@token.setter
def token(self, value: pulumi.Input[str]):
pulumi.set(self, "token", value)
@property
@pulumi.getter
def port(self) -> Optional[pulumi.Input[int]]:
"""
The port number configured in Logentries
"""
return pulumi.get(self, "port")
@port.setter
def port(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "port", value)
@property
@pulumi.getter(name="useTls")
def use_tls(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to use TLS for secure logging
"""
return pulumi.get(self, "use_tls")
@use_tls.setter
def use_tls(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "use_tls", value)
@pulumi.input_type
class ServiceComputeLoggingCloudfileArgs:
def __init__(__self__, *,
access_key: pulumi.Input[str],
bucket_name: pulumi.Input[str],
name: pulumi.Input[str],
user: pulumi.Input[str],
compression_codec: Optional[pulumi.Input[str]] = None,
gzip_level: Optional[pulumi.Input[int]] = None,
message_type: Optional[pulumi.Input[str]] = None,
path: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[int]] = None,
public_key: Optional[pulumi.Input[str]] = None,
region: Optional[pulumi.Input[str]] = None,
timestamp_format: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] access_key: Your Cloud File account access key
:param pulumi.Input[str] bucket_name: The name of your Cloud Files container
:param pulumi.Input[str] name: The unique name of the Rackspace Cloud Files logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] user: The username for your Cloud Files account
:param pulumi.Input[str] compression_codec: The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
:param pulumi.Input[int] gzip_level: What level of GZIP encoding to have when dumping logs (default `0`, no compression)
:param pulumi.Input[str] message_type: How the message should be formatted. One of: `classic` (default), `loggly`, `logplex` or `blank`
:param pulumi.Input[str] path: The path to upload logs to
:param pulumi.Input[int] period: How frequently log files are finalized so they can be available for reading (in seconds, default `3600`)
:param pulumi.Input[str] public_key: The PGP public key that Fastly will use to encrypt your log files before writing them to disk
:param pulumi.Input[str] region: The region to stream logs to. One of: DFW (Dallas), ORD (Chicago), IAD (Northern Virginia), LON (London), SYD (Sydney), HKG (Hong Kong)
:param pulumi.Input[str] timestamp_format: The `strftime` specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
pulumi.set(__self__, "access_key", access_key)
pulumi.set(__self__, "bucket_name", bucket_name)
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "user", user)
if compression_codec is not None:
pulumi.set(__self__, "compression_codec", compression_codec)
if gzip_level is not None:
pulumi.set(__self__, "gzip_level", gzip_level)
if message_type is not None:
pulumi.set(__self__, "message_type", message_type)
if path is not None:
pulumi.set(__self__, "path", path)
if period is not None:
pulumi.set(__self__, "period", period)
if public_key is not None:
pulumi.set(__self__, "public_key", public_key)
if region is not None:
pulumi.set(__self__, "region", region)
if timestamp_format is not None:
pulumi.set(__self__, "timestamp_format", timestamp_format)
@property
@pulumi.getter(name="accessKey")
def access_key(self) -> pulumi.Input[str]:
"""
Your Cloud File account access key
"""
return pulumi.get(self, "access_key")
@access_key.setter
def access_key(self, value: pulumi.Input[str]):
pulumi.set(self, "access_key", value)
@property
@pulumi.getter(name="bucketName")
def bucket_name(self) -> pulumi.Input[str]:
"""
The name of your Cloud Files container
"""
return pulumi.get(self, "bucket_name")
@bucket_name.setter
def bucket_name(self, value: pulumi.Input[str]):
pulumi.set(self, "bucket_name", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the Rackspace Cloud Files logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def user(self) -> pulumi.Input[str]:
"""
The username for your Cloud Files account
"""
return pulumi.get(self, "user")
@user.setter
def user(self, value: pulumi.Input[str]):
pulumi.set(self, "user", value)
@property
@pulumi.getter(name="compressionCodec")
def compression_codec(self) -> Optional[pulumi.Input[str]]:
"""
The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
"""
return pulumi.get(self, "compression_codec")
@compression_codec.setter
def compression_codec(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "compression_codec", value)
@property
@pulumi.getter(name="gzipLevel")
def gzip_level(self) -> Optional[pulumi.Input[int]]:
"""
What level of GZIP encoding to have when dumping logs (default `0`, no compression)
"""
return pulumi.get(self, "gzip_level")
@gzip_level.setter
def gzip_level(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "gzip_level", value)
@property
@pulumi.getter(name="messageType")
def message_type(self) -> Optional[pulumi.Input[str]]:
"""
How the message should be formatted. One of: `classic` (default), `loggly`, `logplex` or `blank`
"""
return pulumi.get(self, "message_type")
@message_type.setter
def message_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "message_type", value)
@property
@pulumi.getter
def path(self) -> Optional[pulumi.Input[str]]:
"""
The path to upload logs to
"""
return pulumi.get(self, "path")
@path.setter
def path(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "path", value)
@property
@pulumi.getter
def period(self) -> Optional[pulumi.Input[int]]:
"""
How frequently log files are finalized so they can be available for reading (in seconds, default `3600`)
"""
return pulumi.get(self, "period")
@period.setter
def period(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "period", value)
@property
@pulumi.getter(name="publicKey")
def public_key(self) -> Optional[pulumi.Input[str]]:
"""
The PGP public key that Fastly will use to encrypt your log files before writing them to disk
"""
return pulumi.get(self, "public_key")
@public_key.setter
def public_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "public_key", value)
@property
@pulumi.getter
def region(self) -> Optional[pulumi.Input[str]]:
"""
The region to stream logs to. One of: DFW (Dallas), ORD (Chicago), IAD (Northern Virginia), LON (London), SYD (Sydney), HKG (Hong Kong)
"""
return pulumi.get(self, "region")
@region.setter
def region(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "region", value)
@property
@pulumi.getter(name="timestampFormat")
def timestamp_format(self) -> Optional[pulumi.Input[str]]:
"""
The `strftime` specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
return pulumi.get(self, "timestamp_format")
@timestamp_format.setter
def timestamp_format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "timestamp_format", value)
@pulumi.input_type
class ServiceComputeLoggingDatadogArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
token: pulumi.Input[str],
region: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: The unique name of the Datadog logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] token: The API key from your Datadog account
:param pulumi.Input[str] region: The region that log data will be sent to. One of `US` or `EU`. Defaults to `US` if undefined
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "token", token)
if region is not None:
pulumi.set(__self__, "region", region)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the Datadog logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def token(self) -> pulumi.Input[str]:
"""
The API key from your Datadog account
"""
return pulumi.get(self, "token")
@token.setter
def token(self, value: pulumi.Input[str]):
pulumi.set(self, "token", value)
@property
@pulumi.getter
def region(self) -> Optional[pulumi.Input[str]]:
"""
The region that log data will be sent to. One of `US` or `EU`. Defaults to `US` if undefined
"""
return pulumi.get(self, "region")
@region.setter
def region(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "region", value)
@pulumi.input_type
class ServiceComputeLoggingDigitaloceanArgs:
def __init__(__self__, *,
access_key: pulumi.Input[str],
bucket_name: pulumi.Input[str],
name: pulumi.Input[str],
secret_key: pulumi.Input[str],
compression_codec: Optional[pulumi.Input[str]] = None,
domain: Optional[pulumi.Input[str]] = None,
gzip_level: Optional[pulumi.Input[int]] = None,
message_type: Optional[pulumi.Input[str]] = None,
path: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[int]] = None,
public_key: Optional[pulumi.Input[str]] = None,
timestamp_format: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] access_key: Your DigitalOcean Spaces account access key
:param pulumi.Input[str] bucket_name: The name of the DigitalOcean Space
:param pulumi.Input[str] name: The unique name of the DigitalOcean Spaces logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] secret_key: Your DigitalOcean Spaces account secret key
:param pulumi.Input[str] compression_codec: The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
:param pulumi.Input[str] domain: The domain of the DigitalOcean Spaces endpoint (default `nyc3.digitaloceanspaces.com`)
:param pulumi.Input[int] gzip_level: What level of Gzip encoding to have when dumping logs (default `0`, no compression)
:param pulumi.Input[str] message_type: How the message should be formatted. One of: `classic` (default), `loggly`, `logplex` or `blank`
:param pulumi.Input[str] path: The path to upload logs to
:param pulumi.Input[int] period: How frequently log files are finalized so they can be available for reading (in seconds, default `3600`)
:param pulumi.Input[str] public_key: A PGP public key that Fastly will use to encrypt your log files before writing them to disk
:param pulumi.Input[str] timestamp_format: `strftime` specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
pulumi.set(__self__, "access_key", access_key)
pulumi.set(__self__, "bucket_name", bucket_name)
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "secret_key", secret_key)
if compression_codec is not None:
pulumi.set(__self__, "compression_codec", compression_codec)
if domain is not None:
pulumi.set(__self__, "domain", domain)
if gzip_level is not None:
pulumi.set(__self__, "gzip_level", gzip_level)
if message_type is not None:
pulumi.set(__self__, "message_type", message_type)
if path is not None:
pulumi.set(__self__, "path", path)
if period is not None:
pulumi.set(__self__, "period", period)
if public_key is not None:
pulumi.set(__self__, "public_key", public_key)
if timestamp_format is not None:
pulumi.set(__self__, "timestamp_format", timestamp_format)
@property
@pulumi.getter(name="accessKey")
def access_key(self) -> pulumi.Input[str]:
"""
Your DigitalOcean Spaces account access key
"""
return pulumi.get(self, "access_key")
@access_key.setter
def access_key(self, value: pulumi.Input[str]):
pulumi.set(self, "access_key", value)
@property
@pulumi.getter(name="bucketName")
def bucket_name(self) -> pulumi.Input[str]:
"""
The name of the DigitalOcean Space
"""
return pulumi.get(self, "bucket_name")
@bucket_name.setter
def bucket_name(self, value: pulumi.Input[str]):
pulumi.set(self, "bucket_name", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the DigitalOcean Spaces logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="secretKey")
def secret_key(self) -> pulumi.Input[str]:
"""
Your DigitalOcean Spaces account secret key
"""
return pulumi.get(self, "secret_key")
@secret_key.setter
def secret_key(self, value: pulumi.Input[str]):
pulumi.set(self, "secret_key", value)
@property
@pulumi.getter(name="compressionCodec")
def compression_codec(self) -> Optional[pulumi.Input[str]]:
"""
The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
"""
return pulumi.get(self, "compression_codec")
@compression_codec.setter
def compression_codec(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "compression_codec", value)
@property
@pulumi.getter
def domain(self) -> Optional[pulumi.Input[str]]:
"""
The domain of the DigitalOcean Spaces endpoint (default `nyc3.digitaloceanspaces.com`)
"""
return pulumi.get(self, "domain")
@domain.setter
def domain(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "domain", value)
@property
@pulumi.getter(name="gzipLevel")
def gzip_level(self) -> Optional[pulumi.Input[int]]:
"""
What level of Gzip encoding to have when dumping logs (default `0`, no compression)
"""
return pulumi.get(self, "gzip_level")
@gzip_level.setter
def gzip_level(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "gzip_level", value)
@property
@pulumi.getter(name="messageType")
def message_type(self) -> Optional[pulumi.Input[str]]:
"""
How the message should be formatted. One of: `classic` (default), `loggly`, `logplex` or `blank`
"""
return pulumi.get(self, "message_type")
@message_type.setter
def message_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "message_type", value)
@property
@pulumi.getter
def path(self) -> Optional[pulumi.Input[str]]:
"""
The path to upload logs to
"""
return pulumi.get(self, "path")
@path.setter
def path(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "path", value)
@property
@pulumi.getter
def period(self) -> Optional[pulumi.Input[int]]:
"""
How frequently log files are finalized so they can be available for reading (in seconds, default `3600`)
"""
return pulumi.get(self, "period")
@period.setter
def period(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "period", value)
@property
@pulumi.getter(name="publicKey")
def public_key(self) -> Optional[pulumi.Input[str]]:
"""
A PGP public key that Fastly will use to encrypt your log files before writing them to disk
"""
return pulumi.get(self, "public_key")
@public_key.setter
def public_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "public_key", value)
@property
@pulumi.getter(name="timestampFormat")
def timestamp_format(self) -> Optional[pulumi.Input[str]]:
"""
`strftime` specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
return pulumi.get(self, "timestamp_format")
@timestamp_format.setter
def timestamp_format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "timestamp_format", value)
@pulumi.input_type
class ServiceComputeLoggingElasticsearchArgs:
def __init__(__self__, *,
index: pulumi.Input[str],
name: pulumi.Input[str],
url: pulumi.Input[str],
password: Optional[pulumi.Input[str]] = None,
pipeline: Optional[pulumi.Input[str]] = None,
request_max_bytes: Optional[pulumi.Input[int]] = None,
request_max_entries: Optional[pulumi.Input[int]] = None,
tls_ca_cert: Optional[pulumi.Input[str]] = None,
tls_client_cert: Optional[pulumi.Input[str]] = None,
tls_client_key: Optional[pulumi.Input[str]] = None,
tls_hostname: Optional[pulumi.Input[str]] = None,
user: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] index: The name of the Elasticsearch index to send documents (logs) to
:param pulumi.Input[str] name: The unique name of the Elasticsearch logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] url: The Elasticsearch URL to stream logs to
:param pulumi.Input[str] password: BasicAuth password for Elasticsearch
:param pulumi.Input[str] pipeline: The ID of the Elasticsearch ingest pipeline to apply pre-process transformations to before indexing
:param pulumi.Input[int] request_max_bytes: The maximum number of logs sent in one request. Defaults to `0` for unbounded
:param pulumi.Input[int] request_max_entries: The maximum number of bytes sent in one request. Defaults to `0` for unbounded
:param pulumi.Input[str] tls_ca_cert: A secure certificate to authenticate the server with. Must be in PEM format
:param pulumi.Input[str] tls_client_cert: The client certificate used to make authenticated requests. Must be in PEM format
:param pulumi.Input[str] tls_client_key: The client private key used to make authenticated requests. Must be in PEM format
:param pulumi.Input[str] tls_hostname: The hostname used to verify the server's certificate. It can either be the Common Name (CN) or a Subject Alternative Name (SAN)
:param pulumi.Input[str] user: BasicAuth username for Elasticsearch
"""
pulumi.set(__self__, "index", index)
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "url", url)
if password is not None:
pulumi.set(__self__, "password", password)
if pipeline is not None:
pulumi.set(__self__, "pipeline", pipeline)
if request_max_bytes is not None:
pulumi.set(__self__, "request_max_bytes", request_max_bytes)
if request_max_entries is not None:
pulumi.set(__self__, "request_max_entries", request_max_entries)
if tls_ca_cert is not None:
pulumi.set(__self__, "tls_ca_cert", tls_ca_cert)
if tls_client_cert is not None:
pulumi.set(__self__, "tls_client_cert", tls_client_cert)
if tls_client_key is not None:
pulumi.set(__self__, "tls_client_key", tls_client_key)
if tls_hostname is not None:
pulumi.set(__self__, "tls_hostname", tls_hostname)
if user is not None:
pulumi.set(__self__, "user", user)
@property
@pulumi.getter
def index(self) -> pulumi.Input[str]:
"""
The name of the Elasticsearch index to send documents (logs) to
"""
return pulumi.get(self, "index")
@index.setter
def index(self, value: pulumi.Input[str]):
pulumi.set(self, "index", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the Elasticsearch logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def url(self) -> pulumi.Input[str]:
"""
The Elasticsearch URL to stream logs to
"""
return pulumi.get(self, "url")
@url.setter
def url(self, value: pulumi.Input[str]):
pulumi.set(self, "url", value)
@property
@pulumi.getter
def password(self) -> Optional[pulumi.Input[str]]:
"""
BasicAuth password for Elasticsearch
"""
return pulumi.get(self, "password")
@password.setter
def password(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "password", value)
@property
@pulumi.getter
def pipeline(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the Elasticsearch ingest pipeline to apply pre-process transformations to before indexing
"""
return pulumi.get(self, "pipeline")
@pipeline.setter
def pipeline(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "pipeline", value)
@property
@pulumi.getter(name="requestMaxBytes")
def request_max_bytes(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of logs sent in one request. Defaults to `0` for unbounded
"""
return pulumi.get(self, "request_max_bytes")
@request_max_bytes.setter
def request_max_bytes(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "request_max_bytes", value)
@property
@pulumi.getter(name="requestMaxEntries")
def request_max_entries(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of bytes sent in one request. Defaults to `0` for unbounded
"""
return pulumi.get(self, "request_max_entries")
@request_max_entries.setter
def request_max_entries(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "request_max_entries", value)
@property
@pulumi.getter(name="tlsCaCert")
def tls_ca_cert(self) -> Optional[pulumi.Input[str]]:
"""
A secure certificate to authenticate the server with. Must be in PEM format
"""
return pulumi.get(self, "tls_ca_cert")
@tls_ca_cert.setter
def tls_ca_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_ca_cert", value)
@property
@pulumi.getter(name="tlsClientCert")
def tls_client_cert(self) -> Optional[pulumi.Input[str]]:
"""
The client certificate used to make authenticated requests. Must be in PEM format
"""
return pulumi.get(self, "tls_client_cert")
@tls_client_cert.setter
def tls_client_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_client_cert", value)
@property
@pulumi.getter(name="tlsClientKey")
def tls_client_key(self) -> Optional[pulumi.Input[str]]:
"""
The client private key used to make authenticated requests. Must be in PEM format
"""
return pulumi.get(self, "tls_client_key")
@tls_client_key.setter
def tls_client_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_client_key", value)
@property
@pulumi.getter(name="tlsHostname")
def tls_hostname(self) -> Optional[pulumi.Input[str]]:
"""
The hostname used to verify the server's certificate. It can either be the Common Name (CN) or a Subject Alternative Name (SAN)
"""
return pulumi.get(self, "tls_hostname")
@tls_hostname.setter
def tls_hostname(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_hostname", value)
@property
@pulumi.getter
def user(self) -> Optional[pulumi.Input[str]]:
"""
BasicAuth username for Elasticsearch
"""
return pulumi.get(self, "user")
@user.setter
def user(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "user", value)
@pulumi.input_type
class ServiceComputeLoggingFtpArgs:
def __init__(__self__, *,
address: pulumi.Input[str],
name: pulumi.Input[str],
password: pulumi.Input[str],
path: pulumi.Input[str],
user: pulumi.Input[str],
compression_codec: Optional[pulumi.Input[str]] = None,
gzip_level: Optional[pulumi.Input[int]] = None,
message_type: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[int]] = None,
port: Optional[pulumi.Input[int]] = None,
public_key: Optional[pulumi.Input[str]] = None,
timestamp_format: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] address: The FTP address to stream logs to
:param pulumi.Input[str] name: The unique name of the FTP logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] password: The password for the server (for anonymous use an email address)
:param pulumi.Input[str] path: The path to upload log files to. If the path ends in `/` then it is treated as a directory
:param pulumi.Input[str] user: The username for the server (can be `anonymous`)
:param pulumi.Input[str] compression_codec: The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
:param pulumi.Input[int] gzip_level: Gzip Compression level. Default `0`
:param pulumi.Input[str] message_type: How the message should be formatted (default: `classic`)
:param pulumi.Input[int] period: How frequently the logs should be transferred, in seconds (Default `3600`)
:param pulumi.Input[int] port: The port number. Default: `21`
:param pulumi.Input[str] public_key: The PGP public key that Fastly will use to encrypt your log files before writing them to disk
:param pulumi.Input[str] timestamp_format: specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
pulumi.set(__self__, "address", address)
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "password", password)
pulumi.set(__self__, "path", path)
pulumi.set(__self__, "user", user)
if compression_codec is not None:
pulumi.set(__self__, "compression_codec", compression_codec)
if gzip_level is not None:
pulumi.set(__self__, "gzip_level", gzip_level)
if message_type is not None:
pulumi.set(__self__, "message_type", message_type)
if period is not None:
pulumi.set(__self__, "period", period)
if port is not None:
pulumi.set(__self__, "port", port)
if public_key is not None:
pulumi.set(__self__, "public_key", public_key)
if timestamp_format is not None:
pulumi.set(__self__, "timestamp_format", timestamp_format)
@property
@pulumi.getter
def address(self) -> pulumi.Input[str]:
"""
The FTP address to stream logs to
"""
return pulumi.get(self, "address")
@address.setter
def address(self, value: pulumi.Input[str]):
pulumi.set(self, "address", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the FTP logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def password(self) -> pulumi.Input[str]:
"""
The password for the server (for anonymous use an email address)
"""
return pulumi.get(self, "password")
@password.setter
def password(self, value: pulumi.Input[str]):
pulumi.set(self, "password", value)
@property
@pulumi.getter
def path(self) -> pulumi.Input[str]:
"""
The path to upload log files to. If the path ends in `/` then it is treated as a directory
"""
return pulumi.get(self, "path")
@path.setter
def path(self, value: pulumi.Input[str]):
pulumi.set(self, "path", value)
@property
@pulumi.getter
def user(self) -> pulumi.Input[str]:
"""
The username for the server (can be `anonymous`)
"""
return pulumi.get(self, "user")
@user.setter
def user(self, value: pulumi.Input[str]):
pulumi.set(self, "user", value)
@property
@pulumi.getter(name="compressionCodec")
def compression_codec(self) -> Optional[pulumi.Input[str]]:
"""
The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
"""
return pulumi.get(self, "compression_codec")
@compression_codec.setter
def compression_codec(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "compression_codec", value)
@property
@pulumi.getter(name="gzipLevel")
def gzip_level(self) -> Optional[pulumi.Input[int]]:
"""
Gzip Compression level. Default `0`
"""
return pulumi.get(self, "gzip_level")
@gzip_level.setter
def gzip_level(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "gzip_level", value)
@property
@pulumi.getter(name="messageType")
def message_type(self) -> Optional[pulumi.Input[str]]:
"""
How the message should be formatted (default: `classic`)
"""
return pulumi.get(self, "message_type")
@message_type.setter
def message_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "message_type", value)
@property
@pulumi.getter
def period(self) -> Optional[pulumi.Input[int]]:
"""
How frequently the logs should be transferred, in seconds (Default `3600`)
"""
return pulumi.get(self, "period")
@period.setter
def period(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "period", value)
@property
@pulumi.getter
def port(self) -> Optional[pulumi.Input[int]]:
"""
The port number. Default: `21`
"""
return pulumi.get(self, "port")
@port.setter
def port(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "port", value)
@property
@pulumi.getter(name="publicKey")
def public_key(self) -> Optional[pulumi.Input[str]]:
"""
The PGP public key that Fastly will use to encrypt your log files before writing them to disk
"""
return pulumi.get(self, "public_key")
@public_key.setter
def public_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "public_key", value)
@property
@pulumi.getter(name="timestampFormat")
def timestamp_format(self) -> Optional[pulumi.Input[str]]:
"""
specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
return pulumi.get(self, "timestamp_format")
@timestamp_format.setter
def timestamp_format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "timestamp_format", value)
@pulumi.input_type
class ServiceComputeLoggingGooglepubsubArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
project_id: pulumi.Input[str],
secret_key: pulumi.Input[str],
topic: pulumi.Input[str],
user: pulumi.Input[str]):
"""
:param pulumi.Input[str] name: The unique name of the Google Cloud Pub/Sub logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] project_id: The ID of your Google Cloud Platform project
:param pulumi.Input[str] secret_key: Your Google Cloud Platform account secret key. The `private_key` field in your service account authentication JSON. You may optionally provide this secret via an environment variable, `FASTLY_GOOGLE_PUBSUB_SECRET_KEY`.
:param pulumi.Input[str] topic: The Google Cloud Pub/Sub topic to which logs will be published
:param pulumi.Input[str] user: Your Google Cloud Platform service account email address. The `client_email` field in your service account authentication JSON. You may optionally provide this via an environment variable, `FASTLY_GOOGLE_PUBSUB_EMAIL`.
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "project_id", project_id)
pulumi.set(__self__, "secret_key", secret_key)
pulumi.set(__self__, "topic", topic)
pulumi.set(__self__, "user", user)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the Google Cloud Pub/Sub logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="projectId")
def project_id(self) -> pulumi.Input[str]:
"""
The ID of your Google Cloud Platform project
"""
return pulumi.get(self, "project_id")
@project_id.setter
def project_id(self, value: pulumi.Input[str]):
pulumi.set(self, "project_id", value)
@property
@pulumi.getter(name="secretKey")
def secret_key(self) -> pulumi.Input[str]:
"""
Your Google Cloud Platform account secret key. The `private_key` field in your service account authentication JSON. You may optionally provide this secret via an environment variable, `FASTLY_GOOGLE_PUBSUB_SECRET_KEY`.
"""
return pulumi.get(self, "secret_key")
@secret_key.setter
def secret_key(self, value: pulumi.Input[str]):
pulumi.set(self, "secret_key", value)
@property
@pulumi.getter
def topic(self) -> pulumi.Input[str]:
"""
The Google Cloud Pub/Sub topic to which logs will be published
"""
return pulumi.get(self, "topic")
@topic.setter
def topic(self, value: pulumi.Input[str]):
pulumi.set(self, "topic", value)
@property
@pulumi.getter
def user(self) -> pulumi.Input[str]:
"""
Your Google Cloud Platform service account email address. The `client_email` field in your service account authentication JSON. You may optionally provide this via an environment variable, `FASTLY_GOOGLE_PUBSUB_EMAIL`.
"""
return pulumi.get(self, "user")
@user.setter
def user(self, value: pulumi.Input[str]):
pulumi.set(self, "user", value)
@pulumi.input_type
class ServiceComputeLoggingHerokuArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
token: pulumi.Input[str],
url: pulumi.Input[str]):
"""
:param pulumi.Input[str] name: The unique name of the Heroku logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] token: The token to use for authentication (https://www.heroku.com/docs/customer-token-authentication-token/)
:param pulumi.Input[str] url: The URL to stream logs to
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "token", token)
pulumi.set(__self__, "url", url)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the Heroku logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def token(self) -> pulumi.Input[str]:
"""
The token to use for authentication (https://www.heroku.com/docs/customer-token-authentication-token/)
"""
return pulumi.get(self, "token")
@token.setter
def token(self, value: pulumi.Input[str]):
pulumi.set(self, "token", value)
@property
@pulumi.getter
def url(self) -> pulumi.Input[str]:
"""
The URL to stream logs to
"""
return pulumi.get(self, "url")
@url.setter
def url(self, value: pulumi.Input[str]):
pulumi.set(self, "url", value)
@pulumi.input_type
class ServiceComputeLoggingHoneycombArgs:
def __init__(__self__, *,
dataset: pulumi.Input[str],
name: pulumi.Input[str],
token: pulumi.Input[str]):
"""
:param pulumi.Input[str] dataset: The Honeycomb Dataset you want to log to
:param pulumi.Input[str] name: The unique name of the Honeycomb logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] token: The Write Key from the Account page of your Honeycomb account
"""
pulumi.set(__self__, "dataset", dataset)
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "token", token)
@property
@pulumi.getter
def dataset(self) -> pulumi.Input[str]:
"""
The Honeycomb Dataset you want to log to
"""
return pulumi.get(self, "dataset")
@dataset.setter
def dataset(self, value: pulumi.Input[str]):
pulumi.set(self, "dataset", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the Honeycomb logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def token(self) -> pulumi.Input[str]:
"""
The Write Key from the Account page of your Honeycomb account
"""
return pulumi.get(self, "token")
@token.setter
def token(self, value: pulumi.Input[str]):
pulumi.set(self, "token", value)
@pulumi.input_type
class ServiceComputeLoggingKafkaArgs:
def __init__(__self__, *,
brokers: pulumi.Input[str],
name: pulumi.Input[str],
topic: pulumi.Input[str],
auth_method: Optional[pulumi.Input[str]] = None,
compression_codec: Optional[pulumi.Input[str]] = None,
parse_log_keyvals: Optional[pulumi.Input[bool]] = None,
password: Optional[pulumi.Input[str]] = None,
request_max_bytes: Optional[pulumi.Input[int]] = None,
required_acks: Optional[pulumi.Input[str]] = None,
tls_ca_cert: Optional[pulumi.Input[str]] = None,
tls_client_cert: Optional[pulumi.Input[str]] = None,
tls_client_key: Optional[pulumi.Input[str]] = None,
tls_hostname: Optional[pulumi.Input[str]] = None,
use_tls: Optional[pulumi.Input[bool]] = None,
user: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] brokers: A comma-separated list of IP addresses or hostnames of Kafka brokers
:param pulumi.Input[str] name: The unique name of the Kafka logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] topic: The Kafka topic to send logs to
:param pulumi.Input[str] auth_method: SASL authentication method. One of: plain, scram-sha-256, scram-sha-512
:param pulumi.Input[str] compression_codec: The codec used for compression of your logs. One of: `gzip`, `snappy`, `lz4`
:param pulumi.Input[bool] parse_log_keyvals: Enables parsing of key=value tuples from the beginning of a logline, turning them into record headers
:param pulumi.Input[str] password: SASL Pass
:param pulumi.Input[int] request_max_bytes: Maximum size of log batch, if non-zero. Defaults to 0 for unbounded
:param pulumi.Input[str] required_acks: The Number of acknowledgements a leader must receive before a write is considered successful. One of: `1` (default) One server needs to respond. `0` No servers need to respond. `-1` Wait for all in-sync replicas to respond
:param pulumi.Input[str] tls_ca_cert: A secure certificate to authenticate the server with. Must be in PEM format
:param pulumi.Input[str] tls_client_cert: The client certificate used to make authenticated requests. Must be in PEM format
:param pulumi.Input[str] tls_client_key: The client private key used to make authenticated requests. Must be in PEM format
:param pulumi.Input[str] tls_hostname: The hostname used to verify the server's certificate. It can either be the Common Name or a Subject Alternative Name (SAN)
:param pulumi.Input[bool] use_tls: Whether to use TLS for secure logging. Can be either `true` or `false`
:param pulumi.Input[str] user: SASL User
"""
pulumi.set(__self__, "brokers", brokers)
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "topic", topic)
if auth_method is not None:
pulumi.set(__self__, "auth_method", auth_method)
if compression_codec is not None:
pulumi.set(__self__, "compression_codec", compression_codec)
if parse_log_keyvals is not None:
pulumi.set(__self__, "parse_log_keyvals", parse_log_keyvals)
if password is not None:
pulumi.set(__self__, "password", password)
if request_max_bytes is not None:
pulumi.set(__self__, "request_max_bytes", request_max_bytes)
if required_acks is not None:
pulumi.set(__self__, "required_acks", required_acks)
if tls_ca_cert is not None:
pulumi.set(__self__, "tls_ca_cert", tls_ca_cert)
if tls_client_cert is not None:
pulumi.set(__self__, "tls_client_cert", tls_client_cert)
if tls_client_key is not None:
pulumi.set(__self__, "tls_client_key", tls_client_key)
if tls_hostname is not None:
pulumi.set(__self__, "tls_hostname", tls_hostname)
if use_tls is not None:
pulumi.set(__self__, "use_tls", use_tls)
if user is not None:
pulumi.set(__self__, "user", user)
@property
@pulumi.getter
def brokers(self) -> pulumi.Input[str]:
"""
A comma-separated list of IP addresses or hostnames of Kafka brokers
"""
return pulumi.get(self, "brokers")
@brokers.setter
def brokers(self, value: pulumi.Input[str]):
pulumi.set(self, "brokers", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the Kafka logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def topic(self) -> pulumi.Input[str]:
"""
The Kafka topic to send logs to
"""
return pulumi.get(self, "topic")
@topic.setter
def topic(self, value: pulumi.Input[str]):
pulumi.set(self, "topic", value)
@property
@pulumi.getter(name="authMethod")
def auth_method(self) -> Optional[pulumi.Input[str]]:
"""
SASL authentication method. One of: plain, scram-sha-256, scram-sha-512
"""
return pulumi.get(self, "auth_method")
@auth_method.setter
def auth_method(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "auth_method", value)
@property
@pulumi.getter(name="compressionCodec")
def compression_codec(self) -> Optional[pulumi.Input[str]]:
"""
The codec used for compression of your logs. One of: `gzip`, `snappy`, `lz4`
"""
return pulumi.get(self, "compression_codec")
@compression_codec.setter
def compression_codec(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "compression_codec", value)
@property
@pulumi.getter(name="parseLogKeyvals")
def parse_log_keyvals(self) -> Optional[pulumi.Input[bool]]:
"""
Enables parsing of key=value tuples from the beginning of a logline, turning them into record headers
"""
return pulumi.get(self, "parse_log_keyvals")
@parse_log_keyvals.setter
def parse_log_keyvals(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "parse_log_keyvals", value)
@property
@pulumi.getter
def password(self) -> Optional[pulumi.Input[str]]:
"""
SASL Pass
"""
return pulumi.get(self, "password")
@password.setter
def password(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "password", value)
@property
@pulumi.getter(name="requestMaxBytes")
def request_max_bytes(self) -> Optional[pulumi.Input[int]]:
"""
Maximum size of log batch, if non-zero. Defaults to 0 for unbounded
"""
return pulumi.get(self, "request_max_bytes")
@request_max_bytes.setter
def request_max_bytes(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "request_max_bytes", value)
@property
@pulumi.getter(name="requiredAcks")
def required_acks(self) -> Optional[pulumi.Input[str]]:
"""
The Number of acknowledgements a leader must receive before a write is considered successful. One of: `1` (default) One server needs to respond. `0` No servers need to respond. `-1` Wait for all in-sync replicas to respond
"""
return pulumi.get(self, "required_acks")
@required_acks.setter
def required_acks(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "required_acks", value)
@property
@pulumi.getter(name="tlsCaCert")
def tls_ca_cert(self) -> Optional[pulumi.Input[str]]:
"""
A secure certificate to authenticate the server with. Must be in PEM format
"""
return pulumi.get(self, "tls_ca_cert")
@tls_ca_cert.setter
def tls_ca_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_ca_cert", value)
@property
@pulumi.getter(name="tlsClientCert")
def tls_client_cert(self) -> Optional[pulumi.Input[str]]:
"""
The client certificate used to make authenticated requests. Must be in PEM format
"""
return pulumi.get(self, "tls_client_cert")
@tls_client_cert.setter
def tls_client_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_client_cert", value)
@property
@pulumi.getter(name="tlsClientKey")
def tls_client_key(self) -> Optional[pulumi.Input[str]]:
"""
The client private key used to make authenticated requests. Must be in PEM format
"""
return pulumi.get(self, "tls_client_key")
@tls_client_key.setter
def tls_client_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_client_key", value)
@property
@pulumi.getter(name="tlsHostname")
def tls_hostname(self) -> Optional[pulumi.Input[str]]:
"""
The hostname used to verify the server's certificate. It can either be the Common Name or a Subject Alternative Name (SAN)
"""
return pulumi.get(self, "tls_hostname")
@tls_hostname.setter
def tls_hostname(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_hostname", value)
@property
@pulumi.getter(name="useTls")
def use_tls(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to use TLS for secure logging. Can be either `true` or `false`
"""
return pulumi.get(self, "use_tls")
@use_tls.setter
def use_tls(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "use_tls", value)
@property
@pulumi.getter
def user(self) -> Optional[pulumi.Input[str]]:
"""
SASL User
"""
return pulumi.get(self, "user")
@user.setter
def user(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "user", value)
@pulumi.input_type
class ServiceComputeLoggingKineseArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
topic: pulumi.Input[str],
access_key: Optional[pulumi.Input[str]] = None,
iam_role: Optional[pulumi.Input[str]] = None,
region: Optional[pulumi.Input[str]] = None,
secret_key: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: The unique name of the Kinesis logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] topic: The Kinesis stream name
:param pulumi.Input[str] access_key: The AWS access key to be used to write to the stream
:param pulumi.Input[str] iam_role: The Amazon Resource Name (ARN) for the IAM role granting Fastly access to Kinesis. Not required if `access_key` and `secret_key` are provided.
:param pulumi.Input[str] region: The AWS region the stream resides in. (Default: `us-east-1`)
:param pulumi.Input[str] secret_key: The AWS secret access key to authenticate with
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "topic", topic)
if access_key is not None:
pulumi.set(__self__, "access_key", access_key)
if iam_role is not None:
pulumi.set(__self__, "iam_role", iam_role)
if region is not None:
pulumi.set(__self__, "region", region)
if secret_key is not None:
pulumi.set(__self__, "secret_key", secret_key)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the Kinesis logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def topic(self) -> pulumi.Input[str]:
"""
The Kinesis stream name
"""
return pulumi.get(self, "topic")
@topic.setter
def topic(self, value: pulumi.Input[str]):
pulumi.set(self, "topic", value)
@property
@pulumi.getter(name="accessKey")
def access_key(self) -> Optional[pulumi.Input[str]]:
"""
The AWS access key to be used to write to the stream
"""
return pulumi.get(self, "access_key")
@access_key.setter
def access_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "access_key", value)
@property
@pulumi.getter(name="iamRole")
def iam_role(self) -> Optional[pulumi.Input[str]]:
"""
The Amazon Resource Name (ARN) for the IAM role granting Fastly access to Kinesis. Not required if `access_key` and `secret_key` are provided.
"""
return pulumi.get(self, "iam_role")
@iam_role.setter
def iam_role(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "iam_role", value)
@property
@pulumi.getter
def region(self) -> Optional[pulumi.Input[str]]:
"""
The AWS region the stream resides in. (Default: `us-east-1`)
"""
return pulumi.get(self, "region")
@region.setter
def region(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "region", value)
@property
@pulumi.getter(name="secretKey")
def secret_key(self) -> Optional[pulumi.Input[str]]:
"""
The AWS secret access key to authenticate with
"""
return pulumi.get(self, "secret_key")
@secret_key.setter
def secret_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "secret_key", value)
@pulumi.input_type
class ServiceComputeLoggingLogglyArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
token: pulumi.Input[str]):
"""
:param pulumi.Input[str] name: The unique name of the Loggly logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] token: The token to use for authentication (https://www.loggly.com/docs/customer-token-authentication-token/).
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "token", token)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the Loggly logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def token(self) -> pulumi.Input[str]:
"""
The token to use for authentication (https://www.loggly.com/docs/customer-token-authentication-token/).
"""
return pulumi.get(self, "token")
@token.setter
def token(self, value: pulumi.Input[str]):
pulumi.set(self, "token", value)
@pulumi.input_type
class ServiceComputeLoggingLogshuttleArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
token: pulumi.Input[str],
url: pulumi.Input[str]):
"""
:param pulumi.Input[str] name: The unique name of the Log Shuttle logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] token: The data authentication token associated with this endpoint
:param pulumi.Input[str] url: Your Log Shuttle endpoint URL
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "token", token)
pulumi.set(__self__, "url", url)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the Log Shuttle logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def token(self) -> pulumi.Input[str]:
"""
The data authentication token associated with this endpoint
"""
return pulumi.get(self, "token")
@token.setter
def token(self, value: pulumi.Input[str]):
pulumi.set(self, "token", value)
@property
@pulumi.getter
def url(self) -> pulumi.Input[str]:
"""
Your Log Shuttle endpoint URL
"""
return pulumi.get(self, "url")
@url.setter
def url(self, value: pulumi.Input[str]):
pulumi.set(self, "url", value)
@pulumi.input_type
class ServiceComputeLoggingNewrelicArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
token: pulumi.Input[str],
region: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: The unique name of the New Relic logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] token: The Insert API key from the Account page of your New Relic account
:param pulumi.Input[str] region: The region that log data will be sent to. Default: `US`
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "token", token)
if region is not None:
pulumi.set(__self__, "region", region)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the New Relic logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def token(self) -> pulumi.Input[str]:
"""
The Insert API key from the Account page of your New Relic account
"""
return pulumi.get(self, "token")
@token.setter
def token(self, value: pulumi.Input[str]):
pulumi.set(self, "token", value)
@property
@pulumi.getter
def region(self) -> Optional[pulumi.Input[str]]:
"""
The region that log data will be sent to. Default: `US`
"""
return pulumi.get(self, "region")
@region.setter
def region(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "region", value)
@pulumi.input_type
class ServiceComputeLoggingOpenstackArgs:
def __init__(__self__, *,
access_key: pulumi.Input[str],
bucket_name: pulumi.Input[str],
name: pulumi.Input[str],
url: pulumi.Input[str],
user: pulumi.Input[str],
compression_codec: Optional[pulumi.Input[str]] = None,
gzip_level: Optional[pulumi.Input[int]] = None,
message_type: Optional[pulumi.Input[str]] = None,
path: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[int]] = None,
public_key: Optional[pulumi.Input[str]] = None,
timestamp_format: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] access_key: Your OpenStack account access key
:param pulumi.Input[str] bucket_name: The name of your OpenStack container
:param pulumi.Input[str] name: The unique name of the OpenStack logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] url: Your OpenStack auth url
:param pulumi.Input[str] user: The username for your OpenStack account
:param pulumi.Input[str] compression_codec: The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
:param pulumi.Input[int] gzip_level: What level of Gzip encoding to have when dumping logs (default `0`, no compression)
:param pulumi.Input[str] message_type: How the message should be formatted; one of: `classic`, `loggly`, `logplex` or `blank`. Default `classic`. [Fastly Documentation](https://developer.fastly.com/reference/api/logging/gcs/)
:param pulumi.Input[str] path: Path to store the files. Must end with a trailing slash. If this field is left empty, the files will be saved in the bucket's root path
:param pulumi.Input[int] period: How frequently the logs should be transferred, in seconds. Default `3600`
:param pulumi.Input[str] public_key: A PGP public key that Fastly will use to encrypt your log files before writing them to disk
:param pulumi.Input[str] timestamp_format: specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
pulumi.set(__self__, "access_key", access_key)
pulumi.set(__self__, "bucket_name", bucket_name)
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "url", url)
pulumi.set(__self__, "user", user)
if compression_codec is not None:
pulumi.set(__self__, "compression_codec", compression_codec)
if gzip_level is not None:
pulumi.set(__self__, "gzip_level", gzip_level)
if message_type is not None:
pulumi.set(__self__, "message_type", message_type)
if path is not None:
pulumi.set(__self__, "path", path)
if period is not None:
pulumi.set(__self__, "period", period)
if public_key is not None:
pulumi.set(__self__, "public_key", public_key)
if timestamp_format is not None:
pulumi.set(__self__, "timestamp_format", timestamp_format)
@property
@pulumi.getter(name="accessKey")
def access_key(self) -> pulumi.Input[str]:
"""
Your OpenStack account access key
"""
return pulumi.get(self, "access_key")
@access_key.setter
def access_key(self, value: pulumi.Input[str]):
pulumi.set(self, "access_key", value)
@property
@pulumi.getter(name="bucketName")
def bucket_name(self) -> pulumi.Input[str]:
"""
The name of your OpenStack container
"""
return pulumi.get(self, "bucket_name")
@bucket_name.setter
def bucket_name(self, value: pulumi.Input[str]):
pulumi.set(self, "bucket_name", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the OpenStack logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def url(self) -> pulumi.Input[str]:
"""
Your OpenStack auth url
"""
return pulumi.get(self, "url")
@url.setter
def url(self, value: pulumi.Input[str]):
pulumi.set(self, "url", value)
@property
@pulumi.getter
def user(self) -> pulumi.Input[str]:
"""
The username for your OpenStack account
"""
return pulumi.get(self, "user")
@user.setter
def user(self, value: pulumi.Input[str]):
pulumi.set(self, "user", value)
@property
@pulumi.getter(name="compressionCodec")
def compression_codec(self) -> Optional[pulumi.Input[str]]:
"""
The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
"""
return pulumi.get(self, "compression_codec")
@compression_codec.setter
def compression_codec(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "compression_codec", value)
@property
@pulumi.getter(name="gzipLevel")
def gzip_level(self) -> Optional[pulumi.Input[int]]:
"""
What level of Gzip encoding to have when dumping logs (default `0`, no compression)
"""
return pulumi.get(self, "gzip_level")
@gzip_level.setter
def gzip_level(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "gzip_level", value)
@property
@pulumi.getter(name="messageType")
def message_type(self) -> Optional[pulumi.Input[str]]:
"""
How the message should be formatted; one of: `classic`, `loggly`, `logplex` or `blank`. Default `classic`. [Fastly Documentation](https://developer.fastly.com/reference/api/logging/gcs/)
"""
return pulumi.get(self, "message_type")
@message_type.setter
def message_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "message_type", value)
@property
@pulumi.getter
def path(self) -> Optional[pulumi.Input[str]]:
"""
Path to store the files. Must end with a trailing slash. If this field is left empty, the files will be saved in the bucket's root path
"""
return pulumi.get(self, "path")
@path.setter
def path(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "path", value)
@property
@pulumi.getter
def period(self) -> Optional[pulumi.Input[int]]:
"""
How frequently the logs should be transferred, in seconds. Default `3600`
"""
return pulumi.get(self, "period")
@period.setter
def period(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "period", value)
@property
@pulumi.getter(name="publicKey")
def public_key(self) -> Optional[pulumi.Input[str]]:
"""
A PGP public key that Fastly will use to encrypt your log files before writing them to disk
"""
return pulumi.get(self, "public_key")
@public_key.setter
def public_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "public_key", value)
@property
@pulumi.getter(name="timestampFormat")
def timestamp_format(self) -> Optional[pulumi.Input[str]]:
"""
specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
return pulumi.get(self, "timestamp_format")
@timestamp_format.setter
def timestamp_format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "timestamp_format", value)
@pulumi.input_type
class ServiceComputeLoggingScalyrArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
token: pulumi.Input[str],
region: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: The unique name of the Scalyr logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] token: The token to use for authentication (https://www.scalyr.com/keys)
:param pulumi.Input[str] region: The region that log data will be sent to. One of `US` or `EU`. Defaults to `US` if undefined
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "token", token)
if region is not None:
pulumi.set(__self__, "region", region)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the Scalyr logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def token(self) -> pulumi.Input[str]:
"""
The token to use for authentication (https://www.scalyr.com/keys)
"""
return pulumi.get(self, "token")
@token.setter
def token(self, value: pulumi.Input[str]):
pulumi.set(self, "token", value)
@property
@pulumi.getter
def region(self) -> Optional[pulumi.Input[str]]:
"""
The region that log data will be sent to. One of `US` or `EU`. Defaults to `US` if undefined
"""
return pulumi.get(self, "region")
@region.setter
def region(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "region", value)
@pulumi.input_type
class ServiceComputeLoggingSftpArgs:
def __init__(__self__, *,
address: pulumi.Input[str],
name: pulumi.Input[str],
path: pulumi.Input[str],
ssh_known_hosts: pulumi.Input[str],
user: pulumi.Input[str],
compression_codec: Optional[pulumi.Input[str]] = None,
gzip_level: Optional[pulumi.Input[int]] = None,
message_type: Optional[pulumi.Input[str]] = None,
password: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[int]] = None,
port: Optional[pulumi.Input[int]] = None,
public_key: Optional[pulumi.Input[str]] = None,
secret_key: Optional[pulumi.Input[str]] = None,
timestamp_format: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] address: The SFTP address to stream logs to
:param pulumi.Input[str] name: The unique name of the SFTP logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] path: The path to upload log files to. If the path ends in `/` then it is treated as a directory
:param pulumi.Input[str] ssh_known_hosts: A list of host keys for all hosts we can connect to over SFTP
:param pulumi.Input[str] user: The username for the server
:param pulumi.Input[str] compression_codec: The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
:param pulumi.Input[int] gzip_level: What level of Gzip encoding to have when dumping logs (default `0`, no compression)
:param pulumi.Input[str] message_type: How the message should be formatted. One of: `classic` (default), `loggly`, `logplex` or `blank`
:param pulumi.Input[str] password: The password for the server. If both `password` and `secret_key` are passed, `secret_key` will be preferred
:param pulumi.Input[int] period: How frequently log files are finalized so they can be available for reading (in seconds, default `3600`)
:param pulumi.Input[int] port: The port the SFTP service listens on. (Default: `22`)
:param pulumi.Input[str] public_key: A PGP public key that Fastly will use to encrypt your log files before writing them to disk
:param pulumi.Input[str] secret_key: The SSH private key for the server. If both `password` and `secret_key` are passed, `secret_key` will be preferred
:param pulumi.Input[str] timestamp_format: The `strftime` specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
pulumi.set(__self__, "address", address)
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "path", path)
pulumi.set(__self__, "ssh_known_hosts", ssh_known_hosts)
pulumi.set(__self__, "user", user)
if compression_codec is not None:
pulumi.set(__self__, "compression_codec", compression_codec)
if gzip_level is not None:
pulumi.set(__self__, "gzip_level", gzip_level)
if message_type is not None:
pulumi.set(__self__, "message_type", message_type)
if password is not None:
pulumi.set(__self__, "password", password)
if period is not None:
pulumi.set(__self__, "period", period)
if port is not None:
pulumi.set(__self__, "port", port)
if public_key is not None:
pulumi.set(__self__, "public_key", public_key)
if secret_key is not None:
pulumi.set(__self__, "secret_key", secret_key)
if timestamp_format is not None:
pulumi.set(__self__, "timestamp_format", timestamp_format)
@property
@pulumi.getter
def address(self) -> pulumi.Input[str]:
"""
The SFTP address to stream logs to
"""
return pulumi.get(self, "address")
@address.setter
def address(self, value: pulumi.Input[str]):
pulumi.set(self, "address", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the SFTP logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def path(self) -> pulumi.Input[str]:
"""
The path to upload log files to. If the path ends in `/` then it is treated as a directory
"""
return pulumi.get(self, "path")
@path.setter
def path(self, value: pulumi.Input[str]):
pulumi.set(self, "path", value)
@property
@pulumi.getter(name="sshKnownHosts")
def ssh_known_hosts(self) -> pulumi.Input[str]:
"""
A list of host keys for all hosts we can connect to over SFTP
"""
return pulumi.get(self, "ssh_known_hosts")
@ssh_known_hosts.setter
def ssh_known_hosts(self, value: pulumi.Input[str]):
pulumi.set(self, "ssh_known_hosts", value)
@property
@pulumi.getter
def user(self) -> pulumi.Input[str]:
"""
The username for the server
"""
return pulumi.get(self, "user")
@user.setter
def user(self, value: pulumi.Input[str]):
pulumi.set(self, "user", value)
@property
@pulumi.getter(name="compressionCodec")
def compression_codec(self) -> Optional[pulumi.Input[str]]:
"""
The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
"""
return pulumi.get(self, "compression_codec")
@compression_codec.setter
def compression_codec(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "compression_codec", value)
@property
@pulumi.getter(name="gzipLevel")
def gzip_level(self) -> Optional[pulumi.Input[int]]:
"""
What level of Gzip encoding to have when dumping logs (default `0`, no compression)
"""
return pulumi.get(self, "gzip_level")
@gzip_level.setter
def gzip_level(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "gzip_level", value)
@property
@pulumi.getter(name="messageType")
def message_type(self) -> Optional[pulumi.Input[str]]:
"""
How the message should be formatted. One of: `classic` (default), `loggly`, `logplex` or `blank`
"""
return pulumi.get(self, "message_type")
@message_type.setter
def message_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "message_type", value)
@property
@pulumi.getter
def password(self) -> Optional[pulumi.Input[str]]:
"""
The password for the server. If both `password` and `secret_key` are passed, `secret_key` will be preferred
"""
return pulumi.get(self, "password")
@password.setter
def password(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "password", value)
@property
@pulumi.getter
def period(self) -> Optional[pulumi.Input[int]]:
"""
How frequently log files are finalized so they can be available for reading (in seconds, default `3600`)
"""
return pulumi.get(self, "period")
@period.setter
def period(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "period", value)
@property
@pulumi.getter
def port(self) -> Optional[pulumi.Input[int]]:
"""
The port the SFTP service listens on. (Default: `22`)
"""
return pulumi.get(self, "port")
@port.setter
def port(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "port", value)
@property
@pulumi.getter(name="publicKey")
def public_key(self) -> Optional[pulumi.Input[str]]:
"""
A PGP public key that Fastly will use to encrypt your log files before writing them to disk
"""
return pulumi.get(self, "public_key")
@public_key.setter
def public_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "public_key", value)
@property
@pulumi.getter(name="secretKey")
def secret_key(self) -> Optional[pulumi.Input[str]]:
"""
The SSH private key for the server. If both `password` and `secret_key` are passed, `secret_key` will be preferred
"""
return pulumi.get(self, "secret_key")
@secret_key.setter
def secret_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "secret_key", value)
@property
@pulumi.getter(name="timestampFormat")
def timestamp_format(self) -> Optional[pulumi.Input[str]]:
"""
The `strftime` specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
return pulumi.get(self, "timestamp_format")
@timestamp_format.setter
def timestamp_format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "timestamp_format", value)
@pulumi.input_type
class ServiceComputePackageArgs:
def __init__(__self__, *,
filename: pulumi.Input[str],
source_code_hash: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] filename: The path to the Wasm deployment package within your local filesystem
:param pulumi.Input[str] source_code_hash: Used to trigger updates. Must be set to a SHA512 hash of the package file specified with the filename.
"""
pulumi.set(__self__, "filename", filename)
if source_code_hash is not None:
pulumi.set(__self__, "source_code_hash", source_code_hash)
@property
@pulumi.getter
def filename(self) -> pulumi.Input[str]:
"""
The path to the Wasm deployment package within your local filesystem
"""
return pulumi.get(self, "filename")
@filename.setter
def filename(self, value: pulumi.Input[str]):
pulumi.set(self, "filename", value)
@property
@pulumi.getter(name="sourceCodeHash")
def source_code_hash(self) -> Optional[pulumi.Input[str]]:
"""
Used to trigger updates. Must be set to a SHA512 hash of the package file specified with the filename.
"""
return pulumi.get(self, "source_code_hash")
@source_code_hash.setter
def source_code_hash(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "source_code_hash", value)
@pulumi.input_type
class ServiceComputePapertrailArgs:
def __init__(__self__, *,
address: pulumi.Input[str],
name: pulumi.Input[str],
port: pulumi.Input[int]):
"""
:param pulumi.Input[str] address: The address of the Papertrail endpoint
:param pulumi.Input[str] name: A unique name to identify this Papertrail endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[int] port: The port associated with the address where the Papertrail endpoint can be accessed
"""
pulumi.set(__self__, "address", address)
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "port", port)
@property
@pulumi.getter
def address(self) -> pulumi.Input[str]:
"""
The address of the Papertrail endpoint
"""
return pulumi.get(self, "address")
@address.setter
def address(self, value: pulumi.Input[str]):
pulumi.set(self, "address", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
A unique name to identify this Papertrail endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def port(self) -> pulumi.Input[int]:
"""
The port associated with the address where the Papertrail endpoint can be accessed
"""
return pulumi.get(self, "port")
@port.setter
def port(self, value: pulumi.Input[int]):
pulumi.set(self, "port", value)
@pulumi.input_type
class ServiceComputeS3loggingArgs:
def __init__(__self__, *,
bucket_name: pulumi.Input[str],
name: pulumi.Input[str],
acl: Optional[pulumi.Input[str]] = None,
compression_codec: Optional[pulumi.Input[str]] = None,
domain: Optional[pulumi.Input[str]] = None,
gzip_level: Optional[pulumi.Input[int]] = None,
message_type: Optional[pulumi.Input[str]] = None,
path: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[int]] = None,
public_key: Optional[pulumi.Input[str]] = None,
redundancy: Optional[pulumi.Input[str]] = None,
s3_access_key: Optional[pulumi.Input[str]] = None,
s3_iam_role: Optional[pulumi.Input[str]] = None,
s3_secret_key: Optional[pulumi.Input[str]] = None,
server_side_encryption: Optional[pulumi.Input[str]] = None,
server_side_encryption_kms_key_id: Optional[pulumi.Input[str]] = None,
timestamp_format: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] bucket_name: The name of the bucket in which to store the logs
:param pulumi.Input[str] name: The unique name of the S3 logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] acl: The AWS [Canned ACL](https://docs.aws.amazon.com/AmazonS3/latest/userguide/acl-overview.html#canned-acl) to use for objects uploaded to the S3 bucket. Options are: `private`, `public-read`, `public-read-write`, `aws-exec-read`, `authenticated-read`, `bucket-owner-read`, `bucket-owner-full-control`
:param pulumi.Input[str] compression_codec: The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
:param pulumi.Input[str] domain: If you created the S3 bucket outside of `us-east-1`, then specify the corresponding bucket endpoint. Example: `s3-us-west-2.amazonaws.com`
:param pulumi.Input[int] gzip_level: Level of Gzip compression, from `0-9`. `0` is no compression. `1` is fastest and least compressed, `9` is slowest and most compressed. Default `0`
:param pulumi.Input[str] message_type: How the message should be formatted; one of: `classic`, `loggly`, `logplex` or `blank`. Default `classic`
:param pulumi.Input[str] path: Path to store the files. Must end with a trailing slash. If this field is left empty, the files will be saved in the bucket's root path
:param pulumi.Input[int] period: How frequently the logs should be transferred, in seconds. Default `3600`
:param pulumi.Input[str] public_key: A PGP public key that Fastly will use to encrypt your log files before writing them to disk
:param pulumi.Input[str] redundancy: The S3 storage class (redundancy level). Should be one of: `standard`, `reduced_redundancy`, `standard_ia`, or `onezone_ia`
:param pulumi.Input[str] s3_access_key: AWS Access Key of an account with the required permissions to post logs. It is **strongly** recommended you create a separate IAM user with permissions to only operate on this Bucket. This key will be not be encrypted. Not required if `iam_role` is provided. You can provide this key via an environment variable, `FASTLY_S3_ACCESS_KEY`
:param pulumi.Input[str] s3_iam_role: The Amazon Resource Name (ARN) for the IAM role granting Fastly access to S3. Not required if `access_key` and `secret_key` are provided. You can provide this value via an environment variable, `FASTLY_S3_IAM_ROLE`
:param pulumi.Input[str] s3_secret_key: AWS Secret Key of an account with the required permissions to post logs. It is **strongly** recommended you create a separate IAM user with permissions to only operate on this Bucket. This secret will be not be encrypted. Not required if `iam_role` is provided. You can provide this secret via an environment variable, `FASTLY_S3_SECRET_KEY`
:param pulumi.Input[str] server_side_encryption: Specify what type of server side encryption should be used. Can be either `AES256` or `aws:kms`
:param pulumi.Input[str] server_side_encryption_kms_key_id: Optional server-side KMS Key Id. Must be set if server*side*encryption is set to `aws:kms`
:param pulumi.Input[str] timestamp_format: `strftime` specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
pulumi.set(__self__, "bucket_name", bucket_name)
pulumi.set(__self__, "name", name)
if acl is not None:
pulumi.set(__self__, "acl", acl)
if compression_codec is not None:
pulumi.set(__self__, "compression_codec", compression_codec)
if domain is not None:
pulumi.set(__self__, "domain", domain)
if gzip_level is not None:
pulumi.set(__self__, "gzip_level", gzip_level)
if message_type is not None:
pulumi.set(__self__, "message_type", message_type)
if path is not None:
pulumi.set(__self__, "path", path)
if period is not None:
pulumi.set(__self__, "period", period)
if public_key is not None:
pulumi.set(__self__, "public_key", public_key)
if redundancy is not None:
pulumi.set(__self__, "redundancy", redundancy)
if s3_access_key is not None:
pulumi.set(__self__, "s3_access_key", s3_access_key)
if s3_iam_role is not None:
pulumi.set(__self__, "s3_iam_role", s3_iam_role)
if s3_secret_key is not None:
pulumi.set(__self__, "s3_secret_key", s3_secret_key)
if server_side_encryption is not None:
pulumi.set(__self__, "server_side_encryption", server_side_encryption)
if server_side_encryption_kms_key_id is not None:
pulumi.set(__self__, "server_side_encryption_kms_key_id", server_side_encryption_kms_key_id)
if timestamp_format is not None:
pulumi.set(__self__, "timestamp_format", timestamp_format)
@property
@pulumi.getter(name="bucketName")
def bucket_name(self) -> pulumi.Input[str]:
"""
The name of the bucket in which to store the logs
"""
return pulumi.get(self, "bucket_name")
@bucket_name.setter
def bucket_name(self, value: pulumi.Input[str]):
pulumi.set(self, "bucket_name", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the S3 logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def acl(self) -> Optional[pulumi.Input[str]]:
"""
The AWS [Canned ACL](https://docs.aws.amazon.com/AmazonS3/latest/userguide/acl-overview.html#canned-acl) to use for objects uploaded to the S3 bucket. Options are: `private`, `public-read`, `public-read-write`, `aws-exec-read`, `authenticated-read`, `bucket-owner-read`, `bucket-owner-full-control`
"""
return pulumi.get(self, "acl")
@acl.setter
def acl(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "acl", value)
@property
@pulumi.getter(name="compressionCodec")
def compression_codec(self) -> Optional[pulumi.Input[str]]:
"""
The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
"""
return pulumi.get(self, "compression_codec")
@compression_codec.setter
def compression_codec(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "compression_codec", value)
@property
@pulumi.getter
def domain(self) -> Optional[pulumi.Input[str]]:
"""
If you created the S3 bucket outside of `us-east-1`, then specify the corresponding bucket endpoint. Example: `s3-us-west-2.amazonaws.com`
"""
return pulumi.get(self, "domain")
@domain.setter
def domain(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "domain", value)
@property
@pulumi.getter(name="gzipLevel")
def gzip_level(self) -> Optional[pulumi.Input[int]]:
"""
Level of Gzip compression, from `0-9`. `0` is no compression. `1` is fastest and least compressed, `9` is slowest and most compressed. Default `0`
"""
return pulumi.get(self, "gzip_level")
@gzip_level.setter
def gzip_level(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "gzip_level", value)
@property
@pulumi.getter(name="messageType")
def message_type(self) -> Optional[pulumi.Input[str]]:
"""
How the message should be formatted; one of: `classic`, `loggly`, `logplex` or `blank`. Default `classic`
"""
return pulumi.get(self, "message_type")
@message_type.setter
def message_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "message_type", value)
@property
@pulumi.getter
def path(self) -> Optional[pulumi.Input[str]]:
"""
Path to store the files. Must end with a trailing slash. If this field is left empty, the files will be saved in the bucket's root path
"""
return pulumi.get(self, "path")
@path.setter
def path(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "path", value)
@property
@pulumi.getter
def period(self) -> Optional[pulumi.Input[int]]:
"""
How frequently the logs should be transferred, in seconds. Default `3600`
"""
return pulumi.get(self, "period")
@period.setter
def period(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "period", value)
@property
@pulumi.getter(name="publicKey")
def public_key(self) -> Optional[pulumi.Input[str]]:
"""
A PGP public key that Fastly will use to encrypt your log files before writing them to disk
"""
return pulumi.get(self, "public_key")
@public_key.setter
def public_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "public_key", value)
@property
@pulumi.getter
def redundancy(self) -> Optional[pulumi.Input[str]]:
"""
The S3 storage class (redundancy level). Should be one of: `standard`, `reduced_redundancy`, `standard_ia`, or `onezone_ia`
"""
return pulumi.get(self, "redundancy")
@redundancy.setter
def redundancy(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "redundancy", value)
@property
@pulumi.getter(name="s3AccessKey")
def s3_access_key(self) -> Optional[pulumi.Input[str]]:
"""
AWS Access Key of an account with the required permissions to post logs. It is **strongly** recommended you create a separate IAM user with permissions to only operate on this Bucket. This key will be not be encrypted. Not required if `iam_role` is provided. You can provide this key via an environment variable, `FASTLY_S3_ACCESS_KEY`
"""
return pulumi.get(self, "s3_access_key")
@s3_access_key.setter
def s3_access_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "s3_access_key", value)
@property
@pulumi.getter(name="s3IamRole")
def s3_iam_role(self) -> Optional[pulumi.Input[str]]:
"""
The Amazon Resource Name (ARN) for the IAM role granting Fastly access to S3. Not required if `access_key` and `secret_key` are provided. You can provide this value via an environment variable, `FASTLY_S3_IAM_ROLE`
"""
return pulumi.get(self, "s3_iam_role")
@s3_iam_role.setter
def s3_iam_role(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "s3_iam_role", value)
@property
@pulumi.getter(name="s3SecretKey")
def s3_secret_key(self) -> Optional[pulumi.Input[str]]:
"""
AWS Secret Key of an account with the required permissions to post logs. It is **strongly** recommended you create a separate IAM user with permissions to only operate on this Bucket. This secret will be not be encrypted. Not required if `iam_role` is provided. You can provide this secret via an environment variable, `FASTLY_S3_SECRET_KEY`
"""
return pulumi.get(self, "s3_secret_key")
@s3_secret_key.setter
def s3_secret_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "s3_secret_key", value)
@property
@pulumi.getter(name="serverSideEncryption")
def server_side_encryption(self) -> Optional[pulumi.Input[str]]:
"""
Specify what type of server side encryption should be used. Can be either `AES256` or `aws:kms`
"""
return pulumi.get(self, "server_side_encryption")
@server_side_encryption.setter
def server_side_encryption(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "server_side_encryption", value)
@property
@pulumi.getter(name="serverSideEncryptionKmsKeyId")
def server_side_encryption_kms_key_id(self) -> Optional[pulumi.Input[str]]:
"""
Optional server-side KMS Key Id. Must be set if server*side*encryption is set to `aws:kms`
"""
return pulumi.get(self, "server_side_encryption_kms_key_id")
@server_side_encryption_kms_key_id.setter
def server_side_encryption_kms_key_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "server_side_encryption_kms_key_id", value)
@property
@pulumi.getter(name="timestampFormat")
def timestamp_format(self) -> Optional[pulumi.Input[str]]:
"""
`strftime` specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
return pulumi.get(self, "timestamp_format")
@timestamp_format.setter
def timestamp_format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "timestamp_format", value)
@pulumi.input_type
class ServiceComputeSplunkArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
token: pulumi.Input[str],
url: pulumi.Input[str],
tls_ca_cert: Optional[pulumi.Input[str]] = None,
tls_client_cert: Optional[pulumi.Input[str]] = None,
tls_client_key: Optional[pulumi.Input[str]] = None,
tls_hostname: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: A unique name to identify the Splunk endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] token: The Splunk token to be used for authentication
:param pulumi.Input[str] url: The Splunk URL to stream logs to
:param pulumi.Input[str] tls_ca_cert: A secure certificate to authenticate the server with. Must be in PEM format. You can provide this certificate via an environment variable, `FASTLY_SPLUNK_CA_CERT`
:param pulumi.Input[str] tls_client_cert: The client certificate used to make authenticated requests. Must be in PEM format.
:param pulumi.Input[str] tls_client_key: The client private key used to make authenticated requests. Must be in PEM format.
:param pulumi.Input[str] tls_hostname: The hostname used to verify the server's certificate. It can either be the Common Name or a Subject Alternative Name (SAN)
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "token", token)
pulumi.set(__self__, "url", url)
if tls_ca_cert is not None:
pulumi.set(__self__, "tls_ca_cert", tls_ca_cert)
if tls_client_cert is not None:
pulumi.set(__self__, "tls_client_cert", tls_client_cert)
if tls_client_key is not None:
pulumi.set(__self__, "tls_client_key", tls_client_key)
if tls_hostname is not None:
pulumi.set(__self__, "tls_hostname", tls_hostname)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
A unique name to identify the Splunk endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def token(self) -> pulumi.Input[str]:
"""
The Splunk token to be used for authentication
"""
return pulumi.get(self, "token")
@token.setter
def token(self, value: pulumi.Input[str]):
pulumi.set(self, "token", value)
@property
@pulumi.getter
def url(self) -> pulumi.Input[str]:
"""
The Splunk URL to stream logs to
"""
return pulumi.get(self, "url")
@url.setter
def url(self, value: pulumi.Input[str]):
pulumi.set(self, "url", value)
@property
@pulumi.getter(name="tlsCaCert")
def tls_ca_cert(self) -> Optional[pulumi.Input[str]]:
"""
A secure certificate to authenticate the server with. Must be in PEM format. You can provide this certificate via an environment variable, `FASTLY_SPLUNK_CA_CERT`
"""
return pulumi.get(self, "tls_ca_cert")
@tls_ca_cert.setter
def tls_ca_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_ca_cert", value)
@property
@pulumi.getter(name="tlsClientCert")
def tls_client_cert(self) -> Optional[pulumi.Input[str]]:
"""
The client certificate used to make authenticated requests. Must be in PEM format.
"""
return pulumi.get(self, "tls_client_cert")
@tls_client_cert.setter
def tls_client_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_client_cert", value)
@property
@pulumi.getter(name="tlsClientKey")
def tls_client_key(self) -> Optional[pulumi.Input[str]]:
"""
The client private key used to make authenticated requests. Must be in PEM format.
"""
return pulumi.get(self, "tls_client_key")
@tls_client_key.setter
def tls_client_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_client_key", value)
@property
@pulumi.getter(name="tlsHostname")
def tls_hostname(self) -> Optional[pulumi.Input[str]]:
"""
The hostname used to verify the server's certificate. It can either be the Common Name or a Subject Alternative Name (SAN)
"""
return pulumi.get(self, "tls_hostname")
@tls_hostname.setter
def tls_hostname(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_hostname", value)
@pulumi.input_type
class ServiceComputeSumologicArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
url: pulumi.Input[str],
message_type: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: A unique name to identify this Sumologic endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] url: The URL to Sumologic collector endpoint
:param pulumi.Input[str] message_type: How the message should be formatted; one of: `classic`, `loggly`, `logplex` or `blank`. Default `classic`. See [Fastly's Documentation on Sumologic](https://developer.fastly.com/reference/api/logging/sumologic/)
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "url", url)
if message_type is not None:
pulumi.set(__self__, "message_type", message_type)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
A unique name to identify this Sumologic endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def url(self) -> pulumi.Input[str]:
"""
The URL to Sumologic collector endpoint
"""
return pulumi.get(self, "url")
@url.setter
def url(self, value: pulumi.Input[str]):
pulumi.set(self, "url", value)
@property
@pulumi.getter(name="messageType")
def message_type(self) -> Optional[pulumi.Input[str]]:
"""
How the message should be formatted; one of: `classic`, `loggly`, `logplex` or `blank`. Default `classic`. See [Fastly's Documentation on Sumologic](https://developer.fastly.com/reference/api/logging/sumologic/)
"""
return pulumi.get(self, "message_type")
@message_type.setter
def message_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "message_type", value)
@pulumi.input_type
class ServiceComputeSyslogArgs:
def __init__(__self__, *,
address: pulumi.Input[str],
name: pulumi.Input[str],
message_type: Optional[pulumi.Input[str]] = None,
port: Optional[pulumi.Input[int]] = None,
tls_ca_cert: Optional[pulumi.Input[str]] = None,
tls_client_cert: Optional[pulumi.Input[str]] = None,
tls_client_key: Optional[pulumi.Input[str]] = None,
tls_hostname: Optional[pulumi.Input[str]] = None,
token: Optional[pulumi.Input[str]] = None,
use_tls: Optional[pulumi.Input[bool]] = None):
"""
:param pulumi.Input[str] address: A hostname or IPv4 address of the Syslog endpoint
:param pulumi.Input[str] name: A unique name to identify this Syslog endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] message_type: How the message should be formatted; one of: `classic`, `loggly`, `logplex` or `blank`. Default `classic`
:param pulumi.Input[int] port: The port associated with the address where the Syslog endpoint can be accessed. Default `514`
:param pulumi.Input[str] tls_ca_cert: A secure certificate to authenticate the server with. Must be in PEM format. You can provide this certificate via an environment variable, `FASTLY_SYSLOG_CA_CERT`
:param pulumi.Input[str] tls_client_cert: The client certificate used to make authenticated requests. Must be in PEM format. You can provide this certificate via an environment variable, `FASTLY_SYSLOG_CLIENT_CERT`
:param pulumi.Input[str] tls_client_key: The client private key used to make authenticated requests. Must be in PEM format. You can provide this key via an environment variable, `FASTLY_SYSLOG_CLIENT_KEY`
:param pulumi.Input[str] tls_hostname: Used during the TLS handshake to validate the certificate
:param pulumi.Input[str] token: Whether to prepend each message with a specific token
:param pulumi.Input[bool] use_tls: Whether to use TLS for secure logging. Default `false`
"""
pulumi.set(__self__, "address", address)
pulumi.set(__self__, "name", name)
if message_type is not None:
pulumi.set(__self__, "message_type", message_type)
if port is not None:
pulumi.set(__self__, "port", port)
if tls_ca_cert is not None:
pulumi.set(__self__, "tls_ca_cert", tls_ca_cert)
if tls_client_cert is not None:
pulumi.set(__self__, "tls_client_cert", tls_client_cert)
if tls_client_key is not None:
pulumi.set(__self__, "tls_client_key", tls_client_key)
if tls_hostname is not None:
pulumi.set(__self__, "tls_hostname", tls_hostname)
if token is not None:
pulumi.set(__self__, "token", token)
if use_tls is not None:
pulumi.set(__self__, "use_tls", use_tls)
@property
@pulumi.getter
def address(self) -> pulumi.Input[str]:
"""
A hostname or IPv4 address of the Syslog endpoint
"""
return pulumi.get(self, "address")
@address.setter
def address(self, value: pulumi.Input[str]):
pulumi.set(self, "address", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
A unique name to identify this Syslog endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="messageType")
def message_type(self) -> Optional[pulumi.Input[str]]:
"""
How the message should be formatted; one of: `classic`, `loggly`, `logplex` or `blank`. Default `classic`
"""
return pulumi.get(self, "message_type")
@message_type.setter
def message_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "message_type", value)
@property
@pulumi.getter
def port(self) -> Optional[pulumi.Input[int]]:
"""
The port associated with the address where the Syslog endpoint can be accessed. Default `514`
"""
return pulumi.get(self, "port")
@port.setter
def port(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "port", value)
@property
@pulumi.getter(name="tlsCaCert")
def tls_ca_cert(self) -> Optional[pulumi.Input[str]]:
"""
A secure certificate to authenticate the server with. Must be in PEM format. You can provide this certificate via an environment variable, `FASTLY_SYSLOG_CA_CERT`
"""
return pulumi.get(self, "tls_ca_cert")
@tls_ca_cert.setter
def tls_ca_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_ca_cert", value)
@property
@pulumi.getter(name="tlsClientCert")
def tls_client_cert(self) -> Optional[pulumi.Input[str]]:
"""
The client certificate used to make authenticated requests. Must be in PEM format. You can provide this certificate via an environment variable, `FASTLY_SYSLOG_CLIENT_CERT`
"""
return pulumi.get(self, "tls_client_cert")
@tls_client_cert.setter
def tls_client_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_client_cert", value)
@property
@pulumi.getter(name="tlsClientKey")
def tls_client_key(self) -> Optional[pulumi.Input[str]]:
"""
The client private key used to make authenticated requests. Must be in PEM format. You can provide this key via an environment variable, `FASTLY_SYSLOG_CLIENT_KEY`
"""
return pulumi.get(self, "tls_client_key")
@tls_client_key.setter
def tls_client_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_client_key", value)
@property
@pulumi.getter(name="tlsHostname")
def tls_hostname(self) -> Optional[pulumi.Input[str]]:
"""
Used during the TLS handshake to validate the certificate
"""
return pulumi.get(self, "tls_hostname")
@tls_hostname.setter
def tls_hostname(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_hostname", value)
@property
@pulumi.getter
def token(self) -> Optional[pulumi.Input[str]]:
"""
Whether to prepend each message with a specific token
"""
return pulumi.get(self, "token")
@token.setter
def token(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "token", value)
@property
@pulumi.getter(name="useTls")
def use_tls(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to use TLS for secure logging. Default `false`
"""
return pulumi.get(self, "use_tls")
@use_tls.setter
def use_tls(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "use_tls", value)
@pulumi.input_type
class ServiceWafConfigurationRuleArgs:
def __init__(__self__, *,
modsec_rule_id: pulumi.Input[int],
status: pulumi.Input[str],
revision: Optional[pulumi.Input[int]] = None):
"""
:param pulumi.Input[int] modsec_rule_id: The Web Application Firewall rule's modsecurity ID
:param pulumi.Input[str] status: The Web Application Firewall rule's status. Allowed values are (`log`, `block` and `score`)
:param pulumi.Input[int] revision: The Web Application Firewall rule's revision. The latest revision will be used if this is not provided
"""
pulumi.set(__self__, "modsec_rule_id", modsec_rule_id)
pulumi.set(__self__, "status", status)
if revision is not None:
pulumi.set(__self__, "revision", revision)
@property
@pulumi.getter(name="modsecRuleId")
def modsec_rule_id(self) -> pulumi.Input[int]:
"""
The Web Application Firewall rule's modsecurity ID
"""
return pulumi.get(self, "modsec_rule_id")
@modsec_rule_id.setter
def modsec_rule_id(self, value: pulumi.Input[int]):
pulumi.set(self, "modsec_rule_id", value)
@property
@pulumi.getter
def status(self) -> pulumi.Input[str]:
"""
The Web Application Firewall rule's status. Allowed values are (`log`, `block` and `score`)
"""
return pulumi.get(self, "status")
@status.setter
def status(self, value: pulumi.Input[str]):
pulumi.set(self, "status", value)
@property
@pulumi.getter
def revision(self) -> Optional[pulumi.Input[int]]:
"""
The Web Application Firewall rule's revision. The latest revision will be used if this is not provided
"""
return pulumi.get(self, "revision")
@revision.setter
def revision(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "revision", value)
@pulumi.input_type
class ServiceWafConfigurationRuleExclusionArgs:
def __init__(__self__, *,
condition: pulumi.Input[str],
exclusion_type: pulumi.Input[str],
name: pulumi.Input[str],
modsec_rule_ids: Optional[pulumi.Input[Sequence[pulumi.Input[int]]]] = None,
number: Optional[pulumi.Input[int]] = None):
"""
:param pulumi.Input[str] condition: A conditional expression in VCL used to determine if the condition is met
:param pulumi.Input[str] exclusion_type: The type of rule exclusion. Values are `rule` to exclude the specified rule(s), or `waf` to disable the Web Application Firewall
:param pulumi.Input[str] name: The name of rule exclusion
:param pulumi.Input[Sequence[pulumi.Input[int]]] modsec_rule_ids: Set of modsecurity IDs to be excluded. No rules should be provided when `exclusion_type` is `waf`. The rules need to be configured on the Web Application Firewall to be excluded
:param pulumi.Input[int] number: The numeric ID assigned to the WAF Rule Exclusion
"""
pulumi.set(__self__, "condition", condition)
pulumi.set(__self__, "exclusion_type", exclusion_type)
pulumi.set(__self__, "name", name)
if modsec_rule_ids is not None:
pulumi.set(__self__, "modsec_rule_ids", modsec_rule_ids)
if number is not None:
pulumi.set(__self__, "number", number)
@property
@pulumi.getter
def condition(self) -> pulumi.Input[str]:
"""
A conditional expression in VCL used to determine if the condition is met
"""
return pulumi.get(self, "condition")
@condition.setter
def condition(self, value: pulumi.Input[str]):
pulumi.set(self, "condition", value)
@property
@pulumi.getter(name="exclusionType")
def exclusion_type(self) -> pulumi.Input[str]:
"""
The type of rule exclusion. Values are `rule` to exclude the specified rule(s), or `waf` to disable the Web Application Firewall
"""
return pulumi.get(self, "exclusion_type")
@exclusion_type.setter
def exclusion_type(self, value: pulumi.Input[str]):
pulumi.set(self, "exclusion_type", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The name of rule exclusion
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="modsecRuleIds")
def modsec_rule_ids(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[int]]]]:
"""
Set of modsecurity IDs to be excluded. No rules should be provided when `exclusion_type` is `waf`. The rules need to be configured on the Web Application Firewall to be excluded
"""
return pulumi.get(self, "modsec_rule_ids")
@modsec_rule_ids.setter
def modsec_rule_ids(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[int]]]]):
pulumi.set(self, "modsec_rule_ids", value)
@property
@pulumi.getter
def number(self) -> Optional[pulumi.Input[int]]:
"""
The numeric ID assigned to the WAF Rule Exclusion
"""
return pulumi.get(self, "number")
@number.setter
def number(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "number", value)
@pulumi.input_type
class Servicev1AclArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
acl_id: Optional[pulumi.Input[str]] = None,
force_destroy: Optional[pulumi.Input[bool]] = None):
"""
:param pulumi.Input[str] name: A unique name to identify this ACL. It is important to note that changing this attribute will delete and recreate the ACL, and discard the current items in the ACL
:param pulumi.Input[str] acl_id: The ID of the ACL
:param pulumi.Input[bool] force_destroy: Allow the ACL to be deleted, even if it contains entries. Defaults to false.
"""
pulumi.set(__self__, "name", name)
if acl_id is not None:
pulumi.set(__self__, "acl_id", acl_id)
if force_destroy is not None:
pulumi.set(__self__, "force_destroy", force_destroy)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
A unique name to identify this ACL. It is important to note that changing this attribute will delete and recreate the ACL, and discard the current items in the ACL
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="aclId")
def acl_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the ACL
"""
return pulumi.get(self, "acl_id")
@acl_id.setter
def acl_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "acl_id", value)
@property
@pulumi.getter(name="forceDestroy")
def force_destroy(self) -> Optional[pulumi.Input[bool]]:
"""
Allow the ACL to be deleted, even if it contains entries. Defaults to false.
"""
return pulumi.get(self, "force_destroy")
@force_destroy.setter
def force_destroy(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "force_destroy", value)
@pulumi.input_type
class Servicev1BackendArgs:
def __init__(__self__, *,
address: pulumi.Input[str],
name: pulumi.Input[str],
auto_loadbalance: Optional[pulumi.Input[bool]] = None,
between_bytes_timeout: Optional[pulumi.Input[int]] = None,
connect_timeout: Optional[pulumi.Input[int]] = None,
error_threshold: Optional[pulumi.Input[int]] = None,
first_byte_timeout: Optional[pulumi.Input[int]] = None,
healthcheck: Optional[pulumi.Input[str]] = None,
max_conn: Optional[pulumi.Input[int]] = None,
max_tls_version: Optional[pulumi.Input[str]] = None,
min_tls_version: Optional[pulumi.Input[str]] = None,
override_host: Optional[pulumi.Input[str]] = None,
port: Optional[pulumi.Input[int]] = None,
request_condition: Optional[pulumi.Input[str]] = None,
shield: Optional[pulumi.Input[str]] = None,
ssl_ca_cert: Optional[pulumi.Input[str]] = None,
ssl_cert_hostname: Optional[pulumi.Input[str]] = None,
ssl_check_cert: Optional[pulumi.Input[bool]] = None,
ssl_ciphers: Optional[pulumi.Input[str]] = None,
ssl_client_cert: Optional[pulumi.Input[str]] = None,
ssl_client_key: Optional[pulumi.Input[str]] = None,
ssl_hostname: Optional[pulumi.Input[str]] = None,
ssl_sni_hostname: Optional[pulumi.Input[str]] = None,
use_ssl: Optional[pulumi.Input[bool]] = None,
weight: Optional[pulumi.Input[int]] = None):
"""
:param pulumi.Input[str] address: An IPv4, hostname, or IPv6 address for the Backend
:param pulumi.Input[str] name: Name for this Backend. Must be unique to this Service. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[bool] auto_loadbalance: Denotes if this Backend should be included in the pool of backends that requests are load balanced against. Default `true`
:param pulumi.Input[int] between_bytes_timeout: How long to wait between bytes in milliseconds. Default `10000`
:param pulumi.Input[int] connect_timeout: How long to wait for a timeout in milliseconds. Default `1000`
:param pulumi.Input[int] error_threshold: Number of errors to allow before the Backend is marked as down. Default `0`
:param pulumi.Input[int] first_byte_timeout: How long to wait for the first bytes in milliseconds. Default `15000`
:param pulumi.Input[str] healthcheck: Name of a defined `healthcheck` to assign to this backend
:param pulumi.Input[int] max_conn: Maximum number of connections for this Backend. Default `200`
:param pulumi.Input[str] max_tls_version: Maximum allowed TLS version on SSL connections to this backend.
:param pulumi.Input[str] min_tls_version: Minimum allowed TLS version on SSL connections to this backend.
:param pulumi.Input[str] override_host: The hostname to override the Host header
:param pulumi.Input[int] port: The port number on which the Backend responds. Default `80`
:param pulumi.Input[str] request_condition: Name of a condition, which if met, will select this backend during a request.
:param pulumi.Input[str] shield: The POP of the shield designated to reduce inbound load. Valid values for `shield` are included in the `GET /datacenters` API response
:param pulumi.Input[str] ssl_ca_cert: CA certificate attached to origin.
:param pulumi.Input[str] ssl_cert_hostname: Overrides ssl_hostname, but only for cert verification. Does not affect SNI at all
:param pulumi.Input[bool] ssl_check_cert: Be strict about checking SSL certs. Default `true`
:param pulumi.Input[str] ssl_ciphers: Cipher list consisting of one or more cipher strings separated by colons. Commas or spaces are also acceptable separators but colons are normally used.
:param pulumi.Input[str] ssl_client_cert: Client certificate attached to origin. Used when connecting to the backend
:param pulumi.Input[str] ssl_client_key: Client key attached to origin. Used when connecting to the backend
:param pulumi.Input[str] ssl_hostname: Used for both SNI during the TLS handshake and to validate the cert
:param pulumi.Input[str] ssl_sni_hostname: Overrides ssl_hostname, but only for SNI in the handshake. Does not affect cert validation at all
:param pulumi.Input[bool] use_ssl: Whether or not to use SSL to reach the Backend. Default `false`
:param pulumi.Input[int] weight: The [portion of traffic](https://docs.fastly.com/en/guides/load-balancing-configuration#how-weight-affects-load-balancing) to send to this Backend. Each Backend receives weight / total of the traffic. Default `100`
"""
pulumi.set(__self__, "address", address)
pulumi.set(__self__, "name", name)
if auto_loadbalance is not None:
pulumi.set(__self__, "auto_loadbalance", auto_loadbalance)
if between_bytes_timeout is not None:
pulumi.set(__self__, "between_bytes_timeout", between_bytes_timeout)
if connect_timeout is not None:
pulumi.set(__self__, "connect_timeout", connect_timeout)
if error_threshold is not None:
pulumi.set(__self__, "error_threshold", error_threshold)
if first_byte_timeout is not None:
pulumi.set(__self__, "first_byte_timeout", first_byte_timeout)
if healthcheck is not None:
pulumi.set(__self__, "healthcheck", healthcheck)
if max_conn is not None:
pulumi.set(__self__, "max_conn", max_conn)
if max_tls_version is not None:
pulumi.set(__self__, "max_tls_version", max_tls_version)
if min_tls_version is not None:
pulumi.set(__self__, "min_tls_version", min_tls_version)
if override_host is not None:
pulumi.set(__self__, "override_host", override_host)
if port is not None:
pulumi.set(__self__, "port", port)
if request_condition is not None:
pulumi.set(__self__, "request_condition", request_condition)
if shield is not None:
pulumi.set(__self__, "shield", shield)
if ssl_ca_cert is not None:
pulumi.set(__self__, "ssl_ca_cert", ssl_ca_cert)
if ssl_cert_hostname is not None:
pulumi.set(__self__, "ssl_cert_hostname", ssl_cert_hostname)
if ssl_check_cert is not None:
pulumi.set(__self__, "ssl_check_cert", ssl_check_cert)
if ssl_ciphers is not None:
pulumi.set(__self__, "ssl_ciphers", ssl_ciphers)
if ssl_client_cert is not None:
pulumi.set(__self__, "ssl_client_cert", ssl_client_cert)
if ssl_client_key is not None:
pulumi.set(__self__, "ssl_client_key", ssl_client_key)
if ssl_hostname is not None:
warnings.warn("""Use ssl_cert_hostname and ssl_sni_hostname instead.""", DeprecationWarning)
pulumi.log.warn("""ssl_hostname is deprecated: Use ssl_cert_hostname and ssl_sni_hostname instead.""")
if ssl_hostname is not None:
pulumi.set(__self__, "ssl_hostname", ssl_hostname)
if ssl_sni_hostname is not None:
pulumi.set(__self__, "ssl_sni_hostname", ssl_sni_hostname)
if use_ssl is not None:
pulumi.set(__self__, "use_ssl", use_ssl)
if weight is not None:
pulumi.set(__self__, "weight", weight)
@property
@pulumi.getter
def address(self) -> pulumi.Input[str]:
"""
An IPv4, hostname, or IPv6 address for the Backend
"""
return pulumi.get(self, "address")
@address.setter
def address(self, value: pulumi.Input[str]):
pulumi.set(self, "address", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
Name for this Backend. Must be unique to this Service. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="autoLoadbalance")
def auto_loadbalance(self) -> Optional[pulumi.Input[bool]]:
"""
Denotes if this Backend should be included in the pool of backends that requests are load balanced against. Default `true`
"""
return pulumi.get(self, "auto_loadbalance")
@auto_loadbalance.setter
def auto_loadbalance(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "auto_loadbalance", value)
@property
@pulumi.getter(name="betweenBytesTimeout")
def between_bytes_timeout(self) -> Optional[pulumi.Input[int]]:
"""
How long to wait between bytes in milliseconds. Default `10000`
"""
return pulumi.get(self, "between_bytes_timeout")
@between_bytes_timeout.setter
def between_bytes_timeout(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "between_bytes_timeout", value)
@property
@pulumi.getter(name="connectTimeout")
def connect_timeout(self) -> Optional[pulumi.Input[int]]:
"""
How long to wait for a timeout in milliseconds. Default `1000`
"""
return pulumi.get(self, "connect_timeout")
@connect_timeout.setter
def connect_timeout(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "connect_timeout", value)
@property
@pulumi.getter(name="errorThreshold")
def error_threshold(self) -> Optional[pulumi.Input[int]]:
"""
Number of errors to allow before the Backend is marked as down. Default `0`
"""
return pulumi.get(self, "error_threshold")
@error_threshold.setter
def error_threshold(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "error_threshold", value)
@property
@pulumi.getter(name="firstByteTimeout")
def first_byte_timeout(self) -> Optional[pulumi.Input[int]]:
"""
How long to wait for the first bytes in milliseconds. Default `15000`
"""
return pulumi.get(self, "first_byte_timeout")
@first_byte_timeout.setter
def first_byte_timeout(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "first_byte_timeout", value)
@property
@pulumi.getter
def healthcheck(self) -> Optional[pulumi.Input[str]]:
"""
Name of a defined `healthcheck` to assign to this backend
"""
return pulumi.get(self, "healthcheck")
@healthcheck.setter
def healthcheck(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "healthcheck", value)
@property
@pulumi.getter(name="maxConn")
def max_conn(self) -> Optional[pulumi.Input[int]]:
"""
Maximum number of connections for this Backend. Default `200`
"""
return pulumi.get(self, "max_conn")
@max_conn.setter
def max_conn(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_conn", value)
@property
@pulumi.getter(name="maxTlsVersion")
def max_tls_version(self) -> Optional[pulumi.Input[str]]:
"""
Maximum allowed TLS version on SSL connections to this backend.
"""
return pulumi.get(self, "max_tls_version")
@max_tls_version.setter
def max_tls_version(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "max_tls_version", value)
@property
@pulumi.getter(name="minTlsVersion")
def min_tls_version(self) -> Optional[pulumi.Input[str]]:
"""
Minimum allowed TLS version on SSL connections to this backend.
"""
return pulumi.get(self, "min_tls_version")
@min_tls_version.setter
def min_tls_version(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "min_tls_version", value)
@property
@pulumi.getter(name="overrideHost")
def override_host(self) -> Optional[pulumi.Input[str]]:
"""
The hostname to override the Host header
"""
return pulumi.get(self, "override_host")
@override_host.setter
def override_host(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "override_host", value)
@property
@pulumi.getter
def port(self) -> Optional[pulumi.Input[int]]:
"""
The port number on which the Backend responds. Default `80`
"""
return pulumi.get(self, "port")
@port.setter
def port(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "port", value)
@property
@pulumi.getter(name="requestCondition")
def request_condition(self) -> Optional[pulumi.Input[str]]:
"""
Name of a condition, which if met, will select this backend during a request.
"""
return pulumi.get(self, "request_condition")
@request_condition.setter
def request_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "request_condition", value)
@property
@pulumi.getter
def shield(self) -> Optional[pulumi.Input[str]]:
"""
The POP of the shield designated to reduce inbound load. Valid values for `shield` are included in the `GET /datacenters` API response
"""
return pulumi.get(self, "shield")
@shield.setter
def shield(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "shield", value)
@property
@pulumi.getter(name="sslCaCert")
def ssl_ca_cert(self) -> Optional[pulumi.Input[str]]:
"""
CA certificate attached to origin.
"""
return pulumi.get(self, "ssl_ca_cert")
@ssl_ca_cert.setter
def ssl_ca_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ssl_ca_cert", value)
@property
@pulumi.getter(name="sslCertHostname")
def ssl_cert_hostname(self) -> Optional[pulumi.Input[str]]:
"""
Overrides ssl_hostname, but only for cert verification. Does not affect SNI at all
"""
return pulumi.get(self, "ssl_cert_hostname")
@ssl_cert_hostname.setter
def ssl_cert_hostname(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ssl_cert_hostname", value)
@property
@pulumi.getter(name="sslCheckCert")
def ssl_check_cert(self) -> Optional[pulumi.Input[bool]]:
"""
Be strict about checking SSL certs. Default `true`
"""
return pulumi.get(self, "ssl_check_cert")
@ssl_check_cert.setter
def ssl_check_cert(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "ssl_check_cert", value)
@property
@pulumi.getter(name="sslCiphers")
def ssl_ciphers(self) -> Optional[pulumi.Input[str]]:
"""
Cipher list consisting of one or more cipher strings separated by colons. Commas or spaces are also acceptable separators but colons are normally used.
"""
return pulumi.get(self, "ssl_ciphers")
@ssl_ciphers.setter
def ssl_ciphers(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ssl_ciphers", value)
@property
@pulumi.getter(name="sslClientCert")
def ssl_client_cert(self) -> Optional[pulumi.Input[str]]:
"""
Client certificate attached to origin. Used when connecting to the backend
"""
return pulumi.get(self, "ssl_client_cert")
@ssl_client_cert.setter
def ssl_client_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ssl_client_cert", value)
@property
@pulumi.getter(name="sslClientKey")
def ssl_client_key(self) -> Optional[pulumi.Input[str]]:
"""
Client key attached to origin. Used when connecting to the backend
"""
return pulumi.get(self, "ssl_client_key")
@ssl_client_key.setter
def ssl_client_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ssl_client_key", value)
@property
@pulumi.getter(name="sslHostname")
def ssl_hostname(self) -> Optional[pulumi.Input[str]]:
"""
Used for both SNI during the TLS handshake and to validate the cert
"""
return pulumi.get(self, "ssl_hostname")
@ssl_hostname.setter
def ssl_hostname(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ssl_hostname", value)
@property
@pulumi.getter(name="sslSniHostname")
def ssl_sni_hostname(self) -> Optional[pulumi.Input[str]]:
"""
Overrides ssl_hostname, but only for SNI in the handshake. Does not affect cert validation at all
"""
return pulumi.get(self, "ssl_sni_hostname")
@ssl_sni_hostname.setter
def ssl_sni_hostname(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ssl_sni_hostname", value)
@property
@pulumi.getter(name="useSsl")
def use_ssl(self) -> Optional[pulumi.Input[bool]]:
"""
Whether or not to use SSL to reach the Backend. Default `false`
"""
return pulumi.get(self, "use_ssl")
@use_ssl.setter
def use_ssl(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "use_ssl", value)
@property
@pulumi.getter
def weight(self) -> Optional[pulumi.Input[int]]:
"""
The [portion of traffic](https://docs.fastly.com/en/guides/load-balancing-configuration#how-weight-affects-load-balancing) to send to this Backend. Each Backend receives weight / total of the traffic. Default `100`
"""
return pulumi.get(self, "weight")
@weight.setter
def weight(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "weight", value)
@pulumi.input_type
class Servicev1BigqueryloggingArgs:
def __init__(__self__, *,
dataset: pulumi.Input[str],
email: pulumi.Input[str],
name: pulumi.Input[str],
project_id: pulumi.Input[str],
secret_key: pulumi.Input[str],
table: pulumi.Input[str],
format: Optional[pulumi.Input[str]] = None,
placement: Optional[pulumi.Input[str]] = None,
response_condition: Optional[pulumi.Input[str]] = None,
template: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] dataset: The ID of your BigQuery dataset
:param pulumi.Input[str] email: The email for the service account with write access to your BigQuery dataset. If not provided, this will be pulled from a `FASTLY_BQ_EMAIL` environment variable
:param pulumi.Input[str] name: A unique name to identify this BigQuery logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] project_id: The ID of your GCP project
:param pulumi.Input[str] secret_key: The secret key associated with the service account that has write access to your BigQuery table. If not provided, this will be pulled from the `FASTLY_BQ_SECRET_KEY` environment variable. Typical format for this is a private key in a string with newlines
:param pulumi.Input[str] table: The ID of your BigQuery table
:param pulumi.Input[str] format: The logging format desired.
:param pulumi.Input[str] placement: Where in the generated VCL the logging call should be placed.
:param pulumi.Input[str] response_condition: Name of a condition to apply this logging.
:param pulumi.Input[str] template: BigQuery table name suffix template
"""
pulumi.set(__self__, "dataset", dataset)
pulumi.set(__self__, "email", email)
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "project_id", project_id)
pulumi.set(__self__, "secret_key", secret_key)
pulumi.set(__self__, "table", table)
if format is not None:
pulumi.set(__self__, "format", format)
if placement is not None:
pulumi.set(__self__, "placement", placement)
if response_condition is not None:
pulumi.set(__self__, "response_condition", response_condition)
if template is not None:
pulumi.set(__self__, "template", template)
@property
@pulumi.getter
def dataset(self) -> pulumi.Input[str]:
"""
The ID of your BigQuery dataset
"""
return pulumi.get(self, "dataset")
@dataset.setter
def dataset(self, value: pulumi.Input[str]):
pulumi.set(self, "dataset", value)
@property
@pulumi.getter
def email(self) -> pulumi.Input[str]:
"""
The email for the service account with write access to your BigQuery dataset. If not provided, this will be pulled from a `FASTLY_BQ_EMAIL` environment variable
"""
return pulumi.get(self, "email")
@email.setter
def email(self, value: pulumi.Input[str]):
pulumi.set(self, "email", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
A unique name to identify this BigQuery logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="projectId")
def project_id(self) -> pulumi.Input[str]:
"""
The ID of your GCP project
"""
return pulumi.get(self, "project_id")
@project_id.setter
def project_id(self, value: pulumi.Input[str]):
pulumi.set(self, "project_id", value)
@property
@pulumi.getter(name="secretKey")
def secret_key(self) -> pulumi.Input[str]:
"""
The secret key associated with the service account that has write access to your BigQuery table. If not provided, this will be pulled from the `FASTLY_BQ_SECRET_KEY` environment variable. Typical format for this is a private key in a string with newlines
"""
return pulumi.get(self, "secret_key")
@secret_key.setter
def secret_key(self, value: pulumi.Input[str]):
pulumi.set(self, "secret_key", value)
@property
@pulumi.getter
def table(self) -> pulumi.Input[str]:
"""
The ID of your BigQuery table
"""
return pulumi.get(self, "table")
@table.setter
def table(self, value: pulumi.Input[str]):
pulumi.set(self, "table", value)
@property
@pulumi.getter
def format(self) -> Optional[pulumi.Input[str]]:
"""
The logging format desired.
"""
return pulumi.get(self, "format")
@format.setter
def format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "format", value)
@property
@pulumi.getter
def placement(self) -> Optional[pulumi.Input[str]]:
"""
Where in the generated VCL the logging call should be placed.
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "placement", value)
@property
@pulumi.getter(name="responseCondition")
def response_condition(self) -> Optional[pulumi.Input[str]]:
"""
Name of a condition to apply this logging.
"""
return pulumi.get(self, "response_condition")
@response_condition.setter
def response_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response_condition", value)
@property
@pulumi.getter
def template(self) -> Optional[pulumi.Input[str]]:
"""
BigQuery table name suffix template
"""
return pulumi.get(self, "template")
@template.setter
def template(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "template", value)
@pulumi.input_type
class Servicev1BlobstorageloggingArgs:
def __init__(__self__, *,
account_name: pulumi.Input[str],
container: pulumi.Input[str],
name: pulumi.Input[str],
sas_token: pulumi.Input[str],
compression_codec: Optional[pulumi.Input[str]] = None,
file_max_bytes: Optional[pulumi.Input[int]] = None,
format: Optional[pulumi.Input[str]] = None,
format_version: Optional[pulumi.Input[int]] = None,
gzip_level: Optional[pulumi.Input[int]] = None,
message_type: Optional[pulumi.Input[str]] = None,
path: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[int]] = None,
placement: Optional[pulumi.Input[str]] = None,
public_key: Optional[pulumi.Input[str]] = None,
response_condition: Optional[pulumi.Input[str]] = None,
timestamp_format: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] account_name: The unique Azure Blob Storage namespace in which your data objects are stored
:param pulumi.Input[str] container: The name of the Azure Blob Storage container in which to store logs
:param pulumi.Input[str] name: A unique name to identify the Azure Blob Storage endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] sas_token: The Azure shared access signature providing write access to the blob service objects. Be sure to update your token before it expires or the logging functionality will not work
:param pulumi.Input[str] compression_codec: The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
:param pulumi.Input[int] file_max_bytes: Maximum size of an uploaded log file, if non-zero.
:param pulumi.Input[str] format: Apache-style string or VCL variables to use for log formatting (default: `%h %l %u %t "%r" %>s %b`)
:param pulumi.Input[int] format_version: The version of the custom logging format used for the configured endpoint. Can be either 1 or 2. (default: 2)
:param pulumi.Input[int] gzip_level: Level of Gzip compression from `0-9`. `0` means no compression. `1` is the fastest and the least compressed version, `9` is the slowest and the most compressed version. Default `0`
:param pulumi.Input[str] message_type: How the message should be formatted. Can be either `classic`, `loggly`, `logplex` or `blank`. Default `classic`
:param pulumi.Input[str] path: The path to upload logs to. Must end with a trailing slash. If this field is left empty, the files will be saved in the container's root path
:param pulumi.Input[int] period: How frequently the logs should be transferred in seconds. Default `3600`
:param pulumi.Input[str] placement: Where in the generated VCL the logging call should be placed
:param pulumi.Input[str] public_key: A PGP public key that Fastly will use to encrypt your log files before writing them to disk
:param pulumi.Input[str] response_condition: The name of the condition to apply
:param pulumi.Input[str] timestamp_format: `strftime` specified timestamp formatting. Default `%Y-%m-%dT%H:%M:%S.000`
"""
pulumi.set(__self__, "account_name", account_name)
pulumi.set(__self__, "container", container)
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "sas_token", sas_token)
if compression_codec is not None:
pulumi.set(__self__, "compression_codec", compression_codec)
if file_max_bytes is not None:
pulumi.set(__self__, "file_max_bytes", file_max_bytes)
if format is not None:
pulumi.set(__self__, "format", format)
if format_version is not None:
pulumi.set(__self__, "format_version", format_version)
if gzip_level is not None:
pulumi.set(__self__, "gzip_level", gzip_level)
if message_type is not None:
pulumi.set(__self__, "message_type", message_type)
if path is not None:
pulumi.set(__self__, "path", path)
if period is not None:
pulumi.set(__self__, "period", period)
if placement is not None:
pulumi.set(__self__, "placement", placement)
if public_key is not None:
pulumi.set(__self__, "public_key", public_key)
if response_condition is not None:
pulumi.set(__self__, "response_condition", response_condition)
if timestamp_format is not None:
pulumi.set(__self__, "timestamp_format", timestamp_format)
@property
@pulumi.getter(name="accountName")
def account_name(self) -> pulumi.Input[str]:
"""
The unique Azure Blob Storage namespace in which your data objects are stored
"""
return pulumi.get(self, "account_name")
@account_name.setter
def account_name(self, value: pulumi.Input[str]):
pulumi.set(self, "account_name", value)
@property
@pulumi.getter
def container(self) -> pulumi.Input[str]:
"""
The name of the Azure Blob Storage container in which to store logs
"""
return pulumi.get(self, "container")
@container.setter
def container(self, value: pulumi.Input[str]):
pulumi.set(self, "container", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
A unique name to identify the Azure Blob Storage endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="sasToken")
def sas_token(self) -> pulumi.Input[str]:
"""
The Azure shared access signature providing write access to the blob service objects. Be sure to update your token before it expires or the logging functionality will not work
"""
return pulumi.get(self, "sas_token")
@sas_token.setter
def sas_token(self, value: pulumi.Input[str]):
pulumi.set(self, "sas_token", value)
@property
@pulumi.getter(name="compressionCodec")
def compression_codec(self) -> Optional[pulumi.Input[str]]:
"""
The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
"""
return pulumi.get(self, "compression_codec")
@compression_codec.setter
def compression_codec(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "compression_codec", value)
@property
@pulumi.getter(name="fileMaxBytes")
def file_max_bytes(self) -> Optional[pulumi.Input[int]]:
"""
Maximum size of an uploaded log file, if non-zero.
"""
return pulumi.get(self, "file_max_bytes")
@file_max_bytes.setter
def file_max_bytes(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "file_max_bytes", value)
@property
@pulumi.getter
def format(self) -> Optional[pulumi.Input[str]]:
"""
Apache-style string or VCL variables to use for log formatting (default: `%h %l %u %t "%r" %>s %b`)
"""
return pulumi.get(self, "format")
@format.setter
def format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "format", value)
@property
@pulumi.getter(name="formatVersion")
def format_version(self) -> Optional[pulumi.Input[int]]:
"""
The version of the custom logging format used for the configured endpoint. Can be either 1 or 2. (default: 2)
"""
return pulumi.get(self, "format_version")
@format_version.setter
def format_version(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "format_version", value)
@property
@pulumi.getter(name="gzipLevel")
def gzip_level(self) -> Optional[pulumi.Input[int]]:
"""
Level of Gzip compression from `0-9`. `0` means no compression. `1` is the fastest and the least compressed version, `9` is the slowest and the most compressed version. Default `0`
"""
return pulumi.get(self, "gzip_level")
@gzip_level.setter
def gzip_level(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "gzip_level", value)
@property
@pulumi.getter(name="messageType")
def message_type(self) -> Optional[pulumi.Input[str]]:
"""
How the message should be formatted. Can be either `classic`, `loggly`, `logplex` or `blank`. Default `classic`
"""
return pulumi.get(self, "message_type")
@message_type.setter
def message_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "message_type", value)
@property
@pulumi.getter
def path(self) -> Optional[pulumi.Input[str]]:
"""
The path to upload logs to. Must end with a trailing slash. If this field is left empty, the files will be saved in the container's root path
"""
return pulumi.get(self, "path")
@path.setter
def path(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "path", value)
@property
@pulumi.getter
def period(self) -> Optional[pulumi.Input[int]]:
"""
How frequently the logs should be transferred in seconds. Default `3600`
"""
return pulumi.get(self, "period")
@period.setter
def period(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "period", value)
@property
@pulumi.getter
def placement(self) -> Optional[pulumi.Input[str]]:
"""
Where in the generated VCL the logging call should be placed
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "placement", value)
@property
@pulumi.getter(name="publicKey")
def public_key(self) -> Optional[pulumi.Input[str]]:
"""
A PGP public key that Fastly will use to encrypt your log files before writing them to disk
"""
return pulumi.get(self, "public_key")
@public_key.setter
def public_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "public_key", value)
@property
@pulumi.getter(name="responseCondition")
def response_condition(self) -> Optional[pulumi.Input[str]]:
"""
The name of the condition to apply
"""
return pulumi.get(self, "response_condition")
@response_condition.setter
def response_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response_condition", value)
@property
@pulumi.getter(name="timestampFormat")
def timestamp_format(self) -> Optional[pulumi.Input[str]]:
"""
`strftime` specified timestamp formatting. Default `%Y-%m-%dT%H:%M:%S.000`
"""
return pulumi.get(self, "timestamp_format")
@timestamp_format.setter
def timestamp_format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "timestamp_format", value)
@pulumi.input_type
class Servicev1CacheSettingArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
action: Optional[pulumi.Input[str]] = None,
cache_condition: Optional[pulumi.Input[str]] = None,
stale_ttl: Optional[pulumi.Input[int]] = None,
ttl: Optional[pulumi.Input[int]] = None):
"""
:param pulumi.Input[str] name: Unique name for this Cache Setting. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] action: One of cache, pass, or restart, as defined on Fastly's documentation under "[Caching action descriptions](https://docs.fastly.com/en/guides/controlling-caching#caching-action-descriptions)"
:param pulumi.Input[str] cache_condition: Name of already defined `condition` used to test whether this settings object should be used. This `condition` must be of type `CACHE`
:param pulumi.Input[int] stale_ttl: Max "Time To Live" for stale (unreachable) objects
:param pulumi.Input[int] ttl: The Time-To-Live (TTL) for the object
"""
pulumi.set(__self__, "name", name)
if action is not None:
pulumi.set(__self__, "action", action)
if cache_condition is not None:
pulumi.set(__self__, "cache_condition", cache_condition)
if stale_ttl is not None:
pulumi.set(__self__, "stale_ttl", stale_ttl)
if ttl is not None:
pulumi.set(__self__, "ttl", ttl)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
Unique name for this Cache Setting. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def action(self) -> Optional[pulumi.Input[str]]:
"""
One of cache, pass, or restart, as defined on Fastly's documentation under "[Caching action descriptions](https://docs.fastly.com/en/guides/controlling-caching#caching-action-descriptions)"
"""
return pulumi.get(self, "action")
@action.setter
def action(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "action", value)
@property
@pulumi.getter(name="cacheCondition")
def cache_condition(self) -> Optional[pulumi.Input[str]]:
"""
Name of already defined `condition` used to test whether this settings object should be used. This `condition` must be of type `CACHE`
"""
return pulumi.get(self, "cache_condition")
@cache_condition.setter
def cache_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "cache_condition", value)
@property
@pulumi.getter(name="staleTtl")
def stale_ttl(self) -> Optional[pulumi.Input[int]]:
"""
Max "Time To Live" for stale (unreachable) objects
"""
return pulumi.get(self, "stale_ttl")
@stale_ttl.setter
def stale_ttl(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "stale_ttl", value)
@property
@pulumi.getter
def ttl(self) -> Optional[pulumi.Input[int]]:
"""
The Time-To-Live (TTL) for the object
"""
return pulumi.get(self, "ttl")
@ttl.setter
def ttl(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "ttl", value)
@pulumi.input_type
class Servicev1ConditionArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
statement: pulumi.Input[str],
type: pulumi.Input[str],
priority: Optional[pulumi.Input[int]] = None):
"""
:param pulumi.Input[str] name: The unique name for the condition. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] statement: The statement used to determine if the condition is met
:param pulumi.Input[str] type: Type of condition, either `REQUEST` (req), `RESPONSE` (req, resp), or `CACHE` (req, beresp)
:param pulumi.Input[int] priority: A number used to determine the order in which multiple conditions execute. Lower numbers execute first. Default `10`
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "statement", statement)
pulumi.set(__self__, "type", type)
if priority is not None:
pulumi.set(__self__, "priority", priority)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name for the condition. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def statement(self) -> pulumi.Input[str]:
"""
The statement used to determine if the condition is met
"""
return pulumi.get(self, "statement")
@statement.setter
def statement(self, value: pulumi.Input[str]):
pulumi.set(self, "statement", value)
@property
@pulumi.getter
def type(self) -> pulumi.Input[str]:
"""
Type of condition, either `REQUEST` (req), `RESPONSE` (req, resp), or `CACHE` (req, beresp)
"""
return pulumi.get(self, "type")
@type.setter
def type(self, value: pulumi.Input[str]):
pulumi.set(self, "type", value)
@property
@pulumi.getter
def priority(self) -> Optional[pulumi.Input[int]]:
"""
A number used to determine the order in which multiple conditions execute. Lower numbers execute first. Default `10`
"""
return pulumi.get(self, "priority")
@priority.setter
def priority(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "priority", value)
@pulumi.input_type
class Servicev1DictionaryArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
dictionary_id: Optional[pulumi.Input[str]] = None,
force_destroy: Optional[pulumi.Input[bool]] = None,
write_only: Optional[pulumi.Input[bool]] = None):
"""
:param pulumi.Input[str] name: A unique name to identify this dictionary. It is important to note that changing this attribute will delete and recreate the dictionary, and discard the current items in the dictionary
:param pulumi.Input[str] dictionary_id: The ID of the dictionary
:param pulumi.Input[bool] force_destroy: Allow the dictionary to be deleted, even if it contains entries. Defaults to false.
:param pulumi.Input[bool] write_only: If `true`, the dictionary is a [private dictionary](https://docs.fastly.com/en/guides/private-dictionaries). Default is `false`. Please note that changing this attribute will delete and recreate the dictionary, and discard the current items in the dictionary. `Servicev1` resource will only manage the dictionary object itself, and items under private dictionaries can not be managed using `ServiceDictionaryItemsv1` resource. Therefore, using a write-only/private dictionary should only be done if the items are managed outside of his provider.
"""
pulumi.set(__self__, "name", name)
if dictionary_id is not None:
pulumi.set(__self__, "dictionary_id", dictionary_id)
if force_destroy is not None:
pulumi.set(__self__, "force_destroy", force_destroy)
if write_only is not None:
pulumi.set(__self__, "write_only", write_only)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
A unique name to identify this dictionary. It is important to note that changing this attribute will delete and recreate the dictionary, and discard the current items in the dictionary
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="dictionaryId")
def dictionary_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the dictionary
"""
return pulumi.get(self, "dictionary_id")
@dictionary_id.setter
def dictionary_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "dictionary_id", value)
@property
@pulumi.getter(name="forceDestroy")
def force_destroy(self) -> Optional[pulumi.Input[bool]]:
"""
Allow the dictionary to be deleted, even if it contains entries. Defaults to false.
"""
return pulumi.get(self, "force_destroy")
@force_destroy.setter
def force_destroy(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "force_destroy", value)
@property
@pulumi.getter(name="writeOnly")
def write_only(self) -> Optional[pulumi.Input[bool]]:
"""
If `true`, the dictionary is a [private dictionary](https://docs.fastly.com/en/guides/private-dictionaries). Default is `false`. Please note that changing this attribute will delete and recreate the dictionary, and discard the current items in the dictionary. `Servicev1` resource will only manage the dictionary object itself, and items under private dictionaries can not be managed using `ServiceDictionaryItemsv1` resource. Therefore, using a write-only/private dictionary should only be done if the items are managed outside of his provider.
"""
return pulumi.get(self, "write_only")
@write_only.setter
def write_only(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "write_only", value)
@pulumi.input_type
class Servicev1DirectorArgs:
def __init__(__self__, *,
backends: pulumi.Input[Sequence[pulumi.Input[str]]],
name: pulumi.Input[str],
capacity: Optional[pulumi.Input[int]] = None,
comment: Optional[pulumi.Input[str]] = None,
quorum: Optional[pulumi.Input[int]] = None,
retries: Optional[pulumi.Input[int]] = None,
shield: Optional[pulumi.Input[str]] = None,
type: Optional[pulumi.Input[int]] = None):
"""
:param pulumi.Input[Sequence[pulumi.Input[str]]] backends: Names of defined backends to map the director to. Example: `[ "origin1", "origin2" ]`
:param pulumi.Input[str] name: Unique name for this Director. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[int] capacity: Load balancing weight for the backends. Default `100`
:param pulumi.Input[str] comment: An optional comment about the Director
:param pulumi.Input[int] quorum: Percentage of capacity that needs to be up for the director itself to be considered up. Default `75`
:param pulumi.Input[int] retries: How many backends to search if it fails. Default `5`
:param pulumi.Input[str] shield: Selected POP to serve as a "shield" for backends. Valid values for `shield` are included in the [`GET /datacenters`](https://developer.fastly.com/reference/api/utils/datacenter/) API response
:param pulumi.Input[int] type: Type of load balance group to use. Integer, 1 to 4. Values: `1` (random), `3` (hash), `4` (client). Default `1`
"""
pulumi.set(__self__, "backends", backends)
pulumi.set(__self__, "name", name)
if capacity is not None:
pulumi.set(__self__, "capacity", capacity)
if comment is not None:
pulumi.set(__self__, "comment", comment)
if quorum is not None:
pulumi.set(__self__, "quorum", quorum)
if retries is not None:
pulumi.set(__self__, "retries", retries)
if shield is not None:
pulumi.set(__self__, "shield", shield)
if type is not None:
pulumi.set(__self__, "type", type)
@property
@pulumi.getter
def backends(self) -> pulumi.Input[Sequence[pulumi.Input[str]]]:
"""
Names of defined backends to map the director to. Example: `[ "origin1", "origin2" ]`
"""
return pulumi.get(self, "backends")
@backends.setter
def backends(self, value: pulumi.Input[Sequence[pulumi.Input[str]]]):
pulumi.set(self, "backends", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
Unique name for this Director. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def capacity(self) -> Optional[pulumi.Input[int]]:
"""
Load balancing weight for the backends. Default `100`
"""
return pulumi.get(self, "capacity")
@capacity.setter
def capacity(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "capacity", value)
@property
@pulumi.getter
def comment(self) -> Optional[pulumi.Input[str]]:
"""
An optional comment about the Director
"""
return pulumi.get(self, "comment")
@comment.setter
def comment(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "comment", value)
@property
@pulumi.getter
def quorum(self) -> Optional[pulumi.Input[int]]:
"""
Percentage of capacity that needs to be up for the director itself to be considered up. Default `75`
"""
return pulumi.get(self, "quorum")
@quorum.setter
def quorum(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "quorum", value)
@property
@pulumi.getter
def retries(self) -> Optional[pulumi.Input[int]]:
"""
How many backends to search if it fails. Default `5`
"""
return pulumi.get(self, "retries")
@retries.setter
def retries(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "retries", value)
@property
@pulumi.getter
def shield(self) -> Optional[pulumi.Input[str]]:
"""
Selected POP to serve as a "shield" for backends. Valid values for `shield` are included in the [`GET /datacenters`](https://developer.fastly.com/reference/api/utils/datacenter/) API response
"""
return pulumi.get(self, "shield")
@shield.setter
def shield(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "shield", value)
@property
@pulumi.getter
def type(self) -> Optional[pulumi.Input[int]]:
"""
Type of load balance group to use. Integer, 1 to 4. Values: `1` (random), `3` (hash), `4` (client). Default `1`
"""
return pulumi.get(self, "type")
@type.setter
def type(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "type", value)
@pulumi.input_type
class Servicev1DomainArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
comment: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: The domain that this Service will respond to. It is important to note that changing this attribute will delete and recreate the resource.
:param pulumi.Input[str] comment: An optional comment about the Domain.
"""
pulumi.set(__self__, "name", name)
if comment is not None:
pulumi.set(__self__, "comment", comment)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The domain that this Service will respond to. It is important to note that changing this attribute will delete and recreate the resource.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def comment(self) -> Optional[pulumi.Input[str]]:
"""
An optional comment about the Domain.
"""
return pulumi.get(self, "comment")
@comment.setter
def comment(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "comment", value)
@pulumi.input_type
class Servicev1DynamicsnippetArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
type: pulumi.Input[str],
priority: Optional[pulumi.Input[int]] = None,
snippet_id: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: A name that is unique across "regular" and "dynamic" VCL Snippet configuration blocks. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] type: The location in generated VCL where the snippet should be placed (can be one of `init`, `recv`, `hit`, `miss`, `pass`, `fetch`, `error`, `deliver`, `log` or `none`)
:param pulumi.Input[int] priority: Priority determines the ordering for multiple snippets. Lower numbers execute first. Defaults to `100`
:param pulumi.Input[str] snippet_id: The ID of the dynamic snippet
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "type", type)
if priority is not None:
pulumi.set(__self__, "priority", priority)
if snippet_id is not None:
pulumi.set(__self__, "snippet_id", snippet_id)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
A name that is unique across "regular" and "dynamic" VCL Snippet configuration blocks. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def type(self) -> pulumi.Input[str]:
"""
The location in generated VCL where the snippet should be placed (can be one of `init`, `recv`, `hit`, `miss`, `pass`, `fetch`, `error`, `deliver`, `log` or `none`)
"""
return pulumi.get(self, "type")
@type.setter
def type(self, value: pulumi.Input[str]):
pulumi.set(self, "type", value)
@property
@pulumi.getter
def priority(self) -> Optional[pulumi.Input[int]]:
"""
Priority determines the ordering for multiple snippets. Lower numbers execute first. Defaults to `100`
"""
return pulumi.get(self, "priority")
@priority.setter
def priority(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "priority", value)
@property
@pulumi.getter(name="snippetId")
def snippet_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the dynamic snippet
"""
return pulumi.get(self, "snippet_id")
@snippet_id.setter
def snippet_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "snippet_id", value)
@pulumi.input_type
class Servicev1GcsloggingArgs:
def __init__(__self__, *,
bucket_name: pulumi.Input[str],
name: pulumi.Input[str],
compression_codec: Optional[pulumi.Input[str]] = None,
email: Optional[pulumi.Input[str]] = None,
format: Optional[pulumi.Input[str]] = None,
gzip_level: Optional[pulumi.Input[int]] = None,
message_type: Optional[pulumi.Input[str]] = None,
path: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[int]] = None,
placement: Optional[pulumi.Input[str]] = None,
response_condition: Optional[pulumi.Input[str]] = None,
secret_key: Optional[pulumi.Input[str]] = None,
timestamp_format: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] bucket_name: The name of the bucket in which to store the logs
:param pulumi.Input[str] name: A unique name to identify this GCS endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] compression_codec: The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
:param pulumi.Input[str] email: The email address associated with the target GCS bucket on your account. You may optionally provide this secret via an environment variable, `FASTLY_GCS_EMAIL`
:param pulumi.Input[str] format: Apache-style string or VCL variables to use for log formatting
:param pulumi.Input[int] gzip_level: Level of Gzip compression, from `0-9`. `0` is no compression. `1` is fastest and least compressed, `9` is slowest and most compressed. Default `0`
:param pulumi.Input[str] message_type: How the message should be formatted; one of: `classic`, `loggly`, `logplex` or `blank`. Default `classic`. [Fastly Documentation](https://developer.fastly.com/reference/api/logging/gcs/)
:param pulumi.Input[str] path: Path to store the files. Must end with a trailing slash. If this field is left empty, the files will be saved in the bucket's root path
:param pulumi.Input[int] period: How frequently the logs should be transferred, in seconds (Default 3600)
:param pulumi.Input[str] placement: Where in the generated VCL the logging call should be placed.
:param pulumi.Input[str] response_condition: Name of a condition to apply this logging.
:param pulumi.Input[str] secret_key: The secret key associated with the target gcs bucket on your account. You may optionally provide this secret via an environment variable, `FASTLY_GCS_SECRET_KEY`. A typical format for the key is PEM format, containing actual newline characters where required
:param pulumi.Input[str] timestamp_format: specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
pulumi.set(__self__, "bucket_name", bucket_name)
pulumi.set(__self__, "name", name)
if compression_codec is not None:
pulumi.set(__self__, "compression_codec", compression_codec)
if email is not None:
pulumi.set(__self__, "email", email)
if format is not None:
pulumi.set(__self__, "format", format)
if gzip_level is not None:
pulumi.set(__self__, "gzip_level", gzip_level)
if message_type is not None:
pulumi.set(__self__, "message_type", message_type)
if path is not None:
pulumi.set(__self__, "path", path)
if period is not None:
pulumi.set(__self__, "period", period)
if placement is not None:
pulumi.set(__self__, "placement", placement)
if response_condition is not None:
pulumi.set(__self__, "response_condition", response_condition)
if secret_key is not None:
pulumi.set(__self__, "secret_key", secret_key)
if timestamp_format is not None:
pulumi.set(__self__, "timestamp_format", timestamp_format)
@property
@pulumi.getter(name="bucketName")
def bucket_name(self) -> pulumi.Input[str]:
"""
The name of the bucket in which to store the logs
"""
return pulumi.get(self, "bucket_name")
@bucket_name.setter
def bucket_name(self, value: pulumi.Input[str]):
pulumi.set(self, "bucket_name", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
A unique name to identify this GCS endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="compressionCodec")
def compression_codec(self) -> Optional[pulumi.Input[str]]:
"""
The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
"""
return pulumi.get(self, "compression_codec")
@compression_codec.setter
def compression_codec(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "compression_codec", value)
@property
@pulumi.getter
def email(self) -> Optional[pulumi.Input[str]]:
"""
The email address associated with the target GCS bucket on your account. You may optionally provide this secret via an environment variable, `FASTLY_GCS_EMAIL`
"""
return pulumi.get(self, "email")
@email.setter
def email(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "email", value)
@property
@pulumi.getter
def format(self) -> Optional[pulumi.Input[str]]:
"""
Apache-style string or VCL variables to use for log formatting
"""
return pulumi.get(self, "format")
@format.setter
def format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "format", value)
@property
@pulumi.getter(name="gzipLevel")
def gzip_level(self) -> Optional[pulumi.Input[int]]:
"""
Level of Gzip compression, from `0-9`. `0` is no compression. `1` is fastest and least compressed, `9` is slowest and most compressed. Default `0`
"""
return pulumi.get(self, "gzip_level")
@gzip_level.setter
def gzip_level(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "gzip_level", value)
@property
@pulumi.getter(name="messageType")
def message_type(self) -> Optional[pulumi.Input[str]]:
"""
How the message should be formatted; one of: `classic`, `loggly`, `logplex` or `blank`. Default `classic`. [Fastly Documentation](https://developer.fastly.com/reference/api/logging/gcs/)
"""
return pulumi.get(self, "message_type")
@message_type.setter
def message_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "message_type", value)
@property
@pulumi.getter
def path(self) -> Optional[pulumi.Input[str]]:
"""
Path to store the files. Must end with a trailing slash. If this field is left empty, the files will be saved in the bucket's root path
"""
return pulumi.get(self, "path")
@path.setter
def path(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "path", value)
@property
@pulumi.getter
def period(self) -> Optional[pulumi.Input[int]]:
"""
How frequently the logs should be transferred, in seconds (Default 3600)
"""
return pulumi.get(self, "period")
@period.setter
def period(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "period", value)
@property
@pulumi.getter
def placement(self) -> Optional[pulumi.Input[str]]:
"""
Where in the generated VCL the logging call should be placed.
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "placement", value)
@property
@pulumi.getter(name="responseCondition")
def response_condition(self) -> Optional[pulumi.Input[str]]:
"""
Name of a condition to apply this logging.
"""
return pulumi.get(self, "response_condition")
@response_condition.setter
def response_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response_condition", value)
@property
@pulumi.getter(name="secretKey")
def secret_key(self) -> Optional[pulumi.Input[str]]:
"""
The secret key associated with the target gcs bucket on your account. You may optionally provide this secret via an environment variable, `FASTLY_GCS_SECRET_KEY`. A typical format for the key is PEM format, containing actual newline characters where required
"""
return pulumi.get(self, "secret_key")
@secret_key.setter
def secret_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "secret_key", value)
@property
@pulumi.getter(name="timestampFormat")
def timestamp_format(self) -> Optional[pulumi.Input[str]]:
"""
specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
return pulumi.get(self, "timestamp_format")
@timestamp_format.setter
def timestamp_format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "timestamp_format", value)
@pulumi.input_type
class Servicev1GzipArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
cache_condition: Optional[pulumi.Input[str]] = None,
content_types: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
extensions: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None):
"""
:param pulumi.Input[str] name: A name to refer to this gzip condition. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] cache_condition: Name of already defined `condition` controlling when this gzip configuration applies. This `condition` must be of type `CACHE`. For detailed information about Conditionals, see [Fastly's Documentation on Conditionals](https://docs.fastly.com/en/guides/using-conditions)
:param pulumi.Input[Sequence[pulumi.Input[str]]] content_types: The content-type for each type of content you wish to have dynamically gzip'ed. Example: `["text/html", "text/css"]`
:param pulumi.Input[Sequence[pulumi.Input[str]]] extensions: File extensions for each file type to dynamically gzip. Example: `["css", "js"]`
"""
pulumi.set(__self__, "name", name)
if cache_condition is not None:
pulumi.set(__self__, "cache_condition", cache_condition)
if content_types is not None:
pulumi.set(__self__, "content_types", content_types)
if extensions is not None:
pulumi.set(__self__, "extensions", extensions)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
A name to refer to this gzip condition. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="cacheCondition")
def cache_condition(self) -> Optional[pulumi.Input[str]]:
"""
Name of already defined `condition` controlling when this gzip configuration applies. This `condition` must be of type `CACHE`. For detailed information about Conditionals, see [Fastly's Documentation on Conditionals](https://docs.fastly.com/en/guides/using-conditions)
"""
return pulumi.get(self, "cache_condition")
@cache_condition.setter
def cache_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "cache_condition", value)
@property
@pulumi.getter(name="contentTypes")
def content_types(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
The content-type for each type of content you wish to have dynamically gzip'ed. Example: `["text/html", "text/css"]`
"""
return pulumi.get(self, "content_types")
@content_types.setter
def content_types(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "content_types", value)
@property
@pulumi.getter
def extensions(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
File extensions for each file type to dynamically gzip. Example: `["css", "js"]`
"""
return pulumi.get(self, "extensions")
@extensions.setter
def extensions(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "extensions", value)
@pulumi.input_type
class Servicev1HeaderArgs:
def __init__(__self__, *,
action: pulumi.Input[str],
destination: pulumi.Input[str],
name: pulumi.Input[str],
type: pulumi.Input[str],
cache_condition: Optional[pulumi.Input[str]] = None,
ignore_if_set: Optional[pulumi.Input[bool]] = None,
priority: Optional[pulumi.Input[int]] = None,
regex: Optional[pulumi.Input[str]] = None,
request_condition: Optional[pulumi.Input[str]] = None,
response_condition: Optional[pulumi.Input[str]] = None,
source: Optional[pulumi.Input[str]] = None,
substitution: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] action: The Header manipulation action to take; must be one of `set`, `append`, `delete`, `regex`, or `regex_repeat`
:param pulumi.Input[str] destination: The name of the header that is going to be affected by the Action
:param pulumi.Input[str] name: Unique name for this header attribute. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] type: The Request type on which to apply the selected Action; must be one of `request`, `fetch`, `cache` or `response`
:param pulumi.Input[str] cache_condition: Name of already defined `condition` to apply. This `condition` must be of type `CACHE`
:param pulumi.Input[bool] ignore_if_set: Don't add the header if it is already. (Only applies to `set` action.). Default `false`
:param pulumi.Input[int] priority: Lower priorities execute first. Default: `100`
:param pulumi.Input[str] regex: Regular expression to use (Only applies to `regex` and `regex_repeat` actions.)
:param pulumi.Input[str] request_condition: Name of already defined `condition` to apply. This `condition` must be of type `REQUEST`
:param pulumi.Input[str] response_condition: Name of already defined `condition` to apply. This `condition` must be of type `RESPONSE`. For detailed information about Conditionals, see [Fastly's Documentation on Conditionals](https://docs.fastly.com/en/guides/using-conditions)
:param pulumi.Input[str] source: Variable to be used as a source for the header content (Does not apply to `delete` action.)
:param pulumi.Input[str] substitution: Value to substitute in place of regular expression. (Only applies to `regex` and `regex_repeat`.)
"""
pulumi.set(__self__, "action", action)
pulumi.set(__self__, "destination", destination)
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "type", type)
if cache_condition is not None:
pulumi.set(__self__, "cache_condition", cache_condition)
if ignore_if_set is not None:
pulumi.set(__self__, "ignore_if_set", ignore_if_set)
if priority is not None:
pulumi.set(__self__, "priority", priority)
if regex is not None:
pulumi.set(__self__, "regex", regex)
if request_condition is not None:
pulumi.set(__self__, "request_condition", request_condition)
if response_condition is not None:
pulumi.set(__self__, "response_condition", response_condition)
if source is not None:
pulumi.set(__self__, "source", source)
if substitution is not None:
pulumi.set(__self__, "substitution", substitution)
@property
@pulumi.getter
def action(self) -> pulumi.Input[str]:
"""
The Header manipulation action to take; must be one of `set`, `append`, `delete`, `regex`, or `regex_repeat`
"""
return pulumi.get(self, "action")
@action.setter
def action(self, value: pulumi.Input[str]):
pulumi.set(self, "action", value)
@property
@pulumi.getter
def destination(self) -> pulumi.Input[str]:
"""
The name of the header that is going to be affected by the Action
"""
return pulumi.get(self, "destination")
@destination.setter
def destination(self, value: pulumi.Input[str]):
pulumi.set(self, "destination", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
Unique name for this header attribute. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def type(self) -> pulumi.Input[str]:
"""
The Request type on which to apply the selected Action; must be one of `request`, `fetch`, `cache` or `response`
"""
return pulumi.get(self, "type")
@type.setter
def type(self, value: pulumi.Input[str]):
pulumi.set(self, "type", value)
@property
@pulumi.getter(name="cacheCondition")
def cache_condition(self) -> Optional[pulumi.Input[str]]:
"""
Name of already defined `condition` to apply. This `condition` must be of type `CACHE`
"""
return pulumi.get(self, "cache_condition")
@cache_condition.setter
def cache_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "cache_condition", value)
@property
@pulumi.getter(name="ignoreIfSet")
def ignore_if_set(self) -> Optional[pulumi.Input[bool]]:
"""
Don't add the header if it is already. (Only applies to `set` action.). Default `false`
"""
return pulumi.get(self, "ignore_if_set")
@ignore_if_set.setter
def ignore_if_set(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "ignore_if_set", value)
@property
@pulumi.getter
def priority(self) -> Optional[pulumi.Input[int]]:
"""
Lower priorities execute first. Default: `100`
"""
return pulumi.get(self, "priority")
@priority.setter
def priority(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "priority", value)
@property
@pulumi.getter
def regex(self) -> Optional[pulumi.Input[str]]:
"""
Regular expression to use (Only applies to `regex` and `regex_repeat` actions.)
"""
return pulumi.get(self, "regex")
@regex.setter
def regex(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "regex", value)
@property
@pulumi.getter(name="requestCondition")
def request_condition(self) -> Optional[pulumi.Input[str]]:
"""
Name of already defined `condition` to apply. This `condition` must be of type `REQUEST`
"""
return pulumi.get(self, "request_condition")
@request_condition.setter
def request_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "request_condition", value)
@property
@pulumi.getter(name="responseCondition")
def response_condition(self) -> Optional[pulumi.Input[str]]:
"""
Name of already defined `condition` to apply. This `condition` must be of type `RESPONSE`. For detailed information about Conditionals, see [Fastly's Documentation on Conditionals](https://docs.fastly.com/en/guides/using-conditions)
"""
return pulumi.get(self, "response_condition")
@response_condition.setter
def response_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response_condition", value)
@property
@pulumi.getter
def source(self) -> Optional[pulumi.Input[str]]:
"""
Variable to be used as a source for the header content (Does not apply to `delete` action.)
"""
return pulumi.get(self, "source")
@source.setter
def source(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "source", value)
@property
@pulumi.getter
def substitution(self) -> Optional[pulumi.Input[str]]:
"""
Value to substitute in place of regular expression. (Only applies to `regex` and `regex_repeat`.)
"""
return pulumi.get(self, "substitution")
@substitution.setter
def substitution(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "substitution", value)
@pulumi.input_type
class Servicev1HealthcheckArgs:
def __init__(__self__, *,
host: pulumi.Input[str],
name: pulumi.Input[str],
path: pulumi.Input[str],
check_interval: Optional[pulumi.Input[int]] = None,
expected_response: Optional[pulumi.Input[int]] = None,
http_version: Optional[pulumi.Input[str]] = None,
initial: Optional[pulumi.Input[int]] = None,
method: Optional[pulumi.Input[str]] = None,
threshold: Optional[pulumi.Input[int]] = None,
timeout: Optional[pulumi.Input[int]] = None,
window: Optional[pulumi.Input[int]] = None):
"""
:param pulumi.Input[str] host: The Host header to send for this Healthcheck
:param pulumi.Input[str] name: A unique name to identify this Healthcheck. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] path: The path to check
:param pulumi.Input[int] check_interval: How often to run the Healthcheck in milliseconds. Default `5000`
:param pulumi.Input[int] expected_response: The status code expected from the host. Default `200`
:param pulumi.Input[str] http_version: Whether to use version 1.0 or 1.1 HTTP. Default `1.1`
:param pulumi.Input[int] initial: When loading a config, the initial number of probes to be seen as OK. Default `3`
:param pulumi.Input[str] method: Which HTTP method to use. Default `HEAD`
:param pulumi.Input[int] threshold: How many Healthchecks must succeed to be considered healthy. Default `3`
:param pulumi.Input[int] timeout: Timeout in milliseconds. Default `500`
:param pulumi.Input[int] window: The number of most recent Healthcheck queries to keep for this Healthcheck. Default `5`
"""
pulumi.set(__self__, "host", host)
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "path", path)
if check_interval is not None:
pulumi.set(__self__, "check_interval", check_interval)
if expected_response is not None:
pulumi.set(__self__, "expected_response", expected_response)
if http_version is not None:
pulumi.set(__self__, "http_version", http_version)
if initial is not None:
pulumi.set(__self__, "initial", initial)
if method is not None:
pulumi.set(__self__, "method", method)
if threshold is not None:
pulumi.set(__self__, "threshold", threshold)
if timeout is not None:
pulumi.set(__self__, "timeout", timeout)
if window is not None:
pulumi.set(__self__, "window", window)
@property
@pulumi.getter
def host(self) -> pulumi.Input[str]:
"""
The Host header to send for this Healthcheck
"""
return pulumi.get(self, "host")
@host.setter
def host(self, value: pulumi.Input[str]):
pulumi.set(self, "host", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
A unique name to identify this Healthcheck. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def path(self) -> pulumi.Input[str]:
"""
The path to check
"""
return pulumi.get(self, "path")
@path.setter
def path(self, value: pulumi.Input[str]):
pulumi.set(self, "path", value)
@property
@pulumi.getter(name="checkInterval")
def check_interval(self) -> Optional[pulumi.Input[int]]:
"""
How often to run the Healthcheck in milliseconds. Default `5000`
"""
return pulumi.get(self, "check_interval")
@check_interval.setter
def check_interval(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "check_interval", value)
@property
@pulumi.getter(name="expectedResponse")
def expected_response(self) -> Optional[pulumi.Input[int]]:
"""
The status code expected from the host. Default `200`
"""
return pulumi.get(self, "expected_response")
@expected_response.setter
def expected_response(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "expected_response", value)
@property
@pulumi.getter(name="httpVersion")
def http_version(self) -> Optional[pulumi.Input[str]]:
"""
Whether to use version 1.0 or 1.1 HTTP. Default `1.1`
"""
return pulumi.get(self, "http_version")
@http_version.setter
def http_version(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "http_version", value)
@property
@pulumi.getter
def initial(self) -> Optional[pulumi.Input[int]]:
"""
When loading a config, the initial number of probes to be seen as OK. Default `3`
"""
return pulumi.get(self, "initial")
@initial.setter
def initial(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "initial", value)
@property
@pulumi.getter
def method(self) -> Optional[pulumi.Input[str]]:
"""
Which HTTP method to use. Default `HEAD`
"""
return pulumi.get(self, "method")
@method.setter
def method(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "method", value)
@property
@pulumi.getter
def threshold(self) -> Optional[pulumi.Input[int]]:
"""
How many Healthchecks must succeed to be considered healthy. Default `3`
"""
return pulumi.get(self, "threshold")
@threshold.setter
def threshold(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "threshold", value)
@property
@pulumi.getter
def timeout(self) -> Optional[pulumi.Input[int]]:
"""
Timeout in milliseconds. Default `500`
"""
return pulumi.get(self, "timeout")
@timeout.setter
def timeout(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "timeout", value)
@property
@pulumi.getter
def window(self) -> Optional[pulumi.Input[int]]:
"""
The number of most recent Healthcheck queries to keep for this Healthcheck. Default `5`
"""
return pulumi.get(self, "window")
@window.setter
def window(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "window", value)
@pulumi.input_type
class Servicev1HttpsloggingArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
url: pulumi.Input[str],
content_type: Optional[pulumi.Input[str]] = None,
format: Optional[pulumi.Input[str]] = None,
format_version: Optional[pulumi.Input[int]] = None,
header_name: Optional[pulumi.Input[str]] = None,
header_value: Optional[pulumi.Input[str]] = None,
json_format: Optional[pulumi.Input[str]] = None,
message_type: Optional[pulumi.Input[str]] = None,
method: Optional[pulumi.Input[str]] = None,
placement: Optional[pulumi.Input[str]] = None,
request_max_bytes: Optional[pulumi.Input[int]] = None,
request_max_entries: Optional[pulumi.Input[int]] = None,
response_condition: Optional[pulumi.Input[str]] = None,
tls_ca_cert: Optional[pulumi.Input[str]] = None,
tls_client_cert: Optional[pulumi.Input[str]] = None,
tls_client_key: Optional[pulumi.Input[str]] = None,
tls_hostname: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: The unique name of the HTTPS logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] url: URL that log data will be sent to. Must use the https protocol
:param pulumi.Input[str] content_type: Value of the `Content-Type` header sent with the request
:param pulumi.Input[str] format: Apache-style string or VCL variables to use for log formatting.
:param pulumi.Input[int] format_version: The version of the custom logging format used for the configured endpoint. Can be either 1 or 2. (default: 2)
:param pulumi.Input[str] header_name: Custom header sent with the request
:param pulumi.Input[str] header_value: Value of the custom header sent with the request
:param pulumi.Input[str] json_format: Formats log entries as JSON. Can be either disabled (`0`), array of json (`1`), or newline delimited json (`2`)
:param pulumi.Input[str] message_type: How the message should be formatted; one of: `classic`, `loggly`, `logplex` or `blank`. Default `blank`
:param pulumi.Input[str] method: HTTP method used for request. Can be either `POST` or `PUT`. Default `POST`
:param pulumi.Input[str] placement: Where in the generated VCL the logging call should be placed
:param pulumi.Input[int] request_max_bytes: The maximum number of bytes sent in one request
:param pulumi.Input[int] request_max_entries: The maximum number of logs sent in one request
:param pulumi.Input[str] response_condition: The name of the condition to apply
:param pulumi.Input[str] tls_ca_cert: A secure certificate to authenticate the server with. Must be in PEM format
:param pulumi.Input[str] tls_client_cert: The client certificate used to make authenticated requests. Must be in PEM format
:param pulumi.Input[str] tls_client_key: The client private key used to make authenticated requests. Must be in PEM format
:param pulumi.Input[str] tls_hostname: Used during the TLS handshake to validate the certificate
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "url", url)
if content_type is not None:
pulumi.set(__self__, "content_type", content_type)
if format is not None:
pulumi.set(__self__, "format", format)
if format_version is not None:
pulumi.set(__self__, "format_version", format_version)
if header_name is not None:
pulumi.set(__self__, "header_name", header_name)
if header_value is not None:
pulumi.set(__self__, "header_value", header_value)
if json_format is not None:
pulumi.set(__self__, "json_format", json_format)
if message_type is not None:
pulumi.set(__self__, "message_type", message_type)
if method is not None:
pulumi.set(__self__, "method", method)
if placement is not None:
pulumi.set(__self__, "placement", placement)
if request_max_bytes is not None:
pulumi.set(__self__, "request_max_bytes", request_max_bytes)
if request_max_entries is not None:
pulumi.set(__self__, "request_max_entries", request_max_entries)
if response_condition is not None:
pulumi.set(__self__, "response_condition", response_condition)
if tls_ca_cert is not None:
pulumi.set(__self__, "tls_ca_cert", tls_ca_cert)
if tls_client_cert is not None:
pulumi.set(__self__, "tls_client_cert", tls_client_cert)
if tls_client_key is not None:
pulumi.set(__self__, "tls_client_key", tls_client_key)
if tls_hostname is not None:
pulumi.set(__self__, "tls_hostname", tls_hostname)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the HTTPS logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def url(self) -> pulumi.Input[str]:
"""
URL that log data will be sent to. Must use the https protocol
"""
return pulumi.get(self, "url")
@url.setter
def url(self, value: pulumi.Input[str]):
pulumi.set(self, "url", value)
@property
@pulumi.getter(name="contentType")
def content_type(self) -> Optional[pulumi.Input[str]]:
"""
Value of the `Content-Type` header sent with the request
"""
return pulumi.get(self, "content_type")
@content_type.setter
def content_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "content_type", value)
@property
@pulumi.getter
def format(self) -> Optional[pulumi.Input[str]]:
"""
Apache-style string or VCL variables to use for log formatting.
"""
return pulumi.get(self, "format")
@format.setter
def format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "format", value)
@property
@pulumi.getter(name="formatVersion")
def format_version(self) -> Optional[pulumi.Input[int]]:
"""
The version of the custom logging format used for the configured endpoint. Can be either 1 or 2. (default: 2)
"""
return pulumi.get(self, "format_version")
@format_version.setter
def format_version(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "format_version", value)
@property
@pulumi.getter(name="headerName")
def header_name(self) -> Optional[pulumi.Input[str]]:
"""
Custom header sent with the request
"""
return pulumi.get(self, "header_name")
@header_name.setter
def header_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "header_name", value)
@property
@pulumi.getter(name="headerValue")
def header_value(self) -> Optional[pulumi.Input[str]]:
"""
Value of the custom header sent with the request
"""
return pulumi.get(self, "header_value")
@header_value.setter
def header_value(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "header_value", value)
@property
@pulumi.getter(name="jsonFormat")
def json_format(self) -> Optional[pulumi.Input[str]]:
"""
Formats log entries as JSON. Can be either disabled (`0`), array of json (`1`), or newline delimited json (`2`)
"""
return pulumi.get(self, "json_format")
@json_format.setter
def json_format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "json_format", value)
@property
@pulumi.getter(name="messageType")
def message_type(self) -> Optional[pulumi.Input[str]]:
"""
How the message should be formatted; one of: `classic`, `loggly`, `logplex` or `blank`. Default `blank`
"""
return pulumi.get(self, "message_type")
@message_type.setter
def message_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "message_type", value)
@property
@pulumi.getter
def method(self) -> Optional[pulumi.Input[str]]:
"""
HTTP method used for request. Can be either `POST` or `PUT`. Default `POST`
"""
return pulumi.get(self, "method")
@method.setter
def method(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "method", value)
@property
@pulumi.getter
def placement(self) -> Optional[pulumi.Input[str]]:
"""
Where in the generated VCL the logging call should be placed
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "placement", value)
@property
@pulumi.getter(name="requestMaxBytes")
def request_max_bytes(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of bytes sent in one request
"""
return pulumi.get(self, "request_max_bytes")
@request_max_bytes.setter
def request_max_bytes(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "request_max_bytes", value)
@property
@pulumi.getter(name="requestMaxEntries")
def request_max_entries(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of logs sent in one request
"""
return pulumi.get(self, "request_max_entries")
@request_max_entries.setter
def request_max_entries(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "request_max_entries", value)
@property
@pulumi.getter(name="responseCondition")
def response_condition(self) -> Optional[pulumi.Input[str]]:
"""
The name of the condition to apply
"""
return pulumi.get(self, "response_condition")
@response_condition.setter
def response_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response_condition", value)
@property
@pulumi.getter(name="tlsCaCert")
def tls_ca_cert(self) -> Optional[pulumi.Input[str]]:
"""
A secure certificate to authenticate the server with. Must be in PEM format
"""
return pulumi.get(self, "tls_ca_cert")
@tls_ca_cert.setter
def tls_ca_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_ca_cert", value)
@property
@pulumi.getter(name="tlsClientCert")
def tls_client_cert(self) -> Optional[pulumi.Input[str]]:
"""
The client certificate used to make authenticated requests. Must be in PEM format
"""
return pulumi.get(self, "tls_client_cert")
@tls_client_cert.setter
def tls_client_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_client_cert", value)
@property
@pulumi.getter(name="tlsClientKey")
def tls_client_key(self) -> Optional[pulumi.Input[str]]:
"""
The client private key used to make authenticated requests. Must be in PEM format
"""
return pulumi.get(self, "tls_client_key")
@tls_client_key.setter
def tls_client_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_client_key", value)
@property
@pulumi.getter(name="tlsHostname")
def tls_hostname(self) -> Optional[pulumi.Input[str]]:
"""
Used during the TLS handshake to validate the certificate
"""
return pulumi.get(self, "tls_hostname")
@tls_hostname.setter
def tls_hostname(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_hostname", value)
@pulumi.input_type
class Servicev1LogentryArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
token: pulumi.Input[str],
format: Optional[pulumi.Input[str]] = None,
format_version: Optional[pulumi.Input[int]] = None,
placement: Optional[pulumi.Input[str]] = None,
port: Optional[pulumi.Input[int]] = None,
response_condition: Optional[pulumi.Input[str]] = None,
use_tls: Optional[pulumi.Input[bool]] = None):
"""
:param pulumi.Input[str] name: The unique name of the Logentries logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] token: Use token based authentication (https://logentries.com/doc/input-token/)
:param pulumi.Input[str] format: Apache-style string or VCL variables to use for log formatting
:param pulumi.Input[int] format_version: The version of the custom logging format used for the configured endpoint. Can be either 1 or 2. (Default: 1)
:param pulumi.Input[str] placement: Where in the generated VCL the logging call should be placed.
:param pulumi.Input[int] port: The port number configured in Logentries
:param pulumi.Input[str] response_condition: Name of blockAttributes condition to apply this logging.
:param pulumi.Input[bool] use_tls: Whether to use TLS for secure logging
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "token", token)
if format is not None:
pulumi.set(__self__, "format", format)
if format_version is not None:
pulumi.set(__self__, "format_version", format_version)
if placement is not None:
pulumi.set(__self__, "placement", placement)
if port is not None:
pulumi.set(__self__, "port", port)
if response_condition is not None:
pulumi.set(__self__, "response_condition", response_condition)
if use_tls is not None:
pulumi.set(__self__, "use_tls", use_tls)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the Logentries logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def token(self) -> pulumi.Input[str]:
"""
Use token based authentication (https://logentries.com/doc/input-token/)
"""
return pulumi.get(self, "token")
@token.setter
def token(self, value: pulumi.Input[str]):
pulumi.set(self, "token", value)
@property
@pulumi.getter
def format(self) -> Optional[pulumi.Input[str]]:
"""
Apache-style string or VCL variables to use for log formatting
"""
return pulumi.get(self, "format")
@format.setter
def format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "format", value)
@property
@pulumi.getter(name="formatVersion")
def format_version(self) -> Optional[pulumi.Input[int]]:
"""
The version of the custom logging format used for the configured endpoint. Can be either 1 or 2. (Default: 1)
"""
return pulumi.get(self, "format_version")
@format_version.setter
def format_version(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "format_version", value)
@property
@pulumi.getter
def placement(self) -> Optional[pulumi.Input[str]]:
"""
Where in the generated VCL the logging call should be placed.
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "placement", value)
@property
@pulumi.getter
def port(self) -> Optional[pulumi.Input[int]]:
"""
The port number configured in Logentries
"""
return pulumi.get(self, "port")
@port.setter
def port(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "port", value)
@property
@pulumi.getter(name="responseCondition")
def response_condition(self) -> Optional[pulumi.Input[str]]:
"""
Name of blockAttributes condition to apply this logging.
"""
return pulumi.get(self, "response_condition")
@response_condition.setter
def response_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response_condition", value)
@property
@pulumi.getter(name="useTls")
def use_tls(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to use TLS for secure logging
"""
return pulumi.get(self, "use_tls")
@use_tls.setter
def use_tls(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "use_tls", value)
@pulumi.input_type
class Servicev1LoggingCloudfileArgs:
def __init__(__self__, *,
access_key: pulumi.Input[str],
bucket_name: pulumi.Input[str],
name: pulumi.Input[str],
user: pulumi.Input[str],
compression_codec: Optional[pulumi.Input[str]] = None,
format: Optional[pulumi.Input[str]] = None,
format_version: Optional[pulumi.Input[int]] = None,
gzip_level: Optional[pulumi.Input[int]] = None,
message_type: Optional[pulumi.Input[str]] = None,
path: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[int]] = None,
placement: Optional[pulumi.Input[str]] = None,
public_key: Optional[pulumi.Input[str]] = None,
region: Optional[pulumi.Input[str]] = None,
response_condition: Optional[pulumi.Input[str]] = None,
timestamp_format: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] access_key: Your Cloud File account access key
:param pulumi.Input[str] bucket_name: The name of your Cloud Files container
:param pulumi.Input[str] name: The unique name of the Rackspace Cloud Files logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] user: The username for your Cloud Files account
:param pulumi.Input[str] compression_codec: The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
:param pulumi.Input[str] format: Apache style log formatting.
:param pulumi.Input[int] format_version: The version of the custom logging format used for the configured endpoint. Can be either `1` or `2`. (default: `2`).
:param pulumi.Input[int] gzip_level: What level of GZIP encoding to have when dumping logs (default `0`, no compression)
:param pulumi.Input[str] message_type: How the message should be formatted. One of: `classic` (default), `loggly`, `logplex` or `blank`
:param pulumi.Input[str] path: The path to upload logs to
:param pulumi.Input[int] period: How frequently log files are finalized so they can be available for reading (in seconds, default `3600`)
:param pulumi.Input[str] placement: Where in the generated VCL the logging call should be placed. Can be `none` or `waf_debug`.
:param pulumi.Input[str] public_key: The PGP public key that Fastly will use to encrypt your log files before writing them to disk
:param pulumi.Input[str] region: The region to stream logs to. One of: DFW (Dallas), ORD (Chicago), IAD (Northern Virginia), LON (London), SYD (Sydney), HKG (Hong Kong)
:param pulumi.Input[str] response_condition: The name of an existing condition in the configured endpoint, or leave blank to always execute.
:param pulumi.Input[str] timestamp_format: The `strftime` specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
pulumi.set(__self__, "access_key", access_key)
pulumi.set(__self__, "bucket_name", bucket_name)
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "user", user)
if compression_codec is not None:
pulumi.set(__self__, "compression_codec", compression_codec)
if format is not None:
pulumi.set(__self__, "format", format)
if format_version is not None:
pulumi.set(__self__, "format_version", format_version)
if gzip_level is not None:
pulumi.set(__self__, "gzip_level", gzip_level)
if message_type is not None:
pulumi.set(__self__, "message_type", message_type)
if path is not None:
pulumi.set(__self__, "path", path)
if period is not None:
pulumi.set(__self__, "period", period)
if placement is not None:
pulumi.set(__self__, "placement", placement)
if public_key is not None:
pulumi.set(__self__, "public_key", public_key)
if region is not None:
pulumi.set(__self__, "region", region)
if response_condition is not None:
pulumi.set(__self__, "response_condition", response_condition)
if timestamp_format is not None:
pulumi.set(__self__, "timestamp_format", timestamp_format)
@property
@pulumi.getter(name="accessKey")
def access_key(self) -> pulumi.Input[str]:
"""
Your Cloud File account access key
"""
return pulumi.get(self, "access_key")
@access_key.setter
def access_key(self, value: pulumi.Input[str]):
pulumi.set(self, "access_key", value)
@property
@pulumi.getter(name="bucketName")
def bucket_name(self) -> pulumi.Input[str]:
"""
The name of your Cloud Files container
"""
return pulumi.get(self, "bucket_name")
@bucket_name.setter
def bucket_name(self, value: pulumi.Input[str]):
pulumi.set(self, "bucket_name", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the Rackspace Cloud Files logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def user(self) -> pulumi.Input[str]:
"""
The username for your Cloud Files account
"""
return pulumi.get(self, "user")
@user.setter
def user(self, value: pulumi.Input[str]):
pulumi.set(self, "user", value)
@property
@pulumi.getter(name="compressionCodec")
def compression_codec(self) -> Optional[pulumi.Input[str]]:
"""
The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
"""
return pulumi.get(self, "compression_codec")
@compression_codec.setter
def compression_codec(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "compression_codec", value)
@property
@pulumi.getter
def format(self) -> Optional[pulumi.Input[str]]:
"""
Apache style log formatting.
"""
return pulumi.get(self, "format")
@format.setter
def format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "format", value)
@property
@pulumi.getter(name="formatVersion")
def format_version(self) -> Optional[pulumi.Input[int]]:
"""
The version of the custom logging format used for the configured endpoint. Can be either `1` or `2`. (default: `2`).
"""
return pulumi.get(self, "format_version")
@format_version.setter
def format_version(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "format_version", value)
@property
@pulumi.getter(name="gzipLevel")
def gzip_level(self) -> Optional[pulumi.Input[int]]:
"""
What level of GZIP encoding to have when dumping logs (default `0`, no compression)
"""
return pulumi.get(self, "gzip_level")
@gzip_level.setter
def gzip_level(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "gzip_level", value)
@property
@pulumi.getter(name="messageType")
def message_type(self) -> Optional[pulumi.Input[str]]:
"""
How the message should be formatted. One of: `classic` (default), `loggly`, `logplex` or `blank`
"""
return pulumi.get(self, "message_type")
@message_type.setter
def message_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "message_type", value)
@property
@pulumi.getter
def path(self) -> Optional[pulumi.Input[str]]:
"""
The path to upload logs to
"""
return pulumi.get(self, "path")
@path.setter
def path(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "path", value)
@property
@pulumi.getter
def period(self) -> Optional[pulumi.Input[int]]:
"""
How frequently log files are finalized so they can be available for reading (in seconds, default `3600`)
"""
return pulumi.get(self, "period")
@period.setter
def period(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "period", value)
@property
@pulumi.getter
def placement(self) -> Optional[pulumi.Input[str]]:
"""
Where in the generated VCL the logging call should be placed. Can be `none` or `waf_debug`.
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "placement", value)
@property
@pulumi.getter(name="publicKey")
def public_key(self) -> Optional[pulumi.Input[str]]:
"""
The PGP public key that Fastly will use to encrypt your log files before writing them to disk
"""
return pulumi.get(self, "public_key")
@public_key.setter
def public_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "public_key", value)
@property
@pulumi.getter
def region(self) -> Optional[pulumi.Input[str]]:
"""
The region to stream logs to. One of: DFW (Dallas), ORD (Chicago), IAD (Northern Virginia), LON (London), SYD (Sydney), HKG (Hong Kong)
"""
return pulumi.get(self, "region")
@region.setter
def region(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "region", value)
@property
@pulumi.getter(name="responseCondition")
def response_condition(self) -> Optional[pulumi.Input[str]]:
"""
The name of an existing condition in the configured endpoint, or leave blank to always execute.
"""
return pulumi.get(self, "response_condition")
@response_condition.setter
def response_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response_condition", value)
@property
@pulumi.getter(name="timestampFormat")
def timestamp_format(self) -> Optional[pulumi.Input[str]]:
"""
The `strftime` specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
return pulumi.get(self, "timestamp_format")
@timestamp_format.setter
def timestamp_format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "timestamp_format", value)
@pulumi.input_type
class Servicev1LoggingDatadogArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
token: pulumi.Input[str],
format: Optional[pulumi.Input[str]] = None,
format_version: Optional[pulumi.Input[int]] = None,
placement: Optional[pulumi.Input[str]] = None,
region: Optional[pulumi.Input[str]] = None,
response_condition: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: The unique name of the Datadog logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] token: The API key from your Datadog account
:param pulumi.Input[str] format: Apache-style string or VCL variables to use for log formatting.
:param pulumi.Input[int] format_version: The version of the custom logging format used for the configured endpoint. Can be either `1` or `2`. (default: `2`).
:param pulumi.Input[str] placement: Where in the generated VCL the logging call should be placed.
:param pulumi.Input[str] region: The region that log data will be sent to. One of `US` or `EU`. Defaults to `US` if undefined
:param pulumi.Input[str] response_condition: The name of the condition to apply.
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "token", token)
if format is not None:
pulumi.set(__self__, "format", format)
if format_version is not None:
pulumi.set(__self__, "format_version", format_version)
if placement is not None:
pulumi.set(__self__, "placement", placement)
if region is not None:
pulumi.set(__self__, "region", region)
if response_condition is not None:
pulumi.set(__self__, "response_condition", response_condition)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the Datadog logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def token(self) -> pulumi.Input[str]:
"""
The API key from your Datadog account
"""
return pulumi.get(self, "token")
@token.setter
def token(self, value: pulumi.Input[str]):
pulumi.set(self, "token", value)
@property
@pulumi.getter
def format(self) -> Optional[pulumi.Input[str]]:
"""
Apache-style string or VCL variables to use for log formatting.
"""
return pulumi.get(self, "format")
@format.setter
def format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "format", value)
@property
@pulumi.getter(name="formatVersion")
def format_version(self) -> Optional[pulumi.Input[int]]:
"""
The version of the custom logging format used for the configured endpoint. Can be either `1` or `2`. (default: `2`).
"""
return pulumi.get(self, "format_version")
@format_version.setter
def format_version(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "format_version", value)
@property
@pulumi.getter
def placement(self) -> Optional[pulumi.Input[str]]:
"""
Where in the generated VCL the logging call should be placed.
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "placement", value)
@property
@pulumi.getter
def region(self) -> Optional[pulumi.Input[str]]:
"""
The region that log data will be sent to. One of `US` or `EU`. Defaults to `US` if undefined
"""
return pulumi.get(self, "region")
@region.setter
def region(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "region", value)
@property
@pulumi.getter(name="responseCondition")
def response_condition(self) -> Optional[pulumi.Input[str]]:
"""
The name of the condition to apply.
"""
return pulumi.get(self, "response_condition")
@response_condition.setter
def response_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response_condition", value)
@pulumi.input_type
class Servicev1LoggingDigitaloceanArgs:
def __init__(__self__, *,
access_key: pulumi.Input[str],
bucket_name: pulumi.Input[str],
name: pulumi.Input[str],
secret_key: pulumi.Input[str],
compression_codec: Optional[pulumi.Input[str]] = None,
domain: Optional[pulumi.Input[str]] = None,
format: Optional[pulumi.Input[str]] = None,
format_version: Optional[pulumi.Input[int]] = None,
gzip_level: Optional[pulumi.Input[int]] = None,
message_type: Optional[pulumi.Input[str]] = None,
path: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[int]] = None,
placement: Optional[pulumi.Input[str]] = None,
public_key: Optional[pulumi.Input[str]] = None,
response_condition: Optional[pulumi.Input[str]] = None,
timestamp_format: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] access_key: Your DigitalOcean Spaces account access key
:param pulumi.Input[str] bucket_name: The name of the DigitalOcean Space
:param pulumi.Input[str] name: The unique name of the DigitalOcean Spaces logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] secret_key: Your DigitalOcean Spaces account secret key
:param pulumi.Input[str] compression_codec: The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
:param pulumi.Input[str] domain: The domain of the DigitalOcean Spaces endpoint (default `nyc3.digitaloceanspaces.com`)
:param pulumi.Input[str] format: Apache style log formatting.
:param pulumi.Input[int] format_version: The version of the custom logging format used for the configured endpoint. Can be either `1` or `2`. (default: `2`).
:param pulumi.Input[int] gzip_level: What level of Gzip encoding to have when dumping logs (default `0`, no compression)
:param pulumi.Input[str] message_type: How the message should be formatted. One of: `classic` (default), `loggly`, `logplex` or `blank`
:param pulumi.Input[str] path: The path to upload logs to
:param pulumi.Input[int] period: How frequently log files are finalized so they can be available for reading (in seconds, default `3600`)
:param pulumi.Input[str] placement: Where in the generated VCL the logging call should be placed. Can be `none` or `waf_debug`.
:param pulumi.Input[str] public_key: A PGP public key that Fastly will use to encrypt your log files before writing them to disk
:param pulumi.Input[str] response_condition: The name of an existing condition in the configured endpoint, or leave blank to always execute.
:param pulumi.Input[str] timestamp_format: `strftime` specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
pulumi.set(__self__, "access_key", access_key)
pulumi.set(__self__, "bucket_name", bucket_name)
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "secret_key", secret_key)
if compression_codec is not None:
pulumi.set(__self__, "compression_codec", compression_codec)
if domain is not None:
pulumi.set(__self__, "domain", domain)
if format is not None:
pulumi.set(__self__, "format", format)
if format_version is not None:
pulumi.set(__self__, "format_version", format_version)
if gzip_level is not None:
pulumi.set(__self__, "gzip_level", gzip_level)
if message_type is not None:
pulumi.set(__self__, "message_type", message_type)
if path is not None:
pulumi.set(__self__, "path", path)
if period is not None:
pulumi.set(__self__, "period", period)
if placement is not None:
pulumi.set(__self__, "placement", placement)
if public_key is not None:
pulumi.set(__self__, "public_key", public_key)
if response_condition is not None:
pulumi.set(__self__, "response_condition", response_condition)
if timestamp_format is not None:
pulumi.set(__self__, "timestamp_format", timestamp_format)
@property
@pulumi.getter(name="accessKey")
def access_key(self) -> pulumi.Input[str]:
"""
Your DigitalOcean Spaces account access key
"""
return pulumi.get(self, "access_key")
@access_key.setter
def access_key(self, value: pulumi.Input[str]):
pulumi.set(self, "access_key", value)
@property
@pulumi.getter(name="bucketName")
def bucket_name(self) -> pulumi.Input[str]:
"""
The name of the DigitalOcean Space
"""
return pulumi.get(self, "bucket_name")
@bucket_name.setter
def bucket_name(self, value: pulumi.Input[str]):
pulumi.set(self, "bucket_name", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the DigitalOcean Spaces logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="secretKey")
def secret_key(self) -> pulumi.Input[str]:
"""
Your DigitalOcean Spaces account secret key
"""
return pulumi.get(self, "secret_key")
@secret_key.setter
def secret_key(self, value: pulumi.Input[str]):
pulumi.set(self, "secret_key", value)
@property
@pulumi.getter(name="compressionCodec")
def compression_codec(self) -> Optional[pulumi.Input[str]]:
"""
The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
"""
return pulumi.get(self, "compression_codec")
@compression_codec.setter
def compression_codec(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "compression_codec", value)
@property
@pulumi.getter
def domain(self) -> Optional[pulumi.Input[str]]:
"""
The domain of the DigitalOcean Spaces endpoint (default `nyc3.digitaloceanspaces.com`)
"""
return pulumi.get(self, "domain")
@domain.setter
def domain(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "domain", value)
@property
@pulumi.getter
def format(self) -> Optional[pulumi.Input[str]]:
"""
Apache style log formatting.
"""
return pulumi.get(self, "format")
@format.setter
def format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "format", value)
@property
@pulumi.getter(name="formatVersion")
def format_version(self) -> Optional[pulumi.Input[int]]:
"""
The version of the custom logging format used for the configured endpoint. Can be either `1` or `2`. (default: `2`).
"""
return pulumi.get(self, "format_version")
@format_version.setter
def format_version(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "format_version", value)
@property
@pulumi.getter(name="gzipLevel")
def gzip_level(self) -> Optional[pulumi.Input[int]]:
"""
What level of Gzip encoding to have when dumping logs (default `0`, no compression)
"""
return pulumi.get(self, "gzip_level")
@gzip_level.setter
def gzip_level(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "gzip_level", value)
@property
@pulumi.getter(name="messageType")
def message_type(self) -> Optional[pulumi.Input[str]]:
"""
How the message should be formatted. One of: `classic` (default), `loggly`, `logplex` or `blank`
"""
return pulumi.get(self, "message_type")
@message_type.setter
def message_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "message_type", value)
@property
@pulumi.getter
def path(self) -> Optional[pulumi.Input[str]]:
"""
The path to upload logs to
"""
return pulumi.get(self, "path")
@path.setter
def path(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "path", value)
@property
@pulumi.getter
def period(self) -> Optional[pulumi.Input[int]]:
"""
How frequently log files are finalized so they can be available for reading (in seconds, default `3600`)
"""
return pulumi.get(self, "period")
@period.setter
def period(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "period", value)
@property
@pulumi.getter
def placement(self) -> Optional[pulumi.Input[str]]:
"""
Where in the generated VCL the logging call should be placed. Can be `none` or `waf_debug`.
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "placement", value)
@property
@pulumi.getter(name="publicKey")
def public_key(self) -> Optional[pulumi.Input[str]]:
"""
A PGP public key that Fastly will use to encrypt your log files before writing them to disk
"""
return pulumi.get(self, "public_key")
@public_key.setter
def public_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "public_key", value)
@property
@pulumi.getter(name="responseCondition")
def response_condition(self) -> Optional[pulumi.Input[str]]:
"""
The name of an existing condition in the configured endpoint, or leave blank to always execute.
"""
return pulumi.get(self, "response_condition")
@response_condition.setter
def response_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response_condition", value)
@property
@pulumi.getter(name="timestampFormat")
def timestamp_format(self) -> Optional[pulumi.Input[str]]:
"""
`strftime` specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
return pulumi.get(self, "timestamp_format")
@timestamp_format.setter
def timestamp_format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "timestamp_format", value)
@pulumi.input_type
class Servicev1LoggingElasticsearchArgs:
def __init__(__self__, *,
index: pulumi.Input[str],
name: pulumi.Input[str],
url: pulumi.Input[str],
format: Optional[pulumi.Input[str]] = None,
format_version: Optional[pulumi.Input[int]] = None,
password: Optional[pulumi.Input[str]] = None,
pipeline: Optional[pulumi.Input[str]] = None,
placement: Optional[pulumi.Input[str]] = None,
request_max_bytes: Optional[pulumi.Input[int]] = None,
request_max_entries: Optional[pulumi.Input[int]] = None,
response_condition: Optional[pulumi.Input[str]] = None,
tls_ca_cert: Optional[pulumi.Input[str]] = None,
tls_client_cert: Optional[pulumi.Input[str]] = None,
tls_client_key: Optional[pulumi.Input[str]] = None,
tls_hostname: Optional[pulumi.Input[str]] = None,
user: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] index: The name of the Elasticsearch index to send documents (logs) to
:param pulumi.Input[str] name: The unique name of the Elasticsearch logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] url: The Elasticsearch URL to stream logs to
:param pulumi.Input[str] format: Apache-style string or VCL variables to use for log formatting.
:param pulumi.Input[int] format_version: The version of the custom logging format used for the configured endpoint. Can be either 1 or 2. (default: 2).
:param pulumi.Input[str] password: BasicAuth password for Elasticsearch
:param pulumi.Input[str] pipeline: The ID of the Elasticsearch ingest pipeline to apply pre-process transformations to before indexing
:param pulumi.Input[str] placement: Where in the generated VCL the logging call should be placed.
:param pulumi.Input[int] request_max_bytes: The maximum number of logs sent in one request. Defaults to `0` for unbounded
:param pulumi.Input[int] request_max_entries: The maximum number of bytes sent in one request. Defaults to `0` for unbounded
:param pulumi.Input[str] response_condition: The name of the condition to apply
:param pulumi.Input[str] tls_ca_cert: A secure certificate to authenticate the server with. Must be in PEM format
:param pulumi.Input[str] tls_client_cert: The client certificate used to make authenticated requests. Must be in PEM format
:param pulumi.Input[str] tls_client_key: The client private key used to make authenticated requests. Must be in PEM format
:param pulumi.Input[str] tls_hostname: The hostname used to verify the server's certificate. It can either be the Common Name (CN) or a Subject Alternative Name (SAN)
:param pulumi.Input[str] user: BasicAuth username for Elasticsearch
"""
pulumi.set(__self__, "index", index)
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "url", url)
if format is not None:
pulumi.set(__self__, "format", format)
if format_version is not None:
pulumi.set(__self__, "format_version", format_version)
if password is not None:
pulumi.set(__self__, "password", password)
if pipeline is not None:
pulumi.set(__self__, "pipeline", pipeline)
if placement is not None:
pulumi.set(__self__, "placement", placement)
if request_max_bytes is not None:
pulumi.set(__self__, "request_max_bytes", request_max_bytes)
if request_max_entries is not None:
pulumi.set(__self__, "request_max_entries", request_max_entries)
if response_condition is not None:
pulumi.set(__self__, "response_condition", response_condition)
if tls_ca_cert is not None:
pulumi.set(__self__, "tls_ca_cert", tls_ca_cert)
if tls_client_cert is not None:
pulumi.set(__self__, "tls_client_cert", tls_client_cert)
if tls_client_key is not None:
pulumi.set(__self__, "tls_client_key", tls_client_key)
if tls_hostname is not None:
pulumi.set(__self__, "tls_hostname", tls_hostname)
if user is not None:
pulumi.set(__self__, "user", user)
@property
@pulumi.getter
def index(self) -> pulumi.Input[str]:
"""
The name of the Elasticsearch index to send documents (logs) to
"""
return pulumi.get(self, "index")
@index.setter
def index(self, value: pulumi.Input[str]):
pulumi.set(self, "index", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the Elasticsearch logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def url(self) -> pulumi.Input[str]:
"""
The Elasticsearch URL to stream logs to
"""
return pulumi.get(self, "url")
@url.setter
def url(self, value: pulumi.Input[str]):
pulumi.set(self, "url", value)
@property
@pulumi.getter
def format(self) -> Optional[pulumi.Input[str]]:
"""
Apache-style string or VCL variables to use for log formatting.
"""
return pulumi.get(self, "format")
@format.setter
def format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "format", value)
@property
@pulumi.getter(name="formatVersion")
def format_version(self) -> Optional[pulumi.Input[int]]:
"""
The version of the custom logging format used for the configured endpoint. Can be either 1 or 2. (default: 2).
"""
return pulumi.get(self, "format_version")
@format_version.setter
def format_version(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "format_version", value)
@property
@pulumi.getter
def password(self) -> Optional[pulumi.Input[str]]:
"""
BasicAuth password for Elasticsearch
"""
return pulumi.get(self, "password")
@password.setter
def password(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "password", value)
@property
@pulumi.getter
def pipeline(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the Elasticsearch ingest pipeline to apply pre-process transformations to before indexing
"""
return pulumi.get(self, "pipeline")
@pipeline.setter
def pipeline(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "pipeline", value)
@property
@pulumi.getter
def placement(self) -> Optional[pulumi.Input[str]]:
"""
Where in the generated VCL the logging call should be placed.
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "placement", value)
@property
@pulumi.getter(name="requestMaxBytes")
def request_max_bytes(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of logs sent in one request. Defaults to `0` for unbounded
"""
return pulumi.get(self, "request_max_bytes")
@request_max_bytes.setter
def request_max_bytes(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "request_max_bytes", value)
@property
@pulumi.getter(name="requestMaxEntries")
def request_max_entries(self) -> Optional[pulumi.Input[int]]:
"""
The maximum number of bytes sent in one request. Defaults to `0` for unbounded
"""
return pulumi.get(self, "request_max_entries")
@request_max_entries.setter
def request_max_entries(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "request_max_entries", value)
@property
@pulumi.getter(name="responseCondition")
def response_condition(self) -> Optional[pulumi.Input[str]]:
"""
The name of the condition to apply
"""
return pulumi.get(self, "response_condition")
@response_condition.setter
def response_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response_condition", value)
@property
@pulumi.getter(name="tlsCaCert")
def tls_ca_cert(self) -> Optional[pulumi.Input[str]]:
"""
A secure certificate to authenticate the server with. Must be in PEM format
"""
return pulumi.get(self, "tls_ca_cert")
@tls_ca_cert.setter
def tls_ca_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_ca_cert", value)
@property
@pulumi.getter(name="tlsClientCert")
def tls_client_cert(self) -> Optional[pulumi.Input[str]]:
"""
The client certificate used to make authenticated requests. Must be in PEM format
"""
return pulumi.get(self, "tls_client_cert")
@tls_client_cert.setter
def tls_client_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_client_cert", value)
@property
@pulumi.getter(name="tlsClientKey")
def tls_client_key(self) -> Optional[pulumi.Input[str]]:
"""
The client private key used to make authenticated requests. Must be in PEM format
"""
return pulumi.get(self, "tls_client_key")
@tls_client_key.setter
def tls_client_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_client_key", value)
@property
@pulumi.getter(name="tlsHostname")
def tls_hostname(self) -> Optional[pulumi.Input[str]]:
"""
The hostname used to verify the server's certificate. It can either be the Common Name (CN) or a Subject Alternative Name (SAN)
"""
return pulumi.get(self, "tls_hostname")
@tls_hostname.setter
def tls_hostname(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_hostname", value)
@property
@pulumi.getter
def user(self) -> Optional[pulumi.Input[str]]:
"""
BasicAuth username for Elasticsearch
"""
return pulumi.get(self, "user")
@user.setter
def user(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "user", value)
@pulumi.input_type
class Servicev1LoggingFtpArgs:
def __init__(__self__, *,
address: pulumi.Input[str],
name: pulumi.Input[str],
password: pulumi.Input[str],
path: pulumi.Input[str],
user: pulumi.Input[str],
compression_codec: Optional[pulumi.Input[str]] = None,
format: Optional[pulumi.Input[str]] = None,
format_version: Optional[pulumi.Input[int]] = None,
gzip_level: Optional[pulumi.Input[int]] = None,
message_type: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[int]] = None,
placement: Optional[pulumi.Input[str]] = None,
port: Optional[pulumi.Input[int]] = None,
public_key: Optional[pulumi.Input[str]] = None,
response_condition: Optional[pulumi.Input[str]] = None,
timestamp_format: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] address: The FTP address to stream logs to
:param pulumi.Input[str] name: The unique name of the FTP logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] password: The password for the server (for anonymous use an email address)
:param pulumi.Input[str] path: The path to upload log files to. If the path ends in `/` then it is treated as a directory
:param pulumi.Input[str] user: The username for the server (can be `anonymous`)
:param pulumi.Input[str] compression_codec: The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
:param pulumi.Input[str] format: Apache-style string or VCL variables to use for log formatting.
:param pulumi.Input[int] format_version: The version of the custom logging format used for the configured endpoint. Can be either 1 or 2. (default: 2).
:param pulumi.Input[int] gzip_level: Gzip Compression level. Default `0`
:param pulumi.Input[str] message_type: How the message should be formatted (default: `classic`)
:param pulumi.Input[int] period: How frequently the logs should be transferred, in seconds (Default `3600`)
:param pulumi.Input[str] placement: Where in the generated VCL the logging call should be placed.
:param pulumi.Input[int] port: The port number. Default: `21`
:param pulumi.Input[str] public_key: The PGP public key that Fastly will use to encrypt your log files before writing them to disk
:param pulumi.Input[str] response_condition: The name of the condition to apply.
:param pulumi.Input[str] timestamp_format: specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
pulumi.set(__self__, "address", address)
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "password", password)
pulumi.set(__self__, "path", path)
pulumi.set(__self__, "user", user)
if compression_codec is not None:
pulumi.set(__self__, "compression_codec", compression_codec)
if format is not None:
pulumi.set(__self__, "format", format)
if format_version is not None:
pulumi.set(__self__, "format_version", format_version)
if gzip_level is not None:
pulumi.set(__self__, "gzip_level", gzip_level)
if message_type is not None:
pulumi.set(__self__, "message_type", message_type)
if period is not None:
pulumi.set(__self__, "period", period)
if placement is not None:
pulumi.set(__self__, "placement", placement)
if port is not None:
pulumi.set(__self__, "port", port)
if public_key is not None:
pulumi.set(__self__, "public_key", public_key)
if response_condition is not None:
pulumi.set(__self__, "response_condition", response_condition)
if timestamp_format is not None:
pulumi.set(__self__, "timestamp_format", timestamp_format)
@property
@pulumi.getter
def address(self) -> pulumi.Input[str]:
"""
The FTP address to stream logs to
"""
return pulumi.get(self, "address")
@address.setter
def address(self, value: pulumi.Input[str]):
pulumi.set(self, "address", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the FTP logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def password(self) -> pulumi.Input[str]:
"""
The password for the server (for anonymous use an email address)
"""
return pulumi.get(self, "password")
@password.setter
def password(self, value: pulumi.Input[str]):
pulumi.set(self, "password", value)
@property
@pulumi.getter
def path(self) -> pulumi.Input[str]:
"""
The path to upload log files to. If the path ends in `/` then it is treated as a directory
"""
return pulumi.get(self, "path")
@path.setter
def path(self, value: pulumi.Input[str]):
pulumi.set(self, "path", value)
@property
@pulumi.getter
def user(self) -> pulumi.Input[str]:
"""
The username for the server (can be `anonymous`)
"""
return pulumi.get(self, "user")
@user.setter
def user(self, value: pulumi.Input[str]):
pulumi.set(self, "user", value)
@property
@pulumi.getter(name="compressionCodec")
def compression_codec(self) -> Optional[pulumi.Input[str]]:
"""
The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
"""
return pulumi.get(self, "compression_codec")
@compression_codec.setter
def compression_codec(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "compression_codec", value)
@property
@pulumi.getter
def format(self) -> Optional[pulumi.Input[str]]:
"""
Apache-style string or VCL variables to use for log formatting.
"""
return pulumi.get(self, "format")
@format.setter
def format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "format", value)
@property
@pulumi.getter(name="formatVersion")
def format_version(self) -> Optional[pulumi.Input[int]]:
"""
The version of the custom logging format used for the configured endpoint. Can be either 1 or 2. (default: 2).
"""
return pulumi.get(self, "format_version")
@format_version.setter
def format_version(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "format_version", value)
@property
@pulumi.getter(name="gzipLevel")
def gzip_level(self) -> Optional[pulumi.Input[int]]:
"""
Gzip Compression level. Default `0`
"""
return pulumi.get(self, "gzip_level")
@gzip_level.setter
def gzip_level(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "gzip_level", value)
@property
@pulumi.getter(name="messageType")
def message_type(self) -> Optional[pulumi.Input[str]]:
"""
How the message should be formatted (default: `classic`)
"""
return pulumi.get(self, "message_type")
@message_type.setter
def message_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "message_type", value)
@property
@pulumi.getter
def period(self) -> Optional[pulumi.Input[int]]:
"""
How frequently the logs should be transferred, in seconds (Default `3600`)
"""
return pulumi.get(self, "period")
@period.setter
def period(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "period", value)
@property
@pulumi.getter
def placement(self) -> Optional[pulumi.Input[str]]:
"""
Where in the generated VCL the logging call should be placed.
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "placement", value)
@property
@pulumi.getter
def port(self) -> Optional[pulumi.Input[int]]:
"""
The port number. Default: `21`
"""
return pulumi.get(self, "port")
@port.setter
def port(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "port", value)
@property
@pulumi.getter(name="publicKey")
def public_key(self) -> Optional[pulumi.Input[str]]:
"""
The PGP public key that Fastly will use to encrypt your log files before writing them to disk
"""
return pulumi.get(self, "public_key")
@public_key.setter
def public_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "public_key", value)
@property
@pulumi.getter(name="responseCondition")
def response_condition(self) -> Optional[pulumi.Input[str]]:
"""
The name of the condition to apply.
"""
return pulumi.get(self, "response_condition")
@response_condition.setter
def response_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response_condition", value)
@property
@pulumi.getter(name="timestampFormat")
def timestamp_format(self) -> Optional[pulumi.Input[str]]:
"""
specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
return pulumi.get(self, "timestamp_format")
@timestamp_format.setter
def timestamp_format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "timestamp_format", value)
@pulumi.input_type
class Servicev1LoggingGooglepubsubArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
project_id: pulumi.Input[str],
secret_key: pulumi.Input[str],
topic: pulumi.Input[str],
user: pulumi.Input[str],
format: Optional[pulumi.Input[str]] = None,
format_version: Optional[pulumi.Input[int]] = None,
placement: Optional[pulumi.Input[str]] = None,
response_condition: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: The unique name of the Google Cloud Pub/Sub logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] project_id: The ID of your Google Cloud Platform project
:param pulumi.Input[str] secret_key: Your Google Cloud Platform account secret key. The `private_key` field in your service account authentication JSON. You may optionally provide this secret via an environment variable, `FASTLY_GOOGLE_PUBSUB_SECRET_KEY`.
:param pulumi.Input[str] topic: The Google Cloud Pub/Sub topic to which logs will be published
:param pulumi.Input[str] user: Your Google Cloud Platform service account email address. The `client_email` field in your service account authentication JSON. You may optionally provide this via an environment variable, `FASTLY_GOOGLE_PUBSUB_EMAIL`.
:param pulumi.Input[str] format: Apache style log formatting.
:param pulumi.Input[int] format_version: The version of the custom logging format used for the configured endpoint. Can be either 1 or 2. (default: 2).
:param pulumi.Input[str] placement: Where in the generated VCL the logging call should be placed.
:param pulumi.Input[str] response_condition: The name of an existing condition in the configured endpoint, or leave blank to always execute.
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "project_id", project_id)
pulumi.set(__self__, "secret_key", secret_key)
pulumi.set(__self__, "topic", topic)
pulumi.set(__self__, "user", user)
if format is not None:
pulumi.set(__self__, "format", format)
if format_version is not None:
pulumi.set(__self__, "format_version", format_version)
if placement is not None:
pulumi.set(__self__, "placement", placement)
if response_condition is not None:
pulumi.set(__self__, "response_condition", response_condition)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the Google Cloud Pub/Sub logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="projectId")
def project_id(self) -> pulumi.Input[str]:
"""
The ID of your Google Cloud Platform project
"""
return pulumi.get(self, "project_id")
@project_id.setter
def project_id(self, value: pulumi.Input[str]):
pulumi.set(self, "project_id", value)
@property
@pulumi.getter(name="secretKey")
def secret_key(self) -> pulumi.Input[str]:
"""
Your Google Cloud Platform account secret key. The `private_key` field in your service account authentication JSON. You may optionally provide this secret via an environment variable, `FASTLY_GOOGLE_PUBSUB_SECRET_KEY`.
"""
return pulumi.get(self, "secret_key")
@secret_key.setter
def secret_key(self, value: pulumi.Input[str]):
pulumi.set(self, "secret_key", value)
@property
@pulumi.getter
def topic(self) -> pulumi.Input[str]:
"""
The Google Cloud Pub/Sub topic to which logs will be published
"""
return pulumi.get(self, "topic")
@topic.setter
def topic(self, value: pulumi.Input[str]):
pulumi.set(self, "topic", value)
@property
@pulumi.getter
def user(self) -> pulumi.Input[str]:
"""
Your Google Cloud Platform service account email address. The `client_email` field in your service account authentication JSON. You may optionally provide this via an environment variable, `FASTLY_GOOGLE_PUBSUB_EMAIL`.
"""
return pulumi.get(self, "user")
@user.setter
def user(self, value: pulumi.Input[str]):
pulumi.set(self, "user", value)
@property
@pulumi.getter
def format(self) -> Optional[pulumi.Input[str]]:
"""
Apache style log formatting.
"""
return pulumi.get(self, "format")
@format.setter
def format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "format", value)
@property
@pulumi.getter(name="formatVersion")
def format_version(self) -> Optional[pulumi.Input[int]]:
"""
The version of the custom logging format used for the configured endpoint. Can be either 1 or 2. (default: 2).
"""
return pulumi.get(self, "format_version")
@format_version.setter
def format_version(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "format_version", value)
@property
@pulumi.getter
def placement(self) -> Optional[pulumi.Input[str]]:
"""
Where in the generated VCL the logging call should be placed.
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "placement", value)
@property
@pulumi.getter(name="responseCondition")
def response_condition(self) -> Optional[pulumi.Input[str]]:
"""
The name of an existing condition in the configured endpoint, or leave blank to always execute.
"""
return pulumi.get(self, "response_condition")
@response_condition.setter
def response_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response_condition", value)
@pulumi.input_type
class Servicev1LoggingHerokuArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
token: pulumi.Input[str],
url: pulumi.Input[str],
format: Optional[pulumi.Input[str]] = None,
format_version: Optional[pulumi.Input[int]] = None,
placement: Optional[pulumi.Input[str]] = None,
response_condition: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: The unique name of the Heroku logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] token: The token to use for authentication (https://www.heroku.com/docs/customer-token-authentication-token/)
:param pulumi.Input[str] url: The URL to stream logs to
:param pulumi.Input[str] format: Apache-style string or VCL variables to use for log formatting.
:param pulumi.Input[int] format_version: The version of the custom logging format used for the configured endpoint. Can be either `1` or `2`. (default: `2`).
:param pulumi.Input[str] placement: Where in the generated VCL the logging call should be placed. Can be `none` or `waf_debug`.
:param pulumi.Input[str] response_condition: The name of an existing condition in the configured endpoint, or leave blank to always execute.
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "token", token)
pulumi.set(__self__, "url", url)
if format is not None:
pulumi.set(__self__, "format", format)
if format_version is not None:
pulumi.set(__self__, "format_version", format_version)
if placement is not None:
pulumi.set(__self__, "placement", placement)
if response_condition is not None:
pulumi.set(__self__, "response_condition", response_condition)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the Heroku logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def token(self) -> pulumi.Input[str]:
"""
The token to use for authentication (https://www.heroku.com/docs/customer-token-authentication-token/)
"""
return pulumi.get(self, "token")
@token.setter
def token(self, value: pulumi.Input[str]):
pulumi.set(self, "token", value)
@property
@pulumi.getter
def url(self) -> pulumi.Input[str]:
"""
The URL to stream logs to
"""
return pulumi.get(self, "url")
@url.setter
def url(self, value: pulumi.Input[str]):
pulumi.set(self, "url", value)
@property
@pulumi.getter
def format(self) -> Optional[pulumi.Input[str]]:
"""
Apache-style string or VCL variables to use for log formatting.
"""
return pulumi.get(self, "format")
@format.setter
def format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "format", value)
@property
@pulumi.getter(name="formatVersion")
def format_version(self) -> Optional[pulumi.Input[int]]:
"""
The version of the custom logging format used for the configured endpoint. Can be either `1` or `2`. (default: `2`).
"""
return pulumi.get(self, "format_version")
@format_version.setter
def format_version(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "format_version", value)
@property
@pulumi.getter
def placement(self) -> Optional[pulumi.Input[str]]:
"""
Where in the generated VCL the logging call should be placed. Can be `none` or `waf_debug`.
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "placement", value)
@property
@pulumi.getter(name="responseCondition")
def response_condition(self) -> Optional[pulumi.Input[str]]:
"""
The name of an existing condition in the configured endpoint, or leave blank to always execute.
"""
return pulumi.get(self, "response_condition")
@response_condition.setter
def response_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response_condition", value)
@pulumi.input_type
class Servicev1LoggingHoneycombArgs:
def __init__(__self__, *,
dataset: pulumi.Input[str],
name: pulumi.Input[str],
token: pulumi.Input[str],
format: Optional[pulumi.Input[str]] = None,
format_version: Optional[pulumi.Input[int]] = None,
placement: Optional[pulumi.Input[str]] = None,
response_condition: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] dataset: The Honeycomb Dataset you want to log to
:param pulumi.Input[str] name: The unique name of the Honeycomb logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] token: The Write Key from the Account page of your Honeycomb account
:param pulumi.Input[str] format: Apache style log formatting. Your log must produce valid JSON that Honeycomb can ingest.
:param pulumi.Input[int] format_version: The version of the custom logging format used for the configured endpoint. Can be either `1` or `2`. (default: `2`).
:param pulumi.Input[str] placement: Where in the generated VCL the logging call should be placed. Can be `none` or `waf_debug`.
:param pulumi.Input[str] response_condition: The name of an existing condition in the configured endpoint, or leave blank to always execute.
"""
pulumi.set(__self__, "dataset", dataset)
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "token", token)
if format is not None:
pulumi.set(__self__, "format", format)
if format_version is not None:
pulumi.set(__self__, "format_version", format_version)
if placement is not None:
pulumi.set(__self__, "placement", placement)
if response_condition is not None:
pulumi.set(__self__, "response_condition", response_condition)
@property
@pulumi.getter
def dataset(self) -> pulumi.Input[str]:
"""
The Honeycomb Dataset you want to log to
"""
return pulumi.get(self, "dataset")
@dataset.setter
def dataset(self, value: pulumi.Input[str]):
pulumi.set(self, "dataset", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the Honeycomb logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def token(self) -> pulumi.Input[str]:
"""
The Write Key from the Account page of your Honeycomb account
"""
return pulumi.get(self, "token")
@token.setter
def token(self, value: pulumi.Input[str]):
pulumi.set(self, "token", value)
@property
@pulumi.getter
def format(self) -> Optional[pulumi.Input[str]]:
"""
Apache style log formatting. Your log must produce valid JSON that Honeycomb can ingest.
"""
return pulumi.get(self, "format")
@format.setter
def format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "format", value)
@property
@pulumi.getter(name="formatVersion")
def format_version(self) -> Optional[pulumi.Input[int]]:
"""
The version of the custom logging format used for the configured endpoint. Can be either `1` or `2`. (default: `2`).
"""
return pulumi.get(self, "format_version")
@format_version.setter
def format_version(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "format_version", value)
@property
@pulumi.getter
def placement(self) -> Optional[pulumi.Input[str]]:
"""
Where in the generated VCL the logging call should be placed. Can be `none` or `waf_debug`.
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "placement", value)
@property
@pulumi.getter(name="responseCondition")
def response_condition(self) -> Optional[pulumi.Input[str]]:
"""
The name of an existing condition in the configured endpoint, or leave blank to always execute.
"""
return pulumi.get(self, "response_condition")
@response_condition.setter
def response_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response_condition", value)
@pulumi.input_type
class Servicev1LoggingKafkaArgs:
def __init__(__self__, *,
brokers: pulumi.Input[str],
name: pulumi.Input[str],
topic: pulumi.Input[str],
auth_method: Optional[pulumi.Input[str]] = None,
compression_codec: Optional[pulumi.Input[str]] = None,
format: Optional[pulumi.Input[str]] = None,
format_version: Optional[pulumi.Input[int]] = None,
parse_log_keyvals: Optional[pulumi.Input[bool]] = None,
password: Optional[pulumi.Input[str]] = None,
placement: Optional[pulumi.Input[str]] = None,
request_max_bytes: Optional[pulumi.Input[int]] = None,
required_acks: Optional[pulumi.Input[str]] = None,
response_condition: Optional[pulumi.Input[str]] = None,
tls_ca_cert: Optional[pulumi.Input[str]] = None,
tls_client_cert: Optional[pulumi.Input[str]] = None,
tls_client_key: Optional[pulumi.Input[str]] = None,
tls_hostname: Optional[pulumi.Input[str]] = None,
use_tls: Optional[pulumi.Input[bool]] = None,
user: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] brokers: A comma-separated list of IP addresses or hostnames of Kafka brokers
:param pulumi.Input[str] name: The unique name of the Kafka logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] topic: The Kafka topic to send logs to
:param pulumi.Input[str] auth_method: SASL authentication method. One of: plain, scram-sha-256, scram-sha-512
:param pulumi.Input[str] compression_codec: The codec used for compression of your logs. One of: `gzip`, `snappy`, `lz4`
:param pulumi.Input[str] format: Apache style log formatting.
:param pulumi.Input[int] format_version: The version of the custom logging format used for the configured endpoint. Can be either 1 or 2. (default: 2).
:param pulumi.Input[bool] parse_log_keyvals: Enables parsing of key=value tuples from the beginning of a logline, turning them into record headers
:param pulumi.Input[str] password: SASL Pass
:param pulumi.Input[str] placement: Where in the generated VCL the logging call should be placed.
:param pulumi.Input[int] request_max_bytes: Maximum size of log batch, if non-zero. Defaults to 0 for unbounded
:param pulumi.Input[str] required_acks: The Number of acknowledgements a leader must receive before a write is considered successful. One of: `1` (default) One server needs to respond. `0` No servers need to respond. `-1` Wait for all in-sync replicas to respond
:param pulumi.Input[str] response_condition: The name of an existing condition in the configured endpoint, or leave blank to always execute.
:param pulumi.Input[str] tls_ca_cert: A secure certificate to authenticate the server with. Must be in PEM format
:param pulumi.Input[str] tls_client_cert: The client certificate used to make authenticated requests. Must be in PEM format
:param pulumi.Input[str] tls_client_key: The client private key used to make authenticated requests. Must be in PEM format
:param pulumi.Input[str] tls_hostname: The hostname used to verify the server's certificate. It can either be the Common Name or a Subject Alternative Name (SAN)
:param pulumi.Input[bool] use_tls: Whether to use TLS for secure logging. Can be either `true` or `false`
:param pulumi.Input[str] user: SASL User
"""
pulumi.set(__self__, "brokers", brokers)
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "topic", topic)
if auth_method is not None:
pulumi.set(__self__, "auth_method", auth_method)
if compression_codec is not None:
pulumi.set(__self__, "compression_codec", compression_codec)
if format is not None:
pulumi.set(__self__, "format", format)
if format_version is not None:
pulumi.set(__self__, "format_version", format_version)
if parse_log_keyvals is not None:
pulumi.set(__self__, "parse_log_keyvals", parse_log_keyvals)
if password is not None:
pulumi.set(__self__, "password", password)
if placement is not None:
pulumi.set(__self__, "placement", placement)
if request_max_bytes is not None:
pulumi.set(__self__, "request_max_bytes", request_max_bytes)
if required_acks is not None:
pulumi.set(__self__, "required_acks", required_acks)
if response_condition is not None:
pulumi.set(__self__, "response_condition", response_condition)
if tls_ca_cert is not None:
pulumi.set(__self__, "tls_ca_cert", tls_ca_cert)
if tls_client_cert is not None:
pulumi.set(__self__, "tls_client_cert", tls_client_cert)
if tls_client_key is not None:
pulumi.set(__self__, "tls_client_key", tls_client_key)
if tls_hostname is not None:
pulumi.set(__self__, "tls_hostname", tls_hostname)
if use_tls is not None:
pulumi.set(__self__, "use_tls", use_tls)
if user is not None:
pulumi.set(__self__, "user", user)
@property
@pulumi.getter
def brokers(self) -> pulumi.Input[str]:
"""
A comma-separated list of IP addresses or hostnames of Kafka brokers
"""
return pulumi.get(self, "brokers")
@brokers.setter
def brokers(self, value: pulumi.Input[str]):
pulumi.set(self, "brokers", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the Kafka logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def topic(self) -> pulumi.Input[str]:
"""
The Kafka topic to send logs to
"""
return pulumi.get(self, "topic")
@topic.setter
def topic(self, value: pulumi.Input[str]):
pulumi.set(self, "topic", value)
@property
@pulumi.getter(name="authMethod")
def auth_method(self) -> Optional[pulumi.Input[str]]:
"""
SASL authentication method. One of: plain, scram-sha-256, scram-sha-512
"""
return pulumi.get(self, "auth_method")
@auth_method.setter
def auth_method(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "auth_method", value)
@property
@pulumi.getter(name="compressionCodec")
def compression_codec(self) -> Optional[pulumi.Input[str]]:
"""
The codec used for compression of your logs. One of: `gzip`, `snappy`, `lz4`
"""
return pulumi.get(self, "compression_codec")
@compression_codec.setter
def compression_codec(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "compression_codec", value)
@property
@pulumi.getter
def format(self) -> Optional[pulumi.Input[str]]:
"""
Apache style log formatting.
"""
return pulumi.get(self, "format")
@format.setter
def format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "format", value)
@property
@pulumi.getter(name="formatVersion")
def format_version(self) -> Optional[pulumi.Input[int]]:
"""
The version of the custom logging format used for the configured endpoint. Can be either 1 or 2. (default: 2).
"""
return pulumi.get(self, "format_version")
@format_version.setter
def format_version(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "format_version", value)
@property
@pulumi.getter(name="parseLogKeyvals")
def parse_log_keyvals(self) -> Optional[pulumi.Input[bool]]:
"""
Enables parsing of key=value tuples from the beginning of a logline, turning them into record headers
"""
return pulumi.get(self, "parse_log_keyvals")
@parse_log_keyvals.setter
def parse_log_keyvals(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "parse_log_keyvals", value)
@property
@pulumi.getter
def password(self) -> Optional[pulumi.Input[str]]:
"""
SASL Pass
"""
return pulumi.get(self, "password")
@password.setter
def password(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "password", value)
@property
@pulumi.getter
def placement(self) -> Optional[pulumi.Input[str]]:
"""
Where in the generated VCL the logging call should be placed.
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "placement", value)
@property
@pulumi.getter(name="requestMaxBytes")
def request_max_bytes(self) -> Optional[pulumi.Input[int]]:
"""
Maximum size of log batch, if non-zero. Defaults to 0 for unbounded
"""
return pulumi.get(self, "request_max_bytes")
@request_max_bytes.setter
def request_max_bytes(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "request_max_bytes", value)
@property
@pulumi.getter(name="requiredAcks")
def required_acks(self) -> Optional[pulumi.Input[str]]:
"""
The Number of acknowledgements a leader must receive before a write is considered successful. One of: `1` (default) One server needs to respond. `0` No servers need to respond. `-1` Wait for all in-sync replicas to respond
"""
return pulumi.get(self, "required_acks")
@required_acks.setter
def required_acks(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "required_acks", value)
@property
@pulumi.getter(name="responseCondition")
def response_condition(self) -> Optional[pulumi.Input[str]]:
"""
The name of an existing condition in the configured endpoint, or leave blank to always execute.
"""
return pulumi.get(self, "response_condition")
@response_condition.setter
def response_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response_condition", value)
@property
@pulumi.getter(name="tlsCaCert")
def tls_ca_cert(self) -> Optional[pulumi.Input[str]]:
"""
A secure certificate to authenticate the server with. Must be in PEM format
"""
return pulumi.get(self, "tls_ca_cert")
@tls_ca_cert.setter
def tls_ca_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_ca_cert", value)
@property
@pulumi.getter(name="tlsClientCert")
def tls_client_cert(self) -> Optional[pulumi.Input[str]]:
"""
The client certificate used to make authenticated requests. Must be in PEM format
"""
return pulumi.get(self, "tls_client_cert")
@tls_client_cert.setter
def tls_client_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_client_cert", value)
@property
@pulumi.getter(name="tlsClientKey")
def tls_client_key(self) -> Optional[pulumi.Input[str]]:
"""
The client private key used to make authenticated requests. Must be in PEM format
"""
return pulumi.get(self, "tls_client_key")
@tls_client_key.setter
def tls_client_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_client_key", value)
@property
@pulumi.getter(name="tlsHostname")
def tls_hostname(self) -> Optional[pulumi.Input[str]]:
"""
The hostname used to verify the server's certificate. It can either be the Common Name or a Subject Alternative Name (SAN)
"""
return pulumi.get(self, "tls_hostname")
@tls_hostname.setter
def tls_hostname(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_hostname", value)
@property
@pulumi.getter(name="useTls")
def use_tls(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to use TLS for secure logging. Can be either `true` or `false`
"""
return pulumi.get(self, "use_tls")
@use_tls.setter
def use_tls(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "use_tls", value)
@property
@pulumi.getter
def user(self) -> Optional[pulumi.Input[str]]:
"""
SASL User
"""
return pulumi.get(self, "user")
@user.setter
def user(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "user", value)
@pulumi.input_type
class Servicev1LoggingKineseArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
topic: pulumi.Input[str],
access_key: Optional[pulumi.Input[str]] = None,
format: Optional[pulumi.Input[str]] = None,
format_version: Optional[pulumi.Input[int]] = None,
iam_role: Optional[pulumi.Input[str]] = None,
placement: Optional[pulumi.Input[str]] = None,
region: Optional[pulumi.Input[str]] = None,
response_condition: Optional[pulumi.Input[str]] = None,
secret_key: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: The unique name of the Kinesis logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] topic: The Kinesis stream name
:param pulumi.Input[str] access_key: The AWS access key to be used to write to the stream
:param pulumi.Input[str] format: Apache style log formatting.
:param pulumi.Input[int] format_version: The version of the custom logging format used for the configured endpoint. Can be either `1` or `2`. (default: `2`).
:param pulumi.Input[str] iam_role: The Amazon Resource Name (ARN) for the IAM role granting Fastly access to Kinesis. Not required if `access_key` and `secret_key` are provided.
:param pulumi.Input[str] placement: Where in the generated VCL the logging call should be placed. Can be `none` or `waf_debug`.
:param pulumi.Input[str] region: The AWS region the stream resides in. (Default: `us-east-1`)
:param pulumi.Input[str] response_condition: The name of an existing condition in the configured endpoint, or leave blank to always execute.
:param pulumi.Input[str] secret_key: The AWS secret access key to authenticate with
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "topic", topic)
if access_key is not None:
pulumi.set(__self__, "access_key", access_key)
if format is not None:
pulumi.set(__self__, "format", format)
if format_version is not None:
pulumi.set(__self__, "format_version", format_version)
if iam_role is not None:
pulumi.set(__self__, "iam_role", iam_role)
if placement is not None:
pulumi.set(__self__, "placement", placement)
if region is not None:
pulumi.set(__self__, "region", region)
if response_condition is not None:
pulumi.set(__self__, "response_condition", response_condition)
if secret_key is not None:
pulumi.set(__self__, "secret_key", secret_key)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the Kinesis logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def topic(self) -> pulumi.Input[str]:
"""
The Kinesis stream name
"""
return pulumi.get(self, "topic")
@topic.setter
def topic(self, value: pulumi.Input[str]):
pulumi.set(self, "topic", value)
@property
@pulumi.getter(name="accessKey")
def access_key(self) -> Optional[pulumi.Input[str]]:
"""
The AWS access key to be used to write to the stream
"""
return pulumi.get(self, "access_key")
@access_key.setter
def access_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "access_key", value)
@property
@pulumi.getter
def format(self) -> Optional[pulumi.Input[str]]:
"""
Apache style log formatting.
"""
return pulumi.get(self, "format")
@format.setter
def format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "format", value)
@property
@pulumi.getter(name="formatVersion")
def format_version(self) -> Optional[pulumi.Input[int]]:
"""
The version of the custom logging format used for the configured endpoint. Can be either `1` or `2`. (default: `2`).
"""
return pulumi.get(self, "format_version")
@format_version.setter
def format_version(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "format_version", value)
@property
@pulumi.getter(name="iamRole")
def iam_role(self) -> Optional[pulumi.Input[str]]:
"""
The Amazon Resource Name (ARN) for the IAM role granting Fastly access to Kinesis. Not required if `access_key` and `secret_key` are provided.
"""
return pulumi.get(self, "iam_role")
@iam_role.setter
def iam_role(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "iam_role", value)
@property
@pulumi.getter
def placement(self) -> Optional[pulumi.Input[str]]:
"""
Where in the generated VCL the logging call should be placed. Can be `none` or `waf_debug`.
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "placement", value)
@property
@pulumi.getter
def region(self) -> Optional[pulumi.Input[str]]:
"""
The AWS region the stream resides in. (Default: `us-east-1`)
"""
return pulumi.get(self, "region")
@region.setter
def region(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "region", value)
@property
@pulumi.getter(name="responseCondition")
def response_condition(self) -> Optional[pulumi.Input[str]]:
"""
The name of an existing condition in the configured endpoint, or leave blank to always execute.
"""
return pulumi.get(self, "response_condition")
@response_condition.setter
def response_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response_condition", value)
@property
@pulumi.getter(name="secretKey")
def secret_key(self) -> Optional[pulumi.Input[str]]:
"""
The AWS secret access key to authenticate with
"""
return pulumi.get(self, "secret_key")
@secret_key.setter
def secret_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "secret_key", value)
@pulumi.input_type
class Servicev1LoggingLogglyArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
token: pulumi.Input[str],
format: Optional[pulumi.Input[str]] = None,
format_version: Optional[pulumi.Input[int]] = None,
placement: Optional[pulumi.Input[str]] = None,
response_condition: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: The unique name of the Loggly logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] token: The token to use for authentication (https://www.loggly.com/docs/customer-token-authentication-token/).
:param pulumi.Input[str] format: Apache-style string or VCL variables to use for log formatting.
:param pulumi.Input[int] format_version: The version of the custom logging format used for the configured endpoint. Can be either `1` or `2`. (default: `2`).
:param pulumi.Input[str] placement: Where in the generated VCL the logging call should be placed. Can be `none` or `waf_debug`.
:param pulumi.Input[str] response_condition: The name of an existing condition in the configured endpoint, or leave blank to always execute.
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "token", token)
if format is not None:
pulumi.set(__self__, "format", format)
if format_version is not None:
pulumi.set(__self__, "format_version", format_version)
if placement is not None:
pulumi.set(__self__, "placement", placement)
if response_condition is not None:
pulumi.set(__self__, "response_condition", response_condition)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the Loggly logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def token(self) -> pulumi.Input[str]:
"""
The token to use for authentication (https://www.loggly.com/docs/customer-token-authentication-token/).
"""
return pulumi.get(self, "token")
@token.setter
def token(self, value: pulumi.Input[str]):
pulumi.set(self, "token", value)
@property
@pulumi.getter
def format(self) -> Optional[pulumi.Input[str]]:
"""
Apache-style string or VCL variables to use for log formatting.
"""
return pulumi.get(self, "format")
@format.setter
def format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "format", value)
@property
@pulumi.getter(name="formatVersion")
def format_version(self) -> Optional[pulumi.Input[int]]:
"""
The version of the custom logging format used for the configured endpoint. Can be either `1` or `2`. (default: `2`).
"""
return pulumi.get(self, "format_version")
@format_version.setter
def format_version(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "format_version", value)
@property
@pulumi.getter
def placement(self) -> Optional[pulumi.Input[str]]:
"""
Where in the generated VCL the logging call should be placed. Can be `none` or `waf_debug`.
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "placement", value)
@property
@pulumi.getter(name="responseCondition")
def response_condition(self) -> Optional[pulumi.Input[str]]:
"""
The name of an existing condition in the configured endpoint, or leave blank to always execute.
"""
return pulumi.get(self, "response_condition")
@response_condition.setter
def response_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response_condition", value)
@pulumi.input_type
class Servicev1LoggingLogshuttleArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
token: pulumi.Input[str],
url: pulumi.Input[str],
format: Optional[pulumi.Input[str]] = None,
format_version: Optional[pulumi.Input[int]] = None,
placement: Optional[pulumi.Input[str]] = None,
response_condition: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: The unique name of the Log Shuttle logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] token: The data authentication token associated with this endpoint
:param pulumi.Input[str] url: Your Log Shuttle endpoint URL
:param pulumi.Input[str] format: Apache style log formatting.
:param pulumi.Input[int] format_version: The version of the custom logging format used for the configured endpoint. Can be either `1` or `2`. (default: `2`).
:param pulumi.Input[str] placement: Where in the generated VCL the logging call should be placed. Can be `none` or `waf_debug`.
:param pulumi.Input[str] response_condition: The name of an existing condition in the configured endpoint, or leave blank to always execute.
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "token", token)
pulumi.set(__self__, "url", url)
if format is not None:
pulumi.set(__self__, "format", format)
if format_version is not None:
pulumi.set(__self__, "format_version", format_version)
if placement is not None:
pulumi.set(__self__, "placement", placement)
if response_condition is not None:
pulumi.set(__self__, "response_condition", response_condition)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the Log Shuttle logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def token(self) -> pulumi.Input[str]:
"""
The data authentication token associated with this endpoint
"""
return pulumi.get(self, "token")
@token.setter
def token(self, value: pulumi.Input[str]):
pulumi.set(self, "token", value)
@property
@pulumi.getter
def url(self) -> pulumi.Input[str]:
"""
Your Log Shuttle endpoint URL
"""
return pulumi.get(self, "url")
@url.setter
def url(self, value: pulumi.Input[str]):
pulumi.set(self, "url", value)
@property
@pulumi.getter
def format(self) -> Optional[pulumi.Input[str]]:
"""
Apache style log formatting.
"""
return pulumi.get(self, "format")
@format.setter
def format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "format", value)
@property
@pulumi.getter(name="formatVersion")
def format_version(self) -> Optional[pulumi.Input[int]]:
"""
The version of the custom logging format used for the configured endpoint. Can be either `1` or `2`. (default: `2`).
"""
return pulumi.get(self, "format_version")
@format_version.setter
def format_version(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "format_version", value)
@property
@pulumi.getter
def placement(self) -> Optional[pulumi.Input[str]]:
"""
Where in the generated VCL the logging call should be placed. Can be `none` or `waf_debug`.
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "placement", value)
@property
@pulumi.getter(name="responseCondition")
def response_condition(self) -> Optional[pulumi.Input[str]]:
"""
The name of an existing condition in the configured endpoint, or leave blank to always execute.
"""
return pulumi.get(self, "response_condition")
@response_condition.setter
def response_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response_condition", value)
@pulumi.input_type
class Servicev1LoggingNewrelicArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
token: pulumi.Input[str],
format: Optional[pulumi.Input[str]] = None,
format_version: Optional[pulumi.Input[int]] = None,
placement: Optional[pulumi.Input[str]] = None,
region: Optional[pulumi.Input[str]] = None,
response_condition: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: The unique name of the New Relic logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] token: The Insert API key from the Account page of your New Relic account
:param pulumi.Input[str] format: Apache style log formatting. Your log must produce valid JSON that New Relic Logs can ingest.
:param pulumi.Input[int] format_version: The version of the custom logging format used for the configured endpoint. Can be either `1` or `2`. (default: `2`).
:param pulumi.Input[str] placement: Where in the generated VCL the logging call should be placed.
:param pulumi.Input[str] region: The region that log data will be sent to. Default: `US`
:param pulumi.Input[str] response_condition: The name of the condition to apply.
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "token", token)
if format is not None:
pulumi.set(__self__, "format", format)
if format_version is not None:
pulumi.set(__self__, "format_version", format_version)
if placement is not None:
pulumi.set(__self__, "placement", placement)
if region is not None:
pulumi.set(__self__, "region", region)
if response_condition is not None:
pulumi.set(__self__, "response_condition", response_condition)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the New Relic logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def token(self) -> pulumi.Input[str]:
"""
The Insert API key from the Account page of your New Relic account
"""
return pulumi.get(self, "token")
@token.setter
def token(self, value: pulumi.Input[str]):
pulumi.set(self, "token", value)
@property
@pulumi.getter
def format(self) -> Optional[pulumi.Input[str]]:
"""
Apache style log formatting. Your log must produce valid JSON that New Relic Logs can ingest.
"""
return pulumi.get(self, "format")
@format.setter
def format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "format", value)
@property
@pulumi.getter(name="formatVersion")
def format_version(self) -> Optional[pulumi.Input[int]]:
"""
The version of the custom logging format used for the configured endpoint. Can be either `1` or `2`. (default: `2`).
"""
return pulumi.get(self, "format_version")
@format_version.setter
def format_version(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "format_version", value)
@property
@pulumi.getter
def placement(self) -> Optional[pulumi.Input[str]]:
"""
Where in the generated VCL the logging call should be placed.
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "placement", value)
@property
@pulumi.getter
def region(self) -> Optional[pulumi.Input[str]]:
"""
The region that log data will be sent to. Default: `US`
"""
return pulumi.get(self, "region")
@region.setter
def region(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "region", value)
@property
@pulumi.getter(name="responseCondition")
def response_condition(self) -> Optional[pulumi.Input[str]]:
"""
The name of the condition to apply.
"""
return pulumi.get(self, "response_condition")
@response_condition.setter
def response_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response_condition", value)
@pulumi.input_type
class Servicev1LoggingOpenstackArgs:
def __init__(__self__, *,
access_key: pulumi.Input[str],
bucket_name: pulumi.Input[str],
name: pulumi.Input[str],
url: pulumi.Input[str],
user: pulumi.Input[str],
compression_codec: Optional[pulumi.Input[str]] = None,
format: Optional[pulumi.Input[str]] = None,
format_version: Optional[pulumi.Input[int]] = None,
gzip_level: Optional[pulumi.Input[int]] = None,
message_type: Optional[pulumi.Input[str]] = None,
path: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[int]] = None,
placement: Optional[pulumi.Input[str]] = None,
public_key: Optional[pulumi.Input[str]] = None,
response_condition: Optional[pulumi.Input[str]] = None,
timestamp_format: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] access_key: Your OpenStack account access key
:param pulumi.Input[str] bucket_name: The name of your OpenStack container
:param pulumi.Input[str] name: The unique name of the OpenStack logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] url: Your OpenStack auth url
:param pulumi.Input[str] user: The username for your OpenStack account
:param pulumi.Input[str] compression_codec: The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
:param pulumi.Input[str] format: Apache style log formatting.
:param pulumi.Input[int] format_version: The version of the custom logging format used for the configured endpoint. Can be either `1` or `2`. (default: `2`).
:param pulumi.Input[int] gzip_level: What level of Gzip encoding to have when dumping logs (default `0`, no compression)
:param pulumi.Input[str] message_type: How the message should be formatted; one of: `classic`, `loggly`, `logplex` or `blank`. Default `classic`. [Fastly Documentation](https://developer.fastly.com/reference/api/logging/gcs/)
:param pulumi.Input[str] path: Path to store the files. Must end with a trailing slash. If this field is left empty, the files will be saved in the bucket's root path
:param pulumi.Input[int] period: How frequently the logs should be transferred, in seconds. Default `3600`
:param pulumi.Input[str] placement: Where in the generated VCL the logging call should be placed. Can be `none` or `waf_debug`.
:param pulumi.Input[str] public_key: A PGP public key that Fastly will use to encrypt your log files before writing them to disk
:param pulumi.Input[str] response_condition: The name of an existing condition in the configured endpoint, or leave blank to always execute.
:param pulumi.Input[str] timestamp_format: specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
pulumi.set(__self__, "access_key", access_key)
pulumi.set(__self__, "bucket_name", bucket_name)
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "url", url)
pulumi.set(__self__, "user", user)
if compression_codec is not None:
pulumi.set(__self__, "compression_codec", compression_codec)
if format is not None:
pulumi.set(__self__, "format", format)
if format_version is not None:
pulumi.set(__self__, "format_version", format_version)
if gzip_level is not None:
pulumi.set(__self__, "gzip_level", gzip_level)
if message_type is not None:
pulumi.set(__self__, "message_type", message_type)
if path is not None:
pulumi.set(__self__, "path", path)
if period is not None:
pulumi.set(__self__, "period", period)
if placement is not None:
pulumi.set(__self__, "placement", placement)
if public_key is not None:
pulumi.set(__self__, "public_key", public_key)
if response_condition is not None:
pulumi.set(__self__, "response_condition", response_condition)
if timestamp_format is not None:
pulumi.set(__self__, "timestamp_format", timestamp_format)
@property
@pulumi.getter(name="accessKey")
def access_key(self) -> pulumi.Input[str]:
"""
Your OpenStack account access key
"""
return pulumi.get(self, "access_key")
@access_key.setter
def access_key(self, value: pulumi.Input[str]):
pulumi.set(self, "access_key", value)
@property
@pulumi.getter(name="bucketName")
def bucket_name(self) -> pulumi.Input[str]:
"""
The name of your OpenStack container
"""
return pulumi.get(self, "bucket_name")
@bucket_name.setter
def bucket_name(self, value: pulumi.Input[str]):
pulumi.set(self, "bucket_name", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the OpenStack logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def url(self) -> pulumi.Input[str]:
"""
Your OpenStack auth url
"""
return pulumi.get(self, "url")
@url.setter
def url(self, value: pulumi.Input[str]):
pulumi.set(self, "url", value)
@property
@pulumi.getter
def user(self) -> pulumi.Input[str]:
"""
The username for your OpenStack account
"""
return pulumi.get(self, "user")
@user.setter
def user(self, value: pulumi.Input[str]):
pulumi.set(self, "user", value)
@property
@pulumi.getter(name="compressionCodec")
def compression_codec(self) -> Optional[pulumi.Input[str]]:
"""
The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
"""
return pulumi.get(self, "compression_codec")
@compression_codec.setter
def compression_codec(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "compression_codec", value)
@property
@pulumi.getter
def format(self) -> Optional[pulumi.Input[str]]:
"""
Apache style log formatting.
"""
return pulumi.get(self, "format")
@format.setter
def format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "format", value)
@property
@pulumi.getter(name="formatVersion")
def format_version(self) -> Optional[pulumi.Input[int]]:
"""
The version of the custom logging format used for the configured endpoint. Can be either `1` or `2`. (default: `2`).
"""
return pulumi.get(self, "format_version")
@format_version.setter
def format_version(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "format_version", value)
@property
@pulumi.getter(name="gzipLevel")
def gzip_level(self) -> Optional[pulumi.Input[int]]:
"""
What level of Gzip encoding to have when dumping logs (default `0`, no compression)
"""
return pulumi.get(self, "gzip_level")
@gzip_level.setter
def gzip_level(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "gzip_level", value)
@property
@pulumi.getter(name="messageType")
def message_type(self) -> Optional[pulumi.Input[str]]:
"""
How the message should be formatted; one of: `classic`, `loggly`, `logplex` or `blank`. Default `classic`. [Fastly Documentation](https://developer.fastly.com/reference/api/logging/gcs/)
"""
return pulumi.get(self, "message_type")
@message_type.setter
def message_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "message_type", value)
@property
@pulumi.getter
def path(self) -> Optional[pulumi.Input[str]]:
"""
Path to store the files. Must end with a trailing slash. If this field is left empty, the files will be saved in the bucket's root path
"""
return pulumi.get(self, "path")
@path.setter
def path(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "path", value)
@property
@pulumi.getter
def period(self) -> Optional[pulumi.Input[int]]:
"""
How frequently the logs should be transferred, in seconds. Default `3600`
"""
return pulumi.get(self, "period")
@period.setter
def period(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "period", value)
@property
@pulumi.getter
def placement(self) -> Optional[pulumi.Input[str]]:
"""
Where in the generated VCL the logging call should be placed. Can be `none` or `waf_debug`.
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "placement", value)
@property
@pulumi.getter(name="publicKey")
def public_key(self) -> Optional[pulumi.Input[str]]:
"""
A PGP public key that Fastly will use to encrypt your log files before writing them to disk
"""
return pulumi.get(self, "public_key")
@public_key.setter
def public_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "public_key", value)
@property
@pulumi.getter(name="responseCondition")
def response_condition(self) -> Optional[pulumi.Input[str]]:
"""
The name of an existing condition in the configured endpoint, or leave blank to always execute.
"""
return pulumi.get(self, "response_condition")
@response_condition.setter
def response_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response_condition", value)
@property
@pulumi.getter(name="timestampFormat")
def timestamp_format(self) -> Optional[pulumi.Input[str]]:
"""
specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
return pulumi.get(self, "timestamp_format")
@timestamp_format.setter
def timestamp_format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "timestamp_format", value)
@pulumi.input_type
class Servicev1LoggingScalyrArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
token: pulumi.Input[str],
format: Optional[pulumi.Input[str]] = None,
format_version: Optional[pulumi.Input[int]] = None,
placement: Optional[pulumi.Input[str]] = None,
region: Optional[pulumi.Input[str]] = None,
response_condition: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: The unique name of the Scalyr logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] token: The token to use for authentication (https://www.scalyr.com/keys)
:param pulumi.Input[str] format: Apache style log formatting.
:param pulumi.Input[int] format_version: The version of the custom logging format used for the configured endpoint. Can be either 1 or 2. (default: 2).
:param pulumi.Input[str] placement: Where in the generated VCL the logging call should be placed.
:param pulumi.Input[str] region: The region that log data will be sent to. One of `US` or `EU`. Defaults to `US` if undefined
:param pulumi.Input[str] response_condition: The name of an existing condition in the configured endpoint, or leave blank to always execute.
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "token", token)
if format is not None:
pulumi.set(__self__, "format", format)
if format_version is not None:
pulumi.set(__self__, "format_version", format_version)
if placement is not None:
pulumi.set(__self__, "placement", placement)
if region is not None:
pulumi.set(__self__, "region", region)
if response_condition is not None:
pulumi.set(__self__, "response_condition", response_condition)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the Scalyr logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def token(self) -> pulumi.Input[str]:
"""
The token to use for authentication (https://www.scalyr.com/keys)
"""
return pulumi.get(self, "token")
@token.setter
def token(self, value: pulumi.Input[str]):
pulumi.set(self, "token", value)
@property
@pulumi.getter
def format(self) -> Optional[pulumi.Input[str]]:
"""
Apache style log formatting.
"""
return pulumi.get(self, "format")
@format.setter
def format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "format", value)
@property
@pulumi.getter(name="formatVersion")
def format_version(self) -> Optional[pulumi.Input[int]]:
"""
The version of the custom logging format used for the configured endpoint. Can be either 1 or 2. (default: 2).
"""
return pulumi.get(self, "format_version")
@format_version.setter
def format_version(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "format_version", value)
@property
@pulumi.getter
def placement(self) -> Optional[pulumi.Input[str]]:
"""
Where in the generated VCL the logging call should be placed.
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "placement", value)
@property
@pulumi.getter
def region(self) -> Optional[pulumi.Input[str]]:
"""
The region that log data will be sent to. One of `US` or `EU`. Defaults to `US` if undefined
"""
return pulumi.get(self, "region")
@region.setter
def region(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "region", value)
@property
@pulumi.getter(name="responseCondition")
def response_condition(self) -> Optional[pulumi.Input[str]]:
"""
The name of an existing condition in the configured endpoint, or leave blank to always execute.
"""
return pulumi.get(self, "response_condition")
@response_condition.setter
def response_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response_condition", value)
@pulumi.input_type
class Servicev1LoggingSftpArgs:
def __init__(__self__, *,
address: pulumi.Input[str],
name: pulumi.Input[str],
path: pulumi.Input[str],
ssh_known_hosts: pulumi.Input[str],
user: pulumi.Input[str],
compression_codec: Optional[pulumi.Input[str]] = None,
format: Optional[pulumi.Input[str]] = None,
format_version: Optional[pulumi.Input[int]] = None,
gzip_level: Optional[pulumi.Input[int]] = None,
message_type: Optional[pulumi.Input[str]] = None,
password: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[int]] = None,
placement: Optional[pulumi.Input[str]] = None,
port: Optional[pulumi.Input[int]] = None,
public_key: Optional[pulumi.Input[str]] = None,
response_condition: Optional[pulumi.Input[str]] = None,
secret_key: Optional[pulumi.Input[str]] = None,
timestamp_format: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] address: The SFTP address to stream logs to
:param pulumi.Input[str] name: The unique name of the SFTP logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] path: The path to upload log files to. If the path ends in `/` then it is treated as a directory
:param pulumi.Input[str] ssh_known_hosts: A list of host keys for all hosts we can connect to over SFTP
:param pulumi.Input[str] user: The username for the server
:param pulumi.Input[str] compression_codec: The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
:param pulumi.Input[str] format: Apache-style string or VCL variables to use for log formatting.
:param pulumi.Input[int] format_version: The version of the custom logging format used for the configured endpoint. Can be either 1 or 2. (default: 2).
:param pulumi.Input[int] gzip_level: What level of Gzip encoding to have when dumping logs (default `0`, no compression)
:param pulumi.Input[str] message_type: How the message should be formatted. One of: `classic` (default), `loggly`, `logplex` or `blank`
:param pulumi.Input[str] password: The password for the server. If both `password` and `secret_key` are passed, `secret_key` will be preferred
:param pulumi.Input[int] period: How frequently log files are finalized so they can be available for reading (in seconds, default `3600`)
:param pulumi.Input[str] placement: Where in the generated VCL the logging call should be placed.
:param pulumi.Input[int] port: The port the SFTP service listens on. (Default: `22`)
:param pulumi.Input[str] public_key: A PGP public key that Fastly will use to encrypt your log files before writing them to disk
:param pulumi.Input[str] response_condition: The name of the condition to apply.
:param pulumi.Input[str] secret_key: The SSH private key for the server. If both `password` and `secret_key` are passed, `secret_key` will be preferred
:param pulumi.Input[str] timestamp_format: The `strftime` specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
pulumi.set(__self__, "address", address)
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "path", path)
pulumi.set(__self__, "ssh_known_hosts", ssh_known_hosts)
pulumi.set(__self__, "user", user)
if compression_codec is not None:
pulumi.set(__self__, "compression_codec", compression_codec)
if format is not None:
pulumi.set(__self__, "format", format)
if format_version is not None:
pulumi.set(__self__, "format_version", format_version)
if gzip_level is not None:
pulumi.set(__self__, "gzip_level", gzip_level)
if message_type is not None:
pulumi.set(__self__, "message_type", message_type)
if password is not None:
pulumi.set(__self__, "password", password)
if period is not None:
pulumi.set(__self__, "period", period)
if placement is not None:
pulumi.set(__self__, "placement", placement)
if port is not None:
pulumi.set(__self__, "port", port)
if public_key is not None:
pulumi.set(__self__, "public_key", public_key)
if response_condition is not None:
pulumi.set(__self__, "response_condition", response_condition)
if secret_key is not None:
pulumi.set(__self__, "secret_key", secret_key)
if timestamp_format is not None:
pulumi.set(__self__, "timestamp_format", timestamp_format)
@property
@pulumi.getter
def address(self) -> pulumi.Input[str]:
"""
The SFTP address to stream logs to
"""
return pulumi.get(self, "address")
@address.setter
def address(self, value: pulumi.Input[str]):
pulumi.set(self, "address", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the SFTP logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def path(self) -> pulumi.Input[str]:
"""
The path to upload log files to. If the path ends in `/` then it is treated as a directory
"""
return pulumi.get(self, "path")
@path.setter
def path(self, value: pulumi.Input[str]):
pulumi.set(self, "path", value)
@property
@pulumi.getter(name="sshKnownHosts")
def ssh_known_hosts(self) -> pulumi.Input[str]:
"""
A list of host keys for all hosts we can connect to over SFTP
"""
return pulumi.get(self, "ssh_known_hosts")
@ssh_known_hosts.setter
def ssh_known_hosts(self, value: pulumi.Input[str]):
pulumi.set(self, "ssh_known_hosts", value)
@property
@pulumi.getter
def user(self) -> pulumi.Input[str]:
"""
The username for the server
"""
return pulumi.get(self, "user")
@user.setter
def user(self, value: pulumi.Input[str]):
pulumi.set(self, "user", value)
@property
@pulumi.getter(name="compressionCodec")
def compression_codec(self) -> Optional[pulumi.Input[str]]:
"""
The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
"""
return pulumi.get(self, "compression_codec")
@compression_codec.setter
def compression_codec(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "compression_codec", value)
@property
@pulumi.getter
def format(self) -> Optional[pulumi.Input[str]]:
"""
Apache-style string or VCL variables to use for log formatting.
"""
return pulumi.get(self, "format")
@format.setter
def format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "format", value)
@property
@pulumi.getter(name="formatVersion")
def format_version(self) -> Optional[pulumi.Input[int]]:
"""
The version of the custom logging format used for the configured endpoint. Can be either 1 or 2. (default: 2).
"""
return pulumi.get(self, "format_version")
@format_version.setter
def format_version(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "format_version", value)
@property
@pulumi.getter(name="gzipLevel")
def gzip_level(self) -> Optional[pulumi.Input[int]]:
"""
What level of Gzip encoding to have when dumping logs (default `0`, no compression)
"""
return pulumi.get(self, "gzip_level")
@gzip_level.setter
def gzip_level(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "gzip_level", value)
@property
@pulumi.getter(name="messageType")
def message_type(self) -> Optional[pulumi.Input[str]]:
"""
How the message should be formatted. One of: `classic` (default), `loggly`, `logplex` or `blank`
"""
return pulumi.get(self, "message_type")
@message_type.setter
def message_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "message_type", value)
@property
@pulumi.getter
def password(self) -> Optional[pulumi.Input[str]]:
"""
The password for the server. If both `password` and `secret_key` are passed, `secret_key` will be preferred
"""
return pulumi.get(self, "password")
@password.setter
def password(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "password", value)
@property
@pulumi.getter
def period(self) -> Optional[pulumi.Input[int]]:
"""
How frequently log files are finalized so they can be available for reading (in seconds, default `3600`)
"""
return pulumi.get(self, "period")
@period.setter
def period(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "period", value)
@property
@pulumi.getter
def placement(self) -> Optional[pulumi.Input[str]]:
"""
Where in the generated VCL the logging call should be placed.
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "placement", value)
@property
@pulumi.getter
def port(self) -> Optional[pulumi.Input[int]]:
"""
The port the SFTP service listens on. (Default: `22`)
"""
return pulumi.get(self, "port")
@port.setter
def port(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "port", value)
@property
@pulumi.getter(name="publicKey")
def public_key(self) -> Optional[pulumi.Input[str]]:
"""
A PGP public key that Fastly will use to encrypt your log files before writing them to disk
"""
return pulumi.get(self, "public_key")
@public_key.setter
def public_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "public_key", value)
@property
@pulumi.getter(name="responseCondition")
def response_condition(self) -> Optional[pulumi.Input[str]]:
"""
The name of the condition to apply.
"""
return pulumi.get(self, "response_condition")
@response_condition.setter
def response_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response_condition", value)
@property
@pulumi.getter(name="secretKey")
def secret_key(self) -> Optional[pulumi.Input[str]]:
"""
The SSH private key for the server. If both `password` and `secret_key` are passed, `secret_key` will be preferred
"""
return pulumi.get(self, "secret_key")
@secret_key.setter
def secret_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "secret_key", value)
@property
@pulumi.getter(name="timestampFormat")
def timestamp_format(self) -> Optional[pulumi.Input[str]]:
"""
The `strftime` specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
return pulumi.get(self, "timestamp_format")
@timestamp_format.setter
def timestamp_format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "timestamp_format", value)
@pulumi.input_type
class Servicev1PapertrailArgs:
def __init__(__self__, *,
address: pulumi.Input[str],
name: pulumi.Input[str],
port: pulumi.Input[int],
format: Optional[pulumi.Input[str]] = None,
format_version: Optional[pulumi.Input[int]] = None,
placement: Optional[pulumi.Input[str]] = None,
response_condition: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] address: The address of the Papertrail endpoint
:param pulumi.Input[str] name: A unique name to identify this Papertrail endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[int] port: The port associated with the address where the Papertrail endpoint can be accessed
:param pulumi.Input[str] format: A Fastly [log format string](https://docs.fastly.com/en/guides/custom-log-formats)
:param pulumi.Input[int] format_version: The version of the custom logging format used for the configured endpoint. The logging call gets placed by default in `vcl_log` if `format_version` is set to `2` and in `vcl_deliver` if `format_version` is set to `1`
:param pulumi.Input[str] placement: Where in the generated VCL the logging call should be placed. If not set, endpoints with `format_version` of 2 are placed in `vcl_log` and those with `format_version` of 1 are placed in `vcl_deliver`
:param pulumi.Input[str] response_condition: The name of an existing condition in the configured endpoint, or leave blank to always execute
"""
pulumi.set(__self__, "address", address)
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "port", port)
if format is not None:
pulumi.set(__self__, "format", format)
if format_version is not None:
pulumi.set(__self__, "format_version", format_version)
if placement is not None:
pulumi.set(__self__, "placement", placement)
if response_condition is not None:
pulumi.set(__self__, "response_condition", response_condition)
@property
@pulumi.getter
def address(self) -> pulumi.Input[str]:
"""
The address of the Papertrail endpoint
"""
return pulumi.get(self, "address")
@address.setter
def address(self, value: pulumi.Input[str]):
pulumi.set(self, "address", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
A unique name to identify this Papertrail endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def port(self) -> pulumi.Input[int]:
"""
The port associated with the address where the Papertrail endpoint can be accessed
"""
return pulumi.get(self, "port")
@port.setter
def port(self, value: pulumi.Input[int]):
pulumi.set(self, "port", value)
@property
@pulumi.getter
def format(self) -> Optional[pulumi.Input[str]]:
"""
A Fastly [log format string](https://docs.fastly.com/en/guides/custom-log-formats)
"""
return pulumi.get(self, "format")
@format.setter
def format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "format", value)
@property
@pulumi.getter(name="formatVersion")
def format_version(self) -> Optional[pulumi.Input[int]]:
"""
The version of the custom logging format used for the configured endpoint. The logging call gets placed by default in `vcl_log` if `format_version` is set to `2` and in `vcl_deliver` if `format_version` is set to `1`
"""
return pulumi.get(self, "format_version")
@format_version.setter
def format_version(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "format_version", value)
@property
@pulumi.getter
def placement(self) -> Optional[pulumi.Input[str]]:
"""
Where in the generated VCL the logging call should be placed. If not set, endpoints with `format_version` of 2 are placed in `vcl_log` and those with `format_version` of 1 are placed in `vcl_deliver`
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "placement", value)
@property
@pulumi.getter(name="responseCondition")
def response_condition(self) -> Optional[pulumi.Input[str]]:
"""
The name of an existing condition in the configured endpoint, or leave blank to always execute
"""
return pulumi.get(self, "response_condition")
@response_condition.setter
def response_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response_condition", value)
@pulumi.input_type
class Servicev1RequestSettingArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
action: Optional[pulumi.Input[str]] = None,
bypass_busy_wait: Optional[pulumi.Input[bool]] = None,
default_host: Optional[pulumi.Input[str]] = None,
force_miss: Optional[pulumi.Input[bool]] = None,
force_ssl: Optional[pulumi.Input[bool]] = None,
geo_headers: Optional[pulumi.Input[bool]] = None,
hash_keys: Optional[pulumi.Input[str]] = None,
max_stale_age: Optional[pulumi.Input[int]] = None,
request_condition: Optional[pulumi.Input[str]] = None,
timer_support: Optional[pulumi.Input[bool]] = None,
xff: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: Unique name to refer to this Request Setting. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] action: Allows you to terminate request handling and immediately perform an action. When set it can be `lookup` or `pass` (Ignore the cache completely)
:param pulumi.Input[bool] bypass_busy_wait: Disable collapsed forwarding, so you don't wait for other objects to origin
:param pulumi.Input[str] default_host: Sets the host header
:param pulumi.Input[bool] force_miss: Force a cache miss for the request. If specified, can be `true` or `false`
:param pulumi.Input[bool] force_ssl: Forces the request to use SSL (Redirects a non-SSL request to SSL)
:param pulumi.Input[bool] geo_headers: Injects Fastly-Geo-Country, Fastly-Geo-City, and Fastly-Geo-Region into the request headers
:param pulumi.Input[str] hash_keys: Comma separated list of varnish request object fields that should be in the hash key
:param pulumi.Input[int] max_stale_age: How old an object is allowed to be to serve `stale-if-error` or `stale-while-revalidate`, in seconds
:param pulumi.Input[str] request_condition: Name of already defined `condition` to determine if this request setting should be applied
:param pulumi.Input[bool] timer_support: Injects the X-Timer info into the request for viewing origin fetch durations
:param pulumi.Input[str] xff: X-Forwarded-For, should be `clear`, `leave`, `append`, `append_all`, or `overwrite`. Default `append`
"""
pulumi.set(__self__, "name", name)
if action is not None:
pulumi.set(__self__, "action", action)
if bypass_busy_wait is not None:
pulumi.set(__self__, "bypass_busy_wait", bypass_busy_wait)
if default_host is not None:
pulumi.set(__self__, "default_host", default_host)
if force_miss is not None:
pulumi.set(__self__, "force_miss", force_miss)
if force_ssl is not None:
pulumi.set(__self__, "force_ssl", force_ssl)
if geo_headers is not None:
warnings.warn("""'geo_headers' attribute has been deprecated and will be removed in the next major version release""", DeprecationWarning)
pulumi.log.warn("""geo_headers is deprecated: 'geo_headers' attribute has been deprecated and will be removed in the next major version release""")
if geo_headers is not None:
pulumi.set(__self__, "geo_headers", geo_headers)
if hash_keys is not None:
pulumi.set(__self__, "hash_keys", hash_keys)
if max_stale_age is not None:
pulumi.set(__self__, "max_stale_age", max_stale_age)
if request_condition is not None:
pulumi.set(__self__, "request_condition", request_condition)
if timer_support is not None:
pulumi.set(__self__, "timer_support", timer_support)
if xff is not None:
pulumi.set(__self__, "xff", xff)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
Unique name to refer to this Request Setting. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def action(self) -> Optional[pulumi.Input[str]]:
"""
Allows you to terminate request handling and immediately perform an action. When set it can be `lookup` or `pass` (Ignore the cache completely)
"""
return pulumi.get(self, "action")
@action.setter
def action(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "action", value)
@property
@pulumi.getter(name="bypassBusyWait")
def bypass_busy_wait(self) -> Optional[pulumi.Input[bool]]:
"""
Disable collapsed forwarding, so you don't wait for other objects to origin
"""
return pulumi.get(self, "bypass_busy_wait")
@bypass_busy_wait.setter
def bypass_busy_wait(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "bypass_busy_wait", value)
@property
@pulumi.getter(name="defaultHost")
def default_host(self) -> Optional[pulumi.Input[str]]:
"""
Sets the host header
"""
return pulumi.get(self, "default_host")
@default_host.setter
def default_host(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "default_host", value)
@property
@pulumi.getter(name="forceMiss")
def force_miss(self) -> Optional[pulumi.Input[bool]]:
"""
Force a cache miss for the request. If specified, can be `true` or `false`
"""
return pulumi.get(self, "force_miss")
@force_miss.setter
def force_miss(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "force_miss", value)
@property
@pulumi.getter(name="forceSsl")
def force_ssl(self) -> Optional[pulumi.Input[bool]]:
"""
Forces the request to use SSL (Redirects a non-SSL request to SSL)
"""
return pulumi.get(self, "force_ssl")
@force_ssl.setter
def force_ssl(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "force_ssl", value)
@property
@pulumi.getter(name="geoHeaders")
def geo_headers(self) -> Optional[pulumi.Input[bool]]:
"""
Injects Fastly-Geo-Country, Fastly-Geo-City, and Fastly-Geo-Region into the request headers
"""
return pulumi.get(self, "geo_headers")
@geo_headers.setter
def geo_headers(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "geo_headers", value)
@property
@pulumi.getter(name="hashKeys")
def hash_keys(self) -> Optional[pulumi.Input[str]]:
"""
Comma separated list of varnish request object fields that should be in the hash key
"""
return pulumi.get(self, "hash_keys")
@hash_keys.setter
def hash_keys(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "hash_keys", value)
@property
@pulumi.getter(name="maxStaleAge")
def max_stale_age(self) -> Optional[pulumi.Input[int]]:
"""
How old an object is allowed to be to serve `stale-if-error` or `stale-while-revalidate`, in seconds
"""
return pulumi.get(self, "max_stale_age")
@max_stale_age.setter
def max_stale_age(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "max_stale_age", value)
@property
@pulumi.getter(name="requestCondition")
def request_condition(self) -> Optional[pulumi.Input[str]]:
"""
Name of already defined `condition` to determine if this request setting should be applied
"""
return pulumi.get(self, "request_condition")
@request_condition.setter
def request_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "request_condition", value)
@property
@pulumi.getter(name="timerSupport")
def timer_support(self) -> Optional[pulumi.Input[bool]]:
"""
Injects the X-Timer info into the request for viewing origin fetch durations
"""
return pulumi.get(self, "timer_support")
@timer_support.setter
def timer_support(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "timer_support", value)
@property
@pulumi.getter
def xff(self) -> Optional[pulumi.Input[str]]:
"""
X-Forwarded-For, should be `clear`, `leave`, `append`, `append_all`, or `overwrite`. Default `append`
"""
return pulumi.get(self, "xff")
@xff.setter
def xff(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "xff", value)
@pulumi.input_type
class Servicev1ResponseObjectArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
cache_condition: Optional[pulumi.Input[str]] = None,
content: Optional[pulumi.Input[str]] = None,
content_type: Optional[pulumi.Input[str]] = None,
request_condition: Optional[pulumi.Input[str]] = None,
response: Optional[pulumi.Input[str]] = None,
status: Optional[pulumi.Input[int]] = None):
"""
:param pulumi.Input[str] name: A unique name to identify this Response Object. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] cache_condition: Name of already defined `condition` to check after we have retrieved an object. If the condition passes then deliver this Request Object instead. This `condition` must be of type `CACHE`. For detailed information about Conditionals, see [Fastly's Documentation on Conditionals](https://docs.fastly.com/en/guides/using-conditions)
:param pulumi.Input[str] content: The content to deliver for the response object
:param pulumi.Input[str] content_type: The MIME type of the content
:param pulumi.Input[str] request_condition: Name of already defined `condition` to be checked during the request phase. If the condition passes then this object will be delivered. This `condition` must be of type `REQUEST`
:param pulumi.Input[str] response: The HTTP Response. Default `OK`
:param pulumi.Input[int] status: The HTTP Status Code. Default `200`
"""
pulumi.set(__self__, "name", name)
if cache_condition is not None:
pulumi.set(__self__, "cache_condition", cache_condition)
if content is not None:
pulumi.set(__self__, "content", content)
if content_type is not None:
pulumi.set(__self__, "content_type", content_type)
if request_condition is not None:
pulumi.set(__self__, "request_condition", request_condition)
if response is not None:
pulumi.set(__self__, "response", response)
if status is not None:
pulumi.set(__self__, "status", status)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
A unique name to identify this Response Object. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="cacheCondition")
def cache_condition(self) -> Optional[pulumi.Input[str]]:
"""
Name of already defined `condition` to check after we have retrieved an object. If the condition passes then deliver this Request Object instead. This `condition` must be of type `CACHE`. For detailed information about Conditionals, see [Fastly's Documentation on Conditionals](https://docs.fastly.com/en/guides/using-conditions)
"""
return pulumi.get(self, "cache_condition")
@cache_condition.setter
def cache_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "cache_condition", value)
@property
@pulumi.getter
def content(self) -> Optional[pulumi.Input[str]]:
"""
The content to deliver for the response object
"""
return pulumi.get(self, "content")
@content.setter
def content(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "content", value)
@property
@pulumi.getter(name="contentType")
def content_type(self) -> Optional[pulumi.Input[str]]:
"""
The MIME type of the content
"""
return pulumi.get(self, "content_type")
@content_type.setter
def content_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "content_type", value)
@property
@pulumi.getter(name="requestCondition")
def request_condition(self) -> Optional[pulumi.Input[str]]:
"""
Name of already defined `condition` to be checked during the request phase. If the condition passes then this object will be delivered. This `condition` must be of type `REQUEST`
"""
return pulumi.get(self, "request_condition")
@request_condition.setter
def request_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "request_condition", value)
@property
@pulumi.getter
def response(self) -> Optional[pulumi.Input[str]]:
"""
The HTTP Response. Default `OK`
"""
return pulumi.get(self, "response")
@response.setter
def response(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response", value)
@property
@pulumi.getter
def status(self) -> Optional[pulumi.Input[int]]:
"""
The HTTP Status Code. Default `200`
"""
return pulumi.get(self, "status")
@status.setter
def status(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "status", value)
@pulumi.input_type
class Servicev1S3loggingArgs:
def __init__(__self__, *,
bucket_name: pulumi.Input[str],
name: pulumi.Input[str],
acl: Optional[pulumi.Input[str]] = None,
compression_codec: Optional[pulumi.Input[str]] = None,
domain: Optional[pulumi.Input[str]] = None,
format: Optional[pulumi.Input[str]] = None,
format_version: Optional[pulumi.Input[int]] = None,
gzip_level: Optional[pulumi.Input[int]] = None,
message_type: Optional[pulumi.Input[str]] = None,
path: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[int]] = None,
placement: Optional[pulumi.Input[str]] = None,
public_key: Optional[pulumi.Input[str]] = None,
redundancy: Optional[pulumi.Input[str]] = None,
response_condition: Optional[pulumi.Input[str]] = None,
s3_access_key: Optional[pulumi.Input[str]] = None,
s3_iam_role: Optional[pulumi.Input[str]] = None,
s3_secret_key: Optional[pulumi.Input[str]] = None,
server_side_encryption: Optional[pulumi.Input[str]] = None,
server_side_encryption_kms_key_id: Optional[pulumi.Input[str]] = None,
timestamp_format: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] bucket_name: The name of the bucket in which to store the logs
:param pulumi.Input[str] name: The unique name of the S3 logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] acl: The AWS [Canned ACL](https://docs.aws.amazon.com/AmazonS3/latest/userguide/acl-overview.html#canned-acl) to use for objects uploaded to the S3 bucket. Options are: `private`, `public-read`, `public-read-write`, `aws-exec-read`, `authenticated-read`, `bucket-owner-read`, `bucket-owner-full-control`
:param pulumi.Input[str] compression_codec: The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
:param pulumi.Input[str] domain: If you created the S3 bucket outside of `us-east-1`, then specify the corresponding bucket endpoint. Example: `s3-us-west-2.amazonaws.com`
:param pulumi.Input[str] format: Apache-style string or VCL variables to use for log formatting.
:param pulumi.Input[int] format_version: The version of the custom logging format used for the configured endpoint. Can be either 1 or 2. (Default: 1).
:param pulumi.Input[int] gzip_level: Level of Gzip compression, from `0-9`. `0` is no compression. `1` is fastest and least compressed, `9` is slowest and most compressed. Default `0`
:param pulumi.Input[str] message_type: How the message should be formatted; one of: `classic`, `loggly`, `logplex` or `blank`. Default `classic`
:param pulumi.Input[str] path: Path to store the files. Must end with a trailing slash. If this field is left empty, the files will be saved in the bucket's root path
:param pulumi.Input[int] period: How frequently the logs should be transferred, in seconds. Default `3600`
:param pulumi.Input[str] placement: Where in the generated VCL the logging call should be placed.
:param pulumi.Input[str] public_key: A PGP public key that Fastly will use to encrypt your log files before writing them to disk
:param pulumi.Input[str] redundancy: The S3 storage class (redundancy level). Should be one of: `standard`, `reduced_redundancy`, `standard_ia`, or `onezone_ia`
:param pulumi.Input[str] response_condition: Name of blockAttributes condition to apply this logging.
:param pulumi.Input[str] s3_access_key: AWS Access Key of an account with the required permissions to post logs. It is **strongly** recommended you create a separate IAM user with permissions to only operate on this Bucket. This key will be not be encrypted. Not required if `iam_role` is provided. You can provide this key via an environment variable, `FASTLY_S3_ACCESS_KEY`
:param pulumi.Input[str] s3_iam_role: The Amazon Resource Name (ARN) for the IAM role granting Fastly access to S3. Not required if `access_key` and `secret_key` are provided. You can provide this value via an environment variable, `FASTLY_S3_IAM_ROLE`
:param pulumi.Input[str] s3_secret_key: AWS Secret Key of an account with the required permissions to post logs. It is **strongly** recommended you create a separate IAM user with permissions to only operate on this Bucket. This secret will be not be encrypted. Not required if `iam_role` is provided. You can provide this secret via an environment variable, `FASTLY_S3_SECRET_KEY`
:param pulumi.Input[str] server_side_encryption: Specify what type of server side encryption should be used. Can be either `AES256` or `aws:kms`
:param pulumi.Input[str] server_side_encryption_kms_key_id: Optional server-side KMS Key Id. Must be set if server*side*encryption is set to `aws:kms`
:param pulumi.Input[str] timestamp_format: `strftime` specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
pulumi.set(__self__, "bucket_name", bucket_name)
pulumi.set(__self__, "name", name)
if acl is not None:
pulumi.set(__self__, "acl", acl)
if compression_codec is not None:
pulumi.set(__self__, "compression_codec", compression_codec)
if domain is not None:
pulumi.set(__self__, "domain", domain)
if format is not None:
pulumi.set(__self__, "format", format)
if format_version is not None:
pulumi.set(__self__, "format_version", format_version)
if gzip_level is not None:
pulumi.set(__self__, "gzip_level", gzip_level)
if message_type is not None:
pulumi.set(__self__, "message_type", message_type)
if path is not None:
pulumi.set(__self__, "path", path)
if period is not None:
pulumi.set(__self__, "period", period)
if placement is not None:
pulumi.set(__self__, "placement", placement)
if public_key is not None:
pulumi.set(__self__, "public_key", public_key)
if redundancy is not None:
pulumi.set(__self__, "redundancy", redundancy)
if response_condition is not None:
pulumi.set(__self__, "response_condition", response_condition)
if s3_access_key is not None:
pulumi.set(__self__, "s3_access_key", s3_access_key)
if s3_iam_role is not None:
pulumi.set(__self__, "s3_iam_role", s3_iam_role)
if s3_secret_key is not None:
pulumi.set(__self__, "s3_secret_key", s3_secret_key)
if server_side_encryption is not None:
pulumi.set(__self__, "server_side_encryption", server_side_encryption)
if server_side_encryption_kms_key_id is not None:
pulumi.set(__self__, "server_side_encryption_kms_key_id", server_side_encryption_kms_key_id)
if timestamp_format is not None:
pulumi.set(__self__, "timestamp_format", timestamp_format)
@property
@pulumi.getter(name="bucketName")
def bucket_name(self) -> pulumi.Input[str]:
"""
The name of the bucket in which to store the logs
"""
return pulumi.get(self, "bucket_name")
@bucket_name.setter
def bucket_name(self, value: pulumi.Input[str]):
pulumi.set(self, "bucket_name", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
The unique name of the S3 logging endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def acl(self) -> Optional[pulumi.Input[str]]:
"""
The AWS [Canned ACL](https://docs.aws.amazon.com/AmazonS3/latest/userguide/acl-overview.html#canned-acl) to use for objects uploaded to the S3 bucket. Options are: `private`, `public-read`, `public-read-write`, `aws-exec-read`, `authenticated-read`, `bucket-owner-read`, `bucket-owner-full-control`
"""
return pulumi.get(self, "acl")
@acl.setter
def acl(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "acl", value)
@property
@pulumi.getter(name="compressionCodec")
def compression_codec(self) -> Optional[pulumi.Input[str]]:
"""
The codec used for compression of your logs. Valid values are zstd, snappy, and gzip. If the specified codec is "gzip", gzip*level will default to 3. To specify a different level, leave compression*codec blank and explicitly set the level using gzip*level. Specifying both compression*codec and gzip_level in the same API request will result in an error.
"""
return pulumi.get(self, "compression_codec")
@compression_codec.setter
def compression_codec(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "compression_codec", value)
@property
@pulumi.getter
def domain(self) -> Optional[pulumi.Input[str]]:
"""
If you created the S3 bucket outside of `us-east-1`, then specify the corresponding bucket endpoint. Example: `s3-us-west-2.amazonaws.com`
"""
return pulumi.get(self, "domain")
@domain.setter
def domain(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "domain", value)
@property
@pulumi.getter
def format(self) -> Optional[pulumi.Input[str]]:
"""
Apache-style string or VCL variables to use for log formatting.
"""
return pulumi.get(self, "format")
@format.setter
def format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "format", value)
@property
@pulumi.getter(name="formatVersion")
def format_version(self) -> Optional[pulumi.Input[int]]:
"""
The version of the custom logging format used for the configured endpoint. Can be either 1 or 2. (Default: 1).
"""
return pulumi.get(self, "format_version")
@format_version.setter
def format_version(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "format_version", value)
@property
@pulumi.getter(name="gzipLevel")
def gzip_level(self) -> Optional[pulumi.Input[int]]:
"""
Level of Gzip compression, from `0-9`. `0` is no compression. `1` is fastest and least compressed, `9` is slowest and most compressed. Default `0`
"""
return pulumi.get(self, "gzip_level")
@gzip_level.setter
def gzip_level(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "gzip_level", value)
@property
@pulumi.getter(name="messageType")
def message_type(self) -> Optional[pulumi.Input[str]]:
"""
How the message should be formatted; one of: `classic`, `loggly`, `logplex` or `blank`. Default `classic`
"""
return pulumi.get(self, "message_type")
@message_type.setter
def message_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "message_type", value)
@property
@pulumi.getter
def path(self) -> Optional[pulumi.Input[str]]:
"""
Path to store the files. Must end with a trailing slash. If this field is left empty, the files will be saved in the bucket's root path
"""
return pulumi.get(self, "path")
@path.setter
def path(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "path", value)
@property
@pulumi.getter
def period(self) -> Optional[pulumi.Input[int]]:
"""
How frequently the logs should be transferred, in seconds. Default `3600`
"""
return pulumi.get(self, "period")
@period.setter
def period(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "period", value)
@property
@pulumi.getter
def placement(self) -> Optional[pulumi.Input[str]]:
"""
Where in the generated VCL the logging call should be placed.
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "placement", value)
@property
@pulumi.getter(name="publicKey")
def public_key(self) -> Optional[pulumi.Input[str]]:
"""
A PGP public key that Fastly will use to encrypt your log files before writing them to disk
"""
return pulumi.get(self, "public_key")
@public_key.setter
def public_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "public_key", value)
@property
@pulumi.getter
def redundancy(self) -> Optional[pulumi.Input[str]]:
"""
The S3 storage class (redundancy level). Should be one of: `standard`, `reduced_redundancy`, `standard_ia`, or `onezone_ia`
"""
return pulumi.get(self, "redundancy")
@redundancy.setter
def redundancy(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "redundancy", value)
@property
@pulumi.getter(name="responseCondition")
def response_condition(self) -> Optional[pulumi.Input[str]]:
"""
Name of blockAttributes condition to apply this logging.
"""
return pulumi.get(self, "response_condition")
@response_condition.setter
def response_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response_condition", value)
@property
@pulumi.getter(name="s3AccessKey")
def s3_access_key(self) -> Optional[pulumi.Input[str]]:
"""
AWS Access Key of an account with the required permissions to post logs. It is **strongly** recommended you create a separate IAM user with permissions to only operate on this Bucket. This key will be not be encrypted. Not required if `iam_role` is provided. You can provide this key via an environment variable, `FASTLY_S3_ACCESS_KEY`
"""
return pulumi.get(self, "s3_access_key")
@s3_access_key.setter
def s3_access_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "s3_access_key", value)
@property
@pulumi.getter(name="s3IamRole")
def s3_iam_role(self) -> Optional[pulumi.Input[str]]:
"""
The Amazon Resource Name (ARN) for the IAM role granting Fastly access to S3. Not required if `access_key` and `secret_key` are provided. You can provide this value via an environment variable, `FASTLY_S3_IAM_ROLE`
"""
return pulumi.get(self, "s3_iam_role")
@s3_iam_role.setter
def s3_iam_role(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "s3_iam_role", value)
@property
@pulumi.getter(name="s3SecretKey")
def s3_secret_key(self) -> Optional[pulumi.Input[str]]:
"""
AWS Secret Key of an account with the required permissions to post logs. It is **strongly** recommended you create a separate IAM user with permissions to only operate on this Bucket. This secret will be not be encrypted. Not required if `iam_role` is provided. You can provide this secret via an environment variable, `FASTLY_S3_SECRET_KEY`
"""
return pulumi.get(self, "s3_secret_key")
@s3_secret_key.setter
def s3_secret_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "s3_secret_key", value)
@property
@pulumi.getter(name="serverSideEncryption")
def server_side_encryption(self) -> Optional[pulumi.Input[str]]:
"""
Specify what type of server side encryption should be used. Can be either `AES256` or `aws:kms`
"""
return pulumi.get(self, "server_side_encryption")
@server_side_encryption.setter
def server_side_encryption(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "server_side_encryption", value)
@property
@pulumi.getter(name="serverSideEncryptionKmsKeyId")
def server_side_encryption_kms_key_id(self) -> Optional[pulumi.Input[str]]:
"""
Optional server-side KMS Key Id. Must be set if server*side*encryption is set to `aws:kms`
"""
return pulumi.get(self, "server_side_encryption_kms_key_id")
@server_side_encryption_kms_key_id.setter
def server_side_encryption_kms_key_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "server_side_encryption_kms_key_id", value)
@property
@pulumi.getter(name="timestampFormat")
def timestamp_format(self) -> Optional[pulumi.Input[str]]:
"""
`strftime` specified timestamp formatting (default `%Y-%m-%dT%H:%M:%S.000`)
"""
return pulumi.get(self, "timestamp_format")
@timestamp_format.setter
def timestamp_format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "timestamp_format", value)
@pulumi.input_type
class Servicev1SnippetArgs:
def __init__(__self__, *,
content: pulumi.Input[str],
name: pulumi.Input[str],
type: pulumi.Input[str],
priority: Optional[pulumi.Input[int]] = None):
"""
:param pulumi.Input[str] content: The VCL code that specifies exactly what the snippet does
:param pulumi.Input[str] name: A name that is unique across "regular" and "dynamic" VCL Snippet configuration blocks. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] type: The location in generated VCL where the snippet should be placed (can be one of `init`, `recv`, `hit`, `miss`, `pass`, `fetch`, `error`, `deliver`, `log` or `none`)
:param pulumi.Input[int] priority: Priority determines the ordering for multiple snippets. Lower numbers execute first. Defaults to `100`
"""
pulumi.set(__self__, "content", content)
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "type", type)
if priority is not None:
pulumi.set(__self__, "priority", priority)
@property
@pulumi.getter
def content(self) -> pulumi.Input[str]:
"""
The VCL code that specifies exactly what the snippet does
"""
return pulumi.get(self, "content")
@content.setter
def content(self, value: pulumi.Input[str]):
pulumi.set(self, "content", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
A name that is unique across "regular" and "dynamic" VCL Snippet configuration blocks. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def type(self) -> pulumi.Input[str]:
"""
The location in generated VCL where the snippet should be placed (can be one of `init`, `recv`, `hit`, `miss`, `pass`, `fetch`, `error`, `deliver`, `log` or `none`)
"""
return pulumi.get(self, "type")
@type.setter
def type(self, value: pulumi.Input[str]):
pulumi.set(self, "type", value)
@property
@pulumi.getter
def priority(self) -> Optional[pulumi.Input[int]]:
"""
Priority determines the ordering for multiple snippets. Lower numbers execute first. Defaults to `100`
"""
return pulumi.get(self, "priority")
@priority.setter
def priority(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "priority", value)
@pulumi.input_type
class Servicev1SplunkArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
token: pulumi.Input[str],
url: pulumi.Input[str],
format: Optional[pulumi.Input[str]] = None,
format_version: Optional[pulumi.Input[int]] = None,
placement: Optional[pulumi.Input[str]] = None,
response_condition: Optional[pulumi.Input[str]] = None,
tls_ca_cert: Optional[pulumi.Input[str]] = None,
tls_client_cert: Optional[pulumi.Input[str]] = None,
tls_client_key: Optional[pulumi.Input[str]] = None,
tls_hostname: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: A unique name to identify the Splunk endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] token: The Splunk token to be used for authentication
:param pulumi.Input[str] url: The Splunk URL to stream logs to
:param pulumi.Input[str] format: Apache-style string or VCL variables to use for log formatting (default: `%h %l %u %t "%r" %>s %b`)
:param pulumi.Input[int] format_version: The version of the custom logging format used for the configured endpoint. Can be either 1 or 2. (default: 2)
:param pulumi.Input[str] placement: Where in the generated VCL the logging call should be placed
:param pulumi.Input[str] response_condition: The name of the condition to apply
:param pulumi.Input[str] tls_ca_cert: A secure certificate to authenticate the server with. Must be in PEM format. You can provide this certificate via an environment variable, `FASTLY_SPLUNK_CA_CERT`
:param pulumi.Input[str] tls_client_cert: The client certificate used to make authenticated requests. Must be in PEM format.
:param pulumi.Input[str] tls_client_key: The client private key used to make authenticated requests. Must be in PEM format.
:param pulumi.Input[str] tls_hostname: The hostname used to verify the server's certificate. It can either be the Common Name or a Subject Alternative Name (SAN)
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "token", token)
pulumi.set(__self__, "url", url)
if format is not None:
pulumi.set(__self__, "format", format)
if format_version is not None:
pulumi.set(__self__, "format_version", format_version)
if placement is not None:
pulumi.set(__self__, "placement", placement)
if response_condition is not None:
pulumi.set(__self__, "response_condition", response_condition)
if tls_ca_cert is not None:
pulumi.set(__self__, "tls_ca_cert", tls_ca_cert)
if tls_client_cert is not None:
pulumi.set(__self__, "tls_client_cert", tls_client_cert)
if tls_client_key is not None:
pulumi.set(__self__, "tls_client_key", tls_client_key)
if tls_hostname is not None:
pulumi.set(__self__, "tls_hostname", tls_hostname)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
A unique name to identify the Splunk endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def token(self) -> pulumi.Input[str]:
"""
The Splunk token to be used for authentication
"""
return pulumi.get(self, "token")
@token.setter
def token(self, value: pulumi.Input[str]):
pulumi.set(self, "token", value)
@property
@pulumi.getter
def url(self) -> pulumi.Input[str]:
"""
The Splunk URL to stream logs to
"""
return pulumi.get(self, "url")
@url.setter
def url(self, value: pulumi.Input[str]):
pulumi.set(self, "url", value)
@property
@pulumi.getter
def format(self) -> Optional[pulumi.Input[str]]:
"""
Apache-style string or VCL variables to use for log formatting (default: `%h %l %u %t "%r" %>s %b`)
"""
return pulumi.get(self, "format")
@format.setter
def format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "format", value)
@property
@pulumi.getter(name="formatVersion")
def format_version(self) -> Optional[pulumi.Input[int]]:
"""
The version of the custom logging format used for the configured endpoint. Can be either 1 or 2. (default: 2)
"""
return pulumi.get(self, "format_version")
@format_version.setter
def format_version(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "format_version", value)
@property
@pulumi.getter
def placement(self) -> Optional[pulumi.Input[str]]:
"""
Where in the generated VCL the logging call should be placed
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "placement", value)
@property
@pulumi.getter(name="responseCondition")
def response_condition(self) -> Optional[pulumi.Input[str]]:
"""
The name of the condition to apply
"""
return pulumi.get(self, "response_condition")
@response_condition.setter
def response_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response_condition", value)
@property
@pulumi.getter(name="tlsCaCert")
def tls_ca_cert(self) -> Optional[pulumi.Input[str]]:
"""
A secure certificate to authenticate the server with. Must be in PEM format. You can provide this certificate via an environment variable, `FASTLY_SPLUNK_CA_CERT`
"""
return pulumi.get(self, "tls_ca_cert")
@tls_ca_cert.setter
def tls_ca_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_ca_cert", value)
@property
@pulumi.getter(name="tlsClientCert")
def tls_client_cert(self) -> Optional[pulumi.Input[str]]:
"""
The client certificate used to make authenticated requests. Must be in PEM format.
"""
return pulumi.get(self, "tls_client_cert")
@tls_client_cert.setter
def tls_client_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_client_cert", value)
@property
@pulumi.getter(name="tlsClientKey")
def tls_client_key(self) -> Optional[pulumi.Input[str]]:
"""
The client private key used to make authenticated requests. Must be in PEM format.
"""
return pulumi.get(self, "tls_client_key")
@tls_client_key.setter
def tls_client_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_client_key", value)
@property
@pulumi.getter(name="tlsHostname")
def tls_hostname(self) -> Optional[pulumi.Input[str]]:
"""
The hostname used to verify the server's certificate. It can either be the Common Name or a Subject Alternative Name (SAN)
"""
return pulumi.get(self, "tls_hostname")
@tls_hostname.setter
def tls_hostname(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_hostname", value)
@pulumi.input_type
class Servicev1SumologicArgs:
def __init__(__self__, *,
name: pulumi.Input[str],
url: pulumi.Input[str],
format: Optional[pulumi.Input[str]] = None,
format_version: Optional[pulumi.Input[int]] = None,
message_type: Optional[pulumi.Input[str]] = None,
placement: Optional[pulumi.Input[str]] = None,
response_condition: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] name: A unique name to identify this Sumologic endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] url: The URL to Sumologic collector endpoint
:param pulumi.Input[str] format: Apache-style string or VCL variables to use for log formatting
:param pulumi.Input[int] format_version: The version of the custom logging format used for the configured endpoint. Can be either 1 or 2. (Default: 1)
:param pulumi.Input[str] message_type: How the message should be formatted; one of: `classic`, `loggly`, `logplex` or `blank`. Default `classic`. See [Fastly's Documentation on Sumologic](https://developer.fastly.com/reference/api/logging/sumologic/)
:param pulumi.Input[str] placement: Where in the generated VCL the logging call should be placed.
:param pulumi.Input[str] response_condition: Name of blockAttributes condition to apply this logging.
"""
pulumi.set(__self__, "name", name)
pulumi.set(__self__, "url", url)
if format is not None:
pulumi.set(__self__, "format", format)
if format_version is not None:
pulumi.set(__self__, "format_version", format_version)
if message_type is not None:
pulumi.set(__self__, "message_type", message_type)
if placement is not None:
pulumi.set(__self__, "placement", placement)
if response_condition is not None:
pulumi.set(__self__, "response_condition", response_condition)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
A unique name to identify this Sumologic endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def url(self) -> pulumi.Input[str]:
"""
The URL to Sumologic collector endpoint
"""
return pulumi.get(self, "url")
@url.setter
def url(self, value: pulumi.Input[str]):
pulumi.set(self, "url", value)
@property
@pulumi.getter
def format(self) -> Optional[pulumi.Input[str]]:
"""
Apache-style string or VCL variables to use for log formatting
"""
return pulumi.get(self, "format")
@format.setter
def format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "format", value)
@property
@pulumi.getter(name="formatVersion")
def format_version(self) -> Optional[pulumi.Input[int]]:
"""
The version of the custom logging format used for the configured endpoint. Can be either 1 or 2. (Default: 1)
"""
return pulumi.get(self, "format_version")
@format_version.setter
def format_version(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "format_version", value)
@property
@pulumi.getter(name="messageType")
def message_type(self) -> Optional[pulumi.Input[str]]:
"""
How the message should be formatted; one of: `classic`, `loggly`, `logplex` or `blank`. Default `classic`. See [Fastly's Documentation on Sumologic](https://developer.fastly.com/reference/api/logging/sumologic/)
"""
return pulumi.get(self, "message_type")
@message_type.setter
def message_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "message_type", value)
@property
@pulumi.getter
def placement(self) -> Optional[pulumi.Input[str]]:
"""
Where in the generated VCL the logging call should be placed.
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "placement", value)
@property
@pulumi.getter(name="responseCondition")
def response_condition(self) -> Optional[pulumi.Input[str]]:
"""
Name of blockAttributes condition to apply this logging.
"""
return pulumi.get(self, "response_condition")
@response_condition.setter
def response_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response_condition", value)
@pulumi.input_type
class Servicev1SyslogArgs:
def __init__(__self__, *,
address: pulumi.Input[str],
name: pulumi.Input[str],
format: Optional[pulumi.Input[str]] = None,
format_version: Optional[pulumi.Input[int]] = None,
message_type: Optional[pulumi.Input[str]] = None,
placement: Optional[pulumi.Input[str]] = None,
port: Optional[pulumi.Input[int]] = None,
response_condition: Optional[pulumi.Input[str]] = None,
tls_ca_cert: Optional[pulumi.Input[str]] = None,
tls_client_cert: Optional[pulumi.Input[str]] = None,
tls_client_key: Optional[pulumi.Input[str]] = None,
tls_hostname: Optional[pulumi.Input[str]] = None,
token: Optional[pulumi.Input[str]] = None,
use_tls: Optional[pulumi.Input[bool]] = None):
"""
:param pulumi.Input[str] address: A hostname or IPv4 address of the Syslog endpoint
:param pulumi.Input[str] name: A unique name to identify this Syslog endpoint. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[str] format: Apache-style string or VCL variables to use for log formatting
:param pulumi.Input[int] format_version: The version of the custom logging format. Can be either 1 or 2. (Default: 1)
:param pulumi.Input[str] message_type: How the message should be formatted; one of: `classic`, `loggly`, `logplex` or `blank`. Default `classic`
:param pulumi.Input[str] placement: Where in the generated VCL the logging call should be placed.
:param pulumi.Input[int] port: The port associated with the address where the Syslog endpoint can be accessed. Default `514`
:param pulumi.Input[str] response_condition: Name of blockAttributes condition to apply this logging.
:param pulumi.Input[str] tls_ca_cert: A secure certificate to authenticate the server with. Must be in PEM format. You can provide this certificate via an environment variable, `FASTLY_SYSLOG_CA_CERT`
:param pulumi.Input[str] tls_client_cert: The client certificate used to make authenticated requests. Must be in PEM format. You can provide this certificate via an environment variable, `FASTLY_SYSLOG_CLIENT_CERT`
:param pulumi.Input[str] tls_client_key: The client private key used to make authenticated requests. Must be in PEM format. You can provide this key via an environment variable, `FASTLY_SYSLOG_CLIENT_KEY`
:param pulumi.Input[str] tls_hostname: Used during the TLS handshake to validate the certificate
:param pulumi.Input[str] token: Whether to prepend each message with a specific token
:param pulumi.Input[bool] use_tls: Whether to use TLS for secure logging. Default `false`
"""
pulumi.set(__self__, "address", address)
pulumi.set(__self__, "name", name)
if format is not None:
pulumi.set(__self__, "format", format)
if format_version is not None:
pulumi.set(__self__, "format_version", format_version)
if message_type is not None:
pulumi.set(__self__, "message_type", message_type)
if placement is not None:
pulumi.set(__self__, "placement", placement)
if port is not None:
pulumi.set(__self__, "port", port)
if response_condition is not None:
pulumi.set(__self__, "response_condition", response_condition)
if tls_ca_cert is not None:
pulumi.set(__self__, "tls_ca_cert", tls_ca_cert)
if tls_client_cert is not None:
pulumi.set(__self__, "tls_client_cert", tls_client_cert)
if tls_client_key is not None:
pulumi.set(__self__, "tls_client_key", tls_client_key)
if tls_hostname is not None:
pulumi.set(__self__, "tls_hostname", tls_hostname)
if token is not None:
pulumi.set(__self__, "token", token)
if use_tls is not None:
pulumi.set(__self__, "use_tls", use_tls)
@property
@pulumi.getter
def address(self) -> pulumi.Input[str]:
"""
A hostname or IPv4 address of the Syslog endpoint
"""
return pulumi.get(self, "address")
@address.setter
def address(self, value: pulumi.Input[str]):
pulumi.set(self, "address", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
A unique name to identify this Syslog endpoint. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def format(self) -> Optional[pulumi.Input[str]]:
"""
Apache-style string or VCL variables to use for log formatting
"""
return pulumi.get(self, "format")
@format.setter
def format(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "format", value)
@property
@pulumi.getter(name="formatVersion")
def format_version(self) -> Optional[pulumi.Input[int]]:
"""
The version of the custom logging format. Can be either 1 or 2. (Default: 1)
"""
return pulumi.get(self, "format_version")
@format_version.setter
def format_version(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "format_version", value)
@property
@pulumi.getter(name="messageType")
def message_type(self) -> Optional[pulumi.Input[str]]:
"""
How the message should be formatted; one of: `classic`, `loggly`, `logplex` or `blank`. Default `classic`
"""
return pulumi.get(self, "message_type")
@message_type.setter
def message_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "message_type", value)
@property
@pulumi.getter
def placement(self) -> Optional[pulumi.Input[str]]:
"""
Where in the generated VCL the logging call should be placed.
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "placement", value)
@property
@pulumi.getter
def port(self) -> Optional[pulumi.Input[int]]:
"""
The port associated with the address where the Syslog endpoint can be accessed. Default `514`
"""
return pulumi.get(self, "port")
@port.setter
def port(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "port", value)
@property
@pulumi.getter(name="responseCondition")
def response_condition(self) -> Optional[pulumi.Input[str]]:
"""
Name of blockAttributes condition to apply this logging.
"""
return pulumi.get(self, "response_condition")
@response_condition.setter
def response_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "response_condition", value)
@property
@pulumi.getter(name="tlsCaCert")
def tls_ca_cert(self) -> Optional[pulumi.Input[str]]:
"""
A secure certificate to authenticate the server with. Must be in PEM format. You can provide this certificate via an environment variable, `FASTLY_SYSLOG_CA_CERT`
"""
return pulumi.get(self, "tls_ca_cert")
@tls_ca_cert.setter
def tls_ca_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_ca_cert", value)
@property
@pulumi.getter(name="tlsClientCert")
def tls_client_cert(self) -> Optional[pulumi.Input[str]]:
"""
The client certificate used to make authenticated requests. Must be in PEM format. You can provide this certificate via an environment variable, `FASTLY_SYSLOG_CLIENT_CERT`
"""
return pulumi.get(self, "tls_client_cert")
@tls_client_cert.setter
def tls_client_cert(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_client_cert", value)
@property
@pulumi.getter(name="tlsClientKey")
def tls_client_key(self) -> Optional[pulumi.Input[str]]:
"""
The client private key used to make authenticated requests. Must be in PEM format. You can provide this key via an environment variable, `FASTLY_SYSLOG_CLIENT_KEY`
"""
return pulumi.get(self, "tls_client_key")
@tls_client_key.setter
def tls_client_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_client_key", value)
@property
@pulumi.getter(name="tlsHostname")
def tls_hostname(self) -> Optional[pulumi.Input[str]]:
"""
Used during the TLS handshake to validate the certificate
"""
return pulumi.get(self, "tls_hostname")
@tls_hostname.setter
def tls_hostname(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tls_hostname", value)
@property
@pulumi.getter
def token(self) -> Optional[pulumi.Input[str]]:
"""
Whether to prepend each message with a specific token
"""
return pulumi.get(self, "token")
@token.setter
def token(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "token", value)
@property
@pulumi.getter(name="useTls")
def use_tls(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to use TLS for secure logging. Default `false`
"""
return pulumi.get(self, "use_tls")
@use_tls.setter
def use_tls(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "use_tls", value)
@pulumi.input_type
class Servicev1VclArgs:
def __init__(__self__, *,
content: pulumi.Input[str],
name: pulumi.Input[str],
main: Optional[pulumi.Input[bool]] = None):
"""
:param pulumi.Input[str] content: The custom VCL code to upload
:param pulumi.Input[str] name: A unique name for this configuration block. It is important to note that changing this attribute will delete and recreate the resource
:param pulumi.Input[bool] main: If `true`, use this block as the main configuration. If `false`, use this block as an includable library. Only a single VCL block can be marked as the main block. Default is `false`
"""
pulumi.set(__self__, "content", content)
pulumi.set(__self__, "name", name)
if main is not None:
pulumi.set(__self__, "main", main)
@property
@pulumi.getter
def content(self) -> pulumi.Input[str]:
"""
The custom VCL code to upload
"""
return pulumi.get(self, "content")
@content.setter
def content(self, value: pulumi.Input[str]):
pulumi.set(self, "content", value)
@property
@pulumi.getter
def name(self) -> pulumi.Input[str]:
"""
A unique name for this configuration block. It is important to note that changing this attribute will delete and recreate the resource
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: pulumi.Input[str]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def main(self) -> Optional[pulumi.Input[bool]]:
"""
If `true`, use this block as the main configuration. If `false`, use this block as an includable library. Only a single VCL block can be marked as the main block. Default is `false`
"""
return pulumi.get(self, "main")
@main.setter
def main(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "main", value)
@pulumi.input_type
class Servicev1WafArgs:
def __init__(__self__, *,
response_object: pulumi.Input[str],
disabled: Optional[pulumi.Input[bool]] = None,
prefetch_condition: Optional[pulumi.Input[str]] = None,
waf_id: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] response_object: The name of the response object used by the Web Application Firewall
:param pulumi.Input[bool] disabled: A flag used to completely disable a Web Application Firewall. This is intended to only be used in an emergency
:param pulumi.Input[str] prefetch_condition: The `condition` to determine which requests will be run past your Fastly WAF. This `condition` must be of type `PREFETCH`. For detailed information about Conditionals, see [Fastly's Documentation on Conditionals](https://docs.fastly.com/en/guides/using-conditions)
:param pulumi.Input[str] waf_id: The ID of the WAF
"""
pulumi.set(__self__, "response_object", response_object)
if disabled is not None:
pulumi.set(__self__, "disabled", disabled)
if prefetch_condition is not None:
pulumi.set(__self__, "prefetch_condition", prefetch_condition)
if waf_id is not None:
pulumi.set(__self__, "waf_id", waf_id)
@property
@pulumi.getter(name="responseObject")
def response_object(self) -> pulumi.Input[str]:
"""
The name of the response object used by the Web Application Firewall
"""
return pulumi.get(self, "response_object")
@response_object.setter
def response_object(self, value: pulumi.Input[str]):
pulumi.set(self, "response_object", value)
@property
@pulumi.getter
def disabled(self) -> Optional[pulumi.Input[bool]]:
"""
A flag used to completely disable a Web Application Firewall. This is intended to only be used in an emergency
"""
return pulumi.get(self, "disabled")
@disabled.setter
def disabled(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "disabled", value)
@property
@pulumi.getter(name="prefetchCondition")
def prefetch_condition(self) -> Optional[pulumi.Input[str]]:
"""
The `condition` to determine which requests will be run past your Fastly WAF. This `condition` must be of type `PREFETCH`. For detailed information about Conditionals, see [Fastly's Documentation on Conditionals](https://docs.fastly.com/en/guides/using-conditions)
"""
return pulumi.get(self, "prefetch_condition")
@prefetch_condition.setter
def prefetch_condition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "prefetch_condition", value)
@property
@pulumi.getter(name="wafId")
def waf_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the WAF
"""
return pulumi.get(self, "waf_id")
@waf_id.setter
def waf_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "waf_id", value)
@pulumi.input_type
class TlsSubscriptionManagedDnsChallengeArgs:
def __init__(__self__, *,
record_name: Optional[pulumi.Input[str]] = None,
record_type: Optional[pulumi.Input[str]] = None,
record_value: Optional[pulumi.Input[str]] = None):
"""
:param pulumi.Input[str] record_name: The name of the DNS record to add. For example `example.com`. Best accessed through a `for` expression to filter the relevant record.
:param pulumi.Input[str] record_type: The type of DNS record to add, e.g. `A`, or `CNAME`.
:param pulumi.Input[str] record_value: The value to which the DNS record should point, e.g. `xxxxx.fastly-validations.com`.
"""
if record_name is not None:
pulumi.set(__self__, "record_name", record_name)
if record_type is not None:
pulumi.set(__self__, "record_type", record_type)
if record_value is not None:
pulumi.set(__self__, "record_value", record_value)
@property
@pulumi.getter(name="recordName")
def record_name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the DNS record to add. For example `example.com`. Best accessed through a `for` expression to filter the relevant record.
"""
return pulumi.get(self, "record_name")
@record_name.setter
def record_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "record_name", value)
@property
@pulumi.getter(name="recordType")
def record_type(self) -> Optional[pulumi.Input[str]]:
"""
The type of DNS record to add, e.g. `A`, or `CNAME`.
"""
return pulumi.get(self, "record_type")
@record_type.setter
def record_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "record_type", value)
@property
@pulumi.getter(name="recordValue")
def record_value(self) -> Optional[pulumi.Input[str]]:
"""
The value to which the DNS record should point, e.g. `xxxxx.fastly-validations.com`.
"""
return pulumi.get(self, "record_value")
@record_value.setter
def record_value(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "record_value", value)
@pulumi.input_type
class TlsSubscriptionManagedHttpChallengeArgs:
def __init__(__self__, *,
record_name: Optional[pulumi.Input[str]] = None,
record_type: Optional[pulumi.Input[str]] = None,
record_values: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None):
"""
:param pulumi.Input[str] record_name: The name of the DNS record to add. For example `example.com`. Best accessed through a `for` expression to filter the relevant record.
:param pulumi.Input[str] record_type: The type of DNS record to add, e.g. `A`, or `CNAME`.
:param pulumi.Input[Sequence[pulumi.Input[str]]] record_values: A list with the value(s) to which the DNS record should point.
"""
if record_name is not None:
pulumi.set(__self__, "record_name", record_name)
if record_type is not None:
pulumi.set(__self__, "record_type", record_type)
if record_values is not None:
pulumi.set(__self__, "record_values", record_values)
@property
@pulumi.getter(name="recordName")
def record_name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the DNS record to add. For example `example.com`. Best accessed through a `for` expression to filter the relevant record.
"""
return pulumi.get(self, "record_name")
@record_name.setter
def record_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "record_name", value)
@property
@pulumi.getter(name="recordType")
def record_type(self) -> Optional[pulumi.Input[str]]:
"""
The type of DNS record to add, e.g. `A`, or `CNAME`.
"""
return pulumi.get(self, "record_type")
@record_type.setter
def record_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "record_type", value)
@property
@pulumi.getter(name="recordValues")
def record_values(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
A list with the value(s) to which the DNS record should point.
"""
return pulumi.get(self, "record_values")
@record_values.setter
def record_values(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "record_values", value)
| 42.386708 | 591 | 0.648695 | 62,766 | 492,364 | 4.959309 | 0.016187 | 0.102305 | 0.098228 | 0.071878 | 0.97445 | 0.964453 | 0.957424 | 0.952573 | 0.949196 | 0.945013 | 0 | 0.002918 | 0.242067 | 492,364 | 11,615 | 592 | 42.390357 | 0.831202 | 0.331842 | 0 | 0.925827 | 1 | 0.000141 | 0.0872 | 0.00813 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2076 | false | 0.009993 | 0.000704 | 0 | 0.317664 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
5e9c433b2fcf1cfbc15c8c62f9e6a4887a4d824e | 115 | py | Python | bitmovin_api_sdk/encoding/configurations/audio/dts_passthrough/customdata/__init__.py | jaythecaesarean/bitmovin-api-sdk-python | 48166511fcb9082041c552ace55a9b66cc59b794 | [
"MIT"
] | 11 | 2019-07-03T10:41:16.000Z | 2022-02-25T21:48:06.000Z | bitmovin_api_sdk/encoding/configurations/audio/dts_passthrough/customdata/__init__.py | jaythecaesarean/bitmovin-api-sdk-python | 48166511fcb9082041c552ace55a9b66cc59b794 | [
"MIT"
] | 8 | 2019-11-23T00:01:25.000Z | 2021-04-29T12:30:31.000Z | bitmovin_api_sdk/encoding/configurations/audio/dts_passthrough/customdata/__init__.py | jaythecaesarean/bitmovin-api-sdk-python | 48166511fcb9082041c552ace55a9b66cc59b794 | [
"MIT"
] | 13 | 2020-01-02T14:58:18.000Z | 2022-03-26T12:10:30.000Z | from bitmovin_api_sdk.encoding.configurations.audio.dts_passthrough.customdata.customdata_api import CustomdataApi
| 57.5 | 114 | 0.913043 | 14 | 115 | 7.214286 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.034783 | 115 | 1 | 115 | 115 | 0.90991 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
5e9d9437c390f8e3f4d30b32b233be0905ca3e3b | 4,285 | py | Python | 21.py | mjenrungrot/AdventOfCode2020 | ad2607fe6c4418327a97b863146f7a5af3361afe | [
"MIT"
] | null | null | null | 21.py | mjenrungrot/AdventOfCode2020 | ad2607fe6c4418327a97b863146f7a5af3361afe | [
"MIT"
] | null | null | null | 21.py | mjenrungrot/AdventOfCode2020 | ad2607fe6c4418327a97b863146f7a5af3361afe | [
"MIT"
] | null | null | null | import sys
def extra():
fp = open("21.input")
food = []
all_ingredients = set()
all_allergens = set()
for line in fp.readlines():
line = line.replace('(', '')
line = line.replace(')', '')
line = line.replace(',', '')
ingredients, allergens = line.strip().split('contains')
ingredients = ingredients.strip().split()
allergens = allergens.strip().split()
for ingredient in ingredients:
all_ingredients.add(ingredient)
for allergen in allergens:
all_allergens.add(allergen)
food.append((ingredients, allergens))
# print(all_ingredients, len(all_ingredients))
# print(all_allergens, len(all_allergens))
food_with_allergen = dict((x, []) for x in all_allergens)
for idx, (ingredients, allergens) in enumerate(food):
for allergen in allergens:
food_with_allergen[allergen].append(idx)
ingredient_with_allergen = {}
for allergen in all_allergens:
ingredient_with_allergen[allergen] = food[food_with_allergen[allergen]
[0]][0]
for food_id in food_with_allergen[allergen][1:]:
tmp = set(food[food_id][0])
ingredient_with_allergen[allergen] = list(
set(ingredient_with_allergen[allergen]) & tmp)
mapping = {}
def search(idx, curr_mapping, mapping):
if idx == len(all_allergens):
xx = curr_mapping
for val in curr_mapping:
mapping[val] = curr_mapping[val]
return
allergen = list(all_allergens)[idx]
for ingredient in ingredient_with_allergen[allergen]:
if ingredient in curr_mapping:
continue
curr_mapping[ingredient] = allergen
search(idx + 1, curr_mapping, mapping)
del curr_mapping[ingredient]
search(0, {}, mapping)
ans = ','.join(sorted(mapping, key=lambda x: mapping[x]))
print(ans)
def main():
fp = open("21.input")
food = []
all_ingredients = set()
all_allergens = set()
for line in fp.readlines():
line = line.replace('(', '')
line = line.replace(')', '')
line = line.replace(',', '')
ingredients, allergens = line.strip().split('contains')
ingredients = ingredients.strip().split()
allergens = allergens.strip().split()
for ingredient in ingredients:
all_ingredients.add(ingredient)
for allergen in allergens:
all_allergens.add(allergen)
food.append((ingredients, allergens))
# print(all_ingredients, len(all_ingredients))
# print(all_allergens, len(all_allergens))
food_with_allergen = dict((x, []) for x in all_allergens)
for idx, (ingredients, allergens) in enumerate(food):
for allergen in allergens:
food_with_allergen[allergen].append(idx)
ingredient_with_allergen = {}
for allergen in all_allergens:
ingredient_with_allergen[allergen] = food[food_with_allergen[allergen]
[0]][0]
for food_id in food_with_allergen[allergen][1:]:
tmp = set(food[food_id][0])
ingredient_with_allergen[allergen] = list(
set(ingredient_with_allergen[allergen]) & tmp)
mapping = {}
def search(idx, curr_mapping, mapping):
if idx == len(all_allergens):
xx = curr_mapping
for val in curr_mapping:
mapping[val] = curr_mapping[val]
return
allergen = list(all_allergens)[idx]
for ingredient in ingredient_with_allergen[allergen]:
if ingredient in curr_mapping:
continue
curr_mapping[ingredient] = allergen
search(idx + 1, curr_mapping, mapping)
del curr_mapping[ingredient]
search(0, {}, mapping)
ans = 0
for idx, (ingredients, allergens) in enumerate(food):
for ingredient in ingredients:
if ingredient in mapping:
continue
ans += 1
print(ans)
if __name__ == '__main__':
if len(sys.argv) == 2 and sys.argv[1] == 'extra':
extra()
else:
main()
| 29.756944 | 78 | 0.586698 | 463 | 4,285 | 5.239741 | 0.12311 | 0.089035 | 0.115416 | 0.098928 | 0.924155 | 0.924155 | 0.924155 | 0.924155 | 0.906018 | 0.906018 | 0 | 0.006714 | 0.304784 | 4,285 | 143 | 79 | 29.965035 | 0.807654 | 0.039907 | 0 | 0.883495 | 0 | 0 | 0.012655 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038835 | false | 0 | 0.009709 | 0 | 0.067961 | 0.019417 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5ea1684f7b878f5fb1a671131b71ec6ed1a78cd2 | 144 | py | Python | bfgame/outfits/monsters/__init__.py | ChrisLR/BasicDungeonRL | b293d40bd9a0d3b7aec41b5e1d58441165997ff1 | [
"MIT"
] | 3 | 2017-10-28T11:28:38.000Z | 2018-09-12T09:47:00.000Z | bfgame/outfits/monsters/__init__.py | ChrisLR/BasicDungeonRL | b293d40bd9a0d3b7aec41b5e1d58441165997ff1 | [
"MIT"
] | null | null | null | bfgame/outfits/monsters/__init__.py | ChrisLR/BasicDungeonRL | b293d40bd9a0d3b7aec41b5e1d58441165997ff1 | [
"MIT"
] | null | null | null | from bfgame.outfits.monsters.goblins import GoblinPack1, GoblinPack2
from bfgame.outfits.monsters.skeletons import SkeletonPack1, SkeletonPack2
| 48 | 74 | 0.875 | 16 | 144 | 7.875 | 0.6875 | 0.15873 | 0.269841 | 0.396825 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.029851 | 0.069444 | 144 | 2 | 75 | 72 | 0.910448 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
5ecba4de2d4433dfa7ccf92cdd2ea6045b0ec76e | 11,732 | py | Python | generalTree.py | yashasnm/py-repo | af6ba46de4db0f4be298fe82124a0ca10bb1ca29 | [
"MIT"
] | 1 | 2020-10-08T14:06:04.000Z | 2020-10-08T14:06:04.000Z | generalTree.py | yashasnm/py-repo | af6ba46de4db0f4be298fe82124a0ca10bb1ca29 | [
"MIT"
] | null | null | null | generalTree.py | yashasnm/py-repo | af6ba46de4db0f4be298fe82124a0ca10bb1ca29 | [
"MIT"
] | null | null | null | class GeneralTree():
"""
Parameters:
@root: root node
@first_born: leftmost child of root node
@current_node: initalized as first born
@current-value: initialized as value of first born (current node)
@visited: Linked list of all values whichg have previously been visited
@path: FULL path, traversing siblings, from root to search value (LL)
@child_path: path, sibling traversal not captured, more closely resembles typical tree design/behavior (LL)
Methods:
@check_visited: Determines if a tree node has been visited yet to prevent cyclical movements
@check_child_path: Determines if a tree node is already captured in child_path
@depth_first_traversal: Explores entire tree via depth first protocol
@depth_first_search: Captures all necessary node traversals required to move from root to search value
including sibling node traversals
@child_depth_first_search: modification to @depth_first_search such that sibling traversals are eliminated.
This more closely mimics general tree behavior. In a file system, for example, traversing siblings is
not necessary. This method allows for the correct capture of path.
"""
def __init__(self,root=None):
self.root = root
self.first_born = self.root.get_child()
self.current_node = self.first_born
self.current_value = self.current_node.get_value()
self.start = self.current_value
self.visited = LinkedList(self.root.get_value())
self.path = LinkedList(self.root.get_value())
self.child_path = LinkedList(self.root.get_value())
def check_visited(self,val):
if self.visited.find(val):
return True
else:
return False
def check_child_path(self,val):
if self.child_path.find(val):
return True
else:
return False
def depth_first_traversal(self):
if self.current_value == self.root.get_value():
self.visited.insert_at(idx=1,val=self.start)
# copy self.visited then reset for future method calls
self.completed_visited = self.visited
self.current_node = self.first_born
self.current_value = self.current_node.get_value()
self.start = self.current_value
self.visited = LinkedList(self.root.get_value())
return self.completed_visited.dump_list()
else:
# ==== tree traversal logic ====
if self.current_node.get_child() and self.check_visited(self.current_node.get_child().get_value()) == False:
# parent -> child (not yet visited)
self.current_node = self.current_node.get_child()
self.current_value = self.current_node.get_value()
self.visited.append(self.current_value)
elif self.current_node.get_right() and self.check_visited(self.current_node.get_right().get_value()) == False:
# sibling -> right_sibling (not yet visited)
self.current_node = self.current_node.get_right()
self.current_value = self.current_node.get_value()
self.visited.append(self.current_value)
elif self.current_node.get_right() == None and self.current_node.get_left():
# right_most_sibling -> left_sibling (already visited)
self.current_node = self.current_node.get_left()
self.current_value = self.current_node.get_value()
elif self.current_node.get_left() != None and self.check_visited(self.current_node.get_right().get_value()) == True:
# sibling (not left-most or right-most) -> left_sibling (already visited)
self.current_node = self.current_node.get_left()
self.current_value = self.current_node.get_value()
else:
# left_most_sibling -> parent (already visited)
self.current_node = self.current_node.get_parent()
self.current_value = self.current_node.get_value()
# ==== recursively apply logic ====
self.depth_first_traversal()
def depth_first_search(self,search_val):
self.search_val = search_val
if self.current_value == search_val or self.current_value == self.root.get_value():
self.visited.insert_at(idx=1,val=self.start)
self.path.insert_at(idx=1,val=self.start)
if self.check_visited(self.search_val) == True:
condition = 1
else:
condition = 0
# copy self.path and reset for future method calls
self.completed_path = self.path
self.current_node = self.first_born
self.current_value = self.current_node.get_value()
self.start = self.current_value
self.visited = LinkedList(self.root.get_value())
self.path = LinkedList(self.root.get_value())
if condition == 1:
return self.completed_path.dump_list()
else:
print("Value not found")
else:
# ==== tree traversal logic ====
if self.current_node.get_child() and self.check_visited(self.current_node.get_child().get_value()) == False:
# parent -> child (not yet visited)
self.current_node = self.current_node.get_child()
self.current_value = self.current_node.get_value()
self.visited.append(self.current_value)
self.path.append(self.current_value)
elif self.current_node.get_right() and self.check_visited(self.current_node.get_right().get_value()) == False:
# sibling -> right_sibling (not yet visited)
self.current_node = self.current_node.get_right()
self.current_value = self.current_node.get_value()
self.visited.append(self.current_value)
self.path.append(self.current_value)
elif self.current_node.get_right() == None and self.current_node.get_left():
# right_most_sibling -> left_sibling (already visited)
self.current_node = self.current_node.get_left()
self.current_value = self.current_node.get_value()
elif self.current_node.get_left() != None and self.check_visited(self.current_node.get_right().get_value()) == True:
# sibling (not left-most or right-most) -> left_sibling (already visited)
self.current_node = self.current_node.get_left()
self.current_value = self.current_node.get_value()
self.path.deleteAt(idx=self.path.count)
else:
# left_most_sibling -> parent (already visited)
self.current_node = self.current_node.get_parent()
self.current_value = self.current_node.get_value()
self.path.deleteAt(idx=self.path.count)
# ==== recursively apply logic ====
self.depth_first_search(search_val=self.search_val)
def child_depth_first_search(self,search_val):
self.search_val = search_val
if self.current_value == search_val or self.current_value == self.root.get_value():
self.visited.insert_at(idx=1,val=self.start)
self.path.insert_at(idx=1,val=self.start)
if self.check_visited(self.search_val) == True:
condition = 1
else:
condition = 0
# copy self.path and reset for future method calls
self.completed_child_path = self.child_path
self.current_node = self.first_born
self.current_value = self.current_node.get_value()
self.start = self.current_value
self.visited = LinkedList(self.root.get_value())
self.child_path = LinkedList(self.root.get_value())
if condition == 1:
return self.completed_child_path.dump_list()
else:
print("Value not found")
else:
# ==== tree traversal logic ====
if self.current_node.get_child() and self.check_visited(self.current_node.get_child().get_value()) == False:
# parent -> child (not yet visited)
if self.check_child_path(self.current_node.get_value()) == False:
self.child_path.append(self.current_value)
self.current_node = self.current_node.get_child()
self.current_value = self.current_node.get_value()
self.visited.append(self.current_value)
self.child_path.append(self.current_value)
elif self.current_node.get_right() and self.check_visited(self.current_node.get_right().get_value()) == False:
# sibling -> right_sibling (not yet visited)
self.child_path.deleteAt(idx=self.child_path.count)
self.current_node = self.current_node.get_right()
self.current_value = self.current_node.get_value()
self.visited.append(self.current_value)
self.child_path.append(self.current_value)
elif self.current_node.get_right() == None and self.current_node.get_left():
# right_most_sibling -> left_sibling (already visited)
self.current_node = self.current_node.get_left()
self.current_value = self.current_node.get_value()
elif self.current_node.get_left() != None and self.check_visited(self.current_node.get_right().get_value()) == True:
# sibling (not left-most or right-most) -> left_sibling (already visited)
self.current_node = self.current_node.get_left()
self.current_value = self.current_node.get_value()
self.child_path.deleteAt(idx=self.child_path.count)
else:
# left_most_sibling -> parent (already visited)
self.current_node = self.current_node.get_parent()
self.current_value = self.current_node.get_value()
self.child_path.deleteAt(idx=self.child_path.count)
# ==== recursively apply logic ====
self.child_depth_first_search(search_val=self.search_val)
if __name__ == '__main__':
a1 = GeneralTreeNode(value='a1')
b1 = GeneralTreeNode(value='b1')
b2 = GeneralTreeNode(value='b2')
b3 = GeneralTreeNode(value='b3')
a1.set_child(b1)
b1.set_parent(a1)
b1.set_right(b2)
b2.set_left(b1)
b2.set_right(b3)
b3.set_left(b2)
c1 = GeneralTreeNode(value='c1')
c1.set_parent(b3)
b3.set_child(c1)
d1 = GeneralTreeNode(value='d1')
d1.set_parent(b1)
b1.set_child(d1)
r = GeneralTree(root=a1)
r.depth_first_search(search_val='c1')
r.child_depth_first_search(search_val='c1') | 48.279835 | 134 | 0.587197 | 1,368 | 11,732 | 4.78655 | 0.099415 | 0.196549 | 0.178681 | 0.162187 | 0.787569 | 0.768021 | 0.745266 | 0.734117 | 0.71182 | 0.706017 | 0 | 0.005783 | 0.32194 | 11,732 | 243 | 135 | 48.279835 | 0.817348 | 0.187095 | 0 | 0.729032 | 0 | 0 | 0.005726 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.03871 | false | 0 | 0 | 0 | 0.090323 | 0.012903 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0d75b9ab916495db8aefb3c1da3eda18cc1a7acf | 182 | py | Python | ctsb/problems/data_based/__init__.py | paula-gradu/ctsb | fdc00acb798949ce1120778ad4725faf170f80c3 | [
"Apache-2.0"
] | 1 | 2021-07-03T05:26:56.000Z | 2021-07-03T05:26:56.000Z | ctsb/problems/data_based/__init__.py | paula-gradu/ctsb | fdc00acb798949ce1120778ad4725faf170f80c3 | [
"Apache-2.0"
] | null | null | null | ctsb/problems/data_based/__init__.py | paula-gradu/ctsb | fdc00acb798949ce1120778ad4725faf170f80c3 | [
"Apache-2.0"
] | null | null | null | # data_based init file
from ctsb.problems.data_based.sp500 import SP500
from ctsb.problems.data_based.uci_indoor import UCI_Indoor
from ctsb.problems.data_based.crypto import Crypto | 36.4 | 58 | 0.857143 | 30 | 182 | 5 | 0.4 | 0.24 | 0.32 | 0.4 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036145 | 0.087912 | 182 | 5 | 59 | 36.4 | 0.86747 | 0.10989 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
0d9ff94b6b9682156e8de443fb6684a006369327 | 4,452 | py | Python | Inference/PythonInference/vad/src/models/vad_model.py | Z-yq/audioSamples.github.io | 53c474288f0db1a3acfe40ba57a4cd5f2aecbcd3 | [
"Apache-2.0"
] | 1 | 2022-03-03T02:51:55.000Z | 2022-03-03T02:51:55.000Z | Inference/PythonInference/vad/src/models/vad_model.py | RapidAI/TensorflowASR | 084519b5a0464f465e1d72c24cba07c1ec55cd26 | [
"Apache-2.0"
] | null | null | null | Inference/PythonInference/vad/src/models/vad_model.py | RapidAI/TensorflowASR | 084519b5a0464f465e1d72c24cba07c1ec55cd26 | [
"Apache-2.0"
] | null | null | null | #encoding=utf-8
import tensorflow as tf
class CNN_Online_VAD(tf.keras.Model):
def __init__(self,
dmodel: int,
name="cnn_online_vad",
**kwargs):
super(CNN_Online_VAD, self).__init__(name=name, **kwargs)
self.embed = tf.keras.layers.Dense(dmodel)
self.cnn1 = tf.keras.layers.Conv1D(dmodel*2, 3, padding='causal', activation='relu')
self.dense1=tf.keras.layers.Conv1D(dmodel, 1, padding='causal', activation='relu')
self.cnn2=tf.keras.layers.Conv1D(dmodel*2,3,padding='causal',activation='relu')
self.dense2=tf.keras.layers.Dense(dmodel,activation='relu')
self.dense3=tf.keras.layers.Dense(dmodel,activation='relu')
self.fc=tf.keras.layers.Dense(1)
self.fc3=tf.keras.layers.Dense(80,name='audio_voice_mask')
def _build(self):
fake=tf.random.uniform([1,80,80])
self(fake)
def call(self,
inputs,
training=True,
**kwargs):
outputs = self.embed(inputs, training=training)
outputs=self.dense1(outputs,training=training)
outputs =self.cnn1(outputs,training=training)
outputs=self.dense2(outputs,training=training)
outputs=self.cnn2(outputs,training=training)
outputs = self.dense3(outputs, training=training)
output1=self.fc(outputs)
output3=self.fc3(outputs)
output4 = inputs*output3
return output1,output4
@tf.function(input_signature=[
tf.TensorSpec([None,None,80], dtype=tf.float32), # TODO:根据自己的frame_input修改
])
def inference(self,inputs,training=False):
outputs = self.embed(inputs, training=training)
outputs = self.dense1(outputs, training=training)
outputs = self.cnn1(outputs, training=training)
outputs = self.dense2(outputs, training=training)
outputs = self.cnn2(outputs, training=training)
outputs = self.dense3(outputs, training=training)
output1 = self.fc(outputs)
return output1
class CNN_Offline_VAD(tf.keras.Model):
def __init__(self,
dmodel: int,
name="cnn_offline_vad",
**kwargs):
super(CNN_Offline_VAD, self).__init__(name=name, **kwargs)
self.embed = tf.keras.layers.Dense(dmodel)
self.cnn1 = tf.keras.layers.Conv1D(dmodel, 5, padding='same', activation='relu')
self.dense1=tf.keras.layers.Dense(dmodel,activation='relu')
self.cnn2=tf.keras.layers.Conv1D(dmodel,5,padding='same',activation='relu',dilation_rate=2)
self.cnn3=tf.keras.layers.Conv1D(dmodel,5,padding='same',activation='relu',dilation_rate=4)
self.cnn4=tf.keras.layers.Conv1D(dmodel,5,padding='same',activation='relu',dilation_rate=8)
self.dense2=tf.keras.layers.Dense(dmodel,activation='relu')
self.fc=tf.keras.layers.Dense(1)
def _build(self):
fake=tf.random.uniform([1,80,80])
self(fake)
# @tf.function(experimental_relax_shapes=True)
def call(self,
inputs,
training=False,
**kwargs):
# inputs has shape [B, U]
outputs = self.embed(inputs, training=training)
outputs=tf.squeeze(outputs,-1)
outputs=self.dense1(outputs)
outputs =self.cnn1(outputs,training=training)
outputs=self.cnn2(outputs,training=training)
outputs=self.cnn3(outputs,training=training)
outputs=self.cnn4(outputs,training=training)
outputs = self.dense2(outputs, training=training)
outputs=self.fc(outputs)
# return shapes [B, T, P], ([num_lstms, B, P], [num_lstms, B, P]) if using lstm
return outputs
@tf.function(input_signature=[
tf.TensorSpec([None,None,80], dtype=tf.float32), # features
])
def infrence(self,inputs,training=False):
outputs = self.embed(inputs, training=training)
outputs = tf.squeeze(outputs, -1)
outputs = self.dense1(outputs)
outputs = self.cnn1(outputs, training=training)
outputs = self.cnn2(outputs, training=training)
outputs = self.cnn3(outputs, training=training)
outputs = self.cnn4(outputs, training=training)
outputs = self.dense2(outputs, training=training)
outputs = self.fc(outputs)
# return shapes [B, T, P], ([num_lstms, B, P], [num_lstms, B, P]) if using lstm
return outputs | 40.108108 | 99 | 0.637916 | 544 | 4,452 | 5.139706 | 0.170956 | 0.110157 | 0.180973 | 0.193133 | 0.8598 | 0.832976 | 0.832976 | 0.822604 | 0.81402 | 0.798283 | 0 | 0.025379 | 0.230009 | 4,452 | 111 | 100 | 40.108108 | 0.790257 | 0.061096 | 0 | 0.719101 | 0 | 0 | 0.029468 | 0 | 0 | 0 | 0 | 0.009009 | 0 | 1 | 0.089888 | false | 0 | 0.011236 | 0 | 0.168539 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
df36c4307cd9d6830e2f78a44d75455b269f185c | 38,320 | py | Python | scripts/servicechain/firewall/verify.py | rombie/contrail-test | a68c71d6f282142501a7e2e889bbb232fdd82dc3 | [
"Apache-2.0"
] | null | null | null | scripts/servicechain/firewall/verify.py | rombie/contrail-test | a68c71d6f282142501a7e2e889bbb232fdd82dc3 | [
"Apache-2.0"
] | null | null | null | scripts/servicechain/firewall/verify.py | rombie/contrail-test | a68c71d6f282142501a7e2e889bbb232fdd82dc3 | [
"Apache-2.0"
] | null | null | null | from time import sleep
from servicechain.config import ConfigSvcChain
from servicechain.verify import VerifySvcChain
from servicechain.mirror.verify import VerifySvcMirror
from servicechain.mirror.config import ConfigSvcMirror
class VerifySvcFirewall(VerifySvcMirror):
def verify_svc_span(self, in_net=False):
vn1_name = "left_vn"
vn1_subnets = ['31.1.1.0/24']
vm1_name = 'left_vm'
vn2_name = "right_vn"
vn2_subnets = ['41.2.2.0/24']
vm2_name = 'right_vm'
if in_net:
vn1_name = "in_left_vn"
vn1_subnets = ['32.1.1.0/24']
vm1_name = 'in_left_vm'
vn2_name = "in_right_vn"
vn2_subnets = ['42.2.2.0/24']
vm2_name = 'in_right_vm'
vn1_fixture = self.config_vn(vn1_name, vn1_subnets)
vn2_fixture = self.config_vn(vn2_name, vn2_subnets)
vm1_fixture = self.config_vm(vn1_fixture, vm1_name)
vm2_fixture = self.config_vm(vn2_fixture, vm2_name)
assert vm1_fixture.verify_on_setup()
assert vm2_fixture.verify_on_setup()
vm1_fixture.wait_till_vm_is_up()
vm2_fixture.wait_till_vm_is_up()
si_count = 3
st_name = "tcp_svc_template"
si_prefix = "tcp_bridge_"
policy_name = "allow_tcp"
if in_net:
st_name = "in_tcp_svc_template"
si_prefix = "in_tcp_bridge_"
policy_name = "in_allow_tcp"
tcp_st_fixture, tcp_si_fixtures = self.config_st_si(
st_name, si_prefix, si_count,
left_vn=vn1_name, right_vn=vn2_name)
else:
tcp_st_fixture, tcp_si_fixtures = self.config_st_si(
st_name, si_prefix, si_count)
action_list = self.chain_si(si_count, si_prefix)
# Update rule with specific port/protocol
rule = [{'direction': '<>',
'protocol': 'tcp',
'source_network': vn1_name,
'src_ports': [8000, 8000],
'dest_network': vn2_name,
'dst_ports': [9000, 9000],
'simple_action': None,
'action_list': {'apply_service': action_list}
}]
# Create new policy with rule to allow traffci from new VN's
tcp_policy_fixture = self.config_policy(policy_name, rule)
self.verify_si(tcp_si_fixtures)
st_name = "udp_svc_template"
si_prefix = "udp_bridge_"
policy_name = "allow_udp"
if in_net:
st_name = "in_udp_svc_template"
si_prefix = "in_udp_bridge_"
policy_name = "in_allow_udp"
udp_st_fixture, udp_si_fixtures = self.config_st_si(
st_name, si_prefix, si_count,
left_vn=vn1_name, right_vn=vn2_name)
else:
udp_st_fixture, udp_si_fixtures = self.config_st_si(
st_name, si_prefix, si_count)
action_list = self.chain_si(si_count, si_prefix)
# Update rule with specific port/protocol
rule = [{'direction': '<>',
'protocol': 'udp',
'source_network': vn1_name,
'src_ports': [8001, 8001],
'dest_network': vn2_name,
'dst_ports': [9001, 9001],
'simple_action': None,
'action_list': {'apply_service': action_list}
}]
# Create new policy with rule to allow traffci from new VN's
udp_policy_fixture = self.config_policy(policy_name, rule)
vn1_udp_policy_fix = self.attach_policy_to_vn(
[tcp_policy_fixture, udp_policy_fixture], vn1_fixture)
vn2_udp_policy_fix = self.attach_policy_to_vn(
[tcp_policy_fixture, udp_policy_fixture], vn2_fixture)
result, msg = self.validate_vn(vn1_name)
assert result, msg
result, msg = self.validate_vn(vn2_name)
assert result, msg
self.verify_si(udp_si_fixtures)
# Install traffic package in VM
vm1_fixture.install_pkg("Traffic")
vm2_fixture.install_pkg("Traffic")
sport = 8001
dport = 9001
sent, recv = self.verify_traffic(vm1_fixture, vm2_fixture,
'udp', sport=sport, dport=dport)
errmsg = "UDP traffic with src port %s and dst port %s failed" % (
sport, dport)
assert sent and recv == sent, errmsg
sport = 8000
dport = 9000
sent, recv = self.verify_traffic(vm1_fixture, vm2_fixture,
'tcp', sport=sport, dport=dport)
errmsg = "TCP traffic with src port %s and dst port %s failed" % (
sport, dport)
assert sent and recv == sent, errmsg
self.delete_si_st(tcp_si_fixtures, tcp_st_fixture)
sport = 8001
dport = 9001
sent, recv = self.verify_traffic(vm1_fixture, vm2_fixture,
'udp', sport=sport, dport=dport)
errmsg = "UDP traffic with src port %s and dst port %s failed" % (
sport, dport)
assert sent and recv == sent, errmsg
sport = 8000
dport = 9000
sent, recv = self.verify_traffic(vm1_fixture, vm2_fixture,
'tcp', sport=sport, dport=dport)
errmsg = "TCP traffic with src port %s and dst port %s passed; Expected to fail" % (
sport, dport)
assert sent and recv == 0, errmsg
st_name = "tcp_svc_template"
si_prefix = "tcp_bridge_"
policy_name = "allow_tcp"
if in_net:
st_name = "in_tcp_svc_template"
si_prefix = "in_tcp_bridge_"
policy_name = "in_allow_tcp"
tcp_st_fixture, tcp_si_fixtures = self.config_st_si(
st_name, si_prefix, si_count,
left_vn=vn1_name, right_vn=vn2_name)
else:
tcp_st_fixture, tcp_si_fixtures = self.config_st_si(
st_name, si_prefix, si_count)
action_list = self.chain_si(si_count, si_prefix)
result, msg = self.validate_vn(vn1_name)
assert result, msg
result, msg = self.validate_vn(vn2_name)
assert result, msg
self.verify_si(tcp_si_fixtures)
sport = 8001
dport = 9001
sent, recv = self.verify_traffic(vm1_fixture, vm2_fixture,
'udp', sport=sport, dport=dport)
errmsg = "UDP traffic with src port %s and dst port %s failed" % (
sport, dport)
assert sent and recv == sent, errmsg
sport = 8000
dport = 9000
sent, recv = self.verify_traffic(vm1_fixture, vm2_fixture,
'tcp', sport=sport, dport=dport)
errmsg = "TCP traffic with src port %s and dst port %s failed" % (
sport, dport)
assert sent and recv == sent, errmsg
def verify_svc_transparent_datapath(self, si_count=1, svc_scaling=False, max_inst=1, flavor='m1.medium'):
"""Validate the service chaining datapath"""
if getattr(self, 'res', None):
self.vn1_name = self.res.vn1_name
self.vn1_subnets = self.res.vn1_subnets
self.vm1_name = self.res.vn1_vm1_name
self.vn2_name = self.res.vn2_name
self.vn2_subnets = self.res.vn2_subnets
self.vm2_name = self.res.vn2_vm2_name
else:
self.vn1_name = "bridge_vn1%s" % si_count
self.vn1_subnets = ['11.1.1.0/24']
self.vm1_name = 'bridge_vm1'
self.vn2_name = "bridge_vn2%s" % si_count
self.vn2_subnets = ['12.2.2.0/24']
self.vm2_name = 'bridge_vm2'
self.action_list = []
self.if_list = []
self.st_name = 'service_template_1'
si_prefix = 'bridge_svc_instance_'
self.policy_name = 'policy_transparent'
# self.st_fixture, self.si_fixtures = self.config_st_si(self.st_name, si_prefix, si_count)
self.st_fixture, self.si_fixtures = self.config_st_si(
self.st_name, si_prefix, si_count, svc_scaling, max_inst, flavor=flavor)
self.action_list = self.chain_si(si_count, si_prefix)
self.rules = [
{
'direction': '<>',
'protocol': 'any',
'source_network': self.vn1_name,
'src_ports': [0, -1],
'dest_network': self.vn2_name,
'dst_ports': [0, -1],
'simple_action': None,
'action_list': {'apply_service': self.action_list}
},
]
self.policy_fixture = self.config_policy(self.policy_name, self.rules)
if getattr(self, 'res', None):
self.vn1_fixture = self.res.get_vn1_fixture()
self.vn2_fixture = self.res.get_vn2_fixture()
assert self.vn1_fixture.verify_on_setup()
assert self.vn2_fixture.verify_on_setup()
else:
self.vn1_fixture = self.config_vn(self.vn1_name, self.vn1_subnets)
self.vn2_fixture = self.config_vn(self.vn2_name, self.vn2_subnets)
self.vn1_policy_fix = self.attach_policy_to_vn(
self.policy_fixture, self.vn1_fixture)
self.vn2_policy_fix = self.attach_policy_to_vn(
self.policy_fixture, self.vn2_fixture)
if getattr(self, 'res', None):
self.vm1_fixture = self.res.get_vn1_vm1_fixture()
self.vm2_fixture = self.res.get_vn2_vm2_fixture()
else:
self.vm1_fixture = self.config_vm(self.vn1_fixture, self.vm1_name)
self.vm2_fixture = self.config_vm(self.vn2_fixture, self.vm2_name)
assert self.vm1_fixture.verify_on_setup()
assert self.vm2_fixture.verify_on_setup()
self.vm1_fixture.wait_till_vm_is_up()
self.vm2_fixture.wait_till_vm_is_up()
result, msg = self.validate_vn(self.vn1_name)
assert result, msg
result, msg = self.validate_vn(self.vn2_name)
assert result, msg
self.verify_si(self.si_fixtures)
# Ping from left VM to right VM
errmsg = "Ping to right VM ip %s from left VM failed" % self.vm2_fixture.vm_ip
assert self.vm1_fixture.ping_with_certainty(
self.vm2_fixture.vm_ip, count='3'), errmsg
return True
def verify_svc_in_network_datapath(self, si_count=1, svc_scaling=False, max_inst=1, svc_mode='in-network', flavor='m1.medium', static_route=['None', 'None', 'None'], ordered_interfaces=True):
"""Validate the service chaining in network datapath"""
if getattr(self, 'res', None):
self.vn1_fq_name = "default-domain:admin:" + self.res.vn1_name
self.vn1_name = self.res.vn1_name
self.vn1_subnets = self.res.vn1_subnets
self.vm1_name = self.res.vn1_vm1_name
self.vn2_fq_name = "default-domain:admin:" + self.res.vn2_name
self.vn2_name = self.res.vn2_name
self.vn2_subnets = self.res.vn2_subnets
self.vm2_name = self.res.vn2_vm2_name
else:
self.vn1_fq_name = "default-domain:admin:in_network_vn1"
self.vn1_name = "in_network_vn1"
self.vn1_subnets = ['10.1.1.0/24']
self.vm1_name = 'in_network_vm1'
self.vn2_fq_name = "default-domain:admin:in_network_vn2"
self.vn2_name = "in_network_vn2"
self.vn2_subnets = ['20.2.2.0/24']
self.vm2_name = 'in_network_vm2'
self.action_list = []
self.if_list = [['management', False, False],
['left', True, False], ['right', True, False]]
for entry in static_route:
if entry != 'None':
self.if_list[static_route.index(entry)][2] = True
self.st_name = 'in_net_svc_template_1'
si_prefix = 'in_net_svc_instance_'
self.policy_name = 'policy_in_network'
if getattr(self, 'res', None):
self.vn1_fixture = self.res.get_vn1_fixture()
self.vn2_fixture = self.res.get_vn2_fixture()
assert self.vn1_fixture.verify_on_setup()
assert self.vn2_fixture.verify_on_setup()
else:
self.vn1_fixture = self.config_vn(self.vn1_name, self.vn1_subnets)
self.vn2_fixture = self.config_vn(self.vn2_name, self.vn2_subnets)
self.st_fixture, self.si_fixtures = self.config_st_si(
self.st_name, si_prefix, si_count, svc_scaling, max_inst, left_vn=self.vn1_fq_name,
right_vn=self.vn2_fq_name, svc_mode=svc_mode, flavor=flavor, static_route=static_route, ordered_interfaces=ordered_interfaces)
self.action_list = self.chain_si(si_count, si_prefix)
self.rules = [
{
'direction': '<>',
'protocol': 'any',
'source_network': self.vn1_name,
'src_ports': [0, -1],
'dest_network': self.vn2_name,
'dst_ports': [0, -1],
'simple_action': None,
'action_list': {'apply_service': self.action_list}
},
]
self.policy_fixture = self.config_policy(self.policy_name, self.rules)
self.vn1_policy_fix = self.attach_policy_to_vn(
self.policy_fixture, self.vn1_fixture)
self.vn2_policy_fix = self.attach_policy_to_vn(
self.policy_fixture, self.vn2_fixture)
if getattr(self, 'res', None):
self.vm1_fixture = self.res.get_vn1_vm1_fixture()
self.vm2_fixture = self.res.get_vn2_vm2_fixture()
else:
self.vm1_fixture = self.config_vm(self.vn1_fixture, self.vm1_name)
self.vm2_fixture = self.config_vm(self.vn2_fixture, self.vm2_name)
assert self.vm1_fixture.verify_on_setup()
assert self.vm2_fixture.verify_on_setup()
self.vm1_fixture.wait_till_vm_is_up()
self.vm2_fixture.wait_till_vm_is_up()
result, msg = self.validate_vn(self.vn1_name)
assert result, msg
result, msg = self.validate_vn(self.vn2_name)
assert result, msg
for si_fix in self.si_fixtures:
si_fix.verify_on_setup()
# Ping from left VM to right VM
errmsg = "Ping to right VM ip %s from left VM failed" % self.vm2_fixture.vm_ip
assert self.vm1_fixture.ping_with_certainty(
self.vm2_fixture.vm_ip), errmsg
return True
def verify_policy_delete_add(self):
# Delete policy
self.detach_policy(self.vn1_policy_fix)
self.detach_policy(self.vn2_policy_fix)
self.unconfig_policy(self.policy_fixture)
# Ping from left VM to right VM; expected to fail
errmsg = "Ping to right VM ip %s from left VM passed; expected to fail" % self.vm2_fixture.vm_ip
assert self.vm1_fixture.ping_with_certainty(
self.vm2_fixture.vm_ip, expectation=False), errmsg
# Create policy again
self.policy_fixture = self.config_policy(self.policy_name, self.rules)
self.vn1_policy_fix = self.attach_policy_to_vn(
self.policy_fixture, self.vn1_fixture)
self.vn2_policy_fix = self.attach_policy_to_vn(
self.policy_fixture, self.vn2_fixture)
self.verify_si(self.si_fixtures)
# Wait for the existing flow entry to age
sleep(40)
# Ping from left VM to right VM
errmsg = "Ping to right VM ip %s from left VM failed" % self.vm2_fixture.vm_ip
assert self.vm1_fixture.ping_with_certainty(
self.vm2_fixture.vm_ip), errmsg
return True
def verify_protocol_port_change(self, mode='transparent'):
# Install traffic package in VM
self.vm1_fixture.install_pkg("Traffic")
self.vm2_fixture.install_pkg("Traffic")
sport = 8000
dport = 9000
sent, recv = self.verify_traffic(self.vm1_fixture, self.vm2_fixture,
'udp', sport=sport, dport=dport)
errmsg = "UDP traffic with src port %s and dst port %s failed" % (
sport, dport)
assert sent and recv == sent, errmsg
sport = 8000
dport = 9001
sent, recv = self.verify_traffic(self.vm1_fixture, self.vm2_fixture,
'tcp', sport=sport, dport=dport)
errmsg = "TCP traffic with src port %s and dst port %s failed" % (
sport, dport)
assert sent and recv == sent, errmsg
# Delete policy
self.detach_policy(self.vn1_policy_fix)
self.detach_policy(self.vn2_policy_fix)
self.unconfig_policy(self.policy_fixture)
# Update rule with specific port/protocol
action_list = {'apply_service': self.action_list}
new_rule = {'direction': '<>',
'protocol': 'tcp',
'source_network': self.vn1_name,
'src_ports': [8000, 8000],
'dest_network': self.vn2_name,
'dst_ports': [9001, 9001],
'simple_action': None,
'action_list': action_list
}
self.rules = [new_rule]
# Create new policy with rule to allow traffci from new VN's
self.policy_fixture = self.config_policy(self.policy_name, self.rules)
self.vn1_policy_fix = self.attach_policy_to_vn(
self.policy_fixture, self.vn1_fixture)
self.vn2_policy_fix = self.attach_policy_to_vn(
self.policy_fixture, self.vn2_fixture)
self.verify_si(self.si_fixtures)
self.logger.debug("Send udp traffic; with policy rule %s", new_rule)
sport = 8000
dport = 9000
sent, recv = self.verify_traffic(self.vm1_fixture, self.vm2_fixture,
'udp', sport=sport, dport=dport)
errmsg = "UDP traffic with src port %s and dst port %s passed; Expected to fail" % (
sport, dport)
assert sent and recv == 0, errmsg
sport = 8000
dport = 9001
self.logger.debug("Send tcp traffic; with policy rule %s", new_rule)
sent, recv = self.verify_traffic(self.vm1_fixture, self.vm2_fixture,
'tcp', sport=sport, dport=dport)
errmsg = "TCP traffic with src port %s and dst port %s failed" % (
sport, dport)
assert sent and recv == sent, errmsg
return True
def verify_add_new_vns(self):
# Delete policy
self.detach_policy(self.vn1_policy_fix)
self.detach_policy(self.vn2_policy_fix)
self.unconfig_policy(self.policy_fixture)
# Create one more left and right VN's
new_left_vn = "new_left_bridge_vn"
new_left_vn_net = ['51.1.1.0/24']
new_right_vn = "new_right_bridge_vn"
new_right_vn_net = ['52.2.2.0/24']
new_left_vn_fix = self.config_vn(new_left_vn, new_left_vn_net)
new_right_vn_fix = self.config_vn(new_right_vn, new_right_vn_net)
# Launch VMs in new left and right VN's
new_left_vm = 'new_left_bridge_vm'
new_right_vm = 'new_right_bridge_vm'
new_left_vm_fix = self.config_vm(new_left_vn_fix, new_left_vm)
new_right_vm_fix = self.config_vm(new_right_vn_fix, new_right_vm)
assert new_left_vm_fix.verify_on_setup()
assert new_right_vm_fix.verify_on_setup()
# Wait for VM's to come up
new_left_vm_fix.wait_till_vm_is_up()
new_right_vm_fix.wait_till_vm_is_up()
# Add rule to policy to allow traffic from new left_vn to right_vn
# through SI
new_rule = {'direction': '<>',
'protocol': 'any',
'source_network': new_left_vn,
'src_ports': [0, -1],
'dest_network': new_right_vn,
'dst_ports': [0, -1],
'simple_action': None,
'action_list': {'apply_service': self.action_list}
}
self.rules.append(new_rule)
# Create new policy with rule to allow traffci from new VN's
self.policy_fixture = self.config_policy(self.policy_name, self.rules)
self.vn1_policy_fix = self.attach_policy_to_vn(
self.policy_fixture, self.vn1_fixture)
self.vn2_policy_fix = self.attach_policy_to_vn(
self.policy_fixture, self.vn2_fixture)
# attach policy to new VN's
new_policy_left_vn_fix = self.attach_policy_to_vn(
self.policy_fixture, new_left_vn_fix)
new_policy_right_vn_fix = self.attach_policy_to_vn(
self.policy_fixture, new_right_vn_fix)
self.verify_si(self.si_fixtures)
# Ping from left VM to right VM
sleep(5)
self.logger.info("Verfiy ICMP traffic between new VN's.")
errmsg = "Ping to right VM ip %s from left VM failed" % new_right_vm_fix.vm_ip
assert new_left_vm_fix.ping_with_certainty(
new_right_vm_fix.vm_ip), errmsg
self.logger.info(
"Verfiy ICMP traffic between new left VN and existing right VN.")
errmsg = "Ping to right VM ip %s from left VM passed; \
Expected tp Fail" % self.vm2_fixture.vm_ip
assert new_left_vm_fix.ping_with_certainty(self.vm2_fixture.vm_ip,
expectation=False), errmsg
self.logger.info(
"Verfiy ICMP traffic between existing VN's with allow all.")
errmsg = "Ping to right VM ip %s from left VM failed" % self.vm2_fixture.vm_ip
assert self.vm1_fixture.ping_with_certainty(
self.vm2_fixture.vm_ip), errmsg
self.logger.info(
"Verfiy ICMP traffic between existing left VN and new right VN.")
errmsg = "Ping to right VM ip %s from left VM passed; \
Expected to Fail" % new_right_vm_fix.vm_ip
assert self.vm1_fixture.ping_with_certainty(new_right_vm_fix.vm_ip,
expectation=False), errmsg
# Ping between left VN's
self.logger.info(
"Verfiy ICMP traffic between new left VN and existing left VN.")
errmsg = "Ping to left VM ip %s from another left VM in different VN \
passed; Expected to fail" % self.vm1_fixture.vm_ip
assert new_left_vm_fix.ping_with_certainty(self.vm1_fixture.vm_ip,
expectation=False), errmsg
self.logger.info(
"Verfiy ICMP traffic between new right VN and existing right VN.")
errmsg = "Ping to right VM ip %s from another right VM in different VN \
passed; Expected to fail" % self.vm2_fixture.vm_ip
assert new_right_vm_fix.ping_with_certainty(self.vm2_fixture.vm_ip,
expectation=False), errmsg
# Delete policy
self.detach_policy(self.vn1_policy_fix)
self.detach_policy(self.vn2_policy_fix)
self.detach_policy(new_policy_left_vn_fix)
self.detach_policy(new_policy_right_vn_fix)
self.unconfig_policy(self.policy_fixture)
# Add rule to policy to allow only tcp traffic from new left_vn to right_vn
# through SI
self.rules.remove(new_rule)
udp_rule = {'direction': '<>',
'protocol': 'udp',
'source_network': new_left_vn,
'src_ports': [8000, 8000],
'dest_network': new_right_vn,
'dst_ports': [9000, 9000],
'simple_action': None,
'action_list': {'apply_service': self.action_list}
}
self.rules.append(udp_rule)
# Create new policy with rule to allow traffci from new VN's
self.policy_fixture = self.config_policy(self.policy_name, self.rules)
self.vn1_policy_fix = self.attach_policy_to_vn(
self.policy_fixture, self.vn1_fixture)
self.vn2_policy_fix = self.attach_policy_to_vn(
self.policy_fixture, self.vn2_fixture)
# attach policy to new VN's
new_policy_left_vn_fix = self.attach_policy_to_vn(
self.policy_fixture, new_left_vn_fix)
new_policy_right_vn_fix = self.attach_policy_to_vn(
self.policy_fixture, new_right_vn_fix)
self.verify_si(self.si_fixtures)
# Ping from left VM to right VM with udp rule
self.logger.info(
"Verify ICMP traffic with allow udp only rule from new left VN to new right VN")
errmsg = "Ping to right VM ip %s from left VM passed; Expected to fail" % new_right_vm_fix.vm_ip
assert new_left_vm_fix.ping_with_certainty(new_right_vm_fix.vm_ip,
expectation=False), errmsg
# Install traffic package in VM
self.vm1_fixture.install_pkg("Traffic")
self.vm2_fixture.install_pkg("Traffic")
new_left_vm_fix.install_pkg("Traffic")
new_right_vm_fix.install_pkg("Traffic")
self.logger.info(
"Verify UDP traffic with allow udp only rule from new left VN to new right VN")
sport = 8000
dport = 9000
sent, recv = self.verify_traffic(new_left_vm_fix, new_right_vm_fix,
'udp', sport=sport, dport=dport)
errmsg = "UDP traffic with src port %s and dst port %s failed" % (
sport, dport)
assert sent and recv == sent, errmsg
self.logger.info("Verfiy ICMP traffic with allow all.")
errmsg = "Ping to right VM ip %s from left VM failed" % self.vm2_fixture.vm_ip
assert self.vm1_fixture.ping_with_certainty(
self.vm2_fixture.vm_ip), errmsg
self.logger.info("Verify UDP traffic with allow all")
sport = 8001
dport = 9001
sent, recv = self.verify_traffic(self.vm1_fixture, self.vm2_fixture,
'udp', sport=sport, dport=dport)
errmsg = "UDP traffic with src port %s and dst port %s failed" % (
sport, dport)
assert sent and recv == sent, errmsg
# Delete policy
self.delete_vm(new_left_vm_fix)
self.delete_vm(new_right_vm_fix)
self.detach_policy(new_policy_left_vn_fix)
self.detach_policy(new_policy_right_vn_fix)
self.delete_vn(new_left_vn_fix)
self.delete_vn(new_right_vn_fix)
self.verify_si(self.si_fixtures)
self.logger.info(
"Icmp traffic with allow all after deleting the new left and right VN.")
errmsg = "Ping to right VM ip %s from left VM failed" % self.vm2_fixture.vm_ip
assert self.vm1_fixture.ping_with_certainty(
self.vm2_fixture.vm_ip), errmsg
return True
def verify_add_new_vms(self):
# Launch VMs in new left and right VN's
new_left_vm = 'new_left_bridge_vm'
new_right_vm = 'new_right_bridge_vm'
new_left_vm_fix = self.config_vm(self.vn1_fixture, new_left_vm)
new_right_vm_fix = self.config_vm(self.vn2_fixture, new_right_vm)
assert new_left_vm_fix.verify_on_setup()
assert new_right_vm_fix.verify_on_setup()
# Wait for VM's to come up
new_left_vm_fix.wait_till_vm_is_up()
new_right_vm_fix.wait_till_vm_is_up()
# Ping from left VM to right VM
errmsg = "Ping to right VM ip %s from left VM failed" % new_right_vm_fix.vm_ip
assert new_left_vm_fix.ping_with_certainty(
new_right_vm_fix.vm_ip), errmsg
errmsg = "Ping to right VM ip %s from left VM failed" % self.vm2_fixture.vm_ip
assert new_left_vm_fix.ping_with_certainty(
self.vm2_fixture.vm_ip), errmsg
errmsg = "Ping to right VM ip %s from left VM failed" % self.vm2_fixture.vm_ip
assert self.vm1_fixture.ping_with_certainty(
self.vm2_fixture.vm_ip), errmsg
errmsg = "Ping to right VM ip %s from left VM failed" % new_right_vm_fix.vm_ip
assert self.vm1_fixture.ping_with_certainty(
new_right_vm_fix.vm_ip), errmsg
# Install traffic package in VM
self.vm1_fixture.install_pkg("Traffic")
self.vm2_fixture.install_pkg("Traffic")
self.logger.debug("Send udp traffic; with policy rule allow all")
sport = 8000
dport = 9000
sent, recv = self.verify_traffic(self.vm1_fixture, self.vm2_fixture,
'udp', sport=sport, dport=dport)
errmsg = "UDP traffic with src port %s and dst port %s failed" % (
sport, dport)
assert sent and recv == sent, errmsg
# Delete policy
self.detach_policy(self.vn1_policy_fix)
self.detach_policy(self.vn2_policy_fix)
self.unconfig_policy(self.policy_fixture)
# Add rule to policy to allow traffic from new left_vn to right_vn
# through SI
new_rule = {'direction': '<>',
'protocol': 'udp',
'source_network': self.vn1_name,
'src_ports': [8000, 8000],
'dest_network': self.vn2_name,
'dst_ports': [9000, 9000],
'simple_action': None,
'action_list': {'apply_service': self.action_list}
}
self.rules = [new_rule]
# Create new policy with rule to allow traffci from new VN's
self.policy_fixture = self.config_policy(self.policy_name, self.rules)
self.vn1_policy_fix = self.attach_policy_to_vn(
self.policy_fixture, self.vn1_fixture)
self.vn2_policy_fix = self.attach_policy_to_vn(
self.policy_fixture, self.vn2_fixture)
self.verify_si(self.si_fixtures)
# Install traffic package in VM
new_left_vm_fix.install_pkg("Traffic")
new_right_vm_fix.install_pkg("Traffic")
self.logger.debug("Send udp traffic; with policy rule %s", new_rule)
sport = 8000
dport = 9000
sent, recv = self.verify_traffic(self.vm1_fixture, self.vm2_fixture,
'udp', sport=sport, dport=dport)
errmsg = "UDP traffic with src port %s and dst port %s failed" % (
sport, dport)
assert sent and recv == sent, errmsg
sent, recv = self.verify_traffic(self.vm1_fixture, new_right_vm_fix,
'udp', sport=sport, dport=dport)
errmsg = "UDP traffic with src port %s and dst port %s failed" % (
sport, dport)
assert sent and recv == sent, errmsg
sent, recv = self.verify_traffic(new_left_vm_fix, new_right_vm_fix,
'udp', sport=sport, dport=dport)
errmsg = "UDP traffic with src port %s and dst port %s failed" % (
sport, dport)
assert sent and recv == sent, errmsg
sent, recv = self.verify_traffic(new_left_vm_fix, self.vm2_fixture,
'udp', sport=sport, dport=dport)
errmsg = "UDP traffic with src port %s and dst port %s failed" % (
sport, dport)
assert sent and recv == sent, errmsg
# Ping from left VM to right VM
errmsg = "Ping to right VM ip %s from left VM failed; Expected to fail" % new_right_vm_fix.vm_ip
assert new_left_vm_fix.ping_with_certainty(
new_right_vm_fix.vm_ip, expectation=False), errmsg
errmsg = "Ping to right VM ip %s from left VM failed; Expected to fail" % self.vm2_fixture.vm_ip
assert new_left_vm_fix.ping_with_certainty(
self.vm2_fixture.vm_ip, expectation=False), errmsg
errmsg = "Ping to right VM ip %s from left VM failed; Expected to fail" % self.vm2_fixture.vm_ip
assert self.vm1_fixture.ping_with_certainty(
self.vm2_fixture.vm_ip, expectation=False), errmsg
errmsg = "Ping to right VM ip %s from left VM passed; Expected to fail" % new_right_vm_fix.vm_ip
assert self.vm1_fixture.ping_with_certainty(
new_right_vm_fix.vm_ip, expectation=False), errmsg
return True
def verify_firewall_with_mirroring(
self, si_count=1, svc_scaling=False, max_inst=1,
firewall_svc_mode='in-network', mirror_svc_mode='transparent', flavor='m1.medium'):
"""Validate the service chaining in network datapath"""
if getattr(self, 'res', None):
self.vn1_fq_name = "default-domain:admin:" + self.res.vn1_name
self.vn1_name = self.res.vn1_name
self.vn1_subnets = self.res.vn1_subnets
self.vm1_name = self.res.vn1_vm1_name
self.vn2_fq_name = "default-domain:admin:" + self.res.vn2_name
self.vn2_name = self.res.vn2_name
self.vn2_subnets = self.res.vn2_subnets
self.vm2_name = self.res.vn2_vm2_name
else:
self.vn1_fq_name = "default-domain:admin:vn1"
self.vn1_name = "vn1"
self.vn1_subnets = ['10.1.1.0/24']
self.vm1_name = 'vm1'
self.vn2_fq_name = "default-domain:admin:vn2"
self.vn2_name = "vn2"
self.vn2_subnets = ['20.2.2.0/24']
self.vm2_name = 'vm2'
self.action_list = []
self.firewall_st_name = 'svc_firewall_template_1'
firewall_si_prefix = 'svc_firewall_instance_'
self.mirror_st_name = 'svc_mirror_template_1'
mirror_si_prefix = 'svc_mirror_instance_'
self.policy_name = 'policy_in_network'
if getattr(self, 'res', None):
self.vn1_fixture = self.res.get_vn1_fixture()
self.vn2_fixture = self.res.get_vn2_fixture()
assert self.vn1_fixture.verify_on_setup()
assert self.vn2_fixture.verify_on_setup()
else:
self.vn1_fixture = self.config_vn(self.vn1_name, self.vn1_subnets)
self.vn2_fixture = self.config_vn(self.vn2_name, self.vn2_subnets)
if firewall_svc_mode == 'transparent':
self.if_list = []
self.st_fixture, self.firewall_si_fixtures = self.config_st_si(
self.firewall_st_name,
firewall_si_prefix, si_count,
svc_scaling, max_inst,
left_vn=None, right_vn=None,
svc_mode=firewall_svc_mode, flavor=flavor)
if firewall_svc_mode == 'in-network':
self.if_list = [['management', False],
['left', True], ['right', True]]
self.st_fixture, self.firewall_si_fixtures = self.config_st_si(
self.firewall_st_name,
firewall_si_prefix, si_count,
svc_scaling, max_inst,
left_vn=self.vn1_fq_name,
right_vn=self.vn2_fq_name,
svc_mode=firewall_svc_mode, flavor=flavor)
self.action_list = self.chain_si(si_count, firewall_si_prefix)
self.st_fixture, self.mirror_si_fixtures = self.config_st_si(
self.mirror_st_name,
mirror_si_prefix, si_count,
left_vn=self.vn1_fq_name,
svc_type='analyzer',
svc_mode=mirror_svc_mode, flavor=flavor)
self.action_list += (self.chain_si(si_count, mirror_si_prefix))
self.rules = [
{
'direction': '<>',
'protocol': 'any',
'source_network': self.vn1_name,
'src_ports': [0, -1],
'dest_network': self.vn2_name,
'dst_ports': [0, -1],
'simple_action': 'pass',
'action_list': {'simple_action': 'pass',
'mirror_to': {'analyzer_name': self.action_list[1]},
'apply_service': self.action_list[:1]}
},
]
self.policy_fixture = self.config_policy(self.policy_name, self.rules)
self.vn1_policy_fix = self.attach_policy_to_vn(
self.policy_fixture, self.vn1_fixture)
self.vn2_policy_fix = self.attach_policy_to_vn(
self.policy_fixture, self.vn2_fixture)
if getattr(self, 'res', None):
self.vm1_fixture = self.res.get_vn1_vm1_fixture()
self.vm2_fixture = self.res.get_vn2_vm2_fixture()
else:
self.vm1_fixture = self.config_vm(self.vn1_fixture, self.vm1_name)
self.vm2_fixture = self.config_vm(self.vn2_fixture, self.vm2_name)
assert self.vm1_fixture.verify_on_setup()
assert self.vm2_fixture.verify_on_setup()
self.vm1_fixture.wait_till_vm_is_up()
self.vm2_fixture.wait_till_vm_is_up()
result, msg = self.validate_vn(self.vn1_name)
assert result, msg
result, msg = self.validate_vn(self.vn2_name)
assert result, msg
self.verify_si(self.firewall_si_fixtures)
self.verify_si(self.mirror_si_fixtures)
for si_fix in self.firewall_si_fixtures:
svm_node_ip = si_fix.svm_compute_node_ip()
# Ping from left VM to right VM
errmsg = "Ping to right VM ip %s from left VM failed" % self.vm2_fixture.vm_ip
assert self.vm1_fixture.ping_with_certainty(
self.vm2_fixture.vm_ip), errmsg
# Verify ICMP mirror
sessions = self.tcpdump_on_all_analyzer(mirror_si_prefix, si_count)
errmsg = "Ping to right VM ip %s from left VM failed" % self.vm2_fixture.vm_ip
assert self.vm1_fixture.ping_with_certainty(
self.vm2_fixture.vm_ip), errmsg
for svm_name, (session, pcap) in sessions.items():
if self.vm1_fixture.vm_node_ip == self.vm2_fixture.vm_node_ip:
if firewall_svc_mode == 'transparent':
count = 20
else:
count = 10
if self.vm1_fixture.vm_node_ip != self.vm2_fixture.vm_node_ip:
if firewall_svc_mode == 'in-network' and self.vm1_fixture.vm_node_ip == svm_node_ip:
count = 10
else:
count = 20
self.verify_icmp_mirror(svm_name, session, pcap, count)
return True
| 44.403244 | 195 | 0.606628 | 5,148 | 38,320 | 4.20474 | 0.038267 | 0.045736 | 0.035572 | 0.023653 | 0.899566 | 0.865287 | 0.839093 | 0.811097 | 0.79636 | 0.79174 | 0 | 0.026036 | 0.305402 | 38,320 | 862 | 196 | 44.454756 | 0.787204 | 0.046973 | 0 | 0.743697 | 0 | 0 | 0.136897 | 0.007927 | 0 | 0 | 0 | 0 | 0.095238 | 1 | 0.011204 | false | 0.015406 | 0.007003 | 0 | 0.029412 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
df90037d52fa6a0bd4056321f22ca41d3b0c7087 | 402 | py | Python | brainglobe_napari_io/plugins.py | dstansby/brainglobe-napari-io | a1f35230717e4d2461adcfef1b39ce33a58909f5 | [
"BSD-3-Clause"
] | null | null | null | brainglobe_napari_io/plugins.py | dstansby/brainglobe-napari-io | a1f35230717e4d2461adcfef1b39ce33a58909f5 | [
"BSD-3-Clause"
] | 12 | 2021-06-11T08:53:09.000Z | 2022-03-29T09:20:05.000Z | brainglobe_napari_io/plugins.py | dstansby/brainglobe-napari-io | a1f35230717e4d2461adcfef1b39ce33a58909f5 | [
"BSD-3-Clause"
] | 3 | 2021-12-21T04:23:46.000Z | 2022-03-14T15:44:00.000Z | from brainglobe_napari_io.cellfinder.reader_xml import cellfinder_read_xml
from brainglobe_napari_io.cellfinder.reader_dir import cellfinder_read_dir
from brainglobe_napari_io.brainreg.reader_dir import brainreg_read_dir
from brainglobe_napari_io.cellfinder.writer_multiple_xml import (
cellfinder_write_multiple_xml,
)
from brainglobe_napari_io.cellfinder.writer_xml import cellfinder_write_xml
| 40.2 | 75 | 0.89801 | 57 | 402 | 5.859649 | 0.245614 | 0.209581 | 0.299401 | 0.329341 | 0.580838 | 0.580838 | 0 | 0 | 0 | 0 | 0 | 0 | 0.069652 | 402 | 9 | 76 | 44.666667 | 0.893048 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.714286 | 0 | 0.714286 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
10c3833018e13b05ffbac6334ef266f718c4023b | 2,043 | py | Python | google_or_tools/nonogram_pbn_light.py | Wikunia/hakank | 030bc928d2efe8dcbc5118bda3f8ae9575d0fd13 | [
"MIT"
] | 279 | 2015-01-10T09:55:35.000Z | 2022-03-28T02:34:03.000Z | google_or_tools/nonogram_pbn_light.py | Wikunia/hakank | 030bc928d2efe8dcbc5118bda3f8ae9575d0fd13 | [
"MIT"
] | 10 | 2017-10-05T15:48:50.000Z | 2021-09-20T12:06:52.000Z | google_or_tools/nonogram_pbn_light.py | Wikunia/hakank | 030bc928d2efe8dcbc5118bda3f8ae9575d0fd13 | [
"MIT"
] | 83 | 2015-01-20T03:44:00.000Z | 2022-03-13T23:53:06.000Z | # webpbn.com Puzzle #803: You light up my life
# Copyright 2007 by Robert Kummerfeldt
#
rows = 45
row_rule_len = 4
row_rules = [
[0, 0, 0, 0],
[0, 0, 0, 1],
[0, 0, 0, 1],
[0, 0, 0, 3],
[0, 0, 2, 2],
[0, 0, 1, 1],
[0, 0, 0, 7],
[0, 0, 1, 1],
[0, 1, 3, 1],
[0, 1, 3, 1],
[0, 0, 1, 1],
[0, 0, 0, 11],
[0, 0, 1, 1],
[0, 0, 1, 1],
[0, 0, 2, 2],
[0, 0, 1, 1],
[0, 0, 1, 1],
[0, 0, 1, 1],
[0, 0, 1, 1],
[0, 0, 2, 2],
[0, 0, 1, 1],
[0, 0, 1, 1],
[0, 0, 1, 1],
[0, 0, 1, 1],
[0, 0, 1, 1],
[0, 0, 1, 1],
[0, 0, 1, 1],
[0, 0, 1, 1],
[0, 0, 1, 1],
[0, 0, 1, 1],
[0, 0, 2, 2],
[0, 0, 1, 1],
[0, 0, 1, 1],
[0, 0, 1, 1],
[0, 0, 1, 1],
[0, 0, 1, 1],
[0, 1, 4, 1],
[1, 1, 1, 1],
[1, 1, 1, 1],
[1, 1, 1, 1],
[1, 1, 1, 1],
[0, 0, 0, 25],
[0, 0, 6, 5],
[0, 0, 5, 6],
[0, 0, 4, 5]
]
cols = 50
col_rule_len = 5
col_rules = [
[0, 0, 0, 0, 1],
[0, 0, 0, 0, 1],
[0, 0, 0, 0, 1],
[0, 0, 0, 0, 2],
[0, 0, 0, 0, 1],
[0, 0, 0, 0, 1],
[0, 0, 0, 0, 1],
[0, 0, 0, 0, 2],
[0, 0, 0, 0, 1],
[0, 0, 0, 0, 1],
[0, 0, 0, 0, 1],
[0, 0, 0, 0, 1],
[0, 0, 0, 0, 2],
[0, 0, 0, 0, 1],
[0, 0, 0, 0, 1],
[0, 0, 0, 0, 1],
[0, 0, 0, 0, 5],
[0, 0, 0, 7, 1],
[0, 0, 0, 6, 1],
[0, 0, 0, 6, 1],
[0, 0, 1, 6, 1],
[0, 0, 0, 4, 1],
[0, 0, 0, 7, 1],
[0, 1, 1, 1, 1],
[2, 1, 2, 1, 1],
[3, 1, 2, 1, 1],
[2, 1, 2, 1, 1],
[0, 1, 1, 1, 1],
[0, 0, 0, 7, 6],
[0, 0, 4, 1, 1],
[0, 1, 6, 1, 1],
[0, 0, 0, 6, 6],
[0, 0, 0, 6, 1],
[0, 0, 0, 5, 1],
[0, 0, 0, 0, 7],
[0, 0, 0, 0, 1],
[0, 0, 0, 0, 2],
[0, 0, 0, 0, 1],
[0, 0, 0, 0, 1],
[0, 0, 0, 0, 1],
[0, 0, 0, 0, 2],
[0, 0, 0, 0, 1],
[0, 0, 0, 0, 1],
[0, 0, 0, 0, 1],
[0, 0, 0, 0, 1],
[0, 0, 0, 0, 2],
[0, 0, 0, 0, 1],
[0, 0, 0, 0, 1],
[0, 0, 0, 0, 1],
[0, 0, 0, 0, 1]
]
| 18.916667 | 46 | 0.27117 | 460 | 2,043 | 1.191304 | 0.071739 | 0.605839 | 0.465328 | 0.270073 | 0.742701 | 0.698905 | 0.636861 | 0.569343 | 0.516423 | 0.516423 | 0 | 0.374579 | 0.418502 | 2,043 | 107 | 47 | 19.093458 | 0.0867 | 0.039158 | 0 | 0.728155 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
805a6c7b58fe7794897006d3ca52bb56b4626856 | 28,659 | py | Python | src/Converter/Tests/InputTestData/IdeaDictionaries.py | CSIRT-MU/IDEA-IDMEF-Converter | 0dbdc70527a815f8bd071fd10ce14251add1aa52 | [
"BSD-3-Clause"
] | null | null | null | src/Converter/Tests/InputTestData/IdeaDictionaries.py | CSIRT-MU/IDEA-IDMEF-Converter | 0dbdc70527a815f8bd071fd10ce14251add1aa52 | [
"BSD-3-Clause"
] | null | null | null | src/Converter/Tests/InputTestData/IdeaDictionaries.py | CSIRT-MU/IDEA-IDMEF-Converter | 0dbdc70527a815f8bd071fd10ce14251add1aa52 | [
"BSD-3-Clause"
] | 2 | 2017-08-24T20:33:55.000Z | 2021-04-15T08:50:12.000Z |
def get_dict1():
return {
"Format": "IDEA0",
"ID": "2E4A3926-B1B9-41E3-89AE-B6B474EB0A54",
"AltNames": ["altname1", "altname2"],
"CorrelID": ["3E4A3926-B1B9-41E3-89AE-B6B474EB0A56", "4E4A3926-B1B9-41E3-89AE-B6B474EB0A57",
"5E4A3926-B1B9-41E3-89AE-B6B474EB0A58"],
"AggrID": ["6E4A3926-B1B9-41E3-89AE-B6B474EB0A57", "6E4A3926-B1B9-41E3-89AE-B6B474EB0A58",
"6E4A3926-B1B9-41E3-89AE-B6B474EB0A59"],
"PredID": ["6E4A7777-B1B9-41E3-89AE-B6B474EB0A57", "6E4A7777-B1B9-41E3-89AE-B6B474EB0A58",
"6E4A7777-B2B9-41E3-89AE-B6B474EB0A59"],
"RelID": ["6E4A7777-B1B9-41E3-89AE-B6B474EB0A22", "6E4A7777-B1B9-41E3-89AE-B6B474EB0A11",
"6E4A7777-B2B9-41E3-89AE-B6B474EB0A33"],
"CreateTime": "2017-03-23T10:10:42Z",
"DetectTime": "2017-03-23T09:15:50Z",
"EventTime": "2017-03-22T21:12:05Z",
"CeaseTime": "2017-02-22T21:11:05Z",
"WinStartTime": "2017-02-22T21:12:10Z",
"WinEndTime": "2017-02-22T21:14:38Z",
"ConnCount": 3,
"FlowCount": 4,
"PacketCount": 248,
"ByteCount": 720468,
"Category": ["Abusive.Spam", "Fraud.Phishing"],
"Ref": ["http://www.example.com", "http://www.other.example.com"],
"Confidence": 0.563,
"Description": "Test message for converter",
"Note": "No field of this message is real",
"Source":
[
{
"Type": ["OriginSpam", "Phishing"],
"Hostname": ["hostexample.com", "nameexample.com"],
"IP4": ["93.184.216.119"],
"IP6": ["2001:db8::2:1", "2001:db8:a::123/64"],
"MAC": ["01:23:45:67:89:ab", "01:23:45:67:89:cd"],
"Email": ["example@example.com", "example2@example2.com"],
"Proto": ["tcp2", "epmap2"],
"Port": [245],
"URL": ["http://www.sourceexample.com", "http://www.source.otherexample.com"],
"Spoofed": False,
"AttachHand": ["abc1", "abc2"],
"Note": "First test source",
"Imprecise": True,
"Anonymised": False,
"ASN": [15, 231],
"Router": ["router1", "router2"],
"Netname": ["example:EXAMPLE", "example2:EXAMPLE2"],
"Ref": ["sourceref.example.com", "sorceref.other.example.com"]
},
{
"Type": ["Phishing"],
"Hostname": ["hostexample2.com", "othernameexample.com"],
"IP4": ["93.184.216.119", "93.184.216.120"],
"IP6": ["2001:db8::2:1", "2001:db8:a::123/64"],
"MAC": ["01:23:45:67:89:ab", "01:23:45:67:89:cd"],
"Email": ["example@example.com", "example2@example2.com"],
"Proto": ["tcp2", "epmap2"],
"Port": [210, 542],
"URL": ["http://www.sourceexample.com", "http://www.source.otherexample.com"],
"Spoofed": False,
"AttachHand": ["def", "def2"],
"Note": "Second test source",
"Imprecise": True,
"Anonymised": False,
"ASN": [15, 231],
"Router": ["router1", "router2"],
"Netname": ["example:EXAMPLE", "example2:EXAMPLE2"],
"Ref": ["sourceref.example.com", "sorceref.other.example.com"]
}
],
"Target":
[
{
"Type": ["Open"],
"Hostname": ["targethostexample.com", "targetnameexample.com"],
"IP4": ["0.184.216.119"],
"IP6": ["0:db8::2:1", "2001:db8:a::123/64"],
"MAC": ["00:00:45:67:89:ab", "00:23:45:67:89:ed"],
"Email": ["example00@example.com", "example000@example2.com"],
"Proto": ["tcp", "epmap"],
"Port": [80],
"URL": ["http://www.targetexample.com", "http://www.target.otherexample.com"],
"Spoofed": True,
"AttachHand": ["att1", "att2"],
"Note": "First test target",
"Imprecise": True,
"Anonymised": False,
"ASN": [15, 231],
"Router": ["router3", "router4"],
"Netname": ["example:EXAMPLE", "example2:EXAMPLE2"],
"Ref": ["tar.example.com", "other.example.com"]
},
{
"Type": ["Open"],
"Hostname": ["example.com", "othertarget.com"],
"IP4": ["0.0.216.119"],
"IP6": ["0:0::2:1", "00:db8:a::123/64"],
"MAC": ["00:00:00:67:89:ab", "00:23:45:67:89:fd"],
"Email": ["example00@example.com", "example000@example2.com"],
"Proto": ["tcp2", "epmap2"],
"Port": [210, 542],
"URL": ["http://www.targetxmpl.com", "http://www.target.otherexample.com"],
"Spoofed": False,
"AttachHand": ["att3", "att4"],
"Note": "Second test target",
"Imprecise": True,
"Anonymised": True,
"ASN": [15, 231],
"Router": ["router5", "router6"],
"Netname": ["example:EXAMPLE", "example2:EXAMPLE2"],
"Ref": ["target.example.com", "ref.other.example.com"]
}
],
"Attach":
[
{
"Handle": "att1",
"FileName": ["file1.txt", "file2"],
"Type": ["Malware"],
"Hash": ["sha1:794467071687f7c59d033f4de5ece6b46415b633",
"md5:dc89f0b4ff9bd3b061dd66bb66c991b1"],
"Size": 200,
"Ref": ["example.com", "other.example.com"],
"Note": "First test attach",
"ContentType": ["mediatype"],
"ContentCharset": "UTF-8",
"ContentEncoding": "base64",
"Content": "This is content of the attach",
"ContentID": ["1", "2"],
"ExternalURI": ["external.example.com"]
},
{
"Handle": "att2",
"FileName": ["file1.txt", "file2"],
"Type": ["Malware"],
"Hash": ["sha1:794467071687f7c59d033f4de5ece6b46415b604",
"md5:dc89f0b4ff9bd3b061dd66bb66c99100"],
"Size": 300,
"Ref": ["example.com", "other.example.com"],
"Note": "First test attach",
"ContentType": ["mediatype"],
"ContentCharset": "UTF-8",
"ContentEncoding": "base64",
"Content": "This is content of the attach",
"ContentID": ["1", "2"],
"ExternalURI": ["external.example.com"]
},
{
"Handle": "att3",
"FileName": ["file1.txt", "file2"],
"Type": ["Malware"],
"Hash": ["sha1:794467071687f7c59d033f4de5ece6b46415b601",
"md5:dc89f0b4ff9bd3b061dd66bb66c99102"],
"Size": 400,
"Ref": ["example.com", "other.example.com"],
"Note": "Second test attach",
"ContentType": ["mediatype"],
"ContentCharset": "UTF-8",
"ContentEncoding": "base64",
"Content": "This is content of the attach",
"ContentID": ["1", "2"],
"ExternalURI": ["external.example.com"]
}
],
"Node":
[
{
"Name": "com.example.specific",
"Type": ["Log", "Statistical"],
"SW": ["Example software", "Example software 2"],
"AggrWin": "536D10:20:30.5",
"Note": "First test node"
},
{
"Name": "com.example.node",
"Type": ["Log", "Statistical"],
"SW": ["Example software"],
"AggrWin": "425D10:20:35.1",
"Note": "Second test node"
}
]
}
def get_expected_dict2():
return {
"Format": "IDEA0",
"ID": "1E4A3926-B1B9-41E3-89AE-B6B474EB0A54",
"AltNames": ["altname8", "altname9"],
"CorrelID": ["3E4A3925-B1B9-41E3-89AE-B6B474EB0A56", "4E4A3926-B1B9-41E3-89AE-B6B474EBBA57",
"5E4A3926-B1B9-41E3-89AE-B6B474EB0A41"],
"AggrID": ["6E4A3926-B1B9-41E3-89AE-B6B474EB0A57", "6E4A3926-B1B9-41E3-89AE-B6B474EB0A58",
"6E4A3926-B1B9-41E3-89AE-B6B474EB0A65"],
"PredID": ["6E4A7777-B1B9-41E3-89AE-B6B474EB0A57", "6E4A7777-B1B9-41E3-89AE-B6B474EB0A58",
"6E4A7777-B2B9-41E3-89AE-B6B474EB0A59"],
"RelID": ["6E4A7777-B1B9-41E3-89AE-B6B474EB0A22", "6E4A7777-B1B9-41E3-89AE-B6B474EB0A11",
"6E4A7777-B2B9-41E3-89AE-B6B474EB0A33"],
"CreateTime": "2016-03-23T10:10:42Z",
"DetectTime": "2015-03-23T09:15:50Z",
"EventTime": "2014-03-22T21:12:05Z",
"CeaseTime": "2013-02-22T21:11:05Z",
"WinStartTime": "2012-02-22T21:12:10Z",
"WinEndTime": "2011-02-22T21:14:38Z",
"ConnCount": 5,
"FlowCount": 9,
"PacketCount": 248,
"ByteCount": 220468,
"Category": ["Abusive.Spam", "Fraud.Phishing"],
"Ref": ["http://www.example.com", "http://www.other.example.com"],
"Confidence": 0.563,
"Description": "Test message for converter",
"Note": "No field of this message is real",
"Source":
[
{
"Type": ["OriginSpam", "Phishing"],
"Hostname": ["hostexample.com", "nameexample.com"],
"IP4": ["112.184.216.119"],
"IP6": ["2001:db8::2:1", "2001:db8:a::123/64"],
"MAC": ["01:23:45:67:89:ab", "01:23:45:67:89:cd"],
"Email": ["example@example.com", "example2@example2.com"],
"Proto": ["tcp2", "epmap2"],
"Port": [245],
"URL": ["http://www.sourceexample.com", "http://www.source.otherexample.com"],
"Spoofed": True,
"AttachHand": ["abc1", "abc2"],
"Note": "First test source",
"Imprecise": True,
"Anonymised": False,
"ASN": [15, 231],
"Router": ["router1", "router2"],
"Netname": ["example:EXAMPLE", "example2:EXAMPLE2"],
"Ref": ["sourceref.example.com", "sorceref.other.example.com"]
},
{
"Type": ["Phishing"],
"Hostname": ["hostexample2.com", "othernameexample.com"],
"IP4": ["93.184.216.119", "93.184.216.120"],
"IP6": ["2001:db8::2:1", "2001:db8:a::123/64"],
"MAC": ["01:23:45:67:89:ab", "01:23:45:67:89:cd"],
"Email": ["example@example.com", "example2@example2.com"],
"Proto": ["tcp2", "epmap2"],
"Port": [210, 542],
"URL": ["http://www.sourceexample.com", "http://www.source.otherexample.com"],
"Spoofed": False,
"AttachHand": ["def", "def2"],
"Note": "Second test source",
"Imprecise": True,
"Anonymised": False,
"ASN": [15, 231],
"Router": ["router1", "router2"],
"Netname": ["example:EXAMPLE", "example2:EXAMPLE2"],
"Ref": ["sourceref.example.com", "sorceref.other.example.com"]
}
],
"Target":
[
{
"Type": ["Open"],
"Hostname": ["targethostexample.com", "targetnameexample.com"],
"IP4": ["0.184.216.119"],
"IP6": ["0:db8::2:1", "2001:db8:a::123/64"],
"MAC": ["00:00:45:67:89:ab", "00:23:45:67:89:ed"],
"Email": ["example00@example.com", "example000@example2.com"],
"Proto": ["tcp", "epmap"],
"Port": [80],
"URL": ["http://www.targetexample.com", "http://www.target.otherexample.com"],
"Spoofed": True,
"AttachHand": ["att1", "att2"],
"Note": "First test target",
"Imprecise": True,
"Anonymised": False,
"ASN": [35, 231],
"Router": ["router3", "router4"],
"Netname": ["example:EXAMPLE", "example2:EXAMPLE2"],
"Ref": ["tar.example.com", "other.example.com"]
},
{
"Type": ["Open"],
"Hostname": ["example.com", "othertarget.com"],
"IP4": ["0.0.216.119"],
"IP6": ["0:0::2:1", "00:db8:a::123/64"],
"MAC": ["00:00:00:67:89:ab", "00:23:45:67:89:fd"],
"Email": ["example00@example.com", "example000@example2.com"],
"Proto": ["tcp2", "epmap2"],
"Port": [210, 542],
"URL": ["http://www.targetxmpl.com", "http://www.target.otherexample.com"],
"Spoofed": False,
"AttachHand": ["att3", "att4"],
"Note": "Second test target",
"Imprecise": True,
"Anonymised": True,
"ASN": [15, 231],
"Router": ["router5", "router6"],
"Netname": ["example:EXAMPLE", "example2:EXAMPLE2"],
"Ref": ["target.example.com", "ref.other.example.com"]
}
],
"Attach":
[
{
"Handle": "att1",
"FileName": ["file1.txt", "file2"],
"Type": ["Malware"],
"Hash": ["sha1:794467071687f7c59d033f4de5ece6b46415b633",
"md5:dc89f0b4ff9bd3b061dd66bb66c991b1"],
"Size": 200,
"Ref": ["other.example.com"],
"Note": "First test attach",
"ContentType": ["mediatype"],
"ContentCharset": "UTF-8",
"ContentEncoding": "base64",
"Content": "This is content of the attach",
"ContentID": ["1", "2"],
"ExternalURI": ["external.example.com"]
},
{
"Handle": "att2",
"FileName": ["file1.txt", "file2"],
"Type": ["Malware"],
"Hash": ["sha1:794467071687f7c59d033f4de5ece6b46415b604",
"md5:dc89f0b4ff9bd3b061dd66bb66c99100"],
"Size": 300,
"Ref": ["example.com", "other.example.com"],
"Note": "First test attach",
"ContentType": ["mediatype"],
"ContentCharset": "UTF-8",
"ContentEncoding": "base64",
"Content": "This is content of the attach",
"ContentID": ["1", "2"],
"ExternalURI": ["external.example.com"]
},
{
"Handle": "att3",
"FileName": ["file1.txt", "file2"],
"Type": ["Malware"],
"Hash": ["sha1:794467071687f7c59d033f4de5ece6b46415b601",
"md5:dc89f0b4ff9bd3b061dd66bb66c99102"],
"Size": 400,
"Ref": ["example.com", "other.example.com"],
"Note": "Second test attach",
"ContentType": ["mediatype"],
"ContentCharset": "UTF-8",
"ContentEncoding": "base64",
"Content": "This is content of the attach",
"ContentID": ["1", "2"],
"ExternalURI": ["external.example.com"]
}
],
"Node":
[
{
"Name": "com.example.specific",
"Type": ["Log", "Statistical"],
"SW": ["Example software", "Example software 2"],
"AggrWin": "536D10:20:30.5",
"Note": "First test node"
},
{
"Name": "com.example.node",
"Type": ["Log", "Statistical"],
"SW": ["Example software"],
"AggrWin": "425D10:20:35.2",
"Note": "Second test node"
}
]
}
def get_test_string():
return """{
"Format": "IDEA0",
"ID": "2E4A3926-B1B9-41E3-89AE-B6B474EB0A54",
"AltNames": ["altname1", "altname2"],
"CorrelID": ["3E4A3926-B1B9-41E3-89AE-B6B474EB0A56", "4E4A3926-B1B9-41E3-89AE-B6B474EB0A57",
"5E4A3926-B1B9-41E3-89AE-B6B474EB0A58"],
"AggrID": ["6E4A3926-B1B9-41E3-89AE-B6B474EB0A57", "6E4A3926-B1B9-41E3-89AE-B6B474EB0A58",
"6E4A3926-B1B9-41E3-89AE-B6B474EB0A59"],
"PredID": ["6E4A7777-B1B9-41E3-89AE-B6B474EB0A57", "6E4A7777-B1B9-41E3-89AE-B6B474EB0A58",
"6E4A7777-B2B9-41E3-89AE-B6B474EB0A59"],
"RelID": ["6E4A7777-B1B9-41E3-89AE-B6B474EB0A22", "6E4A7777-B1B9-41E3-89AE-B6B474EB0A11",
"6E4A7777-B2B9-41E3-89AE-B6B474EB0A33"],
"CreateTime": "2017-03-23T10:10:42Z",
"DetectTime": "2017-03-23T09:15:50Z",
"EventTime": "2017-03-22T21:12:05Z",
"CeaseTime": "2017-02-22T21:11:05Z",
"WinStartTime": "2017-02-22T21:12:10Z",
"WinEndTime": "2017-02-22T21:14:38Z",
"ConnCount": 3,
"FlowCount": 4,
"PacketCount": 248,
"ByteCount": 720468,
"Category": ["Abusive.Spam", "Fraud.Phishing"],
"Ref": ["http://www.example.com", "http://www.other.example.com"],
"Confidence": 0.563,
"Description": "Test message for converter",
"Note": "No field of this message is real",
"Source":
[
{
"Type": ["OriginSpam", "Phishing"],
"Hostname": ["hostexample.com", "nameexample.com"],
"IP4": ["93.184.216.119"],
"IP6": ["2001:db8::2:1", "2001:db8:a::123/64"],
"MAC": ["01:23:45:67:89:ab", "01:23:45:67:89:cd"],
"Email": ["example@example.com", "example2@example2.com"],
"Proto": ["tcp2", "epmap2"],
"Port": [245],
"URL": ["http://www.sourceexample.com", "http://www.source.otherexample.com"],
"Spoofed": false,
"AttachHand": ["abc1", "abc2"],
"Note": "First test source",
"Imprecise": true,
"Anonymised": false,
"ASN": [15, 231],
"Router": ["router1", "router2"],
"Netname": ["example:EXAMPLE", "example2:EXAMPLE2"],
"Ref": ["sourceref.example.com", "sorceref.other.example.com"]
},
{
"Type": ["Phishing"],
"Hostname": ["hostexample2.com", "othernameexample.com"],
"IP4": ["93.184.216.119", "93.184.216.120"],
"IP6": ["2001:db8::2:1", "2001:db8:a::123/64"],
"MAC": ["01:23:45:67:89:ab", "01:23:45:67:89:cd"],
"Email": ["example@example.com", "example2@example2.com"],
"Proto": ["tcp2", "epmap2"],
"Port": [210, 542],
"URL": ["http://www.sourceexample.com", "http://www.source.otherexample.com"],
"Spoofed": false,
"AttachHand": ["def", "def2"],
"Note": "Second test source",
"Imprecise": true,
"Anonymised": false,
"ASN": [15, 231],
"Router": ["router1", "router2"],
"Netname": ["example:EXAMPLE", "example2:EXAMPLE2"],
"Ref": ["sourceref.example.com", "sorceref.other.example.com"]
}
],
"Target":
[
{
"Type": ["Open"],
"Hostname": ["targethostexample.com", "targetnameexample.com"],
"IP4": ["0.184.216.119"],
"IP6": ["0:db8::2:1", "2001:db8:a::123/64"],
"MAC": ["00:00:45:67:89:ab", "00:23:45:67:89:ed"],
"Email": ["example00@example.com", "example000@example2.com"],
"Proto": ["tcp", "epmap"],
"Port": [80],
"URL": ["http://www.targetexample.com", "http://www.target.otherexample.com"],
"Spoofed": true,
"AttachHand": ["att1", "att2"],
"Note": "First test target",
"Imprecise": true,
"Anonymised": false,
"ASN": [15, 231],
"Router": ["router3", "router4"],
"Netname": ["example:EXAMPLE", "example2:EXAMPLE2"],
"Ref": ["tar.example.com", "other.example.com"]
},
{
"Type": ["Open"],
"Hostname": ["example.com", "othertarget.com"],
"IP4": ["0.0.216.119"],
"IP6": ["0:0::2:1", "00:db8:a::123/64"],
"MAC": ["00:00:00:67:89:ab", "00:23:45:67:89:fd"],
"Email": ["example00@example.com", "example000@example2.com"],
"Proto": ["tcp2", "epmap2"],
"Port": [210, 542],
"URL": ["http://www.targetxmpl.com", "http://www.target.otherexample.com"],
"Spoofed": false,
"AttachHand": ["att3", "att4"],
"Note": "Second test target",
"Imprecise": true,
"Anonymised": true,
"ASN": [15, 231],
"Router": ["router5", "router6"],
"Netname": ["example:EXAMPLE", "example2:EXAMPLE2"],
"Ref": ["target.example.com", "ref.other.example.com"]
}
],
"Attach":
[
{
"Handle": "att1",
"FileName": ["file1.txt", "file2"],
"Type": ["Malware"],
"Hash": ["sha1:794467071687f7c59d033f4de5ece6b46415b633",
"md5:dc89f0b4ff9bd3b061dd66bb66c991b1"],
"Size": 200,
"Ref": ["example.com", "other.example.com"],
"Note": "First test attach",
"ContentType": ["mediatype"],
"ContentCharset": "UTF-8",
"ContentEncoding": "base64",
"Content": "This is content of the attach",
"ContentID": ["1", "2"],
"ExternalURI": ["external.example.com"]
},
{
"Handle": "att2",
"FileName": ["file1.txt", "file2"],
"Type": ["Malware"],
"Hash": ["sha1:794467071687f7c59d033f4de5ece6b46415b604",
"md5:dc89f0b4ff9bd3b061dd66bb66c99100"],
"Size": 300,
"Ref": ["example.com", "other.example.com"],
"Note": "First test attach",
"ContentType": ["mediatype"],
"ContentCharset": "UTF-8",
"ContentEncoding": "base64",
"Content": "This is content of the attach",
"ContentID": ["1", "2"],
"ExternalURI": ["external.example.com"]
},
{
"Handle": "att3",
"FileName": ["file1.txt", "file2"],
"Type": ["Malware"],
"Hash": ["sha1:794467071687f7c59d033f4de5ece6b46415b601",
"md5:dc89f0b4ff9bd3b061dd66bb66c99102"],
"Size": 400,
"Ref": ["example.com", "other.example.com"],
"Note": "Second test attach",
"ContentType": ["mediatype"],
"ContentCharset": "UTF-8",
"ContentEncoding": "base64",
"Content": "This is content of the attach",
"ContentID": ["1", "2"],
"ExternalURI": ["external.example.com"]
}
],
"Node":
[
{
"Name": "com.example.specific",
"Type": ["Log", "Statistical"],
"SW": ["Example software", "Example software 2"],
"AggrWin": "536D10:20:30.5",
"Note": "First test node"
},
{
"Name": "com.example.node",
"Type": ["Log", "Statistical"],
"SW": ["Example software"],
"AggrWin": "425D10:20:35.1",
"Note": "Second test node"
}
]
}"""
def get_expected_dict3():
return {
"Format": "IDEA0",
"ID": "123d45",
"CorrelID": ["28421", "87955"],
"CreateTime": "0xbc723b45.0xef449129",
"DetectTime": "2017-03-04T15:10:52.86445Z",
"Category": ["Phishing"],
"Ref": ["ref.example.com"],
"Confidence": 0.54,
"Source":
[
{
"Hostname": ["Second name of the equipment"],
"IP4": ["100.184.216.130"],
"IP6": ["2001:db8:a::123/64"],
"Proto": ["Additional info of protocol"],
"Port": [154],
"Spoofed": "yes",
"Router": ["interfaceExample"],
},
{
"Hostname": ["Third name of the equipment"],
"IP4": ["78.184.216.119"],
"Email": ["example@example.com"],
"Proto": ["Additional info of protocol"],
"Port": [158],
"Router": ["interfaceExample2"],
}
],
"Target":
[
{
"Hostname": ["Fourth name of the equipment"],
"IP4": ["98.184.216.119", "100.89.114.119"],
"Proto": ["Additional info of protocol"],
"Port": [80],
"Spoofed": "no",
"Router": ["interfaceExample3"],
},
{
"Hostname": ["Fifth name of the equipment"],
"IP4": ["100.184.216.119"],
"MAC": ["00:00:45:67:89:ab"],
"Proto": ["Additional info of protocol"],
"Port": [90, 91, 92, 93, 94],
"Spoofed": "yes",
"Router": ["interfaceExample4"],
}
],
"Node":
[
{
"Name": "DetectorExample",
"SW": ["detector great 2.0 highest Linux 4.4.0-31-generic"],
}
]
} | 46.6 | 100 | 0.411005 | 2,256 | 28,659 | 5.218085 | 0.123227 | 0.062012 | 0.033639 | 0.012232 | 0.940792 | 0.921933 | 0.908682 | 0.907407 | 0.898573 | 0.898573 | 0 | 0.160907 | 0.415367 | 28,659 | 615 | 101 | 46.6 | 0.541689 | 0 | 0 | 0.769358 | 0 | 0.029654 | 0.606393 | 0.134936 | 0 | 0 | 0.000698 | 0 | 0 | 1 | 0.00659 | true | 0 | 0 | 0.00659 | 0.01318 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
337567a13e653a02a8bd738c64fd917f67ce5d9a | 9,627 | py | Python | compressed_communication/aggregators/quantize_encode_test.py | lamylio/federated | 3f79e71344016ae5e5ec550557af25e5c169a934 | [
"Apache-2.0"
] | 1 | 2022-03-16T02:13:39.000Z | 2022-03-16T02:13:39.000Z | compressed_communication/aggregators/quantize_encode_test.py | notminusone/federated | 6a709f5598450232b918c046cfeba849f479d5cb | [
"Apache-2.0"
] | null | null | null | compressed_communication/aggregators/quantize_encode_test.py | notminusone/federated | 6a709f5598450232b918c046cfeba849f479d5cb | [
"Apache-2.0"
] | null | null | null | # Copyright 2022, Google LLC.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import collections
from absl.testing import parameterized
import tensorflow as tf
import tensorflow_federated as tff
from compressed_communication.aggregators import quantize_encode
_test_integer_tensor_type = (tf.int32, (3,))
_test_float_struct_type = [(tf.float32, (2,)), (tf.float32, (3,))]
_test_float_tensor_type = (tf.float32, (3,))
_test_float_tensor_quantized_type = (tf.int32, (3,))
class QuantizeEncodeComputationTest(tff.test.TestCase, parameterized.TestCase):
@parameterized.named_parameters(
("float_tensor", _test_float_tensor_type,
_test_float_tensor_quantized_type))
def test_uniform_quantize_encode_properties(self, value_type, quantize_type):
factory = quantize_encode.QuantizeEncodeFactory(1.0)
value_type = tff.to_type(value_type)
quantize_type = tff.to_type(quantize_type)
process = factory.create(value_type)
self.assertIsInstance(process, tff.templates.AggregationProcess)
server_state_type = tff.type_at_server(collections.OrderedDict(
round_num=tf.float32,
step_size=tf.float32,
inner_state=()))
expected_initialize_type = tff.FunctionType(
parameter=None, result=server_state_type)
self.assert_types_equivalent(process.initialize.type_signature,
expected_initialize_type)
expected_measurements_type = tff.type_at_server(
collections.OrderedDict(
avg_bitrate=tf.float64,
avg_distortion=tf.float32,
avg_sparsity=tf.float32,
step_size=tf.float32))
expected_next_type = tff.FunctionType(
parameter=collections.OrderedDict(
state=server_state_type, value=tff.type_at_clients(value_type)),
result=tff.templates.MeasuredProcessOutput(
state=server_state_type,
result=tff.type_at_server(value_type),
measurements=expected_measurements_type))
self.assert_types_equivalent(process.next.type_signature,
expected_next_type)
@parameterized.named_parameters(
("float_tensor", _test_float_tensor_type,
_test_float_tensor_quantized_type))
def test_stochastic_quantize_encode_properties(self, value_type,
quantize_type):
factory = quantize_encode.QuantizeEncodeFactory(
1.0, rounding_type="stochastic")
value_type = tff.to_type(value_type)
quantize_type = tff.to_type(quantize_type)
process = factory.create(value_type)
self.assertIsInstance(process, tff.templates.AggregationProcess)
server_state_type = tff.type_at_server(collections.OrderedDict(
round_num=tf.float32,
step_size=tf.float32,
inner_state=()))
expected_initialize_type = tff.FunctionType(
parameter=None, result=server_state_type)
self.assert_types_equivalent(process.initialize.type_signature,
expected_initialize_type)
expected_measurements_type = tff.type_at_server(
collections.OrderedDict(
avg_bitrate=tf.float64,
avg_distortion=tf.float32,
avg_sparsity=tf.float32,
step_size=tf.float32))
expected_next_type = tff.FunctionType(
parameter=collections.OrderedDict(
state=server_state_type, value=tff.type_at_clients(value_type)),
result=tff.templates.MeasuredProcessOutput(
state=server_state_type,
result=tff.type_at_server(value_type),
measurements=expected_measurements_type))
self.assert_types_equivalent(process.next.type_signature,
expected_next_type)
@parameterized.named_parameters(
("float_tensor", _test_float_tensor_type,
_test_float_tensor_quantized_type))
def test_dithered_quantize_encode_properties(self, value_type, quantize_type):
factory = quantize_encode.QuantizeEncodeFactory(
1.0, rounding_type="dithered")
value_type = tff.to_type(value_type)
quantize_type = tff.to_type(quantize_type)
process = factory.create(value_type)
self.assertIsInstance(process, tff.templates.AggregationProcess)
server_state_type = tff.type_at_server(collections.OrderedDict(
round_num=tf.float32,
step_size=tf.float32,
inner_state=()))
expected_initialize_type = tff.FunctionType(
parameter=None, result=server_state_type)
self.assert_types_equivalent(process.initialize.type_signature,
expected_initialize_type)
expected_measurements_type = tff.type_at_server(
collections.OrderedDict(
avg_bitrate=tf.float64,
avg_distortion=tf.float32,
avg_sparsity=tf.float32,
step_size=tf.float32))
expected_next_type = tff.FunctionType(
parameter=collections.OrderedDict(
state=server_state_type, value=tff.type_at_clients(value_type)),
result=tff.templates.MeasuredProcessOutput(
state=server_state_type,
result=tff.type_at_server(value_type),
measurements=expected_measurements_type))
self.assert_types_equivalent(process.next.type_signature,
expected_next_type)
@parameterized.named_parameters(
("integer_tensor", _test_integer_tensor_type),
("float_struct", _test_float_struct_type))
def test_quantize_encode_create_raises(self, value_type):
factory = quantize_encode.QuantizeEncodeFactory(1.0)
value_type = tff.to_type(value_type)
self.assertRaises(ValueError, factory.create, value_type)
class QuantizeEncodeExecutionTest(tff.test.TestCase, parameterized.TestCase):
@parameterized.named_parameters(
("float_tensor", _test_float_tensor_type,
_test_float_tensor_quantized_type))
def test_uniform_quantize_encode_impl(self, value_type, quantize_type):
factory = quantize_encode.QuantizeEncodeFactory(1.0)
value_type = tff.to_type(value_type)
quantize_type = tff.to_type(quantize_type)
process = factory.create(value_type)
state = process.initialize()
client_values = [tf.ones(value_type.shape, value_type.dtype)
for _ in range(2)]
expected_result = tf.ones(value_type.shape, value_type.dtype) * 2
bitstring_length = tf.math.ceil(
tf.size(expected_result, out_type=tf.float32) * 3.0 / 8.0) * 8.0
expected_avg_bitrate = bitstring_length / tf.size(expected_result,
out_type=tf.float32)
expected_measurements = collections.OrderedDict(
avg_bitrate=tf.cast(expected_avg_bitrate, tf.float64),
avg_distortion=0.0,
avg_sparsity=0.0,
step_size=1.0)
output = process.next(state, client_values)
self.assertAllClose(output.result, expected_result)
self.assertAllClose(output.measurements, expected_measurements)
@parameterized.named_parameters(
("float_tensor", _test_float_tensor_type,
_test_float_tensor_quantized_type))
def test_stochastic_quantize_encode_impl(self, value_type, quantize_type):
factory = quantize_encode.QuantizeEncodeFactory(
1.0, rounding_type="stochastic")
value_type = tff.to_type(value_type)
quantize_type = tff.to_type(quantize_type)
process = factory.create(value_type)
state = process.initialize()
client_values = [tf.ones(value_type.shape, value_type.dtype)
for _ in range(2)]
expected_result = tf.ones(value_type.shape, value_type.dtype) * 2
bitstring_length = tf.math.ceil(
tf.size(expected_result, out_type=tf.float32) * 3.0 / 8.0) * 8.0
expected_avg_bitrate = bitstring_length / tf.size(expected_result,
out_type=tf.float32)
expected_measurements = collections.OrderedDict(
avg_bitrate=tf.cast(expected_avg_bitrate, tf.float64),
avg_distortion=0.0,
avg_sparsity=0.0,
step_size=1.0)
output = process.next(state, client_values)
self.assertAllClose(output.result, expected_result)
self.assertAllClose(output.measurements, expected_measurements)
@parameterized.named_parameters(
("float_tensor", _test_float_tensor_type,
_test_float_tensor_quantized_type))
def test_dithered_quantize_encode_impl(self, value_type, quantize_type):
factory = quantize_encode.QuantizeEncodeFactory(
1.0, rounding_type="dithered")
value_type = tff.to_type(value_type)
quantize_type = tff.to_type(quantize_type)
process = factory.create(value_type)
state = process.initialize()
client_values = [tf.ones(value_type.shape, value_type.dtype)
for _ in range(2)]
deterministic_result = tf.ones(value_type.shape, value_type.dtype) * 2
max_difference = 0.5 * 2
result = process.next(state, client_values).result
self.assertLessEqual(tf.reduce_max(abs(result - deterministic_result)),
max_difference)
if __name__ == "__main__":
tff.test.main()
| 41.675325 | 80 | 0.709878 | 1,141 | 9,627 | 5.644172 | 0.145486 | 0.064286 | 0.044721 | 0.026242 | 0.841615 | 0.835248 | 0.827484 | 0.827484 | 0.827484 | 0.827484 | 0 | 0.015427 | 0.205464 | 9,627 | 230 | 81 | 41.856522 | 0.826513 | 0.057131 | 0 | 0.843243 | 0 | 0 | 0.015668 | 0 | 0 | 0 | 0 | 0 | 0.081081 | 1 | 0.037838 | false | 0 | 0.027027 | 0 | 0.075676 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
33d177c31f5c0f581e23acb5276cfc048d473efb | 14,960 | py | Python | moves/new_hiphop.py | helemanc/PartyNAO | 374128077d2c1e486b1041cce335664ab18eec66 | [
"Apache-2.0"
] | 2 | 2022-01-27T18:35:27.000Z | 2022-01-28T11:40:23.000Z | moves/new_hiphop.py | helemanc/PartyNAO | 374128077d2c1e486b1041cce335664ab18eec66 | [
"Apache-2.0"
] | null | null | null | moves/new_hiphop.py | helemanc/PartyNAO | 374128077d2c1e486b1041cce335664ab18eec66 | [
"Apache-2.0"
] | null | null | null | # Choregraphe bezier export in Python.
from naoqi import ALProxy
names = list()
times = list()
keys = list()
names.append("HeadPitch")
times.append([1.16, 1.72, 2.24, 2.68, 3.08, 3.56, 4, 4.44, 4.88, 5.32])
keys.append([[-0.00178573, [3, -0.386667, 0], [3, 0.186667, 0]], [0.514872, [3, -0.186667, 0], [3, 0.173333, 0]], [-0.258309, [3, -0.173333, 0], [3, 0.146667, 0]], [0.0122173, [3, -0.146667, 0], [3, 0.133333, 0]], [-0.251327, [3, -0.133333, 0], [3, 0.16, 0]], [0.00698132, [3, -0.16, 0], [3, 0.146667, 0]], [0.00698132, [3, -0.146667, 0], [3, 0.146667, 0]], [0.00698132, [3, -0.146667, 0], [3, 0.146667, 0]], [-0.00420227, [3, -0.146667, 0.0111836], [3, 0.146667, -0.0111836]], [-0.165798, [3, -0.146667, 0], [3, 0, 0]]])
names.append("HeadYaw")
times.append([1.16, 1.72, 2.24, 2.68, 3.08, 3.56, 4, 4.44, 4.88, 5.32])
keys.append([[0.00234838, [3, -0.386667, 0], [3, 0.186667, 0]], [0.00234838, [3, -0.186667, 0], [3, 0.173333, 0]], [-1.53589, [3, -0.173333, 0], [3, 0.146667, 0]], [1.213, [3, -0.146667, 0], [3, 0.133333, 0]], [-1.53589, [3, -0.133333, 0], [3, 0.16, 0]], [1.213, [3, -0.16, 0], [3, 0.146667, 0]], [0.371755, [3, -0.146667, 0], [3, 0.146667, 0]], [0.371755, [3, -0.146667, 0], [3, 0.146667, 0]], [0.00379466, [3, -0.146667, 0], [3, 0.146667, 0]], [0.00379466, [3, -0.146667, 0], [3, 0, 0]]])
names.append("LAnklePitch")
times.append([1.16, 1.72, 2.24, 3.08, 3.56, 4, 4.44, 4.88, 5.32])
keys.append([[-0.35, [3, -0.386667, 0], [3, 0.186667, 0]], [-0.35, [3, -0.186667, 0], [3, 0.173333, 0]], [-0.35, [3, -0.173333, 0], [3, 0.28, 0]], [-0.35, [3, -0.28, 0], [3, 0.16, 0]], [-0.35, [3, -0.16, 0], [3, 0.146667, 0]], [-0.35, [3, -0.146667, 0], [3, 0.146667, 0]], [-0.35, [3, -0.146667, 0], [3, 0.146667, 0]], [-0.00852256, [3, -0.146667, -0.0733333], [3, 0.146667, 0.0733333]], [0.09, [3, -0.146667, 0], [3, 0, 0]]])
names.append("LAnkleRoll")
times.append([1.16, 1.72, 2.24, 3.08, 3.56, 4, 4.44, 4.88, 5.32])
keys.append([[0.0628319, [3, -0.386667, 0], [3, 0.186667, 0]], [0.0628319, [3, -0.186667, 0], [3, 0.173333, 0]], [0.0530144, [3, -0.173333, 0], [3, 0.28, 0]], [0.0531246, [3, -0.28, 0], [3, 0.16, 0]], [0.0531246, [3, -0.16, 0], [3, 0.146667, 0]], [0.0531246, [3, -0.146667, 0], [3, 0.146667, 0]], [0.0531246, [3, -0.146667, 0], [3, 0.146667, 0]], [0, [3, -0.146667, 0.0305208], [3, 0.146667, -0.0305208]], [-0.13, [3, -0.146667, 0], [3, 0, 0]]])
names.append("LElbowRoll")
times.append([1.16, 1.72, 2.24, 3.08, 3.56, 4, 4.44, 4.88, 5.32])
keys.append([[-0.745334, [3, -0.386667, 0], [3, 0.186667, 0]], [-0.745334, [3, -0.186667, 0], [3, 0.173333, 0]], [-0.729467, [3, -0.173333, -0.00338802], [3, 0.28, 0.00547296]], [-0.718751, [3, -0.28, 0], [3, 0.16, 0]], [-0.718751, [3, -0.16, 0], [3, 0.146667, 0]], [-0.717819, [3, -0.146667, 0], [3, 0.146667, 0]], [-0.717819, [3, -0.146667, 0], [3, 0.146667, 0]], [-0.0435031, [3, -0.146667, 0], [3, 0.146667, 0]], [-0.417341, [3, -0.146667, 0], [3, 0, 0]]])
names.append("LElbowYaw")
times.append([1.16, 1.72, 2.24, 3.08, 3.56, 4, 4.44, 4.88, 5.32])
keys.append([[0.105107, [3, -0.386667, 0], [3, 0.186667, 0]], [0.105107, [3, -0.186667, 0], [3, 0.173333, 0]], [0.10396, [3, -0.173333, 0], [3, 0.28, 0]], [0.104343, [3, -0.28, 0], [3, 0.16, 0]], [0.104343, [3, -0.16, 0], [3, 0.146667, 0]], [0.115137, [3, -0.146667, 0], [3, 0.146667, 0]], [0.115137, [3, -0.146667, 0], [3, 0.146667, 0]], [-0.000932922, [3, -0.146667, 0.11607], [3, 0.146667, -0.11607]], [-1.19594, [3, -0.146667, 0], [3, 0, 0]]])
names.append("LHand")
times.append([1.16, 1.72, 2.24, 3.08, 3.56, 4, 4.44, 4.88, 5.32])
keys.append([[0.256499, [3, -0.386667, 0], [3, 0.186667, 0]], [0.256499, [3, -0.186667, 0], [3, 0.173333, 0]], [0.256499, [3, -0.173333, 0], [3, 0.28, 0]], [0.256499, [3, -0.28, 0], [3, 0.16, 0]], [0.256499, [3, -0.16, 0], [3, 0.146667, 0]], [0.256499, [3, -0.146667, 0], [3, 0.146667, 0]], [0.256499, [3, -0.146667, 0], [3, 0.146667, 0]], [0.00608754, [3, -0.146667, 0], [3, 0.146667, 0]], [0.290359, [3, -0.146667, 0], [3, 0, 0]]])
names.append("LHipPitch")
times.append([1.16, 1.72, 2.24, 2.68, 3.08, 3.56, 4, 4.44, 4.88, 5.32])
keys.append([[-0.424115, [3, -0.386667, 0], [3, 0.186667, 0]], [-0.424115, [3, -0.186667, 0], [3, 0.173333, 0]], [-0.424406, [3, -0.173333, 0], [3, 0.146667, 0]], [0.390954, [3, -0.146667, 0], [3, 0.133333, 0]], [-0.425031, [3, -0.133333, 0], [3, 0.16, 0]], [0.390954, [3, -0.16, 0], [3, 0.146667, 0]], [0.390954, [3, -0.146667, 0], [3, 0.146667, 0]], [0.390954, [3, -0.146667, 0], [3, 0.146667, 0]], [-0.000718725, [3, -0.146667, 0], [3, 0.146667, 0]], [0.13, [3, -0.146667, 0], [3, 0, 0]]])
names.append("LHipRoll")
times.append([1.16, 1.72, 2.24, 2.68, 3.08, 3.56, 4, 4.44, 4.88, 5.32])
keys.append([[-0.0575959, [3, -0.386667, 0], [3, 0.186667, 0]], [-0.0575959, [3, -0.186667, 0], [3, 0.173333, 0]], [-0.055121, [3, -0.173333, 0], [3, 0.146667, 0]], [-0.228638, [3, -0.146667, 0], [3, 0.133333, 0]], [-0.0555568, [3, -0.133333, 0], [3, 0.16, 0]], [-0.228638, [3, -0.16, 0], [3, 0.146667, 0]], [-0.228638, [3, -0.146667, 0], [3, 0.146667, 0]], [-0.228638, [3, -0.146667, 0], [3, 0.146667, 0]], [0.00922852, [3, -0.146667, -0.0532349], [3, 0.146667, 0.0532349]], [0.0907715, [3, -0.146667, 0], [3, 0, 0]]])
names.append("LHipYawPitch")
times.append([1.16, 1.72, 2.24, 3.08, 3.56, 4, 4.44, 4.88, 5.32])
keys.append([[-0.659734, [3, -0.386667, 0], [3, 0.186667, 0]], [-0.659734, [3, -0.186667, 0], [3, 0.173333, 0]], [-0.652325, [3, -0.173333, 0], [3, 0.28, 0]], [-0.655814, [3, -0.28, 0], [3, 0.16, 0]], [-0.655814, [3, -0.16, 0], [3, 0.146667, 0]], [-0.655814, [3, -0.146667, 0], [3, 0.146667, 0]], [-0.655814, [3, -0.146667, 0], [3, 0.146667, 0]], [-0.00420227, [3, -0.146667, 0], [3, 0.146667, 0]], [-0.165798, [3, -0.146667, 0], [3, 0, 0]]])
names.append("LKneePitch")
times.append([1.16, 1.72, 2.24, 2.68, 3.08, 3.56, 4, 4.44, 4.88, 5.32])
keys.append([[0.715585, [3, -0.386667, 0], [3, 0.186667, 0]], [0.715585, [3, -0.186667, 0], [3, 0.173333, 0]], [0.714915, [3, -0.173333, 0.000669652], [3, 0.146667, -0.000566628]], [0.555015, [3, -0.146667, 0], [3, 0.133333, 0]], [0.714242, [3, -0.133333, 0], [3, 0.16, 0]], [0.555015, [3, -0.16, 0], [3, 0.146667, 0]], [0.555015, [3, -0.146667, 0], [3, 0.146667, 0]], [0.555015, [3, -0.146667, 0], [3, 0.146667, 0]], [0.00111802, [3, -0.146667, 0.091118], [3, 0.146667, -0.091118]], [-0.09, [3, -0.146667, 0], [3, 0, 0]]])
names.append("LShoulderPitch")
times.append([1.16, 1.72, 2.24, 3.08, 3.56, 4, 4.44, 4.88, 5.32])
keys.append([[0.162434, [3, -0.386667, 0], [3, 0.186667, 0]], [0.162434, [3, -0.186667, 0], [3, 0.173333, 0]], [0.167013, [3, -0.173333, 0], [3, 0.28, 0]], [0.166618, [3, -0.28, 0.000395529], [3, 0.16, -0.000226017]], [-1.15541, [3, -0.16, 0], [3, 0.146667, 0]], [-1.14434, [3, -0.146667, 0], [3, 0.146667, 0]], [-1.14434, [3, -0.146667, 0], [3, 0.146667, 0]], [0.000939633, [3, -0.146667, -0.43535], [3, 0.146667, 0.43535]], [1.46776, [3, -0.146667, 0], [3, 0, 0]]])
names.append("LShoulderRoll")
times.append([1.16, 1.72, 2.24, 3.08, 3.56, 4, 4.44, 4.88, 5.32])
keys.append([[-0.0952713, [3, -0.386667, 0], [3, 0.186667, 0]], [-0.0952713, [3, -0.186667, 0], [3, 0.173333, 0]], [-0.0581526, [3, -0.173333, -0.00666667], [3, 0.28, 0.0107692]], [-0.0429635, [3, -0.28, 0], [3, 0.16, 0]], [-0.0429635, [3, -0.16, 0], [3, 0.146667, 0]], [-0.0240079, [3, -0.146667, 0], [3, 0.146667, 0]], [-0.0240079, [3, -0.146667, 0], [3, 0.146667, 0]], [0.00974114, [3, -0.146667, -0.0335098], [3, 0.146667, 0.0335098]], [0.177051, [3, -0.146667, 0], [3, 0, 0]]])
names.append("LWristYaw")
times.append([1.16, 1.72, 2.24, 3.08, 3.56, 4, 4.44, 4.88, 5.32])
keys.append([[0.0277694, [3, -0.386667, 0], [3, 0.186667, 0]], [0.0277694, [3, -0.186667, 0], [3, 0.173333, 0]], [0.0270572, [3, -0.173333, 0.000180219], [3, 0.28, -0.000291123]], [0.0263553, [3, -0.28, 0], [3, 0.16, 0]], [0.0263553, [3, -0.16, 0], [3, 0.146667, 0]], [0.0263553, [3, -0.146667, 0], [3, 0.146667, 0]], [0.0263553, [3, -0.146667, 0], [3, 0.146667, 0]], [0.00525523, [3, -0.146667, 0], [3, 0.146667, 0]], [0.0947448, [3, -0.146667, 0], [3, 0, 0]]])
names.append("RAnklePitch")
times.append([1.16, 1.72, 2.24, 3.08, 3.56, 4, 4.44, 4.88, 5.32])
keys.append([[-0.348013, [3, -0.386667, 0], [3, 0.186667, 0]], [-0.348013, [3, -0.186667, 0], [3, 0.173333, 0]], [-0.348013, [3, -0.173333, 0], [3, 0.28, 0]], [-0.348013, [3, -0.28, 0], [3, 0.16, 0]], [-0.329867, [3, -0.16, 0], [3, 0.146667, 0]], [-0.329867, [3, -0.146667, 0], [3, 0.146667, 0]], [-0.329867, [3, -0.146667, 0], [3, 0.146667, 0]], [-0.00852256, [3, -0.146667, -0.0699779], [3, 0.146667, 0.0699779]], [0.09, [3, -0.146667, 0], [3, 0, 0]]])
names.append("RAnkleRoll")
times.append([1.16, 1.72, 2.24, 3.08, 3.56, 4, 4.44, 4.88, 5.32])
keys.append([[-0.00558042, [3, -0.386667, 0], [3, 0.186667, 0]], [-0.00558042, [3, -0.186667, 0], [3, 0.173333, 0]], [0.141372, [3, -0.173333, 0], [3, 0.28, 0]], [0.141372, [3, -0.28, 0], [3, 0.16, 0]], [-0.169297, [3, -0.16, 0], [3, 0.146667, 0]], [-0.169297, [3, -0.146667, 0], [3, 0.146667, 0]], [-0.169297, [3, -0.146667, 0], [3, 0.146667, 0]], [0, [3, -0.146667, -0.0498828], [3, 0.146667, 0.0498828]], [0.13, [3, -0.146667, 0], [3, 0, 0]]])
names.append("RElbowRoll")
times.append([1.16, 1.72, 2.24, 3.08, 3.56, 4, 4.44, 4.88, 5.32])
keys.append([[1.53673, [3, -0.386667, 0], [3, 0.186667, 0]], [1.53673, [3, -0.186667, 0], [3, 0.173333, 0]], [1.52825, [3, -0.173333, 0], [3, 0.28, 0]], [1.52863, [3, -0.28, 0], [3, 0.16, 0]], [1.52863, [3, -0.16, 0], [3, 0.146667, 0]], [0.390954, [3, -0.146667, 0], [3, 0.146667, 0]], [0.393827, [3, -0.146667, 0], [3, 0.146667, 0]], [0.0435031, [3, -0.146667, 0], [3, 0.146667, 0]], [0.417341, [3, -0.146667, 0], [3, 0, 0]]])
names.append("RElbowYaw")
times.append([1.16, 1.72, 2.24, 3.08, 3.56, 4, 4.44, 4.88, 5.32])
keys.append([[-0.0716553, [3, -0.386667, 0], [3, 0.186667, 0]], [-0.0716553, [3, -0.186667, 0], [3, 0.173333, 0]], [-0.0710624, [3, -0.173333, -0.000592845], [3, 0.28, 0.000957672]], [-0.0666051, [3, -0.28, 0], [3, 0.16, 0]], [-0.0666051, [3, -0.16, 0], [3, 0.146667, 0]], [-0.1212, [3, -0.146667, 0], [3, 0.146667, 0]], [-0.0769817, [3, -0.146667, -0.0203555], [3, 0.146667, 0.0203555]], [0.000932922, [3, -0.146667, -0.0779146], [3, 0.146667, 0.0779146]], [1.19594, [3, -0.146667, 0], [3, 0, 0]]])
names.append("RHand")
times.append([1.16, 1.72, 2.24, 3.08, 3.56, 4, 4.44, 4.88, 5.32])
keys.append([[0.256494, [3, -0.386667, 0], [3, 0.186667, 0]], [0.256494, [3, -0.186667, 0], [3, 0.173333, 0]], [0.256494, [3, -0.173333, 0], [3, 0.28, 0]], [0.256494, [3, -0.28, 0], [3, 0.16, 0]], [0.69, [3, -0.16, 0], [3, 0.146667, 0]], [0.54, [3, -0.146667, 0], [3, 0.146667, 0]], [0.54, [3, -0.146667, 0], [3, 0.146667, 0]], [0.00608754, [3, -0.146667, 0], [3, 0.146667, 0]], [0.290359, [3, -0.146667, 0], [3, 0, 0]]])
names.append("RHipPitch")
times.append([1.16, 1.72, 2.24, 3.08, 3.56, 4, 4.44, 4.88, 5.32])
keys.append([[-0.452112, [3, -0.386667, 0], [3, 0.186667, 0]], [-0.452112, [3, -0.186667, 0], [3, 0.173333, 0]], [-0.0715585, [3, -0.173333, 0], [3, 0.28, 0]], [-0.0715585, [3, -0.28, 0], [3, 0.16, 0]], [0.020944, [3, -0.16, 0], [3, 0.146667, 0]], [0.020944, [3, -0.146667, 0], [3, 0.146667, 0]], [0.020944, [3, -0.146667, 0], [3, 0.146667, 0]], [-0.000718725, [3, -0.146667, 0], [3, 0.146667, 0]], [0.13, [3, -0.146667, 0], [3, 0, 0]]])
names.append("RHipRoll")
times.append([1.16, 1.72, 2.24, 3.08, 3.56, 4, 4.44, 4.88, 5.32])
keys.append([[-0.00425469, [3, -0.386667, 0], [3, 0.186667, 0]], [-0.00425469, [3, -0.186667, 0], [3, 0.173333, 0]], [-0.00425469, [3, -0.173333, 0], [3, 0.28, 0]], [-0.00425469, [3, -0.28, 0], [3, 0.16, 0]], [-0.500909, [3, -0.16, 0], [3, 0.146667, 0]], [-0.500909, [3, -0.146667, 0], [3, 0.146667, 0]], [-0.500909, [3, -0.146667, 0], [3, 0.146667, 0]], [-0.00922852, [3, -0.146667, 0], [3, 0.146667, 0]], [-0.0907715, [3, -0.146667, 0], [3, 0, 0]]])
names.append("RHipYawPitch")
times.append([1.16, 1.72, 2.24, 3.08, 3.56, 4, 4.44, 4.88, 5.32])
keys.append([[0, [3, -0.386667, 0], [3, 0.186667, 0]], [-0.659734, [3, -0.186667, 0], [3, 0.173333, 0]], [-0.652325, [3, -0.173333, 0], [3, 0.28, 0]], [-0.655814, [3, -0.28, 0], [3, 0.16, 0]], [-0.0506145, [3, -0.16, 0], [3, 0.146667, 0]], [-0.655814, [3, -0.146667, 0], [3, 0.146667, 0]], [-0.655814, [3, -0.146667, 0], [3, 0.146667, 0]], [-0.00420227, [3, -0.146667, 0], [3, 0.146667, 0]], [-0.165798, [3, -0.146667, 0], [3, 0, 0]]])
names.append("RKneePitch")
times.append([1.16, 1.72, 2.24, 2.68, 3.08, 3.56, 4, 4.44, 4.88, 5.32])
keys.append([[0.7, [3, -0.386667, 0], [3, 0.186667, 0]], [0.7, [3, -0.186667, 0], [3, 0.173333, 0]], [1.86925, [3, -0.173333, 0], [3, 0.146667, 0]], [0.555015, [3, -0.146667, 0], [3, 0.133333, 0]], [1.86925, [3, -0.133333, 0], [3, 0.16, 0]], [0.776672, [3, -0.16, 0], [3, 0.146667, 0]], [0.776672, [3, -0.146667, 0], [3, 0.146667, 0]], [0.776672, [3, -0.146667, 0], [3, 0.146667, 0]], [0.00111802, [3, -0.146667, 0.091118], [3, 0.146667, -0.091118]], [-0.09, [3, -0.146667, 0], [3, 0, 0]]])
names.append("RShoulderPitch")
times.append([1.16, 1.72, 2.24, 3.08, 3.56, 4, 4.44, 4.88, 5.32])
keys.append([[0.138439, [3, -0.386667, 0], [3, 0.186667, 0]], [0.138439, [3, -0.186667, 0], [3, 0.173333, 0]], [0.146841, [3, -0.173333, 0], [3, 0.28, 0]], [0.145227, [3, -0.28, 0], [3, 0.16, 0]], [2.08567, [3, -0.16, 0], [3, 0.146667, 0]], [-1.18508, [3, -0.146667, 0.00950589], [3, 0.146667, -0.00950589]], [-1.19458, [3, -0.146667, 0], [3, 0.146667, 0]], [0.000939633, [3, -0.146667, -0.443724], [3, 0.146667, 0.443724]], [1.46776, [3, -0.146667, 0], [3, 0, 0]]])
names.append("RShoulderRoll")
times.append([1.16, 1.72, 2.24, 3.08, 3.56, 4, 4.44, 4.88, 5.32])
keys.append([[-0.504904, [3, -0.386667, 0], [3, 0.186667, 0]], [-0.504904, [3, -0.186667, 0], [3, 0.173333, 0]], [-0.526257, [3, -0.173333, 0.00271036], [3, 0.28, -0.00437827]], [-0.530635, [3, -0.28, 0], [3, 0.16, 0]], [-0.530635, [3, -0.16, 0], [3, 0.146667, 0]], [-1.32645, [3, -0.146667, 0], [3, 0.146667, 0]], [-1.32366, [3, -0.146667, -0.00279286], [3, 0.146667, 0.00279286]], [-0.00974114, [3, -0.146667, 0], [3, 0.146667, 0]], [-0.177051, [3, -0.146667, 0], [3, 0, 0]]])
names.append("RWristYaw")
times.append([1.16, 1.72, 2.24, 3.08, 3.56, 4, 4.44, 4.88, 5.32])
keys.append([[0.793417, [3, -0.386667, 0], [3, 0.186667, 0]], [0.793417, [3, -0.186667, 0], [3, 0.173333, 0]], [0.784486, [3, -0.173333, 0], [3, 0.28, 0]], [0.78556, [3, -0.28, 0], [3, 0.16, 0]], [0.78556, [3, -0.16, 0], [3, 0.146667, 0]], [0.78556, [3, -0.146667, 0], [3, 0.146667, 0]], [0.78556, [3, -0.146667, 0], [3, 0.146667, 0]], [0.00525523, [3, -0.146667, 0], [3, 0.146667, 0]], [0.0947448, [3, -0.146667, 0], [3, 0, 0]]])
def main(IP, port):
try:
# uncomment the following line and modify the IP if you use this script outside Choregraphe.
motion = ALProxy("ALMotion", IP, port)
#motion = ALProxy("ALMotion")
motion.angleInterpolationBezier(names, times, keys)
except BaseException, err:
print err
| 124.666667 | 523 | 0.523997 | 3,106 | 14,960 | 2.523825 | 0.072762 | 0.122465 | 0.084194 | 0.252583 | 0.83327 | 0.756346 | 0.747927 | 0.747927 | 0.737211 | 0.530425 | 0 | 0.500581 | 0.1375 | 14,960 | 119 | 524 | 125.714286 | 0.106952 | 0.010361 | 0 | 0.295455 | 0 | 0 | 0.017837 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.011364 | null | null | 0.011364 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
33d8b166a02c5117fc81f9dc023cec57fdddd464 | 32,261 | py | Python | sdk/python/pulumi_azure/sentinel/outputs.py | henriktao/pulumi-azure | f1cbcf100b42b916da36d8fe28be3a159abaf022 | [
"ECL-2.0",
"Apache-2.0"
] | 109 | 2018-06-18T00:19:44.000Z | 2022-02-20T05:32:57.000Z | sdk/python/pulumi_azure/sentinel/outputs.py | henriktao/pulumi-azure | f1cbcf100b42b916da36d8fe28be3a159abaf022 | [
"ECL-2.0",
"Apache-2.0"
] | 663 | 2018-06-18T21:08:46.000Z | 2022-03-31T20:10:11.000Z | sdk/python/pulumi_azure/sentinel/outputs.py | henriktao/pulumi-azure | f1cbcf100b42b916da36d8fe28be3a159abaf022 | [
"ECL-2.0",
"Apache-2.0"
] | 41 | 2018-07-19T22:37:38.000Z | 2022-03-14T10:56:26.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
from . import outputs
__all__ = [
'AlertRuleScheduledEventGrouping',
'AlertRuleScheduledIncidentConfiguration',
'AlertRuleScheduledIncidentConfigurationGrouping',
'AuthomationRuleActionIncident',
'AuthomationRuleActionPlaybook',
'AuthomationRuleCondition',
'AutomationRuleActionIncident',
'AutomationRuleActionPlaybook',
'AutomationRuleCondition',
'GetAlertRuleTemplateScheduledTemplateResult',
'GetAlertRuleTemplateSecurityIncidentTemplateResult',
]
@pulumi.output_type
class AlertRuleScheduledEventGrouping(dict):
@staticmethod
def __key_warning(key: str):
suggest = None
if key == "aggregationMethod":
suggest = "aggregation_method"
if suggest:
pulumi.log.warn(f"Key '{key}' not found in AlertRuleScheduledEventGrouping. Access the value via the '{suggest}' property getter instead.")
def __getitem__(self, key: str) -> Any:
AlertRuleScheduledEventGrouping.__key_warning(key)
return super().__getitem__(key)
def get(self, key: str, default = None) -> Any:
AlertRuleScheduledEventGrouping.__key_warning(key)
return super().get(key, default)
def __init__(__self__, *,
aggregation_method: str):
"""
:param str aggregation_method: The aggregation type of grouping the events.
"""
pulumi.set(__self__, "aggregation_method", aggregation_method)
@property
@pulumi.getter(name="aggregationMethod")
def aggregation_method(self) -> str:
"""
The aggregation type of grouping the events.
"""
return pulumi.get(self, "aggregation_method")
@pulumi.output_type
class AlertRuleScheduledIncidentConfiguration(dict):
@staticmethod
def __key_warning(key: str):
suggest = None
if key == "createIncident":
suggest = "create_incident"
if suggest:
pulumi.log.warn(f"Key '{key}' not found in AlertRuleScheduledIncidentConfiguration. Access the value via the '{suggest}' property getter instead.")
def __getitem__(self, key: str) -> Any:
AlertRuleScheduledIncidentConfiguration.__key_warning(key)
return super().__getitem__(key)
def get(self, key: str, default = None) -> Any:
AlertRuleScheduledIncidentConfiguration.__key_warning(key)
return super().get(key, default)
def __init__(__self__, *,
create_incident: bool,
grouping: 'outputs.AlertRuleScheduledIncidentConfigurationGrouping'):
"""
:param bool create_incident: Whether to create an incident from alerts triggered by this Sentinel Scheduled Alert Rule?
:param 'AlertRuleScheduledIncidentConfigurationGroupingArgs' grouping: A `grouping` block as defined below.
"""
pulumi.set(__self__, "create_incident", create_incident)
pulumi.set(__self__, "grouping", grouping)
@property
@pulumi.getter(name="createIncident")
def create_incident(self) -> bool:
"""
Whether to create an incident from alerts triggered by this Sentinel Scheduled Alert Rule?
"""
return pulumi.get(self, "create_incident")
@property
@pulumi.getter
def grouping(self) -> 'outputs.AlertRuleScheduledIncidentConfigurationGrouping':
"""
A `grouping` block as defined below.
"""
return pulumi.get(self, "grouping")
@pulumi.output_type
class AlertRuleScheduledIncidentConfigurationGrouping(dict):
@staticmethod
def __key_warning(key: str):
suggest = None
if key == "entityMatchingMethod":
suggest = "entity_matching_method"
elif key == "groupBies":
suggest = "group_bies"
elif key == "lookbackDuration":
suggest = "lookback_duration"
elif key == "reopenClosedIncidents":
suggest = "reopen_closed_incidents"
if suggest:
pulumi.log.warn(f"Key '{key}' not found in AlertRuleScheduledIncidentConfigurationGrouping. Access the value via the '{suggest}' property getter instead.")
def __getitem__(self, key: str) -> Any:
AlertRuleScheduledIncidentConfigurationGrouping.__key_warning(key)
return super().__getitem__(key)
def get(self, key: str, default = None) -> Any:
AlertRuleScheduledIncidentConfigurationGrouping.__key_warning(key)
return super().get(key, default)
def __init__(__self__, *,
enabled: Optional[bool] = None,
entity_matching_method: Optional[str] = None,
group_bies: Optional[Sequence[str]] = None,
lookback_duration: Optional[str] = None,
reopen_closed_incidents: Optional[bool] = None):
"""
:param bool enabled: Enable grouping incidents created from alerts triggered by this Sentinel Scheduled Alert Rule. Defaults to `true`.
:param str entity_matching_method: The method used to group incidents. Possible values are `All`, `Custom` and `None`. Defaults to `None`.
:param Sequence[str] group_bies: A list of entity types to group by, only when the `entity_matching_method` is `Custom`. Possible values are `Account`, `Host`, `Url`, `Ip`.
:param str lookback_duration: Limit the group to alerts created within the lookback duration (in ISO 8601 duration format). Defaults to `PT5M`.
:param bool reopen_closed_incidents: Whether to re-open closed matching incidents? Defaults to `false`.
"""
if enabled is not None:
pulumi.set(__self__, "enabled", enabled)
if entity_matching_method is not None:
pulumi.set(__self__, "entity_matching_method", entity_matching_method)
if group_bies is not None:
pulumi.set(__self__, "group_bies", group_bies)
if lookback_duration is not None:
pulumi.set(__self__, "lookback_duration", lookback_duration)
if reopen_closed_incidents is not None:
pulumi.set(__self__, "reopen_closed_incidents", reopen_closed_incidents)
@property
@pulumi.getter
def enabled(self) -> Optional[bool]:
"""
Enable grouping incidents created from alerts triggered by this Sentinel Scheduled Alert Rule. Defaults to `true`.
"""
return pulumi.get(self, "enabled")
@property
@pulumi.getter(name="entityMatchingMethod")
def entity_matching_method(self) -> Optional[str]:
"""
The method used to group incidents. Possible values are `All`, `Custom` and `None`. Defaults to `None`.
"""
return pulumi.get(self, "entity_matching_method")
@property
@pulumi.getter(name="groupBies")
def group_bies(self) -> Optional[Sequence[str]]:
"""
A list of entity types to group by, only when the `entity_matching_method` is `Custom`. Possible values are `Account`, `Host`, `Url`, `Ip`.
"""
return pulumi.get(self, "group_bies")
@property
@pulumi.getter(name="lookbackDuration")
def lookback_duration(self) -> Optional[str]:
"""
Limit the group to alerts created within the lookback duration (in ISO 8601 duration format). Defaults to `PT5M`.
"""
return pulumi.get(self, "lookback_duration")
@property
@pulumi.getter(name="reopenClosedIncidents")
def reopen_closed_incidents(self) -> Optional[bool]:
"""
Whether to re-open closed matching incidents? Defaults to `false`.
"""
return pulumi.get(self, "reopen_closed_incidents")
@pulumi.output_type
class AuthomationRuleActionIncident(dict):
@staticmethod
def __key_warning(key: str):
suggest = None
if key == "classificationComment":
suggest = "classification_comment"
elif key == "ownerId":
suggest = "owner_id"
if suggest:
pulumi.log.warn(f"Key '{key}' not found in AuthomationRuleActionIncident. Access the value via the '{suggest}' property getter instead.")
def __getitem__(self, key: str) -> Any:
AuthomationRuleActionIncident.__key_warning(key)
return super().__getitem__(key)
def get(self, key: str, default = None) -> Any:
AuthomationRuleActionIncident.__key_warning(key)
return super().get(key, default)
def __init__(__self__, *,
order: int,
classification: Optional[str] = None,
classification_comment: Optional[str] = None,
labels: Optional[Sequence[str]] = None,
owner_id: Optional[str] = None,
severity: Optional[str] = None,
status: Optional[str] = None):
"""
:param int order: The execution order of this action.
:param str classification: The classification of the incident, when closing it. Possible values are: `BenignPositive_SuspiciousButExpected`, `FalsePositive_InaccurateData`, `FalsePositive_IncorrectAlertLogic`, `TruePositive_SuspiciousActivity` and `Undetermined`.
:param str classification_comment: The comment why the incident is to be closed.
:param Sequence[str] labels: Specifies a list of labels to add to the incident.
:param str owner_id: The object ID of the entity this incident is assigned to.
:param str severity: The severity to add to the incident.
:param str status: The status to set to the incident. Possible values are: `Active`, `Closed`, `New`.
"""
pulumi.set(__self__, "order", order)
if classification is not None:
pulumi.set(__self__, "classification", classification)
if classification_comment is not None:
pulumi.set(__self__, "classification_comment", classification_comment)
if labels is not None:
pulumi.set(__self__, "labels", labels)
if owner_id is not None:
pulumi.set(__self__, "owner_id", owner_id)
if severity is not None:
pulumi.set(__self__, "severity", severity)
if status is not None:
pulumi.set(__self__, "status", status)
@property
@pulumi.getter
def order(self) -> int:
"""
The execution order of this action.
"""
return pulumi.get(self, "order")
@property
@pulumi.getter
def classification(self) -> Optional[str]:
"""
The classification of the incident, when closing it. Possible values are: `BenignPositive_SuspiciousButExpected`, `FalsePositive_InaccurateData`, `FalsePositive_IncorrectAlertLogic`, `TruePositive_SuspiciousActivity` and `Undetermined`.
"""
return pulumi.get(self, "classification")
@property
@pulumi.getter(name="classificationComment")
def classification_comment(self) -> Optional[str]:
"""
The comment why the incident is to be closed.
"""
return pulumi.get(self, "classification_comment")
@property
@pulumi.getter
def labels(self) -> Optional[Sequence[str]]:
"""
Specifies a list of labels to add to the incident.
"""
return pulumi.get(self, "labels")
@property
@pulumi.getter(name="ownerId")
def owner_id(self) -> Optional[str]:
"""
The object ID of the entity this incident is assigned to.
"""
return pulumi.get(self, "owner_id")
@property
@pulumi.getter
def severity(self) -> Optional[str]:
"""
The severity to add to the incident.
"""
return pulumi.get(self, "severity")
@property
@pulumi.getter
def status(self) -> Optional[str]:
"""
The status to set to the incident. Possible values are: `Active`, `Closed`, `New`.
"""
return pulumi.get(self, "status")
@pulumi.output_type
class AuthomationRuleActionPlaybook(dict):
@staticmethod
def __key_warning(key: str):
suggest = None
if key == "logicAppId":
suggest = "logic_app_id"
elif key == "tenantId":
suggest = "tenant_id"
if suggest:
pulumi.log.warn(f"Key '{key}' not found in AuthomationRuleActionPlaybook. Access the value via the '{suggest}' property getter instead.")
def __getitem__(self, key: str) -> Any:
AuthomationRuleActionPlaybook.__key_warning(key)
return super().__getitem__(key)
def get(self, key: str, default = None) -> Any:
AuthomationRuleActionPlaybook.__key_warning(key)
return super().get(key, default)
def __init__(__self__, *,
logic_app_id: str,
order: int,
tenant_id: Optional[str] = None):
"""
:param str logic_app_id: The ID of the Logic App that defines the playbook's logic.
:param int order: The execution order of this action.
:param str tenant_id: The ID of the Tenant that owns the playbook.
"""
pulumi.set(__self__, "logic_app_id", logic_app_id)
pulumi.set(__self__, "order", order)
if tenant_id is not None:
pulumi.set(__self__, "tenant_id", tenant_id)
@property
@pulumi.getter(name="logicAppId")
def logic_app_id(self) -> str:
"""
The ID of the Logic App that defines the playbook's logic.
"""
return pulumi.get(self, "logic_app_id")
@property
@pulumi.getter
def order(self) -> int:
"""
The execution order of this action.
"""
return pulumi.get(self, "order")
@property
@pulumi.getter(name="tenantId")
def tenant_id(self) -> Optional[str]:
"""
The ID of the Tenant that owns the playbook.
"""
return pulumi.get(self, "tenant_id")
@pulumi.output_type
class AuthomationRuleCondition(dict):
def __init__(__self__, *,
operator: str,
property: str,
values: Sequence[str]):
"""
:param str operator: The operator to use for evaluate the condition. Possible values include: `Equals`, `NotEquals`, `Contains`, `NotContains`, `StartsWith`, `NotStartsWith`, `EndsWith`, `NotEndsWith`.
:param str property: The property to use for evaluate the condition. Possible values include: `AccountAadTenantId`, `AccountAadUserId`, `AccountNTDomain`, `AccountName`, `AccountObjectGuid`, `AccountPUID`, `AccountSid`, `AccountUPNSuffix`, `AzureResourceResourceId`, `AzureResourceSubscriptionId`, `CloudApplicationAppId`, `CloudApplicationAppName`, `DNSDomainName`, `FileDirectory`, `FileHashValue`, `FileName`, `HostAzureID`, `HostNTDomain`, `HostName`, `HostNetBiosName`, `HostOSVersion`, `IPAddress`, `IncidentDescription`, `IncidentProviderName`, `IncidentRelatedAnalyticRuleIds`, `IncidentSeverity`, `IncidentStatus`, `IncidentTactics`, `IncidentTitle`, `IoTDeviceId`, `IoTDeviceModel`, `IoTDeviceName`, `IoTDeviceOperatingSystem`, `IoTDeviceType`, `IoTDeviceVendor`, `MailMessageDeliveryAction`, `MailMessageDeliveryLocation`, `MailMessageP1Sender`, `MailMessageP2Sender`, `MailMessageRecipient`, `MailMessageSenderIP`, `MailMessageSubject`, `MailboxDisplayName`, `MailboxPrimaryAddress`, `MailboxUPN`, `MalwareCategory`, `MalwareName`, `ProcessCommandLine`, `ProcessId`, `RegistryKey`, `RegistryValueData`, `Url`.
:param Sequence[str] values: Specifies a list of values to use for evaluate the condition.
"""
pulumi.set(__self__, "operator", operator)
pulumi.set(__self__, "property", property)
pulumi.set(__self__, "values", values)
@property
@pulumi.getter
def operator(self) -> str:
"""
The operator to use for evaluate the condition. Possible values include: `Equals`, `NotEquals`, `Contains`, `NotContains`, `StartsWith`, `NotStartsWith`, `EndsWith`, `NotEndsWith`.
"""
return pulumi.get(self, "operator")
@property
@pulumi.getter
def values(self) -> Sequence[str]:
"""
Specifies a list of values to use for evaluate the condition.
"""
return pulumi.get(self, "values")
@property
@pulumi.getter
def property(self) -> str:
"""
The property to use for evaluate the condition. Possible values include: `AccountAadTenantId`, `AccountAadUserId`, `AccountNTDomain`, `AccountName`, `AccountObjectGuid`, `AccountPUID`, `AccountSid`, `AccountUPNSuffix`, `AzureResourceResourceId`, `AzureResourceSubscriptionId`, `CloudApplicationAppId`, `CloudApplicationAppName`, `DNSDomainName`, `FileDirectory`, `FileHashValue`, `FileName`, `HostAzureID`, `HostNTDomain`, `HostName`, `HostNetBiosName`, `HostOSVersion`, `IPAddress`, `IncidentDescription`, `IncidentProviderName`, `IncidentRelatedAnalyticRuleIds`, `IncidentSeverity`, `IncidentStatus`, `IncidentTactics`, `IncidentTitle`, `IoTDeviceId`, `IoTDeviceModel`, `IoTDeviceName`, `IoTDeviceOperatingSystem`, `IoTDeviceType`, `IoTDeviceVendor`, `MailMessageDeliveryAction`, `MailMessageDeliveryLocation`, `MailMessageP1Sender`, `MailMessageP2Sender`, `MailMessageRecipient`, `MailMessageSenderIP`, `MailMessageSubject`, `MailboxDisplayName`, `MailboxPrimaryAddress`, `MailboxUPN`, `MalwareCategory`, `MalwareName`, `ProcessCommandLine`, `ProcessId`, `RegistryKey`, `RegistryValueData`, `Url`.
"""
return pulumi.get(self, "property")
@pulumi.output_type
class AutomationRuleActionIncident(dict):
@staticmethod
def __key_warning(key: str):
suggest = None
if key == "classificationComment":
suggest = "classification_comment"
elif key == "ownerId":
suggest = "owner_id"
if suggest:
pulumi.log.warn(f"Key '{key}' not found in AutomationRuleActionIncident. Access the value via the '{suggest}' property getter instead.")
def __getitem__(self, key: str) -> Any:
AutomationRuleActionIncident.__key_warning(key)
return super().__getitem__(key)
def get(self, key: str, default = None) -> Any:
AutomationRuleActionIncident.__key_warning(key)
return super().get(key, default)
def __init__(__self__, *,
order: int,
classification: Optional[str] = None,
classification_comment: Optional[str] = None,
labels: Optional[Sequence[str]] = None,
owner_id: Optional[str] = None,
severity: Optional[str] = None,
status: Optional[str] = None):
"""
:param int order: The execution order of this action.
:param str classification: The classification of the incident, when closing it. Possible values are: `BenignPositive_SuspiciousButExpected`, `FalsePositive_InaccurateData`, `FalsePositive_IncorrectAlertLogic`, `TruePositive_SuspiciousActivity` and `Undetermined`.
:param str classification_comment: The comment why the incident is to be closed.
:param Sequence[str] labels: Specifies a list of labels to add to the incident.
:param str owner_id: The object ID of the entity this incident is assigned to.
:param str severity: The severity to add to the incident.
:param str status: The status to set to the incident. Possible values are: `Active`, `Closed`, `New`.
"""
pulumi.set(__self__, "order", order)
if classification is not None:
pulumi.set(__self__, "classification", classification)
if classification_comment is not None:
pulumi.set(__self__, "classification_comment", classification_comment)
if labels is not None:
pulumi.set(__self__, "labels", labels)
if owner_id is not None:
pulumi.set(__self__, "owner_id", owner_id)
if severity is not None:
pulumi.set(__self__, "severity", severity)
if status is not None:
pulumi.set(__self__, "status", status)
@property
@pulumi.getter
def order(self) -> int:
"""
The execution order of this action.
"""
return pulumi.get(self, "order")
@property
@pulumi.getter
def classification(self) -> Optional[str]:
"""
The classification of the incident, when closing it. Possible values are: `BenignPositive_SuspiciousButExpected`, `FalsePositive_InaccurateData`, `FalsePositive_IncorrectAlertLogic`, `TruePositive_SuspiciousActivity` and `Undetermined`.
"""
return pulumi.get(self, "classification")
@property
@pulumi.getter(name="classificationComment")
def classification_comment(self) -> Optional[str]:
"""
The comment why the incident is to be closed.
"""
return pulumi.get(self, "classification_comment")
@property
@pulumi.getter
def labels(self) -> Optional[Sequence[str]]:
"""
Specifies a list of labels to add to the incident.
"""
return pulumi.get(self, "labels")
@property
@pulumi.getter(name="ownerId")
def owner_id(self) -> Optional[str]:
"""
The object ID of the entity this incident is assigned to.
"""
return pulumi.get(self, "owner_id")
@property
@pulumi.getter
def severity(self) -> Optional[str]:
"""
The severity to add to the incident.
"""
return pulumi.get(self, "severity")
@property
@pulumi.getter
def status(self) -> Optional[str]:
"""
The status to set to the incident. Possible values are: `Active`, `Closed`, `New`.
"""
return pulumi.get(self, "status")
@pulumi.output_type
class AutomationRuleActionPlaybook(dict):
@staticmethod
def __key_warning(key: str):
suggest = None
if key == "logicAppId":
suggest = "logic_app_id"
elif key == "tenantId":
suggest = "tenant_id"
if suggest:
pulumi.log.warn(f"Key '{key}' not found in AutomationRuleActionPlaybook. Access the value via the '{suggest}' property getter instead.")
def __getitem__(self, key: str) -> Any:
AutomationRuleActionPlaybook.__key_warning(key)
return super().__getitem__(key)
def get(self, key: str, default = None) -> Any:
AutomationRuleActionPlaybook.__key_warning(key)
return super().get(key, default)
def __init__(__self__, *,
logic_app_id: str,
order: int,
tenant_id: Optional[str] = None):
"""
:param str logic_app_id: The ID of the Logic App that defines the playbook's logic.
:param int order: The execution order of this action.
:param str tenant_id: The ID of the Tenant that owns the playbook.
"""
pulumi.set(__self__, "logic_app_id", logic_app_id)
pulumi.set(__self__, "order", order)
if tenant_id is not None:
pulumi.set(__self__, "tenant_id", tenant_id)
@property
@pulumi.getter(name="logicAppId")
def logic_app_id(self) -> str:
"""
The ID of the Logic App that defines the playbook's logic.
"""
return pulumi.get(self, "logic_app_id")
@property
@pulumi.getter
def order(self) -> int:
"""
The execution order of this action.
"""
return pulumi.get(self, "order")
@property
@pulumi.getter(name="tenantId")
def tenant_id(self) -> Optional[str]:
"""
The ID of the Tenant that owns the playbook.
"""
return pulumi.get(self, "tenant_id")
@pulumi.output_type
class AutomationRuleCondition(dict):
def __init__(__self__, *,
operator: str,
property: str,
values: Sequence[str]):
"""
:param str operator: The operator to use for evaluate the condition. Possible values include: `Equals`, `NotEquals`, `Contains`, `NotContains`, `StartsWith`, `NotStartsWith`, `EndsWith`, `NotEndsWith`.
:param str property: The property to use for evaluate the condition. Possible values include: `AccountAadTenantId`, `AccountAadUserId`, `AccountNTDomain`, `AccountName`, `AccountObjectGuid`, `AccountPUID`, `AccountSid`, `AccountUPNSuffix`, `AzureResourceResourceId`, `AzureResourceSubscriptionId`, `CloudApplicationAppId`, `CloudApplicationAppName`, `DNSDomainName`, `FileDirectory`, `FileHashValue`, `FileName`, `HostAzureID`, `HostNTDomain`, `HostName`, `HostNetBiosName`, `HostOSVersion`, `IPAddress`, `IncidentDescription`, `IncidentProviderName`, `IncidentRelatedAnalyticRuleIds`, `IncidentSeverity`, `IncidentStatus`, `IncidentTactics`, `IncidentTitle`, `IoTDeviceId`, `IoTDeviceModel`, `IoTDeviceName`, `IoTDeviceOperatingSystem`, `IoTDeviceType`, `IoTDeviceVendor`, `MailMessageDeliveryAction`, `MailMessageDeliveryLocation`, `MailMessageP1Sender`, `MailMessageP2Sender`, `MailMessageRecipient`, `MailMessageSenderIP`, `MailMessageSubject`, `MailboxDisplayName`, `MailboxPrimaryAddress`, `MailboxUPN`, `MalwareCategory`, `MalwareName`, `ProcessCommandLine`, `ProcessId`, `RegistryKey`, `RegistryValueData`, `Url`.
:param Sequence[str] values: Specifies a list of values to use for evaluate the condition.
"""
pulumi.set(__self__, "operator", operator)
pulumi.set(__self__, "property", property)
pulumi.set(__self__, "values", values)
@property
@pulumi.getter
def operator(self) -> str:
"""
The operator to use for evaluate the condition. Possible values include: `Equals`, `NotEquals`, `Contains`, `NotContains`, `StartsWith`, `NotStartsWith`, `EndsWith`, `NotEndsWith`.
"""
return pulumi.get(self, "operator")
@property
@pulumi.getter
def values(self) -> Sequence[str]:
"""
Specifies a list of values to use for evaluate the condition.
"""
return pulumi.get(self, "values")
@property
@pulumi.getter
def property(self) -> str:
"""
The property to use for evaluate the condition. Possible values include: `AccountAadTenantId`, `AccountAadUserId`, `AccountNTDomain`, `AccountName`, `AccountObjectGuid`, `AccountPUID`, `AccountSid`, `AccountUPNSuffix`, `AzureResourceResourceId`, `AzureResourceSubscriptionId`, `CloudApplicationAppId`, `CloudApplicationAppName`, `DNSDomainName`, `FileDirectory`, `FileHashValue`, `FileName`, `HostAzureID`, `HostNTDomain`, `HostName`, `HostNetBiosName`, `HostOSVersion`, `IPAddress`, `IncidentDescription`, `IncidentProviderName`, `IncidentRelatedAnalyticRuleIds`, `IncidentSeverity`, `IncidentStatus`, `IncidentTactics`, `IncidentTitle`, `IoTDeviceId`, `IoTDeviceModel`, `IoTDeviceName`, `IoTDeviceOperatingSystem`, `IoTDeviceType`, `IoTDeviceVendor`, `MailMessageDeliveryAction`, `MailMessageDeliveryLocation`, `MailMessageP1Sender`, `MailMessageP2Sender`, `MailMessageRecipient`, `MailMessageSenderIP`, `MailMessageSubject`, `MailboxDisplayName`, `MailboxPrimaryAddress`, `MailboxUPN`, `MalwareCategory`, `MalwareName`, `ProcessCommandLine`, `ProcessId`, `RegistryKey`, `RegistryValueData`, `Url`.
"""
return pulumi.get(self, "property")
@pulumi.output_type
class GetAlertRuleTemplateScheduledTemplateResult(dict):
def __init__(__self__, *,
description: str,
query: str,
query_frequency: str,
query_period: str,
severity: str,
tactics: Sequence[str],
trigger_operator: str,
trigger_threshold: int):
"""
:param str description: The description of this Sentinel Scheduled Alert Rule Template.
:param str query: The query of this Sentinel Scheduled Alert Rule Template.
:param str query_frequency: The ISO 8601 timespan duration between two consecutive queries.
:param str query_period: The ISO 8601 timespan duration, which determine the time period of the data covered by the query.
:param str severity: The alert severity of this Sentinel Scheduled Alert Rule Template.
:param Sequence[str] tactics: A list of categories of attacks by which to classify the rule.
:param str trigger_operator: The alert trigger operator, combined with `trigger_threshold`, setting alert threshold of this Sentinel Scheduled Alert Rule Template.
:param int trigger_threshold: The baseline number of query results generated, combined with `trigger_operator`, setting alert threshold of this Sentinel Scheduled Alert Rule Template.
"""
pulumi.set(__self__, "description", description)
pulumi.set(__self__, "query", query)
pulumi.set(__self__, "query_frequency", query_frequency)
pulumi.set(__self__, "query_period", query_period)
pulumi.set(__self__, "severity", severity)
pulumi.set(__self__, "tactics", tactics)
pulumi.set(__self__, "trigger_operator", trigger_operator)
pulumi.set(__self__, "trigger_threshold", trigger_threshold)
@property
@pulumi.getter
def description(self) -> str:
"""
The description of this Sentinel Scheduled Alert Rule Template.
"""
return pulumi.get(self, "description")
@property
@pulumi.getter
def query(self) -> str:
"""
The query of this Sentinel Scheduled Alert Rule Template.
"""
return pulumi.get(self, "query")
@property
@pulumi.getter(name="queryFrequency")
def query_frequency(self) -> str:
"""
The ISO 8601 timespan duration between two consecutive queries.
"""
return pulumi.get(self, "query_frequency")
@property
@pulumi.getter(name="queryPeriod")
def query_period(self) -> str:
"""
The ISO 8601 timespan duration, which determine the time period of the data covered by the query.
"""
return pulumi.get(self, "query_period")
@property
@pulumi.getter
def severity(self) -> str:
"""
The alert severity of this Sentinel Scheduled Alert Rule Template.
"""
return pulumi.get(self, "severity")
@property
@pulumi.getter
def tactics(self) -> Sequence[str]:
"""
A list of categories of attacks by which to classify the rule.
"""
return pulumi.get(self, "tactics")
@property
@pulumi.getter(name="triggerOperator")
def trigger_operator(self) -> str:
"""
The alert trigger operator, combined with `trigger_threshold`, setting alert threshold of this Sentinel Scheduled Alert Rule Template.
"""
return pulumi.get(self, "trigger_operator")
@property
@pulumi.getter(name="triggerThreshold")
def trigger_threshold(self) -> int:
"""
The baseline number of query results generated, combined with `trigger_operator`, setting alert threshold of this Sentinel Scheduled Alert Rule Template.
"""
return pulumi.get(self, "trigger_threshold")
@pulumi.output_type
class GetAlertRuleTemplateSecurityIncidentTemplateResult(dict):
def __init__(__self__, *,
description: str,
product_filter: str):
"""
:param str description: The description of this Sentinel Scheduled Alert Rule Template.
:param str product_filter: The Microsoft Security Service from where the alert will be generated.
"""
pulumi.set(__self__, "description", description)
pulumi.set(__self__, "product_filter", product_filter)
@property
@pulumi.getter
def description(self) -> str:
"""
The description of this Sentinel Scheduled Alert Rule Template.
"""
return pulumi.get(self, "description")
@property
@pulumi.getter(name="productFilter")
def product_filter(self) -> str:
"""
The Microsoft Security Service from where the alert will be generated.
"""
return pulumi.get(self, "product_filter")
| 43.654939 | 1,129 | 0.660364 | 3,257 | 32,261 | 6.359533 | 0.090574 | 0.017236 | 0.027616 | 0.040361 | 0.805533 | 0.79578 | 0.783276 | 0.756723 | 0.751605 | 0.744122 | 0 | 0.001417 | 0.234525 | 32,261 | 738 | 1,130 | 43.714092 | 0.837335 | 0.389418 | 0 | 0.711364 | 1 | 0.015909 | 0.160868 | 0.06122 | 0 | 0 | 0 | 0 | 0 | 1 | 0.172727 | false | 0 | 0.013636 | 0 | 0.343182 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
1d21a96e1b945e300b487563a26ad4ef4d71c3f6 | 16,391 | py | Python | Tests/test_suvam_cdi.py | infapy/infapy | 0cb11310130be70ce1b647aa5ede929c1eb9b2ce | [
"Apache-2.0"
] | null | null | null | Tests/test_suvam_cdi.py | infapy/infapy | 0cb11310130be70ce1b647aa5ede929c1eb9b2ce | [
"Apache-2.0"
] | null | null | null | Tests/test_suvam_cdi.py | infapy/infapy | 0cb11310130be70ce1b647aa5ede929c1eb9b2ce | [
"Apache-2.0"
] | 1 | 2021-09-23T10:31:56.000Z | 2021-09-23T10:31:56.000Z | import infapy
infapy.setFileLogger(name="test_Suvam",level="DEBUG")
infaHandler = infapy.connect(profile="spani")
cdi=infaHandler.cdi()
MTTask=cdi.mttask()
# print(MTTask.getAllMTTasks())
# print(MTTask.getMTTaskById("0119Y40Z00000000002P"))
# print(MTTask.getMTTaskByName("MT_IU 2"))
# createJSON={
# "@type":"mtTask",
# "name": "MT_IU_v2",
# "containerId": "9wKUcCBrrb1kTpMB2G94UM",
# "description": "IN DEFAULA FOLDER",
# "runtimeEnvironmentId": "0119Y425000000000003",
# "mappingId": "0119Y417000000000011",
# "sessionProperties": {},
# "enableCrossSchemaPushdown": True,
# "enableParallelRun": False,
# "autoTunedApplied": False,
# "autoTunedAppliedType": "NONE",
# "paramFileType": "PARAM_FILE_LOCAL",
# "schemaMode": "async",
# "parameterFileEncoding": "UTF-8",
# "serverlessProperties": {},
# "parameters": [
# {
# "@type": "mtTaskParameter",
# "id": 362430283,
# "name": "$Source$",
# "type": "EXTENDED_SOURCE",
# "label": "Source",
# "uiProperties": {
# "connectionParameterized": "False",
# "isMsrcFilterParameterized": "False",
# "isMsrcSortParameterized": "False",
# "objectParameterized": "False",
# "visible": "False"
# },
# "sourceConnectionId": "0119Y40B000000000004",
# "newFlatFile": False,
# "newObject": False,
# "showBusinessNames": False,
# "naturalOrder": True,
# "truncateTarget": False,
# "bulkApiDBTarget": False,
# "srcFFAttrs": {
# "@type": "flatFileAttrs",
# "id": 45053425,
# "delimiter": ",",
# "textQualifier": "\"",
# "escapeChar": "",
# "headerLineNo": 1,
# "firstDataRow": 2
# },
# "customFuncCfg": {
# "@type": "customFuncConfig",
# "id": -1,
# "connections": [],
# "inputMap": [],
# "outputFields": []
# },
# "tgtFieldRefs": {},
# "targetUpdateColumns": [],
# "extendedObject": {
# "@type": "extendedObject",
# "object": {
# "@type": "mObject",
# "name": "src_HS.csv",
# "label": "src_HS.csv",
# "metadataUpdated": False,
# "relations": [],
# "children": []
# },
# "objects": [
# {
# "@type": "mObject",
# "name": "src_HS.csv",
# "label": "src_HS.csv",
# "metadataUpdated": False,
# "relations": [],
# "children": []
# }
# ],
# "filters": [],
# "sortFields": []
# },
# "runtimeAttrs": {},
# "isRESTModernSource": True,
# "isFileList": False,
# "handleSpecialChars": False,
# "frsAsset": False,
# "dynamicFileName": False,
# "currentlyProcessedFileName": False,
# "tgtObjectAttributes": {},
# "runtimeParameterData": {
# "@type": "mtTaskRuntimeParameterData",
# "isConnectionRuntimeParameter": False,
# "isObjectRuntimeParameter": False
# },
# "overriddenFields": []
# },
# {
# "@type": "mtTaskParameter",
# "id": 362430286,
# "name": "$Target$",
# "type": "TARGET",
# "label": "Target",
# "uiProperties": {
# "connectionParameterized": "False",
# "objectParameterized": "False",
# "visible": "False",
# "defaultTargetUpdateColumns": "",
# "supportApplyDDLChanges": "True"
# },
# "targetConnectionId": "0119Y40B000000000004",
# "targetObject": "'T_IU_'||Reg_Extract(op_xml, '((.|\\n)*)(<provider-id>)([^<]*)((.|\\n)*)', 4) ||'_'|| To_Char(SYSDATE,'YYYYMMDD') ||'.xml'",
# "targetObjectLabel": "'T_IU_'||Reg_Extract(op_xml, '((.|\\n)*)(<provider-id>)([^<]*)((.|\\n)*)', 4) ||'_'|| To_Char(SYSDATE,'YYYYMMDD') ||'.xml'",
# "newFlatFile": False,
# "newObject": True,
# "showBusinessNames": False,
# "naturalOrder": True,
# "newObjectName": "'T_IU_'||Reg_Extract(op_xml, '((.|\\n)*)(<provider-id>)([^<]*)((.|\\n)*)', 4) ||'_'|| To_Char(SYSDATE,'YYYYMMDD') ||'.xml'",
# "truncateTarget": False,
# "bulkApiDBTarget": False,
# "operationType": "Insert",
# "tgtFFAttrs": {
# "@type": "flatFileAttrs",
# "id": 45053428,
# "delimiter": ",",
# "textQualifier": "none",
# "escapeChar": "",
# "headerLineNo": 1
# },
# "customFuncCfg": {
# "@type": "customFuncConfig",
# "id": -1,
# "connections": [],
# "inputMap": [],
# "outputFields": []
# },
# "tgtFieldRefs": {},
# "targetUpdateColumns": [],
# "extendedObject": {
# "@type": "extendedObject",
# "filters": [],
# "sortFields": []
# },
# "runtimeAttrs": {},
# "isRESTModernSource": True,
# "isFileList": False,
# "handleSpecialChars": False,
# "frsAsset": False,
# "dynamicFileName": False,
# "currentlyProcessedFileName": False,
# "objectName": "'T_IU_'||Reg_Extract(op_xml, '((.|\\n)*)(<provider-id>)([^<]*)((.|\\n)*)', 4) ||'_'|| To_Char(SYSDATE,'YYYYMMDD') ||'.xml'",
# "objectLabel": "'T_IU_'||Reg_Extract(op_xml, '((.|\\n)*)(<provider-id>)([^<]*)((.|\\n)*)', 4) ||'_'|| To_Char(SYSDATE,'YYYYMMDD') ||'.xml'",
# "tgtObjectAttributes": {},
# "runtimeParameterData": {
# "@type": "mtTaskRuntimeParameterData",
# "isConnectionRuntimeParameter": False,
# "isObjectRuntimeParameter": False
# },
# "overriddenFields": []
# },
# {
# "@type": "mtTaskParameter",
# "id": 362430289,
# "name": "$HierarchyBuilder$",
# "type": "HSCHEMA",
# "label": "HierarchyBuilder",
# "uiProperties": {
# "visible": "False"
# },
# "newFlatFile": False,
# "newObject": False,
# "showBusinessNames": True,
# "naturalOrder": True,
# "truncateTarget": False,
# "bulkApiDBTarget": False,
# "customFuncCfg": {
# "@type": "customFuncConfig",
# "id": -1,
# "connections": [],
# "inputMap": [],
# "outputFields": []
# },
# "tgtFieldRefs": {},
# "targetUpdateColumns": [],
# "runtimeAttrs": {},
# "isRESTModernSource": True,
# "isFileList": False,
# "handleSpecialChars": False,
# "frsAsset": False,
# "dynamicFileName": False,
# "currentlyProcessedFileName": False,
# "tgtObjectAttributes": {},
# "runtimeParameterData": {
# "@type": "mtTaskRuntimeParameterData",
# "isConnectionRuntimeParameter": False,
# "isObjectRuntimeParameter": False
# },
# "overriddenFields": []
# }
# ],
# "sequences": [],
# "inOutParameters": [],
# "connRuntimeAttrs": []
# }
# print(MTTask.createMTTask(createJSON))
# updateFullJSON={
# "@type":"mtTask",
# "name": "MT_IU_v3",
# "containerId": "9wKUcCBrrb1kTpMB2G94UM",
# "description": "NEW_DESCRIPTION",
# "runtimeEnvironmentId": "0119Y425000000000003",
# "mappingId": "0119Y417000000000011",
# "sessionProperties": {},
# "enableCrossSchemaPushdown": True,
# "enableParallelRun": False,
# "autoTunedApplied": False,
# "autoTunedAppliedType": "NONE",
# "paramFileType": "PARAM_FILE_LOCAL",
# "schemaMode": "async",
# "parameterFileEncoding": "UTF-8",
# "serverlessProperties": {},
# "parameters": [
# {
# "@type": "mtTaskParameter",
# "id": 362430283,
# "name": "$Source$",
# "type": "EXTENDED_SOURCE",
# "label": "Source",
# "uiProperties": {
# "connectionParameterized": "False",
# "isMsrcFilterParameterized": "False",
# "isMsrcSortParameterized": "False",
# "objectParameterized": "False",
# "visible": "False"
# },
# "sourceConnectionId": "0119Y40B000000000004",
# "newFlatFile": False,
# "newObject": False,
# "showBusinessNames": False,
# "naturalOrder": True,
# "truncateTarget": False,
# "bulkApiDBTarget": False,
# "srcFFAttrs": {
# "@type": "flatFileAttrs",
# "id": 45053425,
# "delimiter": ",",
# "textQualifier": "\"",
# "escapeChar": "",
# "headerLineNo": 1,
# "firstDataRow": 2
# },
# "customFuncCfg": {
# "@type": "customFuncConfig",
# "id": -1,
# "connections": [],
# "inputMap": [],
# "outputFields": []
# },
# "tgtFieldRefs": {},
# "targetUpdateColumns": [],
# "extendedObject": {
# "@type": "extendedObject",
# "object": {
# "@type": "mObject",
# "name": "src_HS.csv",
# "label": "src_HS.csv",
# "metadataUpdated": False,
# "relations": [],
# "children": []
# },
# "objects": [
# {
# "@type": "mObject",
# "name": "src_HS.csv",
# "label": "src_HS.csv",
# "metadataUpdated": False,
# "relations": [],
# "children": []
# }
# ],
# "filters": [],
# "sortFields": []
# },
# "runtimeAttrs": {},
# "isRESTModernSource": True,
# "isFileList": False,
# "handleSpecialChars": False,
# "frsAsset": False,
# "dynamicFileName": False,
# "currentlyProcessedFileName": False,
# "tgtObjectAttributes": {},
# "runtimeParameterData": {
# "@type": "mtTaskRuntimeParameterData",
# "isConnectionRuntimeParameter": False,
# "isObjectRuntimeParameter": False
# },
# "overriddenFields": []
# },
# {
# "@type": "mtTaskParameter",
# "id": 362430286,
# "name": "$Target$",
# "type": "TARGET",
# "label": "Target",
# "uiProperties": {
# "connectionParameterized": "False",
# "objectParameterized": "False",
# "visible": "False",
# "defaultTargetUpdateColumns": "",
# "supportApplyDDLChanges": "True"
# },
# "targetConnectionId": "0119Y40B000000000004",
# "targetObject": "'T_IU_'||Reg_Extract(op_xml, '((.|\\n)*)(<provider-id>)([^<]*)((.|\\n)*)', 4) ||'_'|| To_Char(SYSDATE,'YYYYMMDD') ||'.xml'",
# "targetObjectLabel": "'T_IU_'||Reg_Extract(op_xml, '((.|\\n)*)(<provider-id>)([^<]*)((.|\\n)*)', 4) ||'_'|| To_Char(SYSDATE,'YYYYMMDD') ||'.xml'",
# "newFlatFile": False,
# "newObject": True,
# "showBusinessNames": False,
# "naturalOrder": True,
# "newObjectName": "'T_IU_'||Reg_Extract(op_xml, '((.|\\n)*)(<provider-id>)([^<]*)((.|\\n)*)', 4) ||'_'|| To_Char(SYSDATE,'YYYYMMDD') ||'.xml'",
# "truncateTarget": False,
# "bulkApiDBTarget": False,
# "operationType": "Insert",
# "tgtFFAttrs": {
# "@type": "flatFileAttrs",
# "id": 45053428,
# "delimiter": ",",
# "textQualifier": "none",
# "escapeChar": "",
# "headerLineNo": 1
# },
# "customFuncCfg": {
# "@type": "customFuncConfig",
# "id": -1,
# "connections": [],
# "inputMap": [],
# "outputFields": []
# },
# "tgtFieldRefs": {},
# "targetUpdateColumns": [],
# "extendedObject": {
# "@type": "extendedObject",
# "filters": [],
# "sortFields": []
# },
# "runtimeAttrs": {},
# "isRESTModernSource": True,
# "isFileList": False,
# "handleSpecialChars": False,
# "frsAsset": False,
# "dynamicFileName": False,
# "currentlyProcessedFileName": False,
# "objectName": "'T_IU_'||Reg_Extract(op_xml, '((.|\\n)*)(<provider-id>)([^<]*)((.|\\n)*)', 4) ||'_'|| To_Char(SYSDATE,'YYYYMMDD') ||'.xml'",
# "objectLabel": "'T_IU_'||Reg_Extract(op_xml, '((.|\\n)*)(<provider-id>)([^<]*)((.|\\n)*)', 4) ||'_'|| To_Char(SYSDATE,'YYYYMMDD') ||'.xml'",
# "tgtObjectAttributes": {},
# "runtimeParameterData": {
# "@type": "mtTaskRuntimeParameterData",
# "isConnectionRuntimeParameter": False,
# "isObjectRuntimeParameter": False
# },
# "overriddenFields": []
# },
# {
# "@type": "mtTaskParameter",
# "id": 362430289,
# "name": "$HierarchyBuilder$",
# "type": "HSCHEMA",
# "label": "HierarchyBuilder",
# "uiProperties": {
# "visible": "False"
# },
# "newFlatFile": False,
# "newObject": False,
# "showBusinessNames": True,
# "naturalOrder": True,
# "truncateTarget": False,
# "bulkApiDBTarget": False,
# "customFuncCfg": {
# "@type": "customFuncConfig",
# "id": -1,
# "connections": [],
# "inputMap": [],
# "outputFields": []
# },
# "tgtFieldRefs": {},
# "targetUpdateColumns": [],
# "runtimeAttrs": {},
# "isRESTModernSource": True,
# "isFileList": False,
# "handleSpecialChars": False,
# "frsAsset": False,
# "dynamicFileName": False,
# "currentlyProcessedFileName": False,
# "tgtObjectAttributes": {},
# "runtimeParameterData": {
# "@type": "mtTaskRuntimeParameterData",
# "isConnectionRuntimeParameter": False,
# "isObjectRuntimeParameter": False
# },
# "overriddenFields": []
# }
# ],
# "sequences": [],
# "inOutParameters": [],
# "connRuntimeAttrs": []
# }
# print(MTTask.updateMTTaskFull(updateFullJSON,"0119Y40Z00000000002P"))
# updatePartialJSON={
# "@type":"mtTask",
# "description": "NEW DESCRIPTION V2"
# }
# print(MTTask.updateMTTaskPartial(updatePartialJSON,"0119Y40Z00000000002P"))
print(MTTask.deleteMTTask("0119Y40Z00000000002P")) | 39.119332 | 162 | 0.429565 | 819 | 16,391 | 8.498169 | 0.189255 | 0.00431 | 0.008621 | 0.018678 | 0.913506 | 0.908333 | 0.908333 | 0.908333 | 0.908333 | 0.908333 | 0 | 0.033966 | 0.389299 | 16,391 | 419 | 163 | 39.119332 | 0.661339 | 0.937161 | 0 | 0 | 0 | 0 | 0.064516 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d5727d9f9021b3ec7f308cc8a1a809bab945bd66 | 674 | py | Python | ggfilm/wechat_ggfilm_backend/utils/utils.py | amazingchow/photon-dance-ggfilm-server | edef66c5a0855e4c31342913c391c7dabef27f5c | [
"MIT"
] | null | null | null | ggfilm/wechat_ggfilm_backend/utils/utils.py | amazingchow/photon-dance-ggfilm-server | edef66c5a0855e4c31342913c391c7dabef27f5c | [
"MIT"
] | null | null | null | ggfilm/wechat_ggfilm_backend/utils/utils.py | amazingchow/photon-dance-ggfilm-server | edef66c5a0855e4c31342913c391c7dabef27f5c | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
def esc_replace_view2db(s):
s = s.replace("/", "//")
s = s.replace("'", "''")
s = s.replace('"', "''")
s = s.replace("[", "/[")
s = s.replace("]", "/]")
s = s.replace("%", "/%")
s = s.replace("&","/&")
s = s.replace("_", "/_")
s = s.replace("(", "/(")
s = s.replace(")", "/)")
return s
def esc_replace_db2view(s):
s = s.replace("//", "/")
s = s.replace("''", '"')
s = s.replace("/[", "[")
s = s.replace("/]", "]")
s = s.replace("/%", "%")
s = s.replace("/&","&")
s = s.replace("/_", "_")
s = s.replace("/(", "(")
s = s.replace("/)", ")")
return s
| 24.071429 | 30 | 0.363501 | 74 | 674 | 3.202703 | 0.135135 | 0.177215 | 0.721519 | 0.7173 | 0.78903 | 0.78903 | 0.78903 | 0.78903 | 0.78903 | 0.78903 | 0 | 0.006198 | 0.281899 | 674 | 27 | 31 | 24.962963 | 0.483471 | 0.031157 | 0 | 0.086957 | 0 | 0 | 0.087558 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.086957 | false | 0 | 0 | 0 | 0.173913 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
6381a941869d4a3d60a70899d2b296dd4c91d10b | 216 | py | Python | skboost/milboost/__init__.py | TMRert/skboost | f5ba77cd75beca177b663e7994ae7c3616e278fb | [
"MIT"
] | 15 | 2016-05-09T09:17:14.000Z | 2020-12-03T02:55:33.000Z | skboost/milboost/__init__.py | TMRert/skboost | f5ba77cd75beca177b663e7994ae7c3616e278fb | [
"MIT"
] | 2 | 2018-11-30T05:16:52.000Z | 2020-09-25T05:22:37.000Z | skboost/milboost/__init__.py | TMRert/skboost | f5ba77cd75beca177b663e7994ae7c3616e278fb | [
"MIT"
] | 12 | 2017-06-27T02:37:03.000Z | 2020-03-16T15:24:13.000Z | from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals
from __future__ import absolute_import
from .classifier import MILBoostClassifier
from .softmax import *
| 27 | 42 | 0.87037 | 26 | 216 | 6.5 | 0.461538 | 0.236686 | 0.378698 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115741 | 216 | 7 | 43 | 30.857143 | 0.884817 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0.166667 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
63964ee990a05613b84a2c9b98a730a44ea49f0f | 8,473 | py | Python | DeepLearning/Training/Train.py | ThisGame42/Deep-Learning-Models-4-Muscle-Bones-Segmentation | 1f34f0c5870b57994d652c0d77600ec7f25ec4a9 | [
"Apache-2.0"
] | 2 | 2021-03-15T10:22:55.000Z | 2021-06-03T15:44:01.000Z | DeepLearning/Training/Train.py | ThisGame42/Deep-Learning-Models-4-Muscle-Bones-Segmentation | 1f34f0c5870b57994d652c0d77600ec7f25ec4a9 | [
"Apache-2.0"
] | null | null | null | DeepLearning/Training/Train.py | ThisGame42/Deep-Learning-Models-4-Muscle-Bones-Segmentation | 1f34f0c5870b57994d652c0d77600ec7f25ec4a9 | [
"Apache-2.0"
] | 1 | 2022-02-02T03:52:32.000Z | 2022-02-02T03:52:32.000Z | import os
import torch
import random
import numpy as np
import torch.nn as nn
import torch.optim as optim
import torch.nn.functional as F
from datetime import datetime
from torch.utils.data import DataLoader
from Utils.Device import get_device
from Utils.Visualisation import plot_loss
random.seed(42)
torch.manual_seed(42)
np.random.seed(42)
def train_hybrid3D(model_3d: nn.Module,
model_2d: nn.Module,
optimiser: optim.Optimizer,
loss_fn: nn.Module,
lr_scheduler: optim.lr_scheduler,
training_set_loader: DataLoader,
val_set_loader: DataLoader,
num_epochs: int,
output_path: str) -> None:
model_3d = model_3d.to(get_device())
model_2d = model_2d.to(get_device())
model_2d.eval()
for p in model_2d.parameters():
p.requires_grad = False
avg_training_loss, avg_val_loss = list(), list()
for epoch in range(num_epochs):
model_3d.train()
epoch_t_loss, epoch_v_loss, idx = 0., 0., 0
for idx, batch in enumerate(training_set_loader):
optimiser.zero_grad()
input_images = batch[0].to(get_device(), dtype=torch.float32)
input_2d = torch.unsqueeze(input_images[0, 0, ...], dim=1)
logits_2d, feature_2d = model_2d(input_2d)
output_2d = F.softmax(logits_2d, dim=1)
output_2d = output_2d.permute(1, 0, 2, 3)
feature_2d = feature_2d.permute(1, 0, 2, 3)
output_2d = torch.unsqueeze(output_2d, dim=0)
feature_2d = torch.unsqueeze(feature_2d, dim=0)
input_3d = torch.cat([output_2d, input_images], dim=1)
pred_labels = model_3d(input_3d, feature_2d)
ref_labels = batch[1].to(get_device())
loss_val = loss_fn(pred_labels, ref_labels)
loss_val.backward()
optimiser.step()
epoch_t_loss += loss_val.item()
avg_training_loss.append(epoch_t_loss / (idx + 1))
print(f"The average training loss at epoch: {epoch + 1} was {epoch_t_loss / (idx + 1)}.")
with torch.no_grad():
model_3d.eval()
for idx, batch in enumerate(val_set_loader):
input_images = batch[0].to(get_device(), dtype=torch.float32)
input_2d = torch.unsqueeze(input_images[0, 0, ...], dim=1)
logits_2d, feature_2d = model_2d(input_2d)
output_2d = F.softmax(logits_2d, dim=1)
output_2d = output_2d.permute(1, 0, 2, 3)
feature_2d = feature_2d.permute(1, 0, 2, 3)
output_2d = torch.unsqueeze(output_2d, dim=0)
feature_2d = torch.unsqueeze(feature_2d, dim=0)
input_3d = torch.cat([output_2d, input_images], dim=1)
pred_labels = model_3d(input_3d, feature_2d)
ref_labels = batch[1].to(get_device())
loss_val = loss_fn(pred_labels, ref_labels)
epoch_v_loss += loss_val.item()
epoch_val_loss = epoch_v_loss / (idx + 1)
lr_scheduler.step(epoch_val_loss)
avg_val_loss.append(epoch_val_loss)
print(f"The average validation loss at epoch: {epoch + 1} was {epoch_val_loss}.")
model_name = type(model_3d).__name__
time_stamp = datetime.now().strftime('%Y-%m-%d %H:%M:%S')
torch.save(model_3d.state_dict(), os.path.join(output_path, f"{model_name}_{time_stamp}.pth"))
print(f"Model weights saved to {output_path} with the name: {model_name}_{time_stamp}.pth")
plot_loss(training_loss_=avg_training_loss,
val_loss_=avg_val_loss,
num_epochs=num_epochs,
path_dice_plot=os.path.join(output_path, f"{model_name}_{time_stamp}.png"))
print(f"The loss graph was saved to {output_path} with the name: {model_name}_{time_stamp}.png")
def train_hybrid2D(model: nn.Module,
optimiser: optim.Optimizer,
loss_fn: nn.Module,
lr_scheduler: optim.lr_scheduler,
training_set_loader: DataLoader,
val_set_loader: DataLoader,
num_epochs: int,
output_path: str) -> None:
model = model.to(get_device())
avg_training_loss, avg_val_loss = list(), list()
for epoch in range(num_epochs):
model.train()
epoch_t_loss, epoch_v_loss, idx = 0., 0., 0
for idx, batch in enumerate(training_set_loader):
optimiser.zero_grad()
input_images = batch[0].to(get_device(), dtype=torch.float32).float()
pred_labels = model(input_images)[0]
ref_labels = batch[1].to(get_device())
loss_val = loss_fn(pred_labels, ref_labels)
loss_val.backward()
optimiser.step()
epoch_t_loss += loss_val.item()
avg_training_loss.append(epoch_t_loss / (idx + 1))
print(f"The average training loss at epoch: {epoch + 1} was {epoch_t_loss / (idx + 1)}.")
with torch.no_grad():
model.eval()
for idx, batch in enumerate(val_set_loader):
input_images = batch[0].to(get_device(), dtype=torch.float32).float()
pred_labels = model(input_images)[0]
ref_labels = batch[1].to(get_device())
loss_val = loss_fn(pred_labels, ref_labels)
epoch_v_loss += loss_val.item()
epoch_val_loss = epoch_v_loss / (idx + 1)
lr_scheduler.step(epoch_val_loss)
avg_val_loss.append(epoch_val_loss)
print(f"The average validation loss at epoch: {epoch + 1} was {epoch_val_loss}.")
save_model_plots(model, output_path, avg_training_loss,
avg_val_loss, num_epochs)
def train_model(model: nn.Module,
optimiser: optim.Optimizer,
loss_fn: nn.Module,
lr_scheduler: optim.lr_scheduler,
training_set_loader: DataLoader,
val_set_loader: DataLoader,
num_epochs: int,
output_path: str) -> None:
model = model.to(get_device())
avg_training_loss, avg_val_loss = list(), list()
for epoch in range(num_epochs):
model.train()
epoch_t_loss, epoch_v_loss, idx = 0., 0., 0
for idx, batch in enumerate(training_set_loader):
optimiser.zero_grad()
input_images = batch[0].to(get_device(), dtype=torch.float32)
pred_labels = model(input_images)
ref_labels = batch[1].to(get_device())
loss_val = loss_fn(pred_labels["seg_results"], ref_labels)
loss_val.backward()
optimiser.step()
del pred_labels, input_images, ref_labels
epoch_t_loss += loss_val.item()
avg_training_loss.append(epoch_t_loss / (idx + 1))
print(f"The average training loss at epoch: {epoch + 1} was {epoch_t_loss / (idx + 1)}.")
with torch.no_grad():
model.eval()
for idx, batch in enumerate(val_set_loader):
input_images = batch[0].to(get_device(), dtype=torch.float32)
pred_labels = model(input_images)
ref_labels = batch[1].to(get_device())
loss_val = loss_fn(pred_labels["seg_results"], ref_labels)
epoch_v_loss += loss_val.item()
epoch_val_loss = epoch_v_loss / (idx + 1)
lr_scheduler.step(epoch_val_loss)
avg_val_loss.append(epoch_val_loss)
print(f"The average validation loss at epoch: {epoch + 1} was {epoch_val_loss}.")
save_model_plots(model, output_path, avg_training_loss,
avg_val_loss, num_epochs)
def save_model_plots(model, output_path, avg_training_loss,
avg_val_loss, num_epochs):
model_name = type(model).__name__
time_stamp = datetime.now().strftime('%Y-%m-%d %H:%M:%S')
torch.save(model.state_dict(), os.path.join(output_path, f"{model_name}_{time_stamp}.pth"))
print(f"Model weights saved to {output_path} with the name: {model_name}_{time_stamp}.pth")
plot_loss(training_loss_=avg_training_loss,
val_loss_=avg_val_loss,
num_epochs=num_epochs,
path_dice_plot=os.path.join(output_path, f"{model_name}_{time_stamp}.png"))
print(f"The loss graph was saved to {output_path} with the name: {model_name}_{time_stamp}.png")
| 42.792929 | 100 | 0.613006 | 1,164 | 8,473 | 4.139175 | 0.10567 | 0.045039 | 0.03653 | 0.031963 | 0.899751 | 0.892279 | 0.892279 | 0.886467 | 0.886467 | 0.886467 | 0 | 0.022947 | 0.279948 | 8,473 | 197 | 101 | 43.010152 | 0.76676 | 0 | 0 | 0.809524 | 0 | 0.029762 | 0.112829 | 0.027381 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02381 | false | 0 | 0.065476 | 0 | 0.089286 | 0.059524 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
63d76b246655db73fec7ec3c4e7da3f1d60ec773 | 1,223 | py | Python | div/sanstitre2.py | aymericvie/networked-genetic-algorithms | fa7ef58ad3f37dc430ed9a9823e87d5b221c26e5 | [
"MIT"
] | 3 | 2021-04-06T16:16:06.000Z | 2022-02-28T14:27:37.000Z | div/sanstitre2.py | aymericvie/networked-genetic-algorithms | fa7ef58ad3f37dc430ed9a9823e87d5b221c26e5 | [
"MIT"
] | null | null | null | div/sanstitre2.py | aymericvie/networked-genetic-algorithms | fa7ef58ad3f37dc430ed9a9823e87d5b221c26e5 | [
"MIT"
] | null | null | null | # main(f,p,tau,x,z)
# main (function name, network parameter, number of iterations, network type, network status)
rep = 100
r = 0
sum_mean_fitness = 0
while r < rep:
mean_fitness, fitness_history, time_history, A = main(sphere,1,20,erdos,cons)
sum_mean_fitness += mean_fitness
r += 1
mean_mean_fitness = sum_mean_fitness / (r+1)
print("avg fitness for p1")
print(mean_mean_fitness)
r = 0
sum_mean_fitness = 0
while r < rep:
mean_fitness, fitness_history, time_history, A = main(sphere,0,20,erdos,cons)
sum_mean_fitness += mean_fitness
r += 1
mean_mean_fitness = sum_mean_fitness / (r+1)
print("avg fitness for p0")
print(mean_mean_fitness)
#############
r = 0
sum_mean_fitness = 0
while r < rep:
mean_fitness, fitness_history, time_history, A = main(ackley,1,20,erdos,cons)
sum_mean_fitness += mean_fitness
r += 1
mean_mean_fitness = sum_mean_fitness / (r+1)
print("avg fitness for p1")
print(mean_mean_fitness)
r = 0
sum_mean_fitness = 0
while r < rep:
mean_fitness, fitness_history, time_history, A = main(ackley,0,20,erdos,cons)
sum_mean_fitness += mean_fitness
r += 1
mean_mean_fitness = sum_mean_fitness / (r+1)
print("avg fitness for p0")
print(mean_mean_fitness)
| 26.586957 | 93 | 0.718724 | 204 | 1,223 | 4.034314 | 0.181373 | 0.374241 | 0.204131 | 0.126367 | 0.889429 | 0.889429 | 0.889429 | 0.889429 | 0.889429 | 0.889429 | 0 | 0.034213 | 0.163532 | 1,223 | 45 | 94 | 27.177778 | 0.770283 | 0.089125 | 0 | 0.864865 | 0 | 0 | 0.065574 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.216216 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
89222678f43a931420ed3f4aca9e9b50dbc196f2 | 151 | py | Python | switchbot_client/devices/__init__.py | kzosabe/switchbot-client | 67f0c43da823d3342d006bbb3d370f298f55df56 | [
"Apache-2.0",
"MIT"
] | 1 | 2021-09-04T06:38:36.000Z | 2021-09-04T06:38:36.000Z | switchbot_client/devices/__init__.py | kzosabe/switchbot-client | 71a3a214e178ffdabbc9352c15676cbf4268928b | [
"Apache-2.0",
"MIT"
] | 9 | 2021-09-06T00:26:55.000Z | 2022-03-12T10:22:18.000Z | switchbot_client/devices/__init__.py | kzosabe/switchbot-client | 71a3a214e178ffdabbc9352c15676cbf4268928b | [
"Apache-2.0",
"MIT"
] | null | null | null | from .base import * # noqa
from .factory import * # noqa
from .physical import * # noqa
from .remote import * # noqa
from .status import * # noqa
| 25.166667 | 31 | 0.668874 | 20 | 151 | 5.05 | 0.4 | 0.49505 | 0.554455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.231788 | 151 | 5 | 32 | 30.2 | 0.87069 | 0.15894 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
896058da996a6f137c0b467af89c8c8a68c0f4a0 | 13,699 | py | Python | docparse.py | akinorioyama/xmlcheck | 514d585bbf6d649b4cd211defb3f8bc8c3c3745d | [
"MIT"
] | null | null | null | docparse.py | akinorioyama/xmlcheck | 514d585bbf6d649b4cd211defb3f8bc8c3c3745d | [
"MIT"
] | null | null | null | docparse.py | akinorioyama/xmlcheck | 514d585bbf6d649b4cd211defb3f8bc8c3c3745d | [
"MIT"
] | null | null | null | from xml.etree import ElementTree
import xml.etree.ElementTree as ET
import os
xml_original_filename = "a.xml"
def register_all_namespaces(filename):
namespaces = dict([node for _, node in ET.iterparse(filename, events=['start-ns'])])
for ns in namespaces:
if ( ns == "w14" or ns == "w15" or ns == "w16se" or ns == "w16cid" or
ns == "w16" or ns == "w16cex" or ns == "wp14"):
print("registering (skipped?):", ns)
else:
print("registering:", ns)
try:
ET.register_namespace(ns, namespaces[ns])
except ValueError as e:
print(e)
if __name__ == '__main__':
with open(xml_original_filename, "r",encoding="utf-8") as f:
xml = f.read()
root = ElementTree.fromstring(xml)
register_all_namespaces(xml_original_filename)
#最上位階層のタグと中身
print(root.tag,root.attrib)
# ET.register_namespace('Relationships',"http://schemas.openxmlformats.org/package/2006/relationships")
#子階層のタグと中身
list_content_types = []
for child in root:
print(child.tag, child.attrib)
filepath = "./out"+format(child.attrib['{http://schemas.microsoft.com/office/2006/xmlPackage}name'] )
xml_name = child.attrib['{http://schemas.microsoft.com/office/2006/xmlPackage}name']
xml_contentType = child.attrib['{http://schemas.microsoft.com/office/2006/xmlPackage}contentType']
list_content_types.append([xml_name,xml_contentType])
dirname, basename = os.path.split(filepath)
try:
os.makedirs(dirname)
except FileExistsError as e:
a = e
# print(e.strerror) # エラーメッセージ ('Cannot create a file when that file already exists')
# print(e.errno) # エラー番号 (17)
# print(e.filename) # 作成できなかったディレクトリ名 ('foo')
print(xml_name + " CT:" + xml_contentType )
with open(filepath, 'wb') as f:
if xml_contentType != 'application/xml':
f.write("<?xml version=\"1.0\" encoding=\"UTF-8\" standalone=\"yes\"?>\n".encode('utf-8'))
else:
f.write("<?xml version=\"1.0\" encoding=\"UTF-8\" standalone=\"no\"?>\n".encode('utf-8'))
for child1 in child:
for child2 in child1:
a = ET.tostring(child2)
f.write(ET.tostring(child2, short_empty_elements=False))
with open("./out/[Content_Types].xml", 'w') as f:
f.write("<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n")
f.write('<Types xmlns="http://schemas.openxmlformats.org/package/2006/content-types">')
f.write('<Default Extension="png" ContentType="image/png"/>')
f.write('<Default Extension="rels" ContentType="application/vnd.openxmlformats-package.relationships+xml"/>')
f.write('<Default Extension="xml" ContentType="application/xml"/>')
for content_type in list_content_types:
if ( content_type[1] != 'application/xml' and
content_type[1] != 'application/vnd.openxmlformats-package.relationships+xml'):
f.write('<Override PartName="' + content_type[0] +
'" ContentType="' + content_type[1] + '"/>')
f.write('</Types>')
# < Override PartName = "/word/document.xml" ContentType = "application/vnd.openxmlformats-officedocument.wordprocessingml.document.main+xml" / >
#
# for name in root.iter('w:t'):
# print(name.text)
# {http://schemas.microsoft.com/office/2006/xmlPackage}package {}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/_rels/.rels', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/vnd.openxmlformats-package.relationships+xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}padding': '512'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/word/document.xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/vnd.openxmlformats-officedocument.wordprocessingml.document.main+xml'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/word/_rels/document.xml.rels', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/vnd.openxmlformats-package.relationships+xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}padding': '256'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/word/footnotes.xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/vnd.openxmlformats-officedocument.wordprocessingml.footnotes+xml'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/word/endnotes.xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/vnd.openxmlformats-officedocument.wordprocessingml.endnotes+xml'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/word/header2.xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/vnd.openxmlformats-officedocument.wordprocessingml.header+xml'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/word/footer2.xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/vnd.openxmlformats-officedocument.wordprocessingml.footer+xml'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/word/header1.xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/vnd.openxmlformats-officedocument.wordprocessingml.header+xml'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/word/footer1.xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/vnd.openxmlformats-officedocument.wordprocessingml.footer+xml'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/word/_rels/header1.xml.rels', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/vnd.openxmlformats-package.relationships+xml'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/word/theme/theme1.xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/vnd.openxmlformats-officedocument.theme+xml'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/word/media/image1.png', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'image/png', '{http://schemas.microsoft.com/office/2006/xmlPackage}compression': 'store'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/word/settings.xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/vnd.openxmlformats-officedocument.wordprocessingml.settings+xml'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/customXml/item1.xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}padding': '32'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/customXml/itemProps1.xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/vnd.openxmlformats-officedocument.customXmlProperties+xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}padding': '32'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/customXml/item2.xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}padding': '32'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/customXml/itemProps2.xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/vnd.openxmlformats-officedocument.customXmlProperties+xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}padding': '32'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/customXml/item3.xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}padding': '32'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/customXml/itemProps3.xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/vnd.openxmlformats-officedocument.customXmlProperties+xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}padding': '32'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/customXml/item4.xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}padding': '32'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/customXml/itemProps4.xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/vnd.openxmlformats-officedocument.customXmlProperties+xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}padding': '32'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/word/numbering.xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/vnd.openxmlformats-officedocument.wordprocessingml.numbering+xml'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/word/styles.xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/vnd.openxmlformats-officedocument.wordprocessingml.styles+xml'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/word/webSettings.xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/vnd.openxmlformats-officedocument.wordprocessingml.webSettings+xml'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/word/fontTable.xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/vnd.openxmlformats-officedocument.wordprocessingml.fontTable+xml'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/docProps/core.xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/vnd.openxmlformats-package.core-properties+xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}padding': '256'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/docProps/app.xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/vnd.openxmlformats-officedocument.extended-properties+xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}padding': '256'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/docProps/custom.xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/vnd.openxmlformats-officedocument.custom-properties+xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}padding': '256'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/customXml/_rels/item1.xml.rels', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/vnd.openxmlformats-package.relationships+xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}padding': '256'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/customXml/_rels/item2.xml.rels', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/vnd.openxmlformats-package.relationships+xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}padding': '256'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/customXml/_rels/item3.xml.rels', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/vnd.openxmlformats-package.relationships+xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}padding': '256'}
# {http://schemas.microsoft.com/office/2006/xmlPackage}part {'{http://schemas.microsoft.com/office/2006/xmlPackage}name': '/customXml/_rels/item4.xml.rels', '{http://schemas.microsoft.com/office/2006/xmlPackage}contentType': 'application/vnd.openxmlformats-package.relationships+xml', '{http://schemas.microsoft.com/office/2006/xmlPackage}padding': '256'}
| 129.235849 | 361 | 0.741149 | 1,646 | 13,699 | 6.138518 | 0.106318 | 0.130641 | 0.233571 | 0.268606 | 0.831354 | 0.831354 | 0.823634 | 0.81829 | 0.795724 | 0.766033 | 0 | 0.045727 | 0.075699 | 13,699 | 105 | 362 | 130.466667 | 0.752251 | 0.778159 | 0 | 0.071429 | 0 | 0 | 0.282325 | 0.06983 | 0 | 0 | 0 | 0 | 0 | 1 | 0.017857 | false | 0 | 0.053571 | 0 | 0.071429 | 0.107143 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
98865ea14cd0853944bcabb03486186e8bf0f1ff | 92,759 | py | Python | rlkit/torch/grill/launcher.py | Asap7772/railrl_evalsawyer | baba8ce634d32a48c7dfe4dc03b123e18e96e0a3 | [
"MIT"
] | 1 | 2020-10-23T14:40:09.000Z | 2020-10-23T14:40:09.000Z | rlkit/torch/grill/launcher.py | Asap7772/railrl_evalsawyer | baba8ce634d32a48c7dfe4dc03b123e18e96e0a3 | [
"MIT"
] | null | null | null | rlkit/torch/grill/launcher.py | Asap7772/railrl_evalsawyer | baba8ce634d32a48c7dfe4dc03b123e18e96e0a3 | [
"MIT"
] | 1 | 2021-05-27T20:38:45.000Z | 2021-05-27T20:38:45.000Z | import os.path as osp
import time
import numpy as np
from rlkit.samplers.data_collector import VAEWrappedEnvPathCollector
from rlkit.visualization.video import VideoSaveFunction
from rlkit.torch.her.her import HERTrainer
from rlkit.torch.sac.policies import MakeDeterministic
from rlkit.torch.sac.sac import SACTrainer
from rlkit.torch.vae.online_vae_algorithm import OnlineVaeAlgorithm
def grill_tdm_td3_full_experiment(variant):
full_experiment_variant_preprocess(variant)
if not variant['grill_variant'].get('do_state_exp', False):
train_vae_and_update_variant(variant)
grill_tdm_td3_experiment(variant['grill_variant'])
def grill_tdm_twin_sac_full_experiment(variant):
full_experiment_variant_preprocess(variant)
train_vae_and_update_variant(variant)
grill_tdm_twin_sac_experiment(variant['grill_variant'])
def grill_her_twin_sac_full_experiment(variant):
full_experiment_variant_preprocess(variant)
if not variant['grill_variant'].get('do_state_exp', False):
train_vae_and_update_variant(variant)
grill_her_twin_sac_experiment(variant['grill_variant'])
def grill_her_td3_full_experiment(variant):
full_experiment_variant_preprocess(variant)
if not variant['grill_variant'].get('do_state_exp', False):
train_vae_and_update_variant(variant)
grill_her_td3_experiment(variant['grill_variant'])
def grill_her_td3_online_vae_full_experiment(variant):
variant['grill_variant']['save_vae_data'] = True
full_experiment_variant_preprocess(variant)
train_vae_and_update_variant(variant)
# variant['grill_variant']['vae_trainer_kwargs'] = \
# variant['train_vae_variant']['algo_kwargs']
# if variant['double_algo']:
# grill_her_td3_experiment_online_vae_exploring(variant['grill_variant'])
# else:
grill_her_td3_experiment_online_vae(variant['grill_variant'])
def grill_her_td3_offpolicy_online_vae_full_experiment(variant):
variant['grill_variant']['save_vae_data'] = True
full_experiment_variant_preprocess(variant)
train_vae_and_update_variant(variant)
# variant['grill_variant']['vae_trainer_kwargs'] = \
# variant['train_vae_variant']['algo_kwargs']
# if variant['double_algo']:
# grill_her_td3_experiment_online_vae_exploring(variant['grill_variant'])
# else:
grill_her_td3_experiment_offpolicy_online_vae(variant['grill_variant'])
def grill_her_twin_sac_online_vae_full_experiment(variant):
variant['grill_variant']['save_vae_data'] = True
full_experiment_variant_preprocess(variant)
train_vae_and_update_variant(variant)
grill_her_twin_sac_experiment_online_vae(variant['grill_variant'])
def arl_full_experiment(variant):
variant['grill_variant']['save_vae_data'] = True
full_experiment_variant_preprocess(variant)
active_representation_learning_experiment(variant['grill_variant'])
def grill_tdm_td3_online_vae_full_experiment(variant):
variant['grill_variant']['save_vae_data'] = True
variant['grill_variant']['vae_trainer_kwargs'] = \
variant['train_vae_variant']['algo_kwargs']
full_experiment_variant_preprocess(variant)
train_vae_and_update_variant(variant)
grill_tdm_td3_experiment_online_vae(variant['grill_variant'])
def HER_baseline_her_td3_full_experiment(variant):
full_experiment_variant_preprocess(variant)
HER_baseline_her_td3_experiment(variant['grill_variant'])
def HER_baseline_twin_sac_full_experiment(variant):
full_experiment_variant_preprocess(variant)
HER_baseline_twin_sac_experiment(variant['grill_variant'])
def full_experiment_variant_preprocess(variant):
train_vae_variant = variant['train_vae_variant']
grill_variant = variant['grill_variant']
if 'env_id' in variant:
assert 'env_class' not in variant
env_id = variant['env_id']
grill_variant['env_id'] = env_id
train_vae_variant['generate_vae_dataset_kwargs']['env_id'] = env_id
else:
env_class = variant['env_class']
env_kwargs = variant['env_kwargs']
train_vae_variant['generate_vae_dataset_kwargs']['env_class'] = (
env_class
)
train_vae_variant['generate_vae_dataset_kwargs']['env_kwargs'] = (
env_kwargs
)
grill_variant['env_class'] = env_class
grill_variant['env_kwargs'] = env_kwargs
init_camera = variant.get('init_camera', None)
imsize = variant.get('imsize', 84)
train_vae_variant['generate_vae_dataset_kwargs']['init_camera'] = (
init_camera
)
train_vae_variant['generate_vae_dataset_kwargs']['imsize'] = imsize
train_vae_variant['imsize'] = imsize
grill_variant['imsize'] = imsize
grill_variant['init_camera'] = init_camera
def train_vae_and_update_variant(variant):
from rlkit.core import logger
grill_variant = variant['grill_variant']
train_vae_variant = variant['train_vae_variant']
if grill_variant.get('vae_path', None) is None:
logger.remove_tabular_output(
'progress.csv', relative_to_snapshot_dir=True
)
logger.add_tabular_output(
'vae_progress.csv', relative_to_snapshot_dir=True
)
vae, vae_train_data, vae_test_data = train_vae(train_vae_variant,
return_data=True)
if grill_variant.get('save_vae_data', False):
grill_variant['vae_train_data'] = vae_train_data
grill_variant['vae_test_data'] = vae_test_data
logger.save_extra_data(vae, 'vae.pkl', mode='pickle')
logger.remove_tabular_output(
'vae_progress.csv',
relative_to_snapshot_dir=True,
)
logger.add_tabular_output(
'progress.csv',
relative_to_snapshot_dir=True,
)
grill_variant['vae_path'] = vae # just pass the VAE directly
else:
if grill_variant.get('save_vae_data', False):
vae_train_data, vae_test_data, info = generate_vae_dataset(
train_vae_variant['generate_vae_dataset_kwargs']
)
grill_variant['vae_train_data'] = vae_train_data
grill_variant['vae_test_data'] = vae_test_data
def train_vae(variant, return_data=False):
from rlkit.misc.ml_util import PiecewiseLinearSchedule
from rlkit.torch.vae.vae_trainer import ConvVAETrainer
from rlkit.core import logger
beta = variant["beta"]
use_linear_dynamics = variant.get('use_linear_dynamics', False)
generate_vae_dataset_fctn = variant.get('generate_vae_data_fctn',
generate_vae_dataset)
variant['generate_vae_dataset_kwargs']['use_linear_dynamics'] = use_linear_dynamics
train_dataset, test_dataset, info = generate_vae_dataset_fctn(
variant['generate_vae_dataset_kwargs'])
if use_linear_dynamics:
action_dim = train_dataset.data['actions'].shape[2]
else:
action_dim = 0
model = get_vae(variant, action_dim)
logger.save_extra_data(info)
logger.get_snapshot_dir()
if 'beta_schedule_kwargs' in variant:
beta_schedule = PiecewiseLinearSchedule(
**variant['beta_schedule_kwargs'])
else:
beta_schedule = None
vae_trainer_class = variant.get('vae_trainer_class', ConvVAETrainer)
trainer = vae_trainer_class(model, beta=beta,
beta_schedule=beta_schedule, **variant['algo_kwargs'])
save_period = variant['save_period']
dump_skew_debug_plots = variant.get('dump_skew_debug_plots', False)
for epoch in range(variant['num_epochs']):
should_save_imgs = (epoch % save_period == 0)
trainer.train_epoch(epoch, train_dataset)
trainer.test_epoch(epoch, test_dataset)
if should_save_imgs:
trainer.dump_reconstructions(epoch)
trainer.dump_samples(epoch)
if dump_skew_debug_plots:
trainer.dump_best_reconstruction(epoch)
trainer.dump_worst_reconstruction(epoch)
trainer.dump_sampling_histogram(epoch)
stats = trainer.get_diagnostics()
for k, v in stats.items():
logger.record_tabular(k, v)
logger.dump_tabular()
trainer.end_epoch(epoch)
if epoch % 50 == 0:
logger.save_itr_params(epoch, model)
logger.save_extra_data(model, 'vae.pkl', mode='pickle')
if return_data:
return model, train_dataset, test_dataset
return model
def get_vae(variant, action_dim):
from rlkit.torch.vae.conv_vae import (
ConvVAE,
SpatialAutoEncoder,
AutoEncoder,
)
import rlkit.torch.vae.conv_vae as conv_vae
import rlkit.torch.pytorch_util as ptu
from rlkit.pythonplusplus import identity
import torch
representation_size = variant["representation_size"]
use_linear_dynamics = variant.get('use_linear_dynamics', False)
if variant.get('decoder_activation', None) == 'sigmoid':
decoder_activation = torch.nn.Sigmoid()
else:
decoder_activation = identity
architecture = variant['vae_kwargs'].get('architecture', None)
if not architecture and variant.get('imsize') == 84:
architecture = conv_vae.imsize84_default_architecture
elif not architecture and variant.get('imsize') == 48:
architecture = conv_vae.imsize48_default_architecture
variant['vae_kwargs']['architecture'] = architecture
variant['vae_kwargs']['imsize'] = variant.get('imsize')
if variant['algo_kwargs'].get('is_auto_encoder', False):
model = AutoEncoder(representation_size, decoder_output_activation=decoder_activation,**variant['vae_kwargs'])
elif variant.get('use_spatial_auto_encoder', False):
model = SpatialAutoEncoder(representation_size, decoder_output_activation=decoder_activation,**variant['vae_kwargs'])
else:
vae_class = variant.get('vae_class', ConvVAE)
if use_linear_dynamics:
model = vae_class(representation_size, decoder_output_activation=decoder_activation, action_dim=action_dim,**variant['vae_kwargs'])
else:
model = vae_class(representation_size, decoder_output_activation=decoder_activation,**variant['vae_kwargs'])
model.to(ptu.device)
return model
def generate_vae_dataset(variant):
print(variant)
env_class = variant.get('env_class', None)
env_kwargs = variant.get('env_kwargs',None)
env_id = variant.get('env_id', None)
N = variant.get('N', 10000)
test_p = variant.get('test_p', 0.9)
use_cached = variant.get('use_cached', True)
imsize = variant.get('imsize', 84)
num_channels = variant.get('num_channels', 3)
show = variant.get('show', False)
init_camera = variant.get('init_camera', None)
dataset_path = variant.get('dataset_path', None)
oracle_dataset_using_set_to_goal = variant.get('oracle_dataset_using_set_to_goal', False)
random_rollout_data = variant.get('random_rollout_data', False)
random_rollout_data_set_to_goal = variant.get('random_rollout_data_set_to_goal', True)
random_and_oracle_policy_data=variant.get('random_and_oracle_policy_data', False)
random_and_oracle_policy_data_split=variant.get('random_and_oracle_policy_data_split', 0)
policy_file = variant.get('policy_file', None)
n_random_steps = variant.get('n_random_steps', 100)
vae_dataset_specific_env_kwargs = variant.get('vae_dataset_specific_env_kwargs', None)
save_file_prefix = variant.get('save_file_prefix', None)
non_presampled_goal_img_is_garbage = variant.get('non_presampled_goal_img_is_garbage', None)
conditional_vae_dataset = variant.get('conditional_vae_dataset', False)
use_env_labels = variant.get('use_env_labels', False)
use_linear_dynamics = variant.get('use_linear_dynamics', False)
enviorment_dataset = variant.get('enviorment_dataset', False)
save_trajectories = variant.get('save_trajectories', False)
save_trajectories = save_trajectories or use_linear_dynamics or conditional_vae_dataset
tag = variant.get('tag', '')
from multiworld.core.image_env import ImageEnv, unormalize_image
import rlkit.torch.pytorch_util as ptu
from rlkit.misc.asset_loader import load_local_or_remote_file
from rlkit.data_management.dataset import \
TrajectoryDataset, ImageObservationDataset, InitialObservationDataset, EnvironmentDataset, ConditionalDynamicsDataset
info = {}
if dataset_path is not None:
dataset = load_local_or_remote_file(dataset_path)
dataset = dataset.item()
N = dataset['observations'].shape[0] * dataset['observations'].shape[1]
n_random_steps = dataset['observations'].shape[1]
else:
if env_kwargs is None:
env_kwargs = {}
if save_file_prefix is None:
save_file_prefix = env_id
if save_file_prefix is None:
save_file_prefix = env_class.__name__
filename = "/tmp/{}_N{}_{}_imsize{}_random_oracle_split_{}{}.npy".format(
save_file_prefix,
str(N),
init_camera.__name__ if init_camera and hasattr(init_camera, '__name__') else '',
imsize,
random_and_oracle_policy_data_split,
tag,
)
if use_cached and osp.isfile(filename):
dataset = np.load(filename)
if conditional_vae_dataset:
dataset = dataset.item()
print("loaded data from saved file", filename)
else:
now = time.time()
if env_id is not None:
import gym
import multiworld
multiworld.register_all_envs()
env = gym.make(env_id)
else:
if vae_dataset_specific_env_kwargs is None:
vae_dataset_specific_env_kwargs = {}
for key, val in env_kwargs.items():
if key not in vae_dataset_specific_env_kwargs:
vae_dataset_specific_env_kwargs[key] = val
env = env_class(**vae_dataset_specific_env_kwargs)
if not isinstance(env, ImageEnv):
env = ImageEnv(
env,
imsize,
init_camera=init_camera,
transpose=True,
normalize=True,
non_presampled_goal_img_is_garbage=non_presampled_goal_img_is_garbage,
)
else:
imsize = env.imsize
env.non_presampled_goal_img_is_garbage = non_presampled_goal_img_is_garbage
env.reset()
info['env'] = env
if random_and_oracle_policy_data:
policy_file = load_local_or_remote_file(policy_file)
policy = policy_file['policy']
policy.to(ptu.device)
if random_rollout_data:
from rlkit.exploration_strategies.ou_strategy import OUStrategy
policy = OUStrategy(env.action_space)
if save_trajectories:
dataset = {
'observations': np.zeros((N // n_random_steps, n_random_steps, imsize * imsize * num_channels), dtype=np.uint8),
'actions': np.zeros((N // n_random_steps, n_random_steps, env.action_space.shape[0]), dtype=np.float),
'env': np.zeros((N // n_random_steps, imsize * imsize * num_channels), dtype=np.uint8),
}
else:
dataset = np.zeros((N, imsize * imsize * num_channels), dtype=np.uint8)
labels = []
for i in range(N):
if random_and_oracle_policy_data:
num_random_steps = int(N*random_and_oracle_policy_data_split)
if i < num_random_steps:
env.reset()
for _ in range(n_random_steps):
obs = env.step(env.action_space.sample())[0]
else:
obs = env.reset()
policy.reset()
for _ in range(n_random_steps):
policy_obs = np.hstack((
obs['state_observation'],
obs['state_desired_goal'],
))
action, _ = policy.get_action(policy_obs)
obs, _, _, _ = env.step(action)
elif random_rollout_data: #ADD DATA WHERE JUST PUCK MOVES
if i % n_random_steps == 0:
env.reset()
policy.reset()
env_img = env._get_obs()['image_observation']
if random_rollout_data_set_to_goal:
env.set_to_goal(env.get_goal())
obs = env._get_obs()
u = policy.get_action_from_raw_action(env.action_space.sample())
env.step(u)
elif oracle_dataset_using_set_to_goal:
print(i)
goal = env.sample_goal()
env.set_to_goal(goal)
obs = env._get_obs()
else:
env.reset()
for _ in range(n_random_steps):
obs = env.step(env.action_space.sample())[0]
img = obs['image_observation']
if use_env_labels:
labels.append(obs['label'])
if save_trajectories:
dataset['observations'][i // n_random_steps, i % n_random_steps, :] = unormalize_image(img)
dataset['actions'][i // n_random_steps, i % n_random_steps, :] = u
dataset['env'][i // n_random_steps, :] = unormalize_image(env_img)
else:
dataset[i, :] = unormalize_image(img)
if show:
import cv2
img = img.reshape(3, imsize, imsize).transpose()
img = img[::-1, :, ::-1]
cv2.imshow('img', img)
cv2.waitKey(1)
# radius = input('waiting...')
print("done making training data", filename, time.time() - now)
np.save(filename, dataset)
# np.save(filename[:-4] + 'labels.npy', np.array(labels))
info['train_labels'] = []
info['test_labels'] = []
if use_linear_dynamics and conditional_vae_dataset:
num_trajectories = N // n_random_steps
n = int(num_trajectories * test_p)
train_dataset = ConditionalDynamicsDataset({
'observations': dataset['observations'][:n, :, :],
'actions': dataset['actions'][:n, :, :],
'env': dataset['env'][:n, :]
})
test_dataset = ConditionalDynamicsDataset({
'observations': dataset['observations'][n:, :, :],
'actions': dataset['actions'][n:, :, :],
'env': dataset['env'][n:, :]
})
num_trajectories = N // n_random_steps
n = int(num_trajectories * test_p)
indices = np.arange(num_trajectories)
np.random.shuffle(indices)
train_i, test_i = indices[:n], indices[n:]
try:
train_dataset = ConditionalDynamicsDataset({
'observations': dataset['observations'][train_i, :, :],
'actions': dataset['actions'][train_i, :, :],
'env': dataset['env'][train_i, :]
})
test_dataset = ConditionalDynamicsDataset({
'observations': dataset['observations'][test_i, :, :],
'actions': dataset['actions'][test_i, :, :],
'env': dataset['env'][test_i, :]
})
except:
train_dataset = ConditionalDynamicsDataset({
'observations': dataset['observations'][train_i, :, :],
'actions': dataset['actions'][train_i, :, :],
})
test_dataset = ConditionalDynamicsDataset({
'observations': dataset['observations'][test_i, :, :],
'actions': dataset['actions'][test_i, :, :],
})
elif use_linear_dynamics:
num_trajectories = N // n_random_steps
n = int(num_trajectories * test_p)
train_dataset = TrajectoryDataset({
'observations': dataset['observations'][:n, :, :],
'actions': dataset['actions'][:n, :, :]
})
test_dataset = TrajectoryDataset({
'observations': dataset['observations'][n:, :, :],
'actions': dataset['actions'][n:, :, :]
})
elif enviorment_dataset:
n = int(n_random_steps * test_p)
train_dataset = EnvironmentDataset({
'observations': dataset['observations'][:, :n, :],
})
test_dataset = EnvironmentDataset({
'observations': dataset['observations'][:, n:, :],
})
elif conditional_vae_dataset:
num_trajectories = N // n_random_steps
n = int(num_trajectories * test_p)
indices = np.arange(num_trajectories)
np.random.shuffle(indices)
train_i, test_i = indices[:n], indices[n:]
try:
train_dataset = InitialObservationDataset({
'observations': dataset['observations'][train_i, :, :],
'env': dataset['env'][train_i, :]
})
test_dataset = InitialObservationDataset({
'observations': dataset['observations'][test_i, :, :],
'env': dataset['env'][test_i, :]
})
except:
train_dataset = InitialObservationDataset({
'observations': dataset['observations'][train_i, :, :],
})
test_dataset = InitialObservationDataset({
'observations': dataset['observations'][test_i, :, :],
})
else:
n = int(N * test_p)
train_dataset = ImageObservationDataset(dataset[:n, :])
test_dataset = ImageObservationDataset(dataset[n:, :])
return train_dataset, test_dataset, info
def get_presampled_goals_path(path=''):
"""
:param path: if relative, this will rpe
:param config: One of a few options:
- string: the path to the
- tuple of two strings: the first string specifies the 'mode' and the
second string specifies extra parameters to that mode
- None: return None
:return: Path to the presampled goals, or None.
"""
if not path:
return path
if path[0] == '/':
return path
else:
import multiworld.envs.mujoco as mwmj
return osp.join(osp.dirname(mwmj.__file__), path)
def get_envs(variant):
from multiworld.core.image_env import ImageEnv
from rlkit.envs.vae_wrappers import VAEWrappedEnv, ConditionalVAEWrappedEnv
from rlkit.misc.asset_loader import load_local_or_remote_file
from rlkit.torch.vae.conditional_conv_vae import CVAE, CDVAE, ACE, CADVAE, DeltaCVAE
render = variant.get('render', False)
vae_path = variant.get("vae_path", None)
reward_params = variant.get("reward_params", dict())
init_camera = variant.get("init_camera", None)
do_state_exp = variant.get("do_state_exp", False)
presample_goals = variant.get('presample_goals', False)
presample_image_goals_only = variant.get('presample_image_goals_only', False)
presampled_goals_path = get_presampled_goals_path(
variant.get('presampled_goals_path', None))
vae = load_local_or_remote_file(vae_path) if type(vae_path) is str else vae_path
if 'env_id' in variant:
import gym
import multiworld
multiworld.register_all_envs()
env = gym.make(variant['env_id'])
else:
env = variant["env_class"](**variant['env_kwargs'])
if not do_state_exp:
if isinstance(env, ImageEnv):
image_env = env
else:
image_env = ImageEnv(
env,
variant.get('imsize'),
init_camera=init_camera,
transpose=True,
normalize=True,
)
if presample_goals:
"""
This will fail for online-parallel as presampled_goals will not be
serialized. Also don't use this for online-vae.
"""
if presampled_goals_path is None:
image_env.non_presampled_goal_img_is_garbage = True
vae_env = VAEWrappedEnv(
image_env,
vae,
imsize=image_env.imsize,
decode_goals=render,
render_goals=render,
render_rollouts=render,
reward_params=reward_params,
**variant.get('vae_wrapped_env_kwargs', {})
)
presampled_goals = variant['generate_goal_dataset_fctn'](
env=vae_env,
env_id=variant.get('env_id', None),
**variant['goal_generation_kwargs']
)
del vae_env
else:
presampled_goals = load_local_or_remote_file(
presampled_goals_path
).item()
del image_env
image_env = ImageEnv(
env,
variant.get('imsize'),
init_camera=init_camera,
transpose=True,
normalize=True,
presampled_goals=presampled_goals,
**variant.get('image_env_kwargs', {})
)
vae_env = VAEWrappedEnv(
image_env,
vae,
imsize=image_env.imsize,
decode_goals=render,
render_goals=render,
render_rollouts=render,
reward_params=reward_params,
presampled_goals = presampled_goals,
**variant.get('vae_wrapped_env_kwargs', {})
)
print("Presampling all goals only")
else:
if type(vae) is CVAE or type(vae) is CDVAE or type(vae) is ACE or type(vae) is CADVAE or type(vae) is DeltaCVAE:
vae_env = ConditionalVAEWrappedEnv(
image_env,
vae,
imsize=image_env.imsize,
decode_goals=render,
render_goals=render,
render_rollouts=render,
reward_params=reward_params,
**variant.get('vae_wrapped_env_kwargs', {})
)
else:
vae_env = VAEWrappedEnv(
image_env,
vae,
imsize=image_env.imsize,
decode_goals=render,
render_goals=render,
render_rollouts=render,
reward_params=reward_params,
**variant.get('vae_wrapped_env_kwargs', {})
)
if presample_image_goals_only:
presampled_goals = variant['generate_goal_dataset_fctn'](
image_env=vae_env.wrapped_env,
**variant['goal_generation_kwargs']
)
image_env.set_presampled_goals(presampled_goals)
print("Presampling image goals only")
else:
print("Not using presampled goals")
env = vae_env
return env
def get_exploration_strategy(variant, env):
from rlkit.exploration_strategies.epsilon_greedy import EpsilonGreedy
from rlkit.exploration_strategies.gaussian_strategy import GaussianStrategy
from rlkit.exploration_strategies.ou_strategy import OUStrategy
from rlkit.exploration_strategies.noop import NoopStrategy
exploration_type = variant['exploration_type']
exploration_noise = variant.get('exploration_noise', 0.1)
if exploration_type == 'ou':
es = OUStrategy(
action_space=env.action_space,
max_sigma=exploration_noise,
min_sigma=exploration_noise, # Constant sigma
)
elif exploration_type == 'gaussian':
es = GaussianStrategy(
action_space=env.action_space,
max_sigma=exploration_noise,
min_sigma=exploration_noise, # Constant sigma
)
elif exploration_type == 'epsilon':
es = EpsilonGreedy(
action_space=env.action_space,
prob_random_action=exploration_noise,
)
elif exploration_type == 'noop':
es = NoopStrategy(
action_space=env.action_space
)
else:
raise Exception("Invalid type: " + exploration_type)
return es
def grill_preprocess_variant(variant):
if variant.get("do_state_exp", False):
variant['observation_key'] = 'state_observation'
variant['desired_goal_key'] = 'state_desired_goal'
variant['achieved_goal_key'] = 'state_acheived_goal'
def grill_her_td3_experiment(variant):
import rlkit.samplers.rollout_functions as rf
import rlkit.torch.pytorch_util as ptu
from rlkit.data_management.obs_dict_replay_buffer import \
ObsDictRelabelingBuffer
from rlkit.exploration_strategies.base import (
PolicyWrappedWithExplorationStrategy
)
from rlkit.torch.her.her_td3 import HerTd3
from rlkit.torch.networks import ConcatMlp, TanhMlpPolicy
grill_preprocess_variant(variant)
env = get_envs(variant)
es = get_exploration_strategy(variant, env)
observation_key = variant.get('observation_key', 'latent_observation')
desired_goal_key = variant.get('desired_goal_key', 'latent_desired_goal')
achieved_goal_key = desired_goal_key.replace("desired", "achieved")
obs_dim = (
env.observation_space.spaces[observation_key].low.size
+ env.observation_space.spaces[desired_goal_key].low.size
)
action_dim = env.action_space.low.size
qf1 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
**variant['qf_kwargs']
)
qf2 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
**variant['qf_kwargs']
)
policy = TanhMlpPolicy(
input_size=obs_dim,
output_size=action_dim,
**variant['policy_kwargs']
)
target_qf1 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
**variant['qf_kwargs']
)
target_qf2 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
**variant['qf_kwargs']
)
target_policy = TanhMlpPolicy(
input_size=obs_dim,
output_size=action_dim,
**variant['policy_kwargs']
)
exploration_policy = PolicyWrappedWithExplorationStrategy(
exploration_strategy=es,
policy=policy,
)
replay_buffer = ObsDictRelabelingBuffer(
env=env,
observation_key=observation_key,
desired_goal_key=desired_goal_key,
achieved_goal_key=achieved_goal_key,
**variant['replay_buffer_kwargs']
)
algo_kwargs = variant['algo_kwargs']
algo_kwargs['replay_buffer'] = replay_buffer
base_kwargs = algo_kwargs['base_kwargs']
base_kwargs['training_env'] = env
base_kwargs['render'] = variant["render"]
base_kwargs['render_during_eval'] = variant["render"]
her_kwargs = algo_kwargs['her_kwargs']
her_kwargs['observation_key'] = observation_key
her_kwargs['desired_goal_key'] = desired_goal_key
algorithm = HerTd3(
env,
qf1=qf1,
qf2=qf2,
policy=policy,
target_qf1=target_qf1,
target_qf2=target_qf2,
target_policy=target_policy,
exploration_policy=exploration_policy,
**variant['algo_kwargs']
)
if variant.get("save_video", True):
rollout_function = rf.create_rollout_function(
rf.multitask_rollout,
max_path_length=algorithm.max_path_length,
observation_key=algorithm.observation_key,
desired_goal_key=algorithm.desired_goal_key,
)
video_func = get_video_save_func(
rollout_function,
env,
algorithm.eval_policy,
variant,
)
algorithm.post_train_funcs.append(video_func)
algorithm.to(ptu.device)
if not variant.get("do_state_exp", False):
env.vae.to(ptu.device)
algorithm.train()
def grill_her_twin_sac_experiment(variant):
import rlkit.samplers.rollout_functions as rf
import rlkit.torch.pytorch_util as ptu
from rlkit.data_management.obs_dict_replay_buffer import \
ObsDictRelabelingBuffer
# from rlkit.torch.her.her_twin_sac import HerTwinSAC
from rlkit.torch.networks import ConcatMlp
from rlkit.torch.sac.policies import TanhGaussianPolicy
from rlkit.torch.torch_rl_algorithm import TorchOnlineRLAlgorithm
grill_preprocess_variant(variant)
env = get_envs(variant)
es = get_exploration_strategy(variant, env)
max_path_length = variant['max_path_length']
observation_key = variant.get('observation_key', 'latent_observation')
desired_goal_key = variant.get('desired_goal_key', 'latent_desired_goal')
achieved_goal_key = desired_goal_key.replace("desired", "achieved")
obs_dim = (
env.observation_space.spaces[observation_key].low.size
+ env.observation_space.spaces[desired_goal_key].low.size
)
action_dim = env.action_space.low.size
qf1 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
**variant['qf_kwargs']
)
qf2 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
**variant['qf_kwargs']
)
target_qf1 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
**variant['qf_kwargs']
)
target_qf2 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
**variant['qf_kwargs']
)
policy = TanhGaussianPolicy(
obs_dim=obs_dim,
action_dim=action_dim,
**variant['policy_kwargs']
)
replay_buffer = ObsDictRelabelingBuffer(
env=env,
observation_key=observation_key,
desired_goal_key=desired_goal_key,
achieved_goal_key=achieved_goal_key,
**variant['replay_buffer_kwargs']
)
algo_kwargs = variant['algo_kwargs']
# algo_kwargs['replay_buffer'] = replay_buffer
# base_kwargs = algo_kwargs['base_kwargs']
# base_kwargs['training_env'] = env
# base_kwargs['render'] = variant["render"]
# base_kwargs['render_during_eval'] = variant["render"]
# her_kwargs = algo_kwargs['her_kwargs']
# her_kwargs['observation_key'] = observation_key
# her_kwargs['desired_goal_key'] = desired_goal_key
# algorithm = HerTwinSAC(
# env,
# qf1=qf1,
# qf2=qf2,
# vf=vf,
# target_vf=target_vf,
# policy=policy,
# exploration_policy=exploration_policy,
# **variant['algo_kwargs']
# )
trainer = SACTrainer(
env=env,
policy=policy,
qf1=qf1,
qf2=qf2,
target_qf1=target_qf1,
target_qf2=target_qf2,
**variant['twin_sac_trainer_kwargs']
)
trainer = HERTrainer(trainer)
eval_path_collector = VAEWrappedEnvPathCollector(
env,
MakeDeterministic(policy),
max_path_length,
observation_key=observation_key,
desired_goal_key=desired_goal_key,
goal_sampling_mode=variant['evaluation_goal_sampling_mode'],
)
expl_path_collector = VAEWrappedEnvPathCollector(
env,
policy,
max_path_length,
observation_key=observation_key,
desired_goal_key=desired_goal_key,
goal_sampling_mode=variant['exploration_goal_sampling_mode'],
)
algorithm = TorchOnlineRLAlgorithm(
trainer=trainer,
exploration_env=env,
evaluation_env=env,
exploration_data_collector=expl_path_collector,
evaluation_data_collector=eval_path_collector,
replay_buffer=replay_buffer,
max_path_length=max_path_length,
**variant['algo_kwargs']
)
if variant.get("save_video", True):
rollout_function = rf.create_rollout_function(
rf.multitask_rollout,
max_path_length=algorithm.max_path_length,
observation_key=algorithm.observation_key,
desired_goal_key=algorithm.desired_goal_key,
)
video_func = get_video_save_func(
rollout_function,
env,
algorithm.eval_policy,
variant,
)
algorithm.post_train_funcs.append(video_func)
algorithm.to(ptu.device)
vae.to(ptu.device)
algorithm.train()
def grill_tdm_td3_experiment(variant):
import rlkit.samplers.rollout_functions as rf
import rlkit.torch.pytorch_util as ptu
from rlkit.core import logger
from rlkit.data_management.obs_dict_replay_buffer import \
ObsDictRelabelingBuffer
from rlkit.exploration_strategies.base import (
PolicyWrappedWithExplorationStrategy
)
from rlkit.state_distance.tdm_networks import TdmQf, TdmPolicy
from rlkit.state_distance.tdm_td3 import TdmTd3
grill_preprocess_variant(variant)
env = get_envs(variant)
es = get_exploration_strategy(variant, env)
observation_key = variant.get('observation_key', 'latent_observation')
desired_goal_key = variant.get('desired_goal_key', 'latent_desired_goal')
achieved_goal_key = desired_goal_key.replace("desired", "achieved")
obs_dim = (
env.observation_space.spaces[observation_key].low.size
)
goal_dim = (
env.observation_space.spaces[desired_goal_key].low.size
)
action_dim = env.action_space.low.size
vectorized = 'vectorized' in env.reward_type
norm_order = env.norm_order
variant['algo_kwargs']['tdm_kwargs']['vectorized'] = vectorized
variant['qf_kwargs']['vectorized'] = vectorized
variant['qf_kwargs']['norm_order'] = norm_order
qf1 = TdmQf(
env=env,
observation_dim=obs_dim,
goal_dim=goal_dim,
action_dim=action_dim,
**variant['qf_kwargs']
)
qf2 = TdmQf(
env=env,
observation_dim=obs_dim,
goal_dim=goal_dim,
action_dim=action_dim,
**variant['qf_kwargs']
)
policy = TdmPolicy(
env=env,
observation_dim=obs_dim,
goal_dim=goal_dim,
action_dim=action_dim,
**variant['policy_kwargs']
)
exploration_policy = PolicyWrappedWithExplorationStrategy(
exploration_strategy=es,
policy=policy,
)
variant['replay_buffer_kwargs']['vectorized'] = vectorized
replay_buffer = ObsDictRelabelingBuffer(
env=env,
observation_key=observation_key,
desired_goal_key=desired_goal_key,
achieved_goal_key=achieved_goal_key,
**variant['replay_buffer_kwargs']
)
algo_kwargs = variant['algo_kwargs']
algo_kwargs['replay_buffer'] = replay_buffer
base_kwargs = algo_kwargs['base_kwargs']
base_kwargs['training_env'] = env
base_kwargs['render'] = variant["render"]
base_kwargs['render_during_eval'] = variant["render"]
tdm_kwargs = algo_kwargs['tdm_kwargs']
tdm_kwargs['observation_key'] = observation_key
tdm_kwargs['desired_goal_key'] = desired_goal_key
algorithm = TdmTd3(
env,
qf1=qf1,
qf2=qf2,
policy=policy,
exploration_policy=exploration_policy,
**variant['algo_kwargs']
)
algorithm.to(ptu.device)
if not variant.get("do_state_exp", False):
env.vae.to(ptu.device)
if variant.get("save_video", True):
logdir = logger.get_snapshot_dir()
policy.train(False)
rollout_function = rf.create_rollout_function(
rf.tdm_rollout,
init_tau=algorithm.max_tau,
max_path_length=algorithm.max_path_length,
observation_key=algorithm.observation_key,
desired_goal_key=algorithm.desired_goal_key,
)
video_func = get_video_save_func(
rollout_function,
env,
policy,
variant,
)
algorithm.post_train_funcs.append(video_func)
algorithm.train()
def grill_her_td3_experiment_online_vae_exploring(variant):
import rlkit.samplers.rollout_functions as rf
import rlkit.torch.pytorch_util as ptu
from rlkit.data_management.online_vae_replay_buffer import \
OnlineVaeRelabelingBuffer
from rlkit.exploration_strategies.base import (
PolicyWrappedWithExplorationStrategy
)
from rlkit.torch.her.online_vae_joint_algo import OnlineVaeHerJointAlgo
from rlkit.torch.networks import ConcatMlp, TanhMlpPolicy
from rlkit.torch.td3.td3 import TD3
from rlkit.torch.vae.vae_trainer import ConvVAETrainer
grill_preprocess_variant(variant)
env = get_envs(variant)
es = get_exploration_strategy(variant, env)
observation_key = variant.get('observation_key', 'latent_observation')
desired_goal_key = variant.get('desired_goal_key', 'latent_desired_goal')
achieved_goal_key = desired_goal_key.replace("desired", "achieved")
obs_dim = (
env.observation_space.spaces[observation_key].low.size
+ env.observation_space.spaces[desired_goal_key].low.size
)
action_dim = env.action_space.low.size
qf1 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
**variant['qf_kwargs'],
)
qf2 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
**variant['qf_kwargs'],
)
policy = TanhMlpPolicy(
input_size=obs_dim,
output_size=action_dim,
**variant['policy_kwargs'],
)
exploration_policy = PolicyWrappedWithExplorationStrategy(
exploration_strategy=es,
policy=policy,
)
exploring_qf1 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
**variant['qf_kwargs'],
)
exploring_qf2 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
**variant['qf_kwargs'],
)
exploring_policy = TanhMlpPolicy(
input_size=obs_dim,
output_size=action_dim,
**variant['policy_kwargs'],
)
exploring_exploration_policy = PolicyWrappedWithExplorationStrategy(
exploration_strategy=es,
policy=exploring_policy,
)
vae = env.vae
replay_buffer = OnlineVaeRelabelingBuffer(
vae=vae,
env=env,
observation_key=observation_key,
desired_goal_key=desired_goal_key,
achieved_goal_key=achieved_goal_key,
**variant['replay_buffer_kwargs']
)
variant["algo_kwargs"]["replay_buffer"] = replay_buffer
if variant.get('use_replay_buffer_goals', False):
env.replay_buffer = replay_buffer
env.use_replay_buffer_goals = True
vae_trainer_kwargs = variant.get('vae_trainer_kwargs')
t = ConvVAETrainer(variant['vae_train_data'],
variant['vae_test_data'],
vae,
beta=variant['online_vae_beta'],
**vae_trainer_kwargs)
control_algorithm = TD3(
env=env,
training_env=env,
qf1=qf1,
qf2=qf2,
policy=policy,
exploration_policy=exploration_policy,
**variant['algo_kwargs']
)
exploring_algorithm = TD3(
env=env,
training_env=env,
qf1=exploring_qf1,
qf2=exploring_qf2,
policy=exploring_policy,
exploration_policy=exploring_exploration_policy,
**variant['algo_kwargs']
)
assert 'vae_training_schedule' not in variant,\
"Just put it in joint_algo_kwargs"
algorithm = OnlineVaeHerJointAlgo(
vae=vae,
vae_trainer=t,
env=env,
training_env=env,
policy=policy,
exploration_policy=exploration_policy,
replay_buffer=replay_buffer,
algo1=control_algorithm,
algo2=exploring_algorithm,
algo1_prefix="Control_",
algo2_prefix="VAE_Exploration_",
observation_key=observation_key,
desired_goal_key=desired_goal_key,
**variant['joint_algo_kwargs']
)
algorithm.to(ptu.device)
vae.to(ptu.device)
if variant.get("save_video", True):
policy.train(False)
rollout_function = rf.create_rollout_function(
rf.multitask_rollout,
max_path_length=algorithm.max_path_length,
observation_key=algorithm.observation_key,
desired_goal_key=algorithm.desired_goal_key,
)
video_func = get_video_save_func(
rollout_function,
env,
algorithm.eval_policy,
variant,
)
algorithm.post_train_funcs.append(video_func)
algorithm.train()
def grill_her_twin_sac_experiment_online_vae(variant):
import rlkit.torch.pytorch_util as ptu
from rlkit.data_management.online_vae_replay_buffer import \
OnlineVaeRelabelingBuffer
from rlkit.torch.networks import ConcatMlp
from rlkit.torch.sac.policies import TanhGaussianPolicy
from rlkit.torch.vae.vae_trainer import ConvVAETrainer
grill_preprocess_variant(variant)
env = get_envs(variant)
uniform_dataset_fn = variant.get('generate_uniform_dataset_fn', None)
if uniform_dataset_fn:
uniform_dataset=uniform_dataset_fn(
**variant['generate_uniform_dataset_kwargs']
)
else:
uniform_dataset=None
observation_key = variant.get('observation_key', 'latent_observation')
desired_goal_key = variant.get('desired_goal_key', 'latent_desired_goal')
achieved_goal_key = desired_goal_key.replace("desired", "achieved")
obs_dim = (
env.observation_space.spaces[observation_key].low.size
+ env.observation_space.spaces[desired_goal_key].low.size
)
action_dim = env.action_space.low.size
hidden_sizes = variant.get('hidden_sizes', [400, 300])
qf1 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
hidden_sizes=hidden_sizes,
)
qf2 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
hidden_sizes=hidden_sizes,
)
target_qf1 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
hidden_sizes=hidden_sizes,
)
target_qf2 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
hidden_sizes=hidden_sizes,
)
policy = TanhGaussianPolicy(
obs_dim=obs_dim,
action_dim=action_dim,
hidden_sizes=hidden_sizes,
)
vae = env.vae
replay_buffer_class = variant.get("replay_buffer_class", OnlineVaeRelabelingBuffer)
replay_buffer = replay_buffer_class(
vae=env.vae,
env=env,
observation_key=observation_key,
desired_goal_key=desired_goal_key,
achieved_goal_key=achieved_goal_key,
**variant['replay_buffer_kwargs']
)
vae_trainer_class = variant.get("vae_trainer_class", ConvVAETrainer)
vae_trainer = vae_trainer_class(
env.vae,
**variant['online_vae_trainer_kwargs']
)
assert 'vae_training_schedule' not in variant, "Just put it in algo_kwargs"
max_path_length = variant['max_path_length']
trainer = SACTrainer(
env=env,
policy=policy,
qf1=qf1,
qf2=qf2,
target_qf1=target_qf1,
target_qf2=target_qf2,
**variant['twin_sac_trainer_kwargs']
)
trainer = HERTrainer(trainer)
eval_path_collector = VAEWrappedEnvPathCollector(
env,
MakeDeterministic(policy),
max_path_length,
observation_key=observation_key,
desired_goal_key=desired_goal_key,
goal_sampling_mode=variant['evaluation_goal_sampling_mode'],
)
expl_path_collector = VAEWrappedEnvPathCollector(
env,
policy,
max_path_length,
observation_key=observation_key,
desired_goal_key=desired_goal_key,
goal_sampling_mode=variant['exploration_goal_sampling_mode'],
)
algorithm = OnlineVaeAlgorithm(
trainer=trainer,
exploration_env=env,
evaluation_env=env,
exploration_data_collector=expl_path_collector,
evaluation_data_collector=eval_path_collector,
replay_buffer=replay_buffer,
vae=vae,
vae_trainer=vae_trainer,
uniform_dataset=uniform_dataset,
max_path_length=max_path_length,
**variant['algo_kwargs']
)
if variant.get("save_video", True):
video_func = VideoSaveFunction(
env,
variant,
)
algorithm.post_train_funcs.append(video_func)
if variant['custom_goal_sampler'] == 'replay_buffer':
env.custom_goal_sampler = replay_buffer.sample_buffer_goals
algorithm.to(ptu.device)
vae.to(ptu.device)
algorithm.train()
def grill_her_td3_experiment_online_vae(variant):
import rlkit.torch.pytorch_util as ptu
from rlkit.data_management.online_vae_replay_buffer import \
OnlineVaeRelabelingBuffer
from rlkit.torch.networks import ConcatMlp, TanhMlpPolicy
from rlkit.torch.vae.vae_trainer import ConvVAETrainer
from rlkit.torch.td3.td3 import TD3
from rlkit.exploration_strategies.base import (
PolicyWrappedWithExplorationStrategy
)
from rlkit.exploration_strategies.gaussian_and_epislon import \
GaussianAndEpislonStrategy
grill_preprocess_variant(variant)
env = get_envs(variant)
uniform_dataset_fn = variant.get('generate_uniform_dataset_fn', None)
if uniform_dataset_fn:
uniform_dataset=uniform_dataset_fn(
**variant['generate_uniform_dataset_kwargs']
)
else:
uniform_dataset=None
observation_key = variant.get('observation_key', 'latent_observation')
desired_goal_key = variant.get('desired_goal_key', 'latent_desired_goal')
achieved_goal_key = desired_goal_key.replace("desired", "achieved")
obs_dim = (
env.observation_space.spaces[observation_key].low.size
+ env.observation_space.spaces[desired_goal_key].low.size
)
action_dim = env.action_space.low.size
hidden_sizes = variant.get('hidden_sizes', [400, 300])
qf1 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
hidden_sizes=hidden_sizes,
)
qf2 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
hidden_sizes=hidden_sizes,
)
target_qf1 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
hidden_sizes=hidden_sizes,
)
target_qf2 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
hidden_sizes=hidden_sizes,
)
policy = TanhMlpPolicy(
input_size=obs_dim,
output_size=action_dim,
hidden_sizes=hidden_sizes,
# **variant['policy_kwargs']
)
target_policy = TanhMlpPolicy(
input_size=obs_dim,
output_size=action_dim,
hidden_sizes=hidden_sizes,
# **variant['policy_kwargs']
)
es = GaussianAndEpislonStrategy(
action_space=env.action_space,
max_sigma=.2,
min_sigma=.2, # constant sigma
epsilon=.3,
)
expl_policy = PolicyWrappedWithExplorationStrategy(
exploration_strategy=es,
policy=policy,
)
vae = env.vae
replay_buffer_class = variant.get("replay_buffer_class", OnlineVaeRelabelingBuffer)
replay_buffer = replay_buffer_class(
vae=env.vae,
env=env,
observation_key=observation_key,
desired_goal_key=desired_goal_key,
achieved_goal_key=achieved_goal_key,
**variant['replay_buffer_kwargs']
)
vae_trainer_class = variant.get("vae_trainer_class", ConvVAETrainer)
vae_trainer = vae_trainer_class(
env.vae,
**variant['online_vae_trainer_kwargs']
)
assert 'vae_training_schedule' not in variant, "Just put it in algo_kwargs"
max_path_length = variant['max_path_length']
trainer = TD3(
policy=policy,
qf1=qf1,
qf2=qf2,
target_qf1=target_qf1,
target_qf2=target_qf2,
target_policy=target_policy,
**variant['td3_trainer_kwargs']
)
trainer = HERTrainer(trainer)
eval_path_collector = VAEWrappedEnvPathCollector(
env,
policy,
max_path_length,
observation_key=observation_key,
desired_goal_key=desired_goal_key,
goal_sampling_mode=variant['evaluation_goal_sampling_mode'],
)
expl_path_collector = VAEWrappedEnvPathCollector(
env,
expl_policy,
max_path_length,
observation_key=observation_key,
desired_goal_key=desired_goal_key,
goal_sampling_mode=variant['exploration_goal_sampling_mode'],
)
algorithm = OnlineVaeAlgorithm(
trainer=trainer,
exploration_env=env,
evaluation_env=env,
exploration_data_collector=expl_path_collector,
evaluation_data_collector=eval_path_collector,
replay_buffer=replay_buffer,
vae=vae,
vae_trainer=vae_trainer,
uniform_dataset=uniform_dataset,
max_path_length=max_path_length,
**variant['algo_kwargs']
)
if variant.get("save_video", True):
video_func = VideoSaveFunction(
env,
variant,
)
algorithm.post_train_funcs.append(video_func)
if variant['custom_goal_sampler'] == 'replay_buffer':
env.custom_goal_sampler = replay_buffer.sample_buffer_goals
algorithm.to(ptu.device)
vae.to(ptu.device)
algorithm.train()
def grill_her_td3_experiment_offpolicy_online_vae(variant):
import rlkit.torch.pytorch_util as ptu
from rlkit.data_management.online_vae_replay_buffer import \
OnlineVaeRelabelingBuffer
from rlkit.torch.networks import ConcatMlp, TanhMlpPolicy
from rlkit.torch.vae.vae_trainer import ConvVAETrainer
from rlkit.torch.td3.td3 import TD3
from rlkit.exploration_strategies.base import (
PolicyWrappedWithExplorationStrategy
)
from rlkit.exploration_strategies.gaussian_and_epislon import \
GaussianAndEpislonStrategy
from rlkit.torch.vae.online_vae_offpolicy_algorithm import OnlineVaeOffpolicyAlgorithm
grill_preprocess_variant(variant)
env = get_envs(variant)
uniform_dataset_fn = variant.get('generate_uniform_dataset_fn', None)
if uniform_dataset_fn:
uniform_dataset=uniform_dataset_fn(
**variant['generate_uniform_dataset_kwargs']
)
else:
uniform_dataset=None
observation_key = variant.get('observation_key', 'latent_observation')
desired_goal_key = variant.get('desired_goal_key', 'latent_desired_goal')
achieved_goal_key = desired_goal_key.replace("desired", "achieved")
obs_dim = (
env.observation_space.spaces[observation_key].low.size
+ env.observation_space.spaces[desired_goal_key].low.size
)
action_dim = env.action_space.low.size
hidden_sizes = variant.get('hidden_sizes', [400, 300])
qf1 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
hidden_sizes=hidden_sizes,
)
qf2 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
hidden_sizes=hidden_sizes,
)
target_qf1 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
hidden_sizes=hidden_sizes,
)
target_qf2 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
hidden_sizes=hidden_sizes,
)
policy = TanhMlpPolicy(
input_size=obs_dim,
output_size=action_dim,
hidden_sizes=hidden_sizes,
# **variant['policy_kwargs']
)
target_policy = TanhMlpPolicy(
input_size=obs_dim,
output_size=action_dim,
hidden_sizes=hidden_sizes,
# **variant['policy_kwargs']
)
es = GaussianAndEpislonStrategy(
action_space=env.action_space,
max_sigma=.2,
min_sigma=.2, # constant sigma
epsilon=.3,
)
expl_policy = PolicyWrappedWithExplorationStrategy(
exploration_strategy=es,
policy=policy,
)
vae = env.vae
replay_buffer_class = variant.get("replay_buffer_class", OnlineVaeRelabelingBuffer)
replay_buffer = replay_buffer_class(
vae=env.vae,
env=env,
observation_key=observation_key,
desired_goal_key=desired_goal_key,
achieved_goal_key=achieved_goal_key,
**variant['replay_buffer_kwargs']
)
replay_buffer.representation_size = vae.representation_size
vae_trainer_class = variant.get("vae_trainer_class", ConvVAETrainer)
vae_trainer = vae_trainer_class(
env.vae,
**variant['online_vae_trainer_kwargs']
)
assert 'vae_training_schedule' not in variant, "Just put it in algo_kwargs"
max_path_length = variant['max_path_length']
trainer = TD3(
policy=policy,
qf1=qf1,
qf2=qf2,
target_qf1=target_qf1,
target_qf2=target_qf2,
target_policy=target_policy,
**variant['td3_trainer_kwargs']
)
trainer = HERTrainer(trainer)
eval_path_collector = VAEWrappedEnvPathCollector(
env,
policy,
max_path_length,
observation_key=observation_key,
desired_goal_key=desired_goal_key,
goal_sampling_mode=variant['evaluation_goal_sampling_mode'],
)
expl_path_collector = VAEWrappedEnvPathCollector(
env,
expl_policy,
max_path_length,
observation_key=observation_key,
desired_goal_key=desired_goal_key,
goal_sampling_mode=variant['exploration_goal_sampling_mode'],
)
algorithm = OnlineVaeOffpolicyAlgorithm(
trainer=trainer,
exploration_env=env,
evaluation_env=env,
exploration_data_collector=expl_path_collector,
evaluation_data_collector=eval_path_collector,
replay_buffer=replay_buffer,
vae=vae,
vae_trainer=vae_trainer,
uniform_dataset=uniform_dataset,
max_path_length=max_path_length,
**variant['algo_kwargs']
)
if variant.get("save_video", True):
video_func = VideoSaveFunction(
env,
variant,
)
algorithm.post_train_funcs.append(video_func)
if variant['custom_goal_sampler'] == 'replay_buffer':
env.custom_goal_sampler = replay_buffer.sample_buffer_goals
algorithm.to(ptu.device)
vae.to(ptu.device)
algorithm.pretrain()
algorithm.train()
def active_representation_learning_experiment(variant):
import rlkit.torch.pytorch_util as ptu
from rlkit.data_management.obs_dict_replay_buffer import ObsDictReplayBuffer
from rlkit.torch.networks import ConcatMlp
from rlkit.torch.sac.policies import TanhGaussianPolicy
from rlkit.torch.arl.active_representation_learning_algorithm import \
ActiveRepresentationLearningAlgorithm
from rlkit.torch.arl.representation_wrappers import RepresentationWrappedEnv
from multiworld.core.image_env import ImageEnv
from rlkit.samplers.data_collector import MdpPathCollector
grill_preprocess_variant(variant)
model_class = variant.get('model_class')
model_kwargs = variant.get('model_kwargs')
model = model_class(**model_kwargs)
model.representation_size = 4
model.imsize = 48
variant["vae_path"] = model
reward_params = variant.get("reward_params", dict())
init_camera = variant.get("init_camera", None)
env = variant["env_class"](**variant['env_kwargs'])
image_env = ImageEnv(
env,
variant.get('imsize'),
init_camera=init_camera,
transpose=True,
normalize=True,
)
env = RepresentationWrappedEnv(
image_env,
model,
)
uniform_dataset_fn = variant.get('generate_uniform_dataset_fn', None)
if uniform_dataset_fn:
uniform_dataset=uniform_dataset_fn(
**variant['generate_uniform_dataset_kwargs']
)
else:
uniform_dataset=None
observation_key = variant.get('observation_key', 'latent_observation')
desired_goal_key = variant.get('desired_goal_key', 'latent_desired_goal')
achieved_goal_key = desired_goal_key.replace("desired", "achieved")
obs_dim = env.observation_space.spaces[observation_key].low.size
action_dim = env.action_space.low.size
hidden_sizes = variant.get('hidden_sizes', [400, 300])
qf1 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
hidden_sizes=hidden_sizes,
)
qf2 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
hidden_sizes=hidden_sizes,
)
target_qf1 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
hidden_sizes=hidden_sizes,
)
target_qf2 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
hidden_sizes=hidden_sizes,
)
policy = TanhGaussianPolicy(
obs_dim=obs_dim,
action_dim=action_dim,
hidden_sizes=hidden_sizes,
)
vae = env.vae
replay_buffer = ObsDictReplayBuffer(
env=env,
**variant['replay_buffer_kwargs']
)
model_trainer_class = variant.get('model_trainer_class')
model_trainer_kwargs = variant.get('model_trainer_kwargs')
model_trainer = model_trainer_class(
model,
**model_trainer_kwargs,
)
# vae_trainer = ConvVAETrainer(
# env.vae,
# **variant['online_vae_trainer_kwargs']
# )
assert 'vae_training_schedule' not in variant, "Just put it in algo_kwargs"
max_path_length = variant['max_path_length']
trainer = SACTrainer(
env=env,
policy=policy,
qf1=qf1,
qf2=qf2,
target_qf1=target_qf1,
target_qf2=target_qf2,
**variant['twin_sac_trainer_kwargs']
)
# trainer = HERTrainer(trainer)
eval_path_collector = MdpPathCollector(
env,
MakeDeterministic(policy),
# max_path_length,
# observation_key=observation_key,
# desired_goal_key=desired_goal_key,
)
expl_path_collector = MdpPathCollector(
env,
policy,
# max_path_length,
# observation_key=observation_key,
# desired_goal_key=desired_goal_key,
)
algorithm = ActiveRepresentationLearningAlgorithm(
trainer=trainer,
exploration_env=env,
evaluation_env=env,
exploration_data_collector=expl_path_collector,
evaluation_data_collector=eval_path_collector,
replay_buffer=replay_buffer,
model=model,
model_trainer=model_trainer,
uniform_dataset=uniform_dataset,
max_path_length=max_path_length,
**variant['algo_kwargs']
)
algorithm.to(ptu.device)
vae.to(ptu.device)
algorithm.train()
def grill_tdm_td3_experiment_online_vae(variant):
import rlkit.samplers.rollout_functions as rf
import rlkit.torch.pytorch_util as ptu
from rlkit.data_management.online_vae_replay_buffer import \
OnlineVaeRelabelingBuffer
from rlkit.exploration_strategies.base import (
PolicyWrappedWithExplorationStrategy
)
from rlkit.state_distance.tdm_networks import TdmQf, TdmPolicy
from rlkit.torch.vae.vae_trainer import ConvVAETrainer
from rlkit.torch.online_vae.online_vae_tdm_td3 import OnlineVaeTdmTd3
grill_preprocess_variant(variant)
env = get_envs(variant)
es = get_exploration_strategy(variant, env)
vae_trainer_kwargs = variant.get('vae_trainer_kwargs')
observation_key = variant.get('observation_key', 'latent_observation')
desired_goal_key = variant.get('desired_goal_key', 'latent_desired_goal')
achieved_goal_key = desired_goal_key.replace("desired", "achieved")
obs_dim = (
env.observation_space.spaces[observation_key].low.size
)
goal_dim = (
env.observation_space.spaces[desired_goal_key].low.size
)
action_dim = env.action_space.low.size
vectorized = 'vectorized' in env.reward_type
variant['algo_kwargs']['tdm_td3_kwargs']['tdm_kwargs'][
'vectorized'] = vectorized
norm_order = env.norm_order
# variant['algo_kwargs']['tdm_td3_kwargs']['tdm_kwargs'][
# 'norm_order'] = norm_order
qf1 = TdmQf(
env=env,
vectorized=vectorized,
norm_order=norm_order,
observation_dim=obs_dim,
goal_dim=goal_dim,
action_dim=action_dim,
**variant['qf_kwargs']
)
qf2 = TdmQf(
env=env,
vectorized=vectorized,
norm_order=norm_order,
observation_dim=obs_dim,
goal_dim=goal_dim,
action_dim=action_dim,
**variant['qf_kwargs']
)
policy = TdmPolicy(
env=env,
observation_dim=obs_dim,
goal_dim=goal_dim,
action_dim=action_dim,
**variant['policy_kwargs']
)
exploration_policy = PolicyWrappedWithExplorationStrategy(
exploration_strategy=es,
policy=policy,
)
vae = env.vae
replay_buffer = OnlineVaeRelabelingBuffer(
vae=vae,
env=env,
observation_key=observation_key,
desired_goal_key=desired_goal_key,
achieved_goal_key=achieved_goal_key,
**variant['replay_buffer_kwargs']
)
algo_kwargs = variant['algo_kwargs']['tdm_td3_kwargs']
td3_kwargs = algo_kwargs['td3_kwargs']
td3_kwargs['training_env'] = env
tdm_kwargs = algo_kwargs['tdm_kwargs']
tdm_kwargs['observation_key'] = observation_key
tdm_kwargs['desired_goal_key'] = desired_goal_key
algo_kwargs["replay_buffer"] = replay_buffer
t = ConvVAETrainer(variant['vae_train_data'],
variant['vae_test_data'],
vae,
beta=variant['online_vae_beta'],
**vae_trainer_kwargs)
render = variant["render"]
assert 'vae_training_schedule' not in variant, "Just put it in algo_kwargs"
algorithm = OnlineVaeTdmTd3(
online_vae_kwargs=dict(
vae=vae,
vae_trainer=t,
**variant['algo_kwargs']['online_vae_kwargs']
),
tdm_td3_kwargs=dict(
env=env,
qf1=qf1,
qf2=qf2,
policy=policy,
exploration_policy=exploration_policy,
**variant['algo_kwargs']['tdm_td3_kwargs']
),
)
algorithm.to(ptu.device)
vae.to(ptu.device)
if variant.get("save_video", True):
policy.train(False)
rollout_function = rf.create_rollout_function(
rf.tdm_rollout,
init_tau=algorithm._sample_max_tau_for_rollout(),
decrement_tau=algorithm.cycle_taus_for_rollout,
cycle_tau=algorithm.cycle_taus_for_rollout,
max_path_length=algorithm.max_path_length,
observation_key=algorithm.observation_key,
desired_goal_key=algorithm.desired_goal_key,
)
video_func = get_video_save_func(
rollout_function,
env,
algorithm.eval_policy,
variant,
)
algorithm.post_train_funcs.append(video_func)
algorithm.to(ptu.device)
if not variant.get("do_state_exp", False):
env.vae.to(ptu.device)
algorithm.train()
def grill_tdm_twin_sac_experiment(variant):
import rlkit.samplers.rollout_functions as rf
import rlkit.torch.pytorch_util as ptu
from rlkit.data_management.obs_dict_replay_buffer import \
ObsDictRelabelingBuffer
from rlkit.state_distance.tdm_networks import (
TdmQf, TdmVf,
StochasticTdmPolicy,
)
from rlkit.state_distance.tdm_twin_sac import TdmTwinSAC
grill_preprocess_variant(variant)
env = get_envs(variant)
observation_key = variant.get('observation_key', 'latent_observation')
desired_goal_key = variant.get('desired_goal_key', 'latent_desired_goal')
achieved_goal_key = desired_goal_key.replace("desired", "achieved")
obs_dim = (
env.observation_space.spaces[observation_key].low.size
)
goal_dim = (
env.observation_space.spaces[desired_goal_key].low.size
)
action_dim = env.action_space.low.size
vectorized = 'vectorized' in env.reward_type
norm_order = env.norm_order
variant['algo_kwargs']['tdm_kwargs']['vectorized'] = vectorized
variant['qf_kwargs']['vectorized'] = vectorized
variant['vf_kwargs']['vectorized'] = vectorized
variant['qf_kwargs']['norm_order'] = norm_order
variant['vf_kwargs']['norm_order'] = norm_order
qf1 = TdmQf(
env=env,
observation_dim=obs_dim,
goal_dim=goal_dim,
action_dim=action_dim,
**variant['qf_kwargs']
)
qf2 = TdmQf(
env=env,
observation_dim=obs_dim,
goal_dim=goal_dim,
action_dim=action_dim,
**variant['qf_kwargs']
)
vf = TdmVf(
env=env,
observation_dim=obs_dim,
goal_dim=goal_dim,
**variant['vf_kwargs']
)
policy = StochasticTdmPolicy(
env=env,
observation_dim=obs_dim,
goal_dim=goal_dim,
action_dim=action_dim,
**variant['policy_kwargs']
)
variant['replay_buffer_kwargs']['vectorized'] = vectorized
replay_buffer = ObsDictRelabelingBuffer(
env=env,
observation_key=observation_key,
desired_goal_key=desired_goal_key,
achieved_goal_key=achieved_goal_key,
**variant['replay_buffer_kwargs']
)
algo_kwargs = variant['algo_kwargs']
algo_kwargs['replay_buffer'] = replay_buffer
base_kwargs = algo_kwargs['base_kwargs']
base_kwargs['training_env'] = env
base_kwargs['render'] = variant["render"]
base_kwargs['render_during_eval'] = variant["render"]
tdm_kwargs = algo_kwargs['tdm_kwargs']
tdm_kwargs['observation_key'] = observation_key
tdm_kwargs['desired_goal_key'] = desired_goal_key
algorithm = TdmTwinSAC(
env,
qf1=qf1,
qf2=qf2,
vf=vf,
policy=policy,
**variant['algo_kwargs']
)
if variant.get("save_video", True):
rollout_function = rf.create_rollout_function(
rf.tdm_rollout,
init_tau=algorithm._sample_max_tau_for_rollout(),
decrement_tau=algorithm.cycle_taus_for_rollout,
cycle_tau=algorithm.cycle_taus_for_rollout,
max_path_length=algorithm.max_path_length,
observation_key=algorithm.observation_key,
desired_goal_key=algorithm.desired_goal_key,
)
video_func = get_video_save_func(
rollout_function,
env,
algorithm.eval_policy,
variant,
)
algorithm.post_train_funcs.append(video_func)
algorithm.to(ptu.device)
if not variant.get("do_state_exp", False):
env.vae.to(ptu.device)
algorithm.train()
# def grill_her_td3_experiment_online_vae(variant):
# import rlkit.samplers.rollout_functions as rf
# import rlkit.torch.pytorch_util as ptu
# from rlkit.data_management.online_vae_replay_buffer import \
# OnlineVaeRelabelingBuffer
# from rlkit.exploration_strategies.base import (
# PolicyWrappedWithExplorationStrategy
# )
# from rlkit.torch.her.online_vae_her_td3 import OnlineVaeHerTd3
# from rlkit.torch.networks import FlattenMlp, TanhMlpPolicy
# from rlkit.torch.vae.vae_trainer import ConvVAETrainer
# grill_preprocess_variant(variant)
# env = get_envs(variant)
# es = get_exploration_strategy(variant, env)
# observation_key = variant.get('observation_key', 'latent_observation')
# desired_goal_key = variant.get('desired_goal_key', 'latent_desired_goal')
# achieved_goal_key = desired_goal_key.replace("desired", "achieved")
# obs_dim = (
# env.observation_space.spaces[observation_key].low.size
# + env.observation_space.spaces[desired_goal_key].low.size
# )
# action_dim = env.action_space.low.size
# qf1 = FlattenMlp(
# input_size=obs_dim + action_dim,
# output_size=1,
# **variant['qf_kwargs'],
# )
# qf2 = FlattenMlp(
# input_size=obs_dim + action_dim,
# output_size=1,
# **variant['qf_kwargs'],
# )
# policy = TanhMlpPolicy(
# input_size=obs_dim,
# output_size=action_dim,
# **variant['policy_kwargs'],
# )
# target_qf1 = FlattenMlp(
# input_size=obs_dim + action_dim,
# output_size=1,
# **variant['qf_kwargs'],
# )
# target_qf2 = FlattenMlp(
# input_size=obs_dim + action_dim,
# output_size=1,
# **variant['qf_kwargs'],
# )
# target_policy = TanhMlpPolicy(
# input_size=obs_dim,
# output_size=action_dim,
# **variant['policy_kwargs'],
# )
# exploration_policy = PolicyWrappedWithExplorationStrategy(
# exploration_strategy=es,
# policy=policy,
# )
# vae = env.vae
# vae.action_dim = action_dim
# replay_buffer = OnlineVaeRelabelingBuffer(
# vae=vae,
# env=env,
# observation_key=observation_key,
# desired_goal_key=desired_goal_key,
# achieved_goal_key=achieved_goal_key,
# **variant['replay_buffer_kwargs']
# )
# variant["algo_kwargs"]["base_kwargs"]["replay_buffer"] = replay_buffer
# if variant.get('use_replay_buffer_goals', False):
# env.replay_buffer = replay_buffer
# env.use_replay_buffer_goals = True
# vae_trainer_kwargs = variant.get('vae_trainer_kwargs')
# t = ConvVAETrainer(variant['vae_train_data'],
# variant['vae_test_data'],
# vae,
# beta=variant['online_vae_beta'],
# **vae_trainer_kwargs)
# render = variant["render"]
# assert 'vae_training_schedule' not in variant, "Just put it in algo_kwargs"
# algorithm = OnlineVaeHerTd3(
# online_vae_kwargs=dict(
# vae=vae,
# vae_trainer=t,
# **variant['algo_kwargs']['online_vae_kwargs']
# ),
# base_kwargs=dict(
# env=env,
# training_env=env,
# policy=policy,
# exploration_policy=exploration_policy,
# render=render,
# render_during_eval=render,
# **variant['algo_kwargs']['base_kwargs'],
# ),
# her_kwargs=dict(
# observation_key=observation_key,
# desired_goal_key=desired_goal_key,
# ),
# td3_kwargs=dict(
# **variant['algo_kwargs']['td3_kwargs'],
# qf1=qf1,
# qf2=qf2,
# target_qf1=target_qf1,
# target_qf2=target_qf2,
# target_policy=target_policy,
# )
# )
# algorithm.to(ptu.device)
# vae.to(ptu.device)
# if variant.get("save_video", True):
# rollout_function = rf.create_rollout_function(
# rf.multitask_rollout,
# max_path_length=algorithm.max_path_length,
# observation_key=algorithm.observation_key,
# desired_goal_key=algorithm.desired_goal_key,
# )
# video_func = get_video_save_func(
# rollout_function,
# env,
# algorithm.eval_policy,
# variant,
# )
# algorithm.post_train_funcs.append(video_func)
# algorithm.train()
def grill_her_td3_experiment_online_vae_exploring(variant):
import rlkit.samplers.rollout_functions as rf
import rlkit.torch.pytorch_util as ptu
from rlkit.data_management.online_vae_replay_buffer import \
OnlineVaeRelabelingBuffer
from rlkit.exploration_strategies.base import (
PolicyWrappedWithExplorationStrategy
)
from rlkit.torch.her.online_vae_joint_algo import OnlineVaeHerJointAlgo
from rlkit.torch.networks import ConcatMlp, TanhMlpPolicy
from rlkit.torch.td3.td3 import TD3
from rlkit.torch.vae.vae_trainer import ConvVAETrainer
grill_preprocess_variant(variant)
env = get_envs(variant)
es = get_exploration_strategy(variant, env)
observation_key = variant.get('observation_key', 'latent_observation')
desired_goal_key = variant.get('desired_goal_key', 'latent_desired_goal')
achieved_goal_key = desired_goal_key.replace("desired", "achieved")
obs_dim = (
env.observation_space.spaces[observation_key].low.size
+ env.observation_space.spaces[desired_goal_key].low.size
)
action_dim = env.action_space.low.size
qf1 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
**variant['qf_kwargs'],
)
qf2 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
**variant['qf_kwargs'],
)
policy = TanhMlpPolicy(
input_size=obs_dim,
output_size=action_dim,
**variant['policy_kwargs'],
)
exploration_policy = PolicyWrappedWithExplorationStrategy(
exploration_strategy=es,
policy=policy,
)
exploring_qf1 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
**variant['qf_kwargs'],
)
exploring_qf2 = ConcatMlp(
input_size=obs_dim + action_dim,
output_size=1,
**variant['qf_kwargs'],
)
exploring_policy = TanhMlpPolicy(
input_size=obs_dim,
output_size=action_dim,
**variant['policy_kwargs'],
)
exploring_exploration_policy = PolicyWrappedWithExplorationStrategy(
exploration_strategy=es,
policy=exploring_policy,
)
vae = env.vae
replay_buffer = OnlineVaeRelabelingBuffer(
vae=vae,
env=env,
observation_key=observation_key,
desired_goal_key=desired_goal_key,
achieved_goal_key=achieved_goal_key,
**variant['replay_buffer_kwargs']
)
variant["algo_kwargs"]["replay_buffer"] = replay_buffer
if variant.get('use_replay_buffer_goals', False):
env.replay_buffer = replay_buffer
env.use_replay_buffer_goals = True
vae_trainer_kwargs = variant.get('vae_trainer_kwargs')
t = ConvVAETrainer(variant['vae_train_data'],
variant['vae_test_data'],
vae,
beta=variant['online_vae_beta'],
**vae_trainer_kwargs)
control_algorithm = TD3(
env=env,
training_env=env,
qf1=qf1,
qf2=qf2,
policy=policy,
exploration_policy=exploration_policy,
**variant['algo_kwargs']
)
exploring_algorithm = TD3(
env=env,
training_env=env,
qf1=exploring_qf1,
qf2=exploring_qf2,
policy=exploring_policy,
exploration_policy=exploring_exploration_policy,
**variant['algo_kwargs']
)
assert 'vae_training_schedule' not in variant,\
"Just put it in joint_algo_kwargs"
algorithm = OnlineVaeHerJointAlgo(
vae=vae,
vae_trainer=t,
env=env,
training_env=env,
policy=policy,
exploration_policy=exploration_policy,
replay_buffer=replay_buffer,
algo1=control_algorithm,
algo2=exploring_algorithm,
algo1_prefix="Control_",
algo2_prefix="VAE_Exploration_",
observation_key=observation_key,
desired_goal_key=desired_goal_key,
**variant['joint_algo_kwargs']
)
algorithm.to(ptu.device)
vae.to(ptu.device)
if variant.get("save_video", True):
policy.train(False)
rollout_function = rf.create_rollout_function(
rf.multitask_rollout,
max_path_length=algorithm.max_path_length,
observation_key=algorithm.observation_key,
desired_goal_key=algorithm.desired_goal_key,
)
video_func = get_video_save_func(
rollout_function,
env,
algorithm.eval_policy,
variant,
)
algorithm.post_train_funcs.append(video_func)
algorithm.train()
def HER_baseline_her_td3_experiment(variant):
import rlkit.torch.pytorch_util as ptu
from rlkit.data_management.obs_dict_replay_buffer import \
ObsDictRelabelingBuffer
from rlkit.exploration_strategies.base import (
PolicyWrappedWithExplorationStrategy
)
from rlkit.torch.her.her_td3 import HerTd3
from rlkit.torch.networks import MergedCNN, CNNPolicy
import torch
from multiworld.core.image_env import ImageEnv
from rlkit.misc.asset_loader import load_local_or_remote_file
init_camera = variant.get("init_camera", None)
presample_goals = variant.get('presample_goals', False)
presampled_goals_path = get_presampled_goals_path(
variant.get('presampled_goals_path', None))
if 'env_id' in variant:
import gym
import multiworld
multiworld.register_all_envs()
env = gym.make(variant['env_id'])
else:
env = variant["env_class"](**variant['env_kwargs'])
image_env = ImageEnv(
env,
variant.get('imsize'),
reward_type='image_sparse',
init_camera=init_camera,
transpose=True,
normalize=True,
)
if presample_goals:
if presampled_goals_path is None:
image_env.non_presampled_goal_img_is_garbage = True
presampled_goals = variant['generate_goal_dataset_fctn'](
env=image_env,
**variant['goal_generation_kwargs']
)
else:
presampled_goals = load_local_or_remote_file(
presampled_goals_path
).item()
del image_env
env = ImageEnv(
env,
variant.get('imsize'),
reward_type='image_distance',
init_camera=init_camera,
transpose=True,
normalize=True,
presampled_goals=presampled_goals,
)
else:
env = image_env
es = get_exploration_strategy(variant, env)
observation_key = variant.get('observation_key', 'image_observation')
desired_goal_key = variant.get('desired_goal_key', 'image_desired_goal')
achieved_goal_key = desired_goal_key.replace("desired", "achieved")
imsize=variant['imsize']
action_dim = env.action_space.low.size
qf1 = MergedCNN(input_width=imsize,
input_height=imsize,
output_size=1,
input_channels=3 * 2,
added_fc_input_size=action_dim,
**variant['cnn_params']
)
qf2 = MergedCNN(input_width=imsize,
input_height=imsize,
output_size=1,
input_channels=3 * 2,
added_fc_input_size=action_dim,
**variant['cnn_params']
)
policy = CNNPolicy(input_width=imsize,
input_height=imsize,
added_fc_input_size=0,
output_size=action_dim,
input_channels=3 * 2,
output_activation=torch.tanh,
**variant['cnn_params'],
)
target_qf1 = MergedCNN(input_width=imsize,
input_height=imsize,
output_size=1,
input_channels=3 * 2,
added_fc_input_size=action_dim,
**variant['cnn_params']
)
target_qf2 = MergedCNN(input_width=imsize,
input_height=imsize,
output_size=1,
input_channels=3 * 2,
added_fc_input_size=action_dim,
**variant['cnn_params']
)
target_policy = CNNPolicy(input_width=imsize,
input_height=imsize,
added_fc_input_size=0,
output_size=action_dim,
input_channels=3 * 2,
output_activation=torch.tanh,
**variant['cnn_params'],
)
exploration_policy = PolicyWrappedWithExplorationStrategy(
exploration_strategy=es,
policy=policy,
)
replay_buffer = ObsDictRelabelingBuffer(
env=env,
observation_key=observation_key,
desired_goal_key=desired_goal_key,
achieved_goal_key=achieved_goal_key,
**variant['replay_buffer_kwargs']
)
algo_kwargs = variant['algo_kwargs']
algo_kwargs['replay_buffer'] = replay_buffer
base_kwargs = algo_kwargs['base_kwargs']
base_kwargs['training_env'] = env
base_kwargs['render'] = variant["render"]
base_kwargs['render_during_eval'] = variant["render"]
her_kwargs = algo_kwargs['her_kwargs']
her_kwargs['observation_key'] = observation_key
her_kwargs['desired_goal_key'] = desired_goal_key
algorithm = HerTd3(
env,
qf1=qf1,
qf2=qf2,
policy=policy,
target_qf1=target_qf1,
target_qf2=target_qf2,
target_policy=target_policy,
exploration_policy=exploration_policy,
**variant['algo_kwargs']
)
algorithm.to(ptu.device)
algorithm.train()
def HER_baseline_twin_sac_experiment(variant):
import rlkit.torch.pytorch_util as ptu
from rlkit.data_management.obs_dict_replay_buffer import \
ObsDictRelabelingBuffer
from rlkit.exploration_strategies.base import (
PolicyWrappedWithExplorationStrategy
)
from rlkit.torch.her.her_twin_sac import HerTwinSAC
from rlkit.torch.sac.policies import TanhCNNGaussianPolicy
from rlkit.torch.networks import MergedCNN, CNN
import torch
from multiworld.core.image_env import ImageEnv
from rlkit.misc.asset_loader import load_local_or_remote_file
init_camera = variant.get("init_camera", None)
presample_goals = variant.get('presample_goals', False)
presampled_goals_path = get_presampled_goals_path(
variant.get('presampled_goals_path', None))
if 'env_id' in variant:
import gym
import multiworld
multiworld.register_all_envs()
env = gym.make(variant['env_id'])
else:
env = variant["env_class"](**variant['env_kwargs'])
image_env = ImageEnv(
env,
variant.get('imsize'),
reward_type='image_sparse',
init_camera=init_camera,
transpose=True,
normalize=True,
)
if presample_goals:
if presampled_goals_path is None:
image_env.non_presampled_goal_img_is_garbage = True
presampled_goals = variant['generate_goal_dataset_fctn'](
env=image_env,
**variant['goal_generation_kwargs']
)
else:
presampled_goals = load_local_or_remote_file(
presampled_goals_path
).item()
del image_env
env = ImageEnv(
env,
variant.get('imsize'),
reward_type='image_distance',
init_camera=init_camera,
transpose=True,
normalize=True,
presampled_goals=presampled_goals,
)
else:
env = image_env
es = get_exploration_strategy(variant, env)
observation_key = variant.get('observation_key', 'image_observation')
desired_goal_key = variant.get('desired_goal_key', 'image_desired_goal')
achieved_goal_key = desired_goal_key.replace("desired", "achieved")
imsize=variant['imsize']
action_dim = env.action_space.low.size
qf1 = MergedCNN(input_width=imsize,
input_height=imsize,
output_size=1,
input_channels=3 * 2,
added_fc_input_size=action_dim,
**variant['cnn_params']
)
qf2 = MergedCNN(input_width=imsize,
input_height=imsize,
output_size=1,
input_channels=3 * 2,
added_fc_input_size=action_dim,
**variant['cnn_params']
)
policy = TanhCNNGaussianPolicy(input_width=imsize,
input_height=imsize,
added_fc_input_size=0,
output_size=action_dim,
input_channels=3 * 2,
output_activation=torch.tanh,
**variant['cnn_params'],
)
vf = CNN(input_width=imsize,
input_height=imsize,
output_size=1,
input_channels=3 * 2,
**variant['cnn_params']
)
target_vf = CNN(input_width=imsize,
input_height=imsize,
output_size=1,
input_channels=3 * 2,
**variant['cnn_params']
)
replay_buffer = ObsDictRelabelingBuffer(
env=env,
observation_key=observation_key,
desired_goal_key=desired_goal_key,
achieved_goal_key=achieved_goal_key,
**variant['replay_buffer_kwargs']
)
exploration_policy = PolicyWrappedWithExplorationStrategy(
exploration_strategy=es,
policy=policy,
)
algo_kwargs = variant['algo_kwargs']
algo_kwargs['replay_buffer'] = replay_buffer
base_kwargs = algo_kwargs['base_kwargs']
base_kwargs['training_env'] = env
base_kwargs['render'] = variant["render"]
base_kwargs['render_during_eval'] = variant["render"]
her_kwargs = algo_kwargs['her_kwargs']
her_kwargs['observation_key'] = observation_key
her_kwargs['desired_goal_key'] = desired_goal_key
algorithm = HerTwinSAC(
env,
qf1=qf1,
qf2=qf2,
vf=vf,
target_vf=target_vf,
policy=policy,
exploration_policy=exploration_policy,
**variant['algo_kwargs']
)
algorithm.to(ptu.device)
algorithm.train()
def get_state_experiment_video_save_function(rollout_function, env, policy, variant):
from multiworld.core.image_env import ImageEnv
from rlkit.core import logger
from rlkit.envs.vae_wrappers import temporary_mode
from rlkit.visualization.video import dump_video
logdir = logger.get_snapshot_dir()
save_period = variant.get('save_video_period', 50)
do_state_exp = variant.get("do_state_exp", False)
dump_video_kwargs = variant.get("dump_video_kwargs", dict())
if do_state_exp:
imsize = variant.get('imsize')
dump_video_kwargs['imsize'] = imsize
image_env = ImageEnv(
env,
imsize,
init_camera=variant.get('init_camera', None),
transpose=True,
normalize=True,
)
def save_video(algo, epoch):
if epoch % save_period == 0 or epoch == algo.num_epochs:
filename = osp.join(logdir,
'video_{epoch}_env.mp4'.format(epoch=epoch))
dump_video(image_env, policy, filename, rollout_function,
**dump_video_kwargs)
else:
image_env = env
dump_video_kwargs['imsize'] = env.imsize
def save_video(algo, epoch):
if epoch % save_period == 0 or epoch == algo.num_epochs:
filename = osp.join(logdir,
'video_{epoch}_env.mp4'.format(epoch=epoch))
temporary_mode(
image_env,
mode='video_env',
func=dump_video,
args=(image_env, policy, filename, rollout_function),
kwargs=dump_video_kwargs
)
filename = osp.join(logdir,
'video_{epoch}_vae.mp4'.format(epoch=epoch))
temporary_mode(
image_env,
mode='video_vae',
func=dump_video,
args=(image_env, policy, filename, rollout_function),
kwargs=dump_video_kwargs
)
return save_video
| 35.526235 | 143 | 0.640315 | 10,224 | 92,759 | 5.44151 | 0.043134 | 0.022271 | 0.034224 | 0.024751 | 0.847575 | 0.819948 | 0.797678 | 0.782345 | 0.764497 | 0.749721 | 0 | 0.005671 | 0.269979 | 92,759 | 2,610 | 144 | 35.539847 | 0.815908 | 0.068317 | 0 | 0.721673 | 0 | 0 | 0.101618 | 0.020767 | 0 | 0 | 0 | 0 | 0.003597 | 1 | 0.016187 | false | 0 | 0.061601 | 0 | 0.082284 | 0.003147 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
98875d293ebe7f2cab41c056170c10cf280d228f | 209 | py | Python | scripts/field/merStandAlone.py | G00dBye/YYMS | 1de816fc842b6598d5b4b7896b6ab0ee8f7cdcfb | [
"MIT"
] | 54 | 2019-04-16T23:24:48.000Z | 2021-12-18T11:41:50.000Z | scripts/field/merStandAlone.py | G00dBye/YYMS | 1de816fc842b6598d5b4b7896b6ab0ee8f7cdcfb | [
"MIT"
] | 3 | 2019-05-19T15:19:41.000Z | 2020-04-27T16:29:16.000Z | scripts/field/merStandAlone.py | G00dBye/YYMS | 1de816fc842b6598d5b4b7896b6ab0ee8f7cdcfb | [
"MIT"
] | 49 | 2020-11-25T23:29:16.000Z | 2022-03-26T16:20:24.000Z | # Created by MechAviv
# ID :: [910150002]
# Frozen Fairy Forest : Path of the Glowcaves
# Unhandled Message [47] Packet: 2F 01 00 00 00 B0 83 08 00 00 00 00 00 2E 02 00 00 00 00 00 80 05 BB 46 E6 17 02 00 00
| 34.833333 | 119 | 0.69378 | 45 | 209 | 3.222222 | 0.666667 | 0.303448 | 0.289655 | 0.22069 | 0.137931 | 0 | 0 | 0 | 0 | 0 | 0 | 0.401274 | 0.248804 | 209 | 5 | 120 | 41.8 | 0.522293 | 0.952153 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7f650b373b1650817a55c85f808dea537983e24c | 9,664 | py | Python | angr_platforms/tricore/rr2_instr.py | shahinsba/angr-platforms | 86f9ea90c396fb5561d0196a2d1a873e573b0294 | [
"BSD-2-Clause"
] | null | null | null | angr_platforms/tricore/rr2_instr.py | shahinsba/angr-platforms | 86f9ea90c396fb5561d0196a2d1a873e573b0294 | [
"BSD-2-Clause"
] | null | null | null | angr_platforms/tricore/rr2_instr.py | shahinsba/angr-platforms | 86f9ea90c396fb5561d0196a2d1a873e573b0294 | [
"BSD-2-Clause"
] | null | null | null | #!/usr/bin/env python3
""" rr2_instr.py
Implementation of RR2 format instructions.
"""
from pyvex.lifting.util import Type, Instruction
from .rtl import * # pylint: disable=[wildcard-import, unused-wildcard-import]
from .logger import log_this
class RR2_MUL_Inst(Instruction):
""" Multiply instruction.
op = 0x73
op2 = 0x0A
User Status Flags: V, SV, AV, SAV.
"""
name = 'RR2_MUL'
op = "{0}{1}".format(bin(7)[2:].zfill(4), bin(3)[2:].zfill(4))
op2_1 = "{0}".format(bin(0)[2:].zfill(4))
op2_2 = "{0}".format(bin(0xa)[2:].zfill(4))
bin_format = op + 'b'*4 + 'a'*4 + op2_1 + op2_2 + 'c'*4 + 'i'*4
def parse(self, bitstrm):
data = Instruction.parse(self, bitstrm)
data = {"a": int(data['a'], 2),
"b": int(data['b'], 2),
"c": int(data['c'], 2)}
log_this(self.name, data, hex(self.addr))
return data
def get_dst_reg(self):
return "d{0}".format(self.data['c'])
def get_psw(self):
return self.get("psw", Type.int_32)
def get_d_b(self):
return self.get("d{0}".format(self.data['b']), Type.int_32)
def get_d_a(self):
return self.get("d{0}".format(self.data['a']), Type.int_32)
def fetch_operands(self):
return self.get_d_a(), self.get_d_b()
def compute_result(self, *args):
d_a = args[0]
d_b = args[1]
result = d_a.cast_to(Type.int_64) * d_b.cast_to(Type.int_64)
result = result & 0xffffffff
# set flags
c = 0
v = overflow_64(result).cast_to(Type.int_32)
av = advanced_overflow_64(result).cast_to(Type.int_32)
psw = self.get_psw()
cond_sv = (v == 0)
cond_sav = (av == 0)
sv = ((psw & SV_MASK) & cond_sv) | (1 & (cond_sv^1))
sav = ((psw & ASV_MASK) & cond_sav) | (1 & (cond_sav^1))
psw = set_usb(psw, c, v, sv, av, sav)
self.put(psw, "psw")
return result
def commit_result(self, res):
self.put(res, self.get_dst_reg())
class RR2_MULS_Inst(Instruction):
""" Multiply, Saturated instruction.
op = 0x73
op2 = 0x8A
User Status Flags: V, SV, AV, SAV.
"""
name = 'RR2_MULS'
op = "{0}{1}".format(bin(7)[2:].zfill(4), bin(3)[2:].zfill(4))
op2_1 = "{0}".format(bin(8)[2:].zfill(4))
op2_2 = "{0}".format(bin(0xa)[2:].zfill(4))
bin_format = op + 'b'*4 + 'a'*4 + op2_1 + op2_2 + 'c'*4 + 'i'*4
def parse(self, bitstrm):
data = Instruction.parse(self, bitstrm)
data = {"a": int(data['a'], 2),
"b": int(data['b'], 2),
"c": int(data['c'], 2)}
log_this(self.name, data, hex(self.addr))
return data
@property
def max_pos(self):
return self.constant(INT32_MAX_POS, Type.int_32).cast_to(Type.int_64, signed=True)
@property
def max_neg(self):
return self.constant(INT32_MAX_NEG, Type.int_32).cast_to(Type.int_64, signed=True)
def get_dst_reg(self):
return "d{0}".format(self.data['c'])
def get_psw(self):
return self.get("psw", Type.int_32)
def get_d_b(self):
return self.get("d{0}".format(self.data['b']), Type.int_32)
def get_d_a(self):
return self.get("d{0}".format(self.data['a']), Type.int_32)
def fetch_operands(self):
return self.get_d_a(), self.get_d_b()
def compute_result(self, *args):
d_a = args[0]
d_b = args[1]
result1 = d_a.cast_to(Type.int_64, signed=True) * d_b.cast_to(Type.int_64, signed=True)
result = ssov32(result1, self.max_pos, self.max_neg)
# set flags
c = 0
v = overflow_64(result).cast_to(Type.int_32)
av = advanced_overflow_64(result).cast_to(Type.int_32)
psw = self.get_psw()
cond_sv = (v == 0)
cond_sav = (av == 0)
sv = ((psw & SV_MASK) & cond_sv) | (1 & (cond_sv^1))
sav = ((psw & ASV_MASK) & cond_sav) | (1 & (cond_sav^1))
psw = set_usb(psw, c, v, sv, av, sav)
self.put(psw, "psw")
return result
def commit_result(self, res):
self.put(res, self.get_dst_reg())
class RR2_MUL_6A_Inst(Instruction):
""" Multiply instruction.
op = 0x73
op2 = 0x6A
User Status Flags: V, SV, AV, SAV.
"""
name = 'RR2_MUL_6A.U'
op = "{0}{1}".format(bin(7)[2:].zfill(4), bin(3)[2:].zfill(4))
op2_1 = "{0}".format(bin(6)[2:].zfill(4))
op2_2 = "{0}".format(bin(0xa)[2:].zfill(4))
bin_format = op + 'b'*4 + 'a'*4 + op2_1 + op2_2 + 'c'*4 + 'i'*4
def parse(self, bitstrm):
data = Instruction.parse(self, bitstrm)
data = {"a": int(data['a'], 2),
"b": int(data['b'], 2),
"c": int(data['c'], 2)}
log_this(self.name, data, hex(self.addr))
return data
def get_psw(self):
return self.get("psw", Type.int_32)
def get_d_b(self):
return self.get("d{0}".format(self.data['b']), Type.int_32)
def get_d_a(self):
return self.get("d{0}".format(self.data['a']), Type.int_32)
def fetch_operands(self):
return self.get_d_a(), self.get_d_b()
def compute_result(self, *args):
d_a = args[0]
d_b = args[1]
result = d_a.cast_to(Type.int_64) * d_b.cast_to(Type.int_64)
result_1 = result.cast_to(Type.int_32) & 0xffffffff
result_2 = (result >> 32).cast_to(Type.int_32)
self.put(result_1, "d{0}".format(self.data['c']))
self.put(result_2, "d{0}".format(self.data['c']+1))
# set flags
c = 0
v = overflow_64(result).cast_to(Type.int_32)
av = advanced_overflow_64(result).cast_to(Type.int_32)
psw = self.get_psw()
cond_sv = (v == 0)
cond_sav = (av == 0)
sv = ((psw & SV_MASK) & cond_sv) | (1 & (cond_sv^1))
sav = ((psw & ASV_MASK) & cond_sav) | (1 & (cond_sav^1))
psw = set_usb(psw, c, v, sv, av, sav)
self.put(psw, "psw")
class RR2_MUL_U_Inst(Instruction):
""" Multiply Unsigned instruction.
op = 0x73
op2 = 0x68
User Status Flags: V, SV, AV, SAV.
"""
name = 'RR2_MUL.U'
op = "{0}{1}".format(bin(7)[2:].zfill(4), bin(3)[2:].zfill(4))
op2_1 = "{0}".format(bin(6)[2:].zfill(4))
op2_2 = "{0}".format(bin(8)[2:].zfill(4))
bin_format = op + 'b'*4 + 'a'*4 + op2_1 + op2_2 + 'c'*4 + 'i'*4
def parse(self, bitstrm):
data = Instruction.parse(self, bitstrm)
data = {"a": int(data['a'], 2),
"b": int(data['b'], 2),
"c": int(data['c'], 2)}
log_this(self.name, data, hex(self.addr))
return data
def get_psw(self):
return self.get("psw", Type.int_32)
def get_d_b(self):
return self.get("d{0}".format(self.data['b']), Type.int_32)
def get_d_a(self):
return self.get("d{0}".format(self.data['a']), Type.int_32)
def fetch_operands(self):
return self.get_d_a(), self.get_d_b()
def compute_result(self, *args):
d_a = args[0]
d_b = args[1]
result = d_a.cast_to(Type.int_64) * d_b.cast_to(Type.int_64)
result_1 = result.cast_to(Type.int_32) & 0xffffffff
result_2 = (result >> 32).cast_to(Type.int_32)
self.put(result_1, "d{0}".format(self.data['c']))
self.put(result_2, "d{0}".format(self.data['c']+1))
# set flags
c = 0
v = overflow_64(result).cast_to(Type.int_32)
av = advanced_overflow_64(result).cast_to(Type.int_32)
psw = self.get_psw()
cond_sv = (v == 0)
cond_sav = (av == 0)
sv = ((psw & SV_MASK) & cond_sv) | (1 & (cond_sv^1))
sav = ((psw & ASV_MASK) & cond_sav) | (1 & (cond_sav^1))
psw = set_usb(psw, c, v, sv, av, sav)
self.put(psw, "psw")
class RR2_MULS_U_Inst(Instruction):
""" Multiply Unsigned, Saturated instruction.
op = 0x73
op2 = 0x88
User Status Flags: V, SV, AV, SAV.
"""
name = 'RR2_MULS.U'
op = "{0}{1}".format(bin(7)[2:].zfill(4), bin(3)[2:].zfill(4))
op2_1 = "{0}".format(bin(8)[2:].zfill(4))
op2_2 = "{0}".format(bin(8)[2:].zfill(4))
bin_format = op + 'b'*4 + 'a'*4 + op2_1 + op2_2 + 'c'*4 + 'i'*4
def parse(self, bitstrm):
data = Instruction.parse(self, bitstrm)
data = {"a": int(data['a'], 2),
"b": int(data['b'], 2),
"c": int(data['c'], 2)}
log_this(self.name, data, hex(self.addr))
return data
def get_dst_reg(self):
return "d{0}".format(self.data['c'])
def get_psw(self):
return self.get("psw", Type.int_32)
def get_d_b(self):
return self.get("d{0}".format(self.data['b']), Type.int_32)
def get_d_a(self):
return self.get("d{0}".format(self.data['a']), Type.int_32)
def fetch_operands(self):
return self.get_d_a(), self.get_d_b()
def compute_result(self, *args):
d_a = args[0]
d_b = args[1]
result = d_a.cast_to(Type.int_64) * d_b.cast_to(Type.int_64) # Unsigned
result = suov32(result)
# set flags
c = 0
v = overflow_64(result).cast_to(Type.int_32)
av = advanced_overflow_64(result).cast_to(Type.int_32)
psw = self.get_psw()
cond_sv = (v == 0)
cond_sav = (av == 0)
sv = ((psw & SV_MASK) & cond_sv) | (1 & (cond_sv^1))
sav = ((psw & ASV_MASK) & cond_sav) | (1 & (cond_sav^1))
psw = set_usb(psw, c, v, sv, av, sav)
self.put(psw, "psw")
return result
def commit_result(self, res):
self.put(res, self.get_dst_reg())
| 31.478827 | 95 | 0.545012 | 1,548 | 9,664 | 3.21447 | 0.071059 | 0.06049 | 0.056069 | 0.067926 | 0.918006 | 0.893489 | 0.881431 | 0.853296 | 0.853296 | 0.853296 | 0 | 0.056424 | 0.271937 | 9,664 | 306 | 96 | 31.581699 | 0.650796 | 0.06457 | 0 | 0.895238 | 0 | 0 | 0.030621 | 0 | 0 | 0 | 0.004407 | 0 | 0 | 1 | 0.180952 | false | 0 | 0.014286 | 0.119048 | 0.495238 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 8 |
7fbde18bcad7e9789290595adddf246e9bb8e2b1 | 165 | py | Python | monitor/collectors/__init__.py | danielsemkowicz/chia-monitor | 289eee345c67b5bef82229786b1d286589561c93 | [
"Apache-2.0"
] | 148 | 2021-06-03T17:26:46.000Z | 2022-03-18T16:51:43.000Z | monitor/collectors/__init__.py | danielsemkowicz/chia-monitor | 289eee345c67b5bef82229786b1d286589561c93 | [
"Apache-2.0"
] | 60 | 2021-06-09T12:53:46.000Z | 2022-03-26T17:18:55.000Z | monitor/collectors/__init__.py | danielsemkowicz/chia-monitor | 289eee345c67b5bef82229786b1d286589561c93 | [
"Apache-2.0"
] | 27 | 2021-06-10T12:06:26.000Z | 2022-03-23T13:22:47.000Z | from monitor.collectors.collector import Collector
from monitor.collectors.rpc_collector import RpcCollector
from monitor.collectors.ws_collector import WsCollector
| 41.25 | 57 | 0.890909 | 20 | 165 | 7.25 | 0.45 | 0.227586 | 0.434483 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.072727 | 165 | 3 | 58 | 55 | 0.947712 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
f6d5fe7565c0563bc240aaf9354b7f84685027cd | 17,022 | py | Python | tests/test_inotify.py | tatsuya4649/fly | f958d8655c01664b319fc6bcf52aaea0d33197b0 | [
"MIT"
] | 16 | 2021-11-05T08:38:20.000Z | 2021-12-07T07:25:17.000Z | tests/test_inotify.py | tatsuya4649/fly | f958d8655c01664b319fc6bcf52aaea0d33197b0 | [
"MIT"
] | null | null | null | tests/test_inotify.py | tatsuya4649/fly | f958d8655c01664b319fc6bcf52aaea0d33197b0 | [
"MIT"
] | 1 | 2021-12-02T03:07:46.000Z | 2021-12-02T03:07:46.000Z | import pytest
import httpx
import os
import shutil
from conftest import *
_HOST="localhost"
_PORT="1234"
_HTTP="http"
_INOF="test_inotify_file.txt"
_INOD="ino/test_inotify_file.txt"
_TEST_MOUNT_POINT="./tests/mnt"
_INOF_PATH=f"{_TEST_MOUNT_POINT}/{_INOF}"
_INOD_PATH=f"{_TEST_MOUNT_POINT}/{_INOD}"
_INOD2_PATH=f"{_TEST_MOUNT_POINT}/ino"
@pytest.fixture(scope="function", autouse=False)
def inotify_file_remove():
if os.path.isfile(_INOF_PATH):
os.remove(_INOF_PATH)
yield
if os.path.isfile(_INOF_PATH):
os.remove(_INOF_PATH)
@pytest.fixture(scope="function", autouse=False)
def inotify_dir_remove():
if os.path.isdir(os.path.dirname(_INOD_PATH)):
shutil.rmtree(os.path.dirname(_INOD_PATH))
yield
if os.path.isdir(os.path.dirname(_INOD_PATH)):
shutil.rmtree(os.path.dirname(_INOD_PATH))
@pytest.mark.asyncio
async def test_create(inotify_file_remove, newlog, fly_servers, emerge_log_size_check):
_tmp_notice_size = fly_notice_log_size()
_tmp_access_size = fly_access_log_size()
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/{_INOF}")
assert(res.status_code == 404)
_tmp_access_size = await fly_access_increment_check(_tmp_access_size)
# create file
with open(_INOF_PATH, "w") as f:
f.write("Hello new file!")
await fly_notice_increment_check(_tmp_notice_size)
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/{_INOF}")
assert(res.status_code == 200)
await fly_access_increment_check(_tmp_access_size)
@pytest.mark.asyncio
async def test_delete(inotify_file_remove, newlog, fly_servers, emerge_log_size_check):
_tmp_notice_size = fly_notice_log_size()
_tmp_access_size = fly_access_log_size()
# create file
with open(_INOF_PATH, "w") as f:
f.write("Hello new file!")
_tmp_notice_size = await fly_notice_increment_check(_tmp_notice_size)
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/{_INOF}")
assert(res.status_code == 200)
assert(os.path.isfile(_INOF_PATH))
_tmp_access_size = await fly_access_increment_check(_tmp_access_size)
os.remove(_INOF_PATH)
_tmp_notice_size = await fly_notice_increment_check(_tmp_notice_size)
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/{_INOF}")
assert(res.status_code == 404)
await fly_access_increment_check(_tmp_access_size)
@pytest.mark.asyncio
async def test_move(inotify_file_remove, newlog, fly_servers, emerge_log_size_check):
_tmp_notice_size = fly_notice_log_size()
_tmp_access_size = fly_access_log_size()
# create file
with open(_INOF_PATH, "w") as f:
f.write("Hello new file!")
_tmp_notice_size = await fly_notice_increment_check(_tmp_notice_size)
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/{_INOF}")
assert(res.status_code == 200)
assert(os.path.isfile(_INOF_PATH))
_tmp_access_size = await fly_access_increment_check(_tmp_access_size)
# move file different directory
shutil.move(_INOF_PATH, '../' + _INOF)
await fly_notice_increment_check(_tmp_notice_size)
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/{_INOF}")
assert(res.status_code == 404)
await fly_access_increment_check(_tmp_access_size)
# move to mount directory
shutil.move('../' + _INOF, _INOF_PATH)
await fly_notice_increment_check(_tmp_notice_size)
@pytest.mark.asyncio
async def test_create_directory(inotify_dir_remove, newlog, fly_servers, emerge_log_size_check):
_tmp_notice_size = fly_notice_log_size()
_tmp_access_size = fly_access_log_size()
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/{_INOD}")
assert(res.status_code == 404)
_tmp_access_size = await fly_access_increment_check(_tmp_access_size)
# create directory
os.mkdir(os.path.dirname(_INOD_PATH))
_tmp_notice_size = await fly_notice_increment_check(_tmp_notice_size)
with open(_INOD_PATH, "w") as f:
f.write("Hello new file!")
assert(os.path.isfile(_INOD_PATH))
_tmp_notice_size = await fly_notice_increment_check(_tmp_notice_size)
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/{_INOD}")
assert(res.status_code == 200)
await fly_access_increment_check(_tmp_access_size)
@pytest.mark.asyncio
async def test_delete_directory(inotify_dir_remove, newlog, fly_servers, emerge_log_size_check):
_tmp_notice_size = fly_notice_log_size()
# create directory
os.mkdir(os.path.dirname(_INOD_PATH))
_tmp_notice_size = await fly_notice_increment_check(_tmp_notice_size)
assert not os.path.isfile(_INOD_PATH)
with open(_INOD_PATH, "w") as f:
f.write("Hello new file!")
_tmp_notice_size = await fly_notice_increment_check(_tmp_notice_size)
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/{_INOD}")
assert(res.status_code == 200)
# delete directory
assert os.path.isdir(os.path.dirname(_INOD_PATH))
shutil.rmtree(os.path.dirname(_INOD_PATH))
assert not os.path.isdir(os.path.dirname(_INOD_PATH))
_tmp_notice_size = await fly_notice_increment_check(_tmp_notice_size)
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/{_INOD}")
assert(res.status_code == 404)
@pytest.mark.asyncio
async def test_move_directory(inotify_dir_remove, fly_servers, emerge_log_size_check):
_tmp_notice_size = fly_notice_log_size()
_tmp_access_size = fly_access_log_size()
# create directory
os.mkdir(os.path.dirname(_INOD_PATH))
_tmp_notice_size = await fly_notice_increment_check(_tmp_notice_size)
with open(_INOD_PATH, "w") as f:
f.write("Hello new file!")
_tmp_notice_size = await fly_notice_increment_check(_tmp_notice_size)
assert(os.path.isfile(_INOD_PATH))
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/{_INOD}")
assert(res.status_code == 200)
_tmp_access_size = await fly_access_increment_check(_tmp_access_size)
# move directory
shutil.move(_INOD_PATH, '../' + _INOF)
_tmp_notice_size = await fly_notice_increment_check(_tmp_notice_size)
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/{_INOD}")
assert(res.status_code == 404)
await fly_access_increment_check(_tmp_access_size)
# move to mount directory
shutil.move('../' + _INOF, _INOD_PATH)
await fly_notice_increment_check(_tmp_notice_size)
@pytest.mark.asyncio
async def test_move_directory2(inotify_dir_remove, newlog, fly_servers, emerge_log_size_check):
_tmp_notice_size = fly_notice_log_size()
_tmp_access_size = fly_access_log_size()
# create directory
os.mkdir(os.path.dirname(_INOD_PATH))
_tmp_notice_size = await fly_notice_increment_check(_tmp_notice_size)
with open(_INOD_PATH, "w") as f:
f.write("Hello new file!")
_tmp_notice_size = await fly_notice_increment_check(_tmp_notice_size)
assert(os.path.isfile(_INOD_PATH))
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/{_INOD}")
assert(res.status_code == 200)
_tmp_access_size = await fly_access_increment_check(_tmp_access_size)
# move directory
shutil.move(_INOD2_PATH, '../ino')
_tmp_notice_size = await fly_notice_increment_check(_tmp_notice_size)
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/{_INOD}")
assert(res.status_code == 404)
await fly_access_increment_check(_tmp_access_size)
# move to mount directory
shutil.move('../ino', _INOD2_PATH)
await fly_notice_increment_check(_tmp_notice_size)
@pytest.fixture(scope="function", autouse=False)
def make_mnt2():
if not os.path.isdir("tests/mnt2"):
os.mkdir("tests/mnt2")
if not os.path.isfile("tests/mnt2/hello"):
with open("tests/mnt2/hello", "w") as f:
f.write("Hello test!")
yield
@pytest.mark.asyncio
async def test_unmount_dir(inotify_dir_remove, make_mnt2, newlog, fly_servers, emerge_log_size_check):
_tmp_notice_size = fly_notice_log_size()
_tmp_access_size = fly_access_log_size()
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/hello")
assert(res.status_code == 200)
_tmp_access_size = await fly_access_increment_check(_tmp_access_size)
# move mount point
assert(os.path.isdir('tests/mnt2'))
if os.path.isdir("../mnt2"):
shutil.rmtree("../mnt2")
shutil.move("tests/mnt2", '../mnt2')
_tmp_notice_size = await fly_notice_increment_check(_tmp_notice_size)
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/hello")
assert(res.status_code == 404)
_tmp_access_size = await fly_access_increment_check(_tmp_access_size)
# move to mount directory
shutil.move("../mnt2", 'tests/mnt2')
# not increment test
await fly_notice_noincrement_check(_tmp_notice_size)
_INODIR = "tests/mnt2/inod"
_INODIR_FILE = "tests/mnt2/inod/hello"
@pytest.mark.asyncio
async def test_mount_unmount(inotify_dir_remove, newlog, fly_servers, emerge_log_size_check):
_tmp_notice_size = fly_notice_log_size()
_tmp_access_size = fly_access_log_size()
# create directory
os.mkdir(os.path.dirname(_INOD_PATH))
_tmp_notice_size = await fly_notice_increment_check(_tmp_notice_size)
with open(_INOD_PATH, "w") as f:
f.write("Hello new file!")
_tmp_notice_size = await fly_notice_increment_check(_tmp_notice_size)
assert(os.path.isfile(_INOD_PATH))
if os.path.isdir(_INODIR):
shutil.rmtree(_INODIR)
_tmp_notice_size = await fly_notice_increment_check(_tmp_notice_size)
assert not os.path.isdir(_INODIR)
assert not os.path.isfile(_INODIR_FILE)
# test HTTP
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/inod/hello")
assert(res.status_code == 404)
_tmp_access_size = await fly_access_increment_check(_tmp_access_size)
# make directory
os.mkdir(_INODIR)
_tmp_notice_size = await fly_notice_increment_check(_tmp_notice_size)
# crate file
assert not os.path.isfile(_INODIR_FILE)
with open(_INODIR_FILE, "w") as f:
f.write("Hello test")
_tmp_notice_size = await fly_notice_increment_check(_tmp_notice_size)
# test HTTP
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/inod/hello")
assert(res.status_code == 200)
_tmp_access_size = await fly_access_increment_check(_tmp_access_size)
assert os.path.isdir(_INODIR)
assert os.path.isfile(_INODIR_FILE)
# remove directory(move)
shutil.move(_INODIR, "./inod")
_tmp_notice_size = await fly_notice_increment_check(_tmp_notice_size)
assert not os.path.isdir(_INODIR)
assert not os.path.isfile(_INODIR_FILE)
# test HTTP
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/inod/hello")
assert(res.status_code == 404)
_tmp_access_size = await fly_access_increment_check(_tmp_access_size)
shutil.rmtree("./inod")
await fly_notice_noincrement_check(_tmp_notice_size)
# test HTTP
for i in range(1000):
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/{_INOD}")
assert(res.status_code == 200)
_tmp_access_size = await fly_access_increment_check(_tmp_access_size)
@pytest.mark.asyncio
async def test_mount_move(make_mnt2, newlog, fly_servers, emerge_log_size_check):
_tmp_notice_size = fly_notice_log_size()
_tmp_access_size = fly_access_log_size()
assert os.path.isdir("tests/mnt2")
if os.path.isfile("tests/mnt2/test1"):
os.remove("tests/mnt2/test1")
_tmp_notice_size = await fly_notice_increment_check(_tmp_notice_size)
# test HTTP
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/test1")
assert(res.status_code == 404)
_tmp_access_size = await fly_access_increment_check(_tmp_access_size)
# create test file on mount point
if not os.path.isfile("tests/mnt2/test1"):
with open("tests/mnt2/test1", "w") as f:
f.write("Hello World")
_tmp_notice_size = await fly_notice_increment_check(_tmp_notice_size)
# test HTTP
for i in range(1000):
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/test1")
assert(res.status_code == 200)
_tmp_access_size = await fly_access_increment_check(_tmp_access_size)
# move mount point to different directory
shutil.move("tests/mnt2", "../mnt2")
_tmp_notice_size = await fly_notice_increment_check(_tmp_notice_size)
# test HTTP
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/test1")
assert(res.status_code == 404)
_tmp_access_size = await fly_access_increment_check(_tmp_access_size)
# move mount point to different directory
shutil.move("../mnt2", "tests/mnt2")
_tmp_notice_size = await fly_notice_noincrement_check(_tmp_notice_size)
# test HTTP
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/test1")
assert(res.status_code == 404)
_tmp_access_size = await fly_access_increment_check(_tmp_access_size)
assert(os.path.isfile("tests/mnt2/test1"))
os.remove("tests/mnt2/test1")
await fly_notice_noincrement_check(_tmp_notice_size)
@pytest.mark.asyncio
async def test_mount_delete(make_mnt2, newlog, fly_servers, emerge_log_size_check):
_tmp_access_size = fly_access_log_size()
_tmp_notice_size = fly_notice_log_size()
assert os.path.isdir("tests/mnt2")
if os.path.isfile("tests/mnt2/test1"):
os.remove("tests/mnt2/test1")
_tmp_notice_size = await fly_notice_increment_check(_tmp_notice_size)
# test HTTP
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/test1")
assert(res.status_code == 404)
_tmp_access_size = await fly_access_increment_check(_tmp_access_size)
# create test file on mount point
if not os.path.isfile("tests/mnt2/test1"):
with open("tests/mnt2/test1", "w") as f:
f.write("Hello World")
_tmp_notice_size = await fly_notice_increment_check(_tmp_notice_size)
# test HTTP
for i in range(1000):
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/test1")
assert(res.status_code == 200)
_tmp_access_size = await fly_access_increment_check(_tmp_access_size)
# move mount point to different directory
shutil.rmtree("tests/mnt2")
_tmp_notice_size = await fly_notice_increment_check(_tmp_notice_size)
# test HTTP
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/test1")
assert(res.status_code == 404)
_tmp_access_size = await fly_access_increment_check(_tmp_access_size)
# move mount point to different directory
assert not os.path.isdir("tests/mnt2")
os.mkdir("tests/mnt2")
_tmp_notice_size = await fly_notice_noincrement_check(_tmp_notice_size)
# create test file on mount point
assert not os.path.isfile("tests/mnt2/test1")
with open("tests/mnt2/test1", "w") as f:
f.write("Hello World")
_tmp_notice_size = await fly_notice_noincrement_check(_tmp_notice_size)
# test HTTP
async with httpx.AsyncClient(http1=True, timeout=1) as client:
res = await client.get(f"{_HTTP}://{_HOST}:{_PORT}/test1")
assert(res.status_code == 404)
await fly_access_increment_check(_tmp_access_size)
assert(os.path.isfile("tests/mnt2/test1"))
os.remove("tests/mnt2/test1")
await fly_notice_noincrement_check(_tmp_notice_size)
| 39.678322 | 102 | 0.721067 | 2,474 | 17,022 | 4.553759 | 0.039612 | 0.063909 | 0.092313 | 0.078289 | 0.941417 | 0.928369 | 0.922688 | 0.894994 | 0.879638 | 0.87662 | 0 | 0.015571 | 0.162437 | 17,022 | 428 | 103 | 39.771028 | 0.774637 | 0.044296 | 0 | 0.802548 | 0 | 0 | 0.108769 | 0.065138 | 0 | 0 | 0 | 0 | 0.16242 | 1 | 0.009554 | false | 0 | 0.015924 | 0 | 0.025478 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
123af5a011df8d9837c25ebc3acdd100b9b13cc4 | 12,167 | py | Python | cst_data/migrations/0001_initial.py | iPelino/cst-research-api | f73e9d3d526aa37e756308a9cb262da3f319d7c0 | [
"MIT"
] | null | null | null | cst_data/migrations/0001_initial.py | iPelino/cst-research-api | f73e9d3d526aa37e756308a9cb262da3f319d7c0 | [
"MIT"
] | null | null | null | cst_data/migrations/0001_initial.py | iPelino/cst-research-api | f73e9d3d526aa37e756308a9cb262da3f319d7c0 | [
"MIT"
] | null | null | null | # Generated by Django 3.2.3 on 2021-06-07 15:41
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
migrations.CreateModel(
name='Architecture_departments',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('department_code', models.CharField(max_length=100)),
('department_name', models.CharField(max_length=100)),
],
),
migrations.CreateModel(
name='College_schools',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('school_code', models.CharField(max_length=100)),
('school_name', models.CharField(max_length=100)),
],
),
migrations.CreateModel(
name='engineering_departments',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('department_code', models.CharField(max_length=100)),
('department_name', models.CharField(max_length=100)),
('school', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='cst_data.college_schools')),
],
),
migrations.CreateModel(
name='ict_departments',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('department_code', models.CharField(max_length=100)),
('department_name', models.CharField(max_length=100)),
('school', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='cst_data.college_schools')),
],
),
migrations.CreateModel(
name='mining_and_geology_departments',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('department_code', models.CharField(max_length=100)),
('department_name', models.CharField(max_length=100)),
('school', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='cst_data.college_schools')),
],
),
migrations.CreateModel(
name='science_departments',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('department_code', models.CharField(max_length=100)),
('department_name', models.CharField(max_length=100)),
('school', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='cst_data.college_schools')),
],
),
migrations.CreateModel(
name='Physics_modules',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('module_code', models.CharField(max_length=100)),
('module_name', models.CharField(max_length=100)),
('department', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='cst_data.science_departments')),
],
),
migrations.CreateModel(
name='Mining_modules',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('module_code', models.CharField(max_length=100)),
('module_name', models.CharField(max_length=100)),
('department', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='cst_data.mining_and_geology_departments')),
],
),
migrations.CreateModel(
name='MEE_modules',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('module_code', models.CharField(max_length=100)),
('module_name', models.CharField(max_length=100)),
('department', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='cst_data.engineering_departments')),
],
),
migrations.CreateModel(
name='Mathematics_modules',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('module_code', models.CharField(max_length=100)),
('module_name', models.CharField(max_length=100)),
('department', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='cst_data.science_departments')),
],
),
migrations.CreateModel(
name='Information_Technology_modules',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('module_code', models.CharField(max_length=100)),
('module_name', models.CharField(max_length=100)),
('department', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='cst_data.ict_departments')),
],
),
migrations.CreateModel(
name='Information_systems_modules',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('module_code', models.CharField(max_length=100)),
('module_name', models.CharField(max_length=100)),
('department', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='cst_data.ict_departments')),
],
),
migrations.CreateModel(
name='Geology_modules',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('module_code', models.CharField(max_length=100)),
('module_name', models.CharField(max_length=100)),
('department', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='cst_data.mining_and_geology_departments')),
],
),
migrations.CreateModel(
name='Geography_and_urban_planning_modules',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('module_code', models.CharField(max_length=100)),
('module_name', models.CharField(max_length=100)),
('department', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='cst_data.architecture_departments')),
],
),
migrations.CreateModel(
name='Estate_management_and_valuation_models',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('module_code', models.CharField(max_length=100)),
('module_name', models.CharField(max_length=100)),
('department', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='cst_data.architecture_departments')),
],
),
migrations.CreateModel(
name='Electrical_and_Electronics_engineering_modules',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('module_code', models.CharField(max_length=100)),
('module_name', models.CharField(max_length=100)),
('department', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='cst_data.engineering_departments')),
],
),
migrations.CreateModel(
name='Construction_and_management_modules',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('module_code', models.CharField(max_length=100)),
('module_name', models.CharField(max_length=100)),
('department', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='cst_data.architecture_departments')),
],
),
migrations.CreateModel(
name='Computer_sceince_modules',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('module_code', models.CharField(max_length=100)),
('module_name', models.CharField(max_length=100)),
('department', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='cst_data.ict_departments')),
],
),
migrations.CreateModel(
name='Computer_and_software_engineering_modules',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('module_code', models.CharField(max_length=100)),
('module_name', models.CharField(max_length=100)),
('department', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='cst_data.ict_departments')),
],
),
migrations.CreateModel(
name='Civil_environmental_and_geomatic_engineering_modules',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('module_code', models.CharField(max_length=100)),
('module_name', models.CharField(max_length=100)),
('department', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='cst_data.engineering_departments')),
],
),
migrations.CreateModel(
name='Chemistry_modules',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('module_code', models.CharField(max_length=100)),
('module_name', models.CharField(max_length=100)),
('department', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='cst_data.science_departments')),
],
),
migrations.CreateModel(
name='Biology_modules',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('module_code', models.CharField(max_length=100)),
('module_name', models.CharField(max_length=100)),
('department', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='cst_data.science_departments')),
],
),
migrations.CreateModel(
name='Archtecture_modules',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('module_code', models.CharField(max_length=100)),
('module_name', models.CharField(max_length=100)),
('department', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='cst_data.architecture_departments')),
],
),
migrations.AddField(
model_name='architecture_departments',
name='school',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='cst_data.college_schools'),
),
]
| 53.836283 | 142 | 0.592669 | 1,178 | 12,167 | 5.88455 | 0.073854 | 0.099538 | 0.119446 | 0.159261 | 0.921523 | 0.916042 | 0.910704 | 0.910704 | 0.910704 | 0.899019 | 0 | 0.017369 | 0.275992 | 12,167 | 225 | 143 | 54.075556 | 0.769554 | 0.003699 | 0 | 0.830275 | 1 | 0 | 0.174527 | 0.089449 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.009174 | 0 | 0.027523 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
89e291446fa4c7344331afd346f502e6a68c3cb6 | 140 | py | Python | loldib/getratings/models/NA/na_shen/__init__.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_shen/__init__.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_shen/__init__.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | from .na_shen_top import *
from .na_shen_jng import *
from .na_shen_mid import *
from .na_shen_bot import *
from .na_shen_sup import *
| 23.333333 | 27 | 0.75 | 25 | 140 | 3.8 | 0.36 | 0.315789 | 0.526316 | 0.673684 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.178571 | 140 | 5 | 28 | 28 | 0.826087 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
7f092d3443195f7f244006fc3f80ab5b68ff8ed3 | 3,179 | py | Python | test/pyaz/dls/account/__init__.py | bigdatamoore/py-az-cli | 54383a4ee7cc77556f6183e74e992eec95b28e01 | [
"MIT"
] | null | null | null | test/pyaz/dls/account/__init__.py | bigdatamoore/py-az-cli | 54383a4ee7cc77556f6183e74e992eec95b28e01 | [
"MIT"
] | 9 | 2021-09-24T16:37:24.000Z | 2021-12-24T00:39:19.000Z | test/pyaz/dls/account/__init__.py | bigdatamoore/py-az-cli | 54383a4ee7cc77556f6183e74e992eec95b28e01 | [
"MIT"
] | null | null | null | import json, subprocess
from ... pyaz_utils import get_cli_name, get_params
def create(resource_group=None, account, location=None, default_group=None, tags=None, encryption_type=None, key_vault_id=None, key_name=None, key_version=None, disable_encryption=None, tier=None):
params = get_params(locals())
command = "az dls account create " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def update(account, resource_group=None, tags=None, default_group=None, firewall_state=None, allow_azure_ips=None, trusted_id_provider_state=None, tier=None, key_version=None):
params = get_params(locals())
command = "az dls account update " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def list(resource_group=None):
params = get_params(locals())
command = "az dls account list " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def delete(resource_group=None, account):
params = get_params(locals())
command = "az dls account delete " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def show(resource_group=None, account):
params = get_params(locals())
command = "az dls account show " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def enable_key_vault(resource_group=None, account):
params = get_params(locals())
command = "az dls account enable-key-vault " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
| 36.125 | 197 | 0.672539 | 398 | 3,179 | 5.288945 | 0.150754 | 0.07981 | 0.057007 | 0.059857 | 0.799525 | 0.799525 | 0.799525 | 0.799525 | 0.799525 | 0.736817 | 0 | 0.004789 | 0.211702 | 3,179 | 87 | 198 | 36.54023 | 0.835196 | 0 | 0 | 0.825 | 0 | 0 | 0.062284 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.025 | null | null | 0.225 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
613ca8dfd89414d1db89fd5510941751c0b59ff3 | 2,358 | py | Python | config/js_routes.py | girardinsamuel/masonite-js-routes | 8918b874ff1ca64069745ca5fbb9d4b464ceaf74 | [
"MIT"
] | 4 | 2020-11-01T22:51:01.000Z | 2021-09-23T23:12:53.000Z | config/js_routes.py | girardinsamuel/masonite-js-routes | 8918b874ff1ca64069745ca5fbb9d4b464ceaf74 | [
"MIT"
] | 17 | 2021-02-07T17:32:15.000Z | 2022-03-21T22:08:31.000Z | config/js_routes.py | girardinsamuel/masonite-js-routes | 8918b874ff1ca64069745ca5fbb9d4b464ceaf74 | [
"MIT"
] | null | null | null | """ Masonite JS Routes Settings """
"""
|--------------------------------------------------------------------------
| Filters
|--------------------------------------------------------------------------
|
| Filtering routes is completely optional. If you want to pass all of your routes to your
| JavaScript by default, let the default values.
| You can either define 'only' or 'except' with a list of route name patterns.
| You can also optionally define multiple groups of included routes using 'groups' key.
|
|
"""
FILTERS = {"only": [], "except": [], "groups": {}}
# FILTERS = {
# "only": ["home", "api.*"],
# "except": ["debugbar.*", "horizon.*", "admin.*"],
# "groups": {
# "admin": ["admin.*", "posts.*"],
# "author": ["posts.*"]
# }
# }
"""
|--------------------------------------------------------------------------
| Skip JS route() function
|--------------------------------------------------------------------------
|
| If you only want to use the @routes directive to make your app's routes available in JavaScript,
| but don't need the route() helper function, you can set this parameter to True.
|
"""
SKIP_ROUTE_FUNCTION = False
""" Masonite JS Routes Settings """
"""
|--------------------------------------------------------------------------
| Filters
|--------------------------------------------------------------------------
|
| Filtering routes is completely optional. If you want to pass all of your routes to your
| JavaScript by default, let the default values.
| You can either define 'only' or 'except' with a list of route name patterns.
| You can also optionally define multiple groups of included routes using 'groups' key.
|
|
"""
FILTERS = {"only": [], "except": [], "groups": {}}
# FILTERS = {
# "only": ["home", "api.*"],
# "except": ["debugbar.*", "horizon.*", "admin.*"],
# "groups": {
# "admin": ["admin.*", "posts.*"],
# "author": ["posts.*"]
# }
# }
"""
|--------------------------------------------------------------------------
| Skip JS route() function
|--------------------------------------------------------------------------
|
| If you only want to use the @routes directive to make your app's routes available in JavaScript,
| but don't need the route() helper function, you can set this parameter to True.
|
"""
SKIP_ROUTE_FUNCTION = False
| 34.173913 | 98 | 0.464801 | 228 | 2,358 | 4.789474 | 0.298246 | 0.032967 | 0.029304 | 0.043956 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0.156913 | 2,358 | 68 | 99 | 34.676471 | 0.549296 | 0.175148 | 0 | 1 | 0 | 0 | 0.173913 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
61574af40e47544f8101fe62dfbfb48a16686aa8 | 21,274 | py | Python | post_optimization_studies/mad_analyses/four_cuts_eff_flow_chart/Output/Histos/MadAnalysis5job_0/selection_14.py | sheride/axion_pheno | 7d3fc08f5ae5b17a3500eba19a2e43f87f076ce5 | [
"MIT"
] | null | null | null | post_optimization_studies/mad_analyses/four_cuts_eff_flow_chart/Output/Histos/MadAnalysis5job_0/selection_14.py | sheride/axion_pheno | 7d3fc08f5ae5b17a3500eba19a2e43f87f076ce5 | [
"MIT"
] | null | null | null | post_optimization_studies/mad_analyses/four_cuts_eff_flow_chart/Output/Histos/MadAnalysis5job_0/selection_14.py | sheride/axion_pheno | 7d3fc08f5ae5b17a3500eba19a2e43f87f076ce5 | [
"MIT"
] | null | null | null | def selection_14():
# Library import
import numpy
import matplotlib
import matplotlib.pyplot as plt
import matplotlib.gridspec as gridspec
# Library version
matplotlib_version = matplotlib.__version__
numpy_version = numpy.__version__
# Histo binning
xBinning = numpy.linspace(0.0,8000.0,81,endpoint=True)
# Creating data sequence: middle of each bin
xData = numpy.array([50.0,150.0,250.0,350.0,450.0,550.0,650.0,750.0,850.0,950.0,1050.0,1150.0,1250.0,1350.0,1450.0,1550.0,1650.0,1750.0,1850.0,1950.0,2050.0,2150.0,2250.0,2350.0,2450.0,2550.0,2650.0,2750.0,2850.0,2950.0,3050.0,3150.0,3250.0,3350.0,3450.0,3550.0,3650.0,3750.0,3850.0,3950.0,4050.0,4150.0,4250.0,4350.0,4450.0,4550.0,4650.0,4750.0,4850.0,4950.0,5050.0,5150.0,5250.0,5350.0,5450.0,5550.0,5650.0,5750.0,5850.0,5950.0,6050.0,6150.0,6250.0,6350.0,6450.0,6550.0,6650.0,6750.0,6850.0,6950.0,7050.0,7150.0,7250.0,7350.0,7450.0,7550.0,7650.0,7750.0,7850.0,7950.0])
# Creating weights for histo: y15_TET_0
y15_TET_0_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,6.93947462676,23.6474296898,38.9838698148,47.2293634303,52.8955590429,54.6601176766,53.9190782504,51.2456403205,49.0389220291,44.9653251833,40.9203683153,37.1374392445,33.0106024399,29.9564168047,26.2635516641,23.56964175,20.9084878105,17.9034301374,15.9055156843,13.5391375166,11.8523748227,10.1492361414,8.97423305124,7.84016992935,6.51778295328,5.53110771726,5.07666406913,4.22918872534,3.59460601669,3.11969238442,2.59974358702,2.28859302794,1.85462016397,1.72770346224,1.46977646195,1.14224951556,1.011238817,0.867945727949,0.75740541354,0.655053492792,0.442161257634,0.466725638614,0.433972863974,0.278397744436,0.212892395157,0.225174625647,0.184233777348,0.147387045878,0.114634351239,0.0695994261091,0.102352120749,0.0982580039187,0.0736935029391,0.0491290019594,0.0450349251294,0.0409408482995,0.0327526746396,0.0122822544898,0.0122822544898,0.00409408482995,0.00409408482995,0.00409408482995,0.0122822544898,0.0,0.00818816965989,0.0,0.0,0.00409408482995,0.00409408482995,0.00409408482995,0.00818816965989,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_1
y15_TET_1_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0121704927804,0.0242994195771,0.0,0.0,0.0121313836425,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_2
y15_TET_2_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.220879096585,0.441761829362,0.230809248523,0.120544402743,0.0502051137366,0.0501796190701,0.0100546543364,0.0301461580631,0.0100696701578,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_3
y15_TET_3_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.511511761888,1.38040517552,1.56756688076,1.17169068658,0.676676643777,0.330056636725,0.192540760371,0.0659724728581,0.0770204073513,0.0274921778746,0.00550370717489,0.0109833557686,0.00549442824894,0.0,0.010972805256,0.0,0.00549598015337,0.0,0.0,0.0,0.0,0.00551421706168,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_4
y15_TET_4_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0917632079944,0.633559011637,0.989790504888,1.05102433758,0.744094801675,0.434199933413,0.213154808046,0.115462739059,0.0631728425952,0.0365086204957,0.0157838573059,0.0197451252806,0.0078951112346,0.00493792772495,0.00493571915759,0.000984219383707,0.00197349803006,0.000983908741293,0.00098817275936,0.0,0.00197004649222,0.000985428684564,0.000988894652248,0.0,0.0,0.0,0.000986794308699,0.0,0.0,0.000986290466745,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_5
y15_TET_5_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0151205420752,0.111653684494,0.261657168041,0.343094295266,0.331480637049,0.238964801687,0.16257873154,0.0804142402529,0.0380674438393,0.022441684025,0.0138683267656,0.00731159527149,0.00352862875846,0.0030259033186,0.0022693930793,0.000756678686561,0.000755416732519,0.00100723257957,0.000756493034476,0.0,0.000504272673936,0.00025201786373,0.0,0.0,0.000251958166982,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_6
y15_TET_6_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.00372534281898,0.0183210275661,0.0360822271564,0.0684229371749,0.0853200768397,0.0930575474769,0.0821456755683,0.0710181160836,0.0538331497211,0.0357861528456,0.0266283032915,0.0197457052388,0.0137463515593,0.00743379452655,0.00229388610822,0.00200115782904,0.0014346567846,0.000573726553247,0.000572941480461,0.0,0.000572185899139,0.0,0.000286183773284,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_7
y15_TET_7_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.000151233502217,0.00110138422607,0.00220321730341,0.00414651744629,0.00401822686611,0.00382183410569,0.00369232395761,0.00379994810059,0.00488093704425,0.00507276962655,0.00556959057563,0.00427551587877,0.00410439541507,0.00282801073919,0.00200744564252,0.00125294843654,0.000840630970017,0.00049662690786,0.000216079142796,0.000151277549154,0.000108084058385,2.15933968115e-05,2.16292043309e-05,2.16220838984e-05,2.16751874889e-05,0.0,4.31049358904e-05,0.0,0.0,0.0,0.0,0.0,0.0,2.15827559768e-05,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_8
y15_TET_8_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,5.6858529886e-05,8.51343800459e-05,0.000226838278621,8.52280444334e-05,0.000283801015335,0.000312017486312,0.000227213054959,0.000113571381062,0.000141943835613,0.000198131509915,0.000283985582288,0.000396740102387,0.000510742936541,0.000312537184087,0.000566361289355,0.000284116249157,0.000340548894141,0.00028235298885,0.000170249130364,0.000310712005502,0.000140637552984,5.67977549434e-05,8.48283374507e-05,2.84489095189e-05,5.67791349145e-05,0.0,5.67987794904e-05,0.000113523850988,5.67183451234e-05,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,2.79727327207e-05,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_9
y15_TET_9_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_10
y15_TET_10_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,2.10674221742,0.0,1.0529581672,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_11
y15_TET_11_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.691829993515,5.2978182267,5.06681435217,5.06814008347,1.38077450405,0.690627997143,0.461033240486,0.23012854603,0.0,0.0,0.230673171817,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_12
y15_TET_12_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.332435765982,1.60580251853,2.13237520421,2.51968650854,2.63087350198,1.38477604996,0.719744800729,0.332138681087,0.249261343098,0.138402554805,0.110813435051,0.0277263950555,0.0276696169506,0.0554236624816,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_13
y15_TET_13_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0604818175633,0.151143961499,0.403213391549,0.453828545279,0.746194426083,0.645139387342,0.534350744552,0.231751792407,0.171464208753,0.0705768526515,0.0706294696799,0.0201750558134,0.0100953567379,0.0100429581886,0.0,0.0,0.0100805669227,0.0,0.0,0.0,0.0,0.0100975172526,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_14
y15_TET_14_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.011317582905,0.0509394803948,0.0622634731433,0.0989753324578,0.101844295309,0.152781313339,0.198036424467,0.251867683062,0.141452626708,0.118830765363,0.107487496914,0.0622151877075,0.0452703470099,0.0424225451063,0.0113235810719,0.00848724455472,0.0,0.00283014097637,0.00282347950995,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_15
y15_TET_15_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.00305829576001,0.00304355754346,0.0,0.00612419346139,0.00457451903961,0.00304894934329,0.0,0.00611716001098,0.00304400548124,0.0075950710189,0.0106366087057,0.0182773386486,0.00606803813798,0.0106546916819,0.00455394344924,0.00306899308143,0.00304183197841,0.00610962543756,0.00151115650058,0.00153821477883,0.0,0.0,0.00151265396011,0.0,0.0,0.00152449653668,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y15_TET_16
y15_TET_16_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.000180553030001,0.0,0.000721759378433,0.000180626138525,0.000180734319121,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.000180547139741,0.00252864058845,0.00126248642138,0.000180657129763,0.000361173239767,0.00108292126116,0.00054161443182,0.000360506985889,0.000361704556638,0.000361583017085,0.00036167183297,0.000361395760708,0.0,0.0,0.00018020300225,0.0,0.0,0.000180766965792,0.0,0.0,0.0,0.0,0.000180755031278,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating a new Canvas
fig = plt.figure(figsize=(12,6),dpi=80)
frame = gridspec.GridSpec(1,1,right=0.7)
pad = fig.add_subplot(frame[0])
# Creating a new Stack
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights+y15_TET_7_weights+y15_TET_8_weights+y15_TET_9_weights+y15_TET_10_weights+y15_TET_11_weights+y15_TET_12_weights+y15_TET_13_weights+y15_TET_14_weights+y15_TET_15_weights+y15_TET_16_weights,\
label="$bg\_dip\_1600\_inf$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#e5e5e5", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights+y15_TET_7_weights+y15_TET_8_weights+y15_TET_9_weights+y15_TET_10_weights+y15_TET_11_weights+y15_TET_12_weights+y15_TET_13_weights+y15_TET_14_weights+y15_TET_15_weights,\
label="$bg\_dip\_1200\_1600$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#f2f2f2", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights+y15_TET_7_weights+y15_TET_8_weights+y15_TET_9_weights+y15_TET_10_weights+y15_TET_11_weights+y15_TET_12_weights+y15_TET_13_weights+y15_TET_14_weights,\
label="$bg\_dip\_800\_1200$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#ccc6aa", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights+y15_TET_7_weights+y15_TET_8_weights+y15_TET_9_weights+y15_TET_10_weights+y15_TET_11_weights+y15_TET_12_weights+y15_TET_13_weights,\
label="$bg\_dip\_600\_800$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#ccc6aa", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights+y15_TET_7_weights+y15_TET_8_weights+y15_TET_9_weights+y15_TET_10_weights+y15_TET_11_weights+y15_TET_12_weights,\
label="$bg\_dip\_400\_600$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#c1bfa8", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights+y15_TET_7_weights+y15_TET_8_weights+y15_TET_9_weights+y15_TET_10_weights+y15_TET_11_weights,\
label="$bg\_dip\_200\_400$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#bab5a3", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights+y15_TET_7_weights+y15_TET_8_weights+y15_TET_9_weights+y15_TET_10_weights,\
label="$bg\_dip\_100\_200$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#b2a596", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights+y15_TET_7_weights+y15_TET_8_weights+y15_TET_9_weights,\
label="$bg\_dip\_0\_100$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#b7a39b", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights+y15_TET_7_weights+y15_TET_8_weights,\
label="$bg\_vbf\_1600\_inf$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#ad998c", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights+y15_TET_7_weights,\
label="$bg\_vbf\_1200\_1600$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#9b8e82", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights,\
label="$bg\_vbf\_800\_1200$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#876656", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights,\
label="$bg\_vbf\_600\_800$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#afcec6", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights,\
label="$bg\_vbf\_400\_600$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#84c1a3", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights,\
label="$bg\_vbf\_200\_400$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#89a8a0", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights,\
label="$bg\_vbf\_100\_200$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#829e8c", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights+y15_TET_1_weights,\
label="$bg\_vbf\_0\_100$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#adbcc6", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y15_TET_0_weights,\
label="$signal$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#7a8e99", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
# Axis
plt.rc('text',usetex=False)
plt.xlabel(r"TET",\
fontsize=16,color="black")
plt.ylabel(r"$\mathrm{Events}$ $(\mathcal{L}_{\mathrm{int}} = 40.0\ \mathrm{fb}^{-1})$ ",\
fontsize=16,color="black")
# Boundary of y-axis
ymax=(y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights+y15_TET_7_weights+y15_TET_8_weights+y15_TET_9_weights+y15_TET_10_weights+y15_TET_11_weights+y15_TET_12_weights+y15_TET_13_weights+y15_TET_14_weights+y15_TET_15_weights+y15_TET_16_weights).max()*1.1
ymin=0 # linear scale
#ymin=min([x for x in (y15_TET_0_weights+y15_TET_1_weights+y15_TET_2_weights+y15_TET_3_weights+y15_TET_4_weights+y15_TET_5_weights+y15_TET_6_weights+y15_TET_7_weights+y15_TET_8_weights+y15_TET_9_weights+y15_TET_10_weights+y15_TET_11_weights+y15_TET_12_weights+y15_TET_13_weights+y15_TET_14_weights+y15_TET_15_weights+y15_TET_16_weights) if x])/100. # log scale
plt.gca().set_ylim(ymin,ymax)
# Log/Linear scale for X-axis
plt.gca().set_xscale("linear")
#plt.gca().set_xscale("log",nonposx="clip")
# Log/Linear scale for Y-axis
plt.gca().set_yscale("linear")
#plt.gca().set_yscale("log",nonposy="clip")
# Legend
plt.legend(bbox_to_anchor=(1.05,1), loc=2, borderaxespad=0.)
# Saving the image
plt.savefig('../../HTML/MadAnalysis5job_0/selection_14.png')
plt.savefig('../../PDF/MadAnalysis5job_0/selection_14.png')
plt.savefig('../../DVI/MadAnalysis5job_0/selection_14.eps')
# Running!
if __name__ == '__main__':
selection_14()
| 109.659794 | 1,090 | 0.717872 | 4,681 | 21,274 | 3.106815 | 0.121982 | 0.283573 | 0.411951 | 0.53249 | 0.586468 | 0.583373 | 0.583373 | 0.566733 | 0.56412 | 0.562264 | 0 | 0.401213 | 0.077371 | 21,274 | 193 | 1,091 | 110.227979 | 0.339719 | 0.063505 | 0 | 0.185841 | 0 | 0.00885 | 0.05124 | 0.010208 | 0 | 0 | 0 | 0 | 0 | 1 | 0.00885 | false | 0 | 0.035398 | 0 | 0.044248 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
61b822aaa55e5ba1d9a9f950857a5b3082a3952c | 89 | py | Python | engine/station_control/test_plans/instruments/generic/__init__.py | geeklevi/PythonElectron | 0a01b8842a56f91338de6c341bb1c2037aaae359 | [
"CC0-1.0"
] | null | null | null | engine/station_control/test_plans/instruments/generic/__init__.py | geeklevi/PythonElectron | 0a01b8842a56f91338de6c341bb1c2037aaae359 | [
"CC0-1.0"
] | null | null | null | engine/station_control/test_plans/instruments/generic/__init__.py | geeklevi/PythonElectron | 0a01b8842a56f91338de6c341bb1c2037aaae359 | [
"CC0-1.0"
] | null | null | null | from .generic_instrument import *
from .generic_osa import *
from .generic_smu import *
| 17.8 | 33 | 0.786517 | 12 | 89 | 5.583333 | 0.5 | 0.492537 | 0.507463 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.146067 | 89 | 4 | 34 | 22.25 | 0.881579 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
4ee3cd7f7d59a674b97c628363ee1ca3e8487e75 | 2,965 | py | Python | tests/test_cli.py | deathwebo/diffino | 3e5fa73bde57a40a347d9f5a869e5b66ec440511 | [
"MIT"
] | 4 | 2017-10-21T08:20:40.000Z | 2019-02-22T11:02:11.000Z | tests/test_cli.py | deathwebo/diffino | 3e5fa73bde57a40a347d9f5a869e5b66ec440511 | [
"MIT"
] | null | null | null | tests/test_cli.py | deathwebo/diffino | 3e5fa73bde57a40a347d9f5a869e5b66ec440511 | [
"MIT"
] | null | null | null | from subprocess import Popen, PIPE
def test_md5_csv():
p = Popen(['diffino', 'before_dataset.csv', 'after_dataset.csv', '--mode', 'md5'], stdout=PIPE, stderr=PIPE)
stdout, stderr = p.communicate()
raise Exception('Finish test!')
def test_md5_zip():
p = Popen(['diffino', 'before_dataset.zip', 'after_dataset.zip', '--mode', 'md5'], stdout=PIPE, stderr=PIPE)
stdout, stderr = p.communicate()
raise Exception('Finish test!')
def test_s3_csv():
p = Popen(['diffino', 's3://bucket/before_dataset.csv', 's3://bucket/after_dataset.csv', '--mode', 'md5'], stdout=PIPE, stderr=PIPE)
stdout, stderr = p.communicate()
raise Exception('Finish test!')
def test_s3_bucket_md5():
p = Popen(['diffino', 's3://bucket/before_dataset', 's3://bucket/after_dataset', '--mode', 'md5'], stdout=PIPE, stderr=PIPE)
stdout, stderr = p.communicate()
raise Exception('Finish test!')
def test_pandas_csv():
p = Popen(['diffino', 'before_dataset.csv', 'after_dataset.csv', '--mode', 'pandas'], stdout=PIPE, stderr=PIPE)
stdout, stderr = p.communicate()
raise Exception('Finish test!')
def test_pandas_csv_numeric_false():
p = Popen(['diffino', 'before_dataset.csv', 'after_dataset.csv', '--mode', 'pandas', '--convert-numeric', 'false'], stdout=PIPE, stderr=PIPE)
stdout, stderr = p.communicate()
raise Exception('Finish test!')
def test_pandas_csv_cols():
p = Popen(['diffino before_dataset.csv', 'after_dataset.csv', '--mode pandas', '--cols', 'id', 'name'], stdout=PIPE, stderr=PIPE)
stdout, stderr = p.communicate()
raise Exception('Finish test!')
def test_pandas_output_csv_local():
p = Popen(['diffino', 'file_1.csv', 'file_2.csv', '--output', 'diff.csv'], stdout=PIPE, stderr=PIPE)
stdout, stderr = p.communicate()
raise Exception('Finish test!')
def test_pandas_output_xlsx_local():
p = Popen(['diffino', 'file_1.csv', 'file_2.csv', '--output', 'diff.xlsx'], stdout=PIPE, stderr=PIPE)
stdout, stderr = p.communicate()
raise Exception('Finish test!')
def test_pandas_output_json_local():
p = Popen(['diffino', 'file_1.csv', 'file_2.csv', '--output', 'diff.json'], stdout=PIPE, stderr=PIPE)
stdout, stderr = p.communicate()
raise Exception('Finish test!')
def test_pandas_output_csv_s3():
p = Popen(['diffino', 'file_1.csv', 'file_2.csv', '--output', 's3://bucket/diff.csv'], stdout=PIPE, stderr=PIPE)
stdout, stderr = p.communicate()
raise Exception('Finish test!')
def test_pandas_output_xlsx_s3():
p = Popen(['diffino', 'file_1.csv', 'file_2.csv', '--output', 's3://bucket/diff.xlsx'], stdout=PIPE, stderr=PIPE)
stdout, stderr = p.communicate()
raise Exception('Finish test!')
def test_pandas_output_json_s3():
p = Popen(['diffino', 'file_1.csv', 'file_2.csv', '--output', 's3://bucket/diff.json'], stdout=PIPE, stderr=PIPE)
stdout, stderr = p.communicate()
raise Exception('Finish test!')
| 31.88172 | 145 | 0.660708 | 396 | 2,965 | 4.775253 | 0.098485 | 0.048123 | 0.089371 | 0.137493 | 0.920148 | 0.906399 | 0.906399 | 0.870439 | 0.870439 | 0.870439 | 0 | 0.012272 | 0.148061 | 2,965 | 92 | 146 | 32.228261 | 0.736342 | 0 | 0 | 0.490566 | 0 | 0 | 0.302192 | 0.051265 | 0 | 0 | 0 | 0 | 0 | 1 | 0.245283 | false | 0 | 0.018868 | 0 | 0.264151 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4ef5ffa7f8c36f72de90540c46e83166fb194312 | 210 | py | Python | pytextrank/__init__.py | nocluebutalotofit/pytextrank | 76e07ff4cb768571ec3c7ff6633110545c38c392 | [
"Apache-2.0"
] | null | null | null | pytextrank/__init__.py | nocluebutalotofit/pytextrank | 76e07ff4cb768571ec3c7ff6633110545c38c392 | [
"Apache-2.0"
] | null | null | null | pytextrank/__init__.py | nocluebutalotofit/pytextrank | 76e07ff4cb768571ec3c7ff6633110545c38c392 | [
"Apache-2.0"
] | null | null | null | from .pytextrank import json_iter, limit_keyphrases, limit_sentences, make_sentence, normalize_key_phrases, parse_doc, pretty_print, rank_kernel, render_ranks, text_rank, top_sentences, top_keywords_sentences
| 70 | 208 | 0.866667 | 29 | 210 | 5.793103 | 0.827586 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.07619 | 210 | 2 | 209 | 105 | 0.865979 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 7 |
f60d5bfb8868b31e9521950e239094acdc81a1de | 2,730 | py | Python | Projeto/LMS/curriculo/migrations/0001_initial.py | groupope/Projeto_LMS-oficial- | 4c0311c04ce05e5c36d45a5145315e7ebc51fa9c | [
"Apache-2.0"
] | null | null | null | Projeto/LMS/curriculo/migrations/0001_initial.py | groupope/Projeto_LMS-oficial- | 4c0311c04ce05e5c36d45a5145315e7ebc51fa9c | [
"Apache-2.0"
] | null | null | null | Projeto/LMS/curriculo/migrations/0001_initial.py | groupope/Projeto_LMS-oficial- | 4c0311c04ce05e5c36d45a5145315e7ebc51fa9c | [
"Apache-2.0"
] | null | null | null | # Generated by Django 2.0.2 on 2018-05-02 14:28
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
migrations.CreateModel(
name='Alunos',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('login', models.CharField(max_length=50, verbose_name='Login')),
('senha', models.CharField(max_length=50, verbose_name='Senha')),
('nome', models.CharField(max_length=100, verbose_name='Nome')),
('email', models.CharField(max_length=100, verbose_name='Email')),
('celular', models.CharField(max_length=20, verbose_name='Celular')),
('dt_expiracao', models.DateField(max_length=10, verbose_name='Data Expiracao')),
('ra', models.CharField(max_length=7, verbose_name='RA')),
('foto', models.CharField(blank=True, max_length=100, null=True, verbose_name='Foto')),
],
),
migrations.CreateModel(
name='Coordenador',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('login', models.CharField(max_length=50, verbose_name='Login')),
('senha', models.CharField(max_length=50, verbose_name='Senha')),
('nome', models.CharField(max_length=100, verbose_name='Nome')),
('email', models.CharField(max_length=100, verbose_name='Email')),
('celular', models.CharField(max_length=20, verbose_name='Celular')),
('dt_expiracao', models.DateField(max_length=10, verbose_name='Data Expiracao')),
],
),
migrations.CreateModel(
name='Professor',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('login', models.CharField(max_length=50, verbose_name='Login')),
('senha', models.CharField(max_length=50, verbose_name='Senha')),
('nome', models.CharField(max_length=100, verbose_name='Nome')),
('email', models.CharField(max_length=100, verbose_name='Email')),
('celular', models.CharField(max_length=20, verbose_name='Celular')),
('dt_expiracao', models.DateField(max_length=10, verbose_name='Data Expiracao')),
('apelido', models.CharField(max_length=25, verbose_name='Apelido')),
],
),
]
| 50.555556 | 115 | 0.578755 | 282 | 2,730 | 5.411348 | 0.212766 | 0.173001 | 0.200524 | 0.267366 | 0.745085 | 0.745085 | 0.745085 | 0.745085 | 0.745085 | 0.745085 | 0 | 0.031706 | 0.272161 | 2,730 | 53 | 116 | 51.509434 | 0.736286 | 0.016484 | 0 | 0.717391 | 1 | 0 | 0.113308 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.021739 | 0 | 0.108696 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
f64c82d26a1c6570f6e05e8e0bec886ae8a9930c | 722 | py | Python | tests/test_data.py | umberto10/mcl-python | 0f486efbab425be3ce66ed8c53e44a4aa3f95b7d | [
"MIT"
] | 1 | 2020-10-29T02:32:50.000Z | 2020-10-29T02:32:50.000Z | tests/test_data.py | umberto10/mcl-python | 0f486efbab425be3ce66ed8c53e44a4aa3f95b7d | [
"MIT"
] | null | null | null | tests/test_data.py | umberto10/mcl-python | 0f486efbab425be3ce66ed8c53e44a4aa3f95b7d | [
"MIT"
] | 3 | 2020-10-16T05:55:02.000Z | 2022-02-13T21:27:38.000Z | G1_STR = b"1 3685416753713387016781088315183077757961620795782546409894578378688607592378376318836054947676345821548104185464507 1339506544944476473020471379941921221584933875938349620426543736416511423956333506472724655353366534992391756441569"
G2_STR = b"1 352701069587466618187139116011060144890029952792775240219908644239793785735715026873347600343865175952761926303160 3059144344244213709971259814753781636986470325476647558659373206291635324768958432433509563104347017837885763365758 1985150602287291935568054521177171638300868978215655730859378665066344726373823718423869104263333984641494340347905 927553665492332455747201965776037880757740193453592970025027978793976877002675564980949289727957565575433344219582"
| 240.666667 | 475 | 0.975069 | 14 | 722 | 50.142857 | 0.785714 | 0.011396 | 0.014245 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.974648 | 0.016621 | 722 | 2 | 476 | 361 | 0.014085 | 0 | 0 | 0 | 0 | 0 | 0.963989 | 0.952909 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
9ce4510bddc7f51a3ae368bf1ccd9285c641147e | 401,312 | py | Python | bnz.py | 5754515/ins | def2387551913fbf9c49554ffc42c95588fdf568 | [
"Apache-2.0"
] | null | null | null | bnz.py | 5754515/ins | def2387551913fbf9c49554ffc42c95588fdf568 | [
"Apache-2.0"
] | null | null | null | bnz.py | 5754515/ins | def2387551913fbf9c49554ffc42c95588fdf568 | [
"Apache-2.0"
] | null | null | null | # Creator : Hassan Zaid Sultani
# Facebook : https://fb.com/Hassanzaidsultanii
# THE CHAMPION
import base64
exec(base64.b16decode('2320436F6D70696C6564204279203A2042696E79616D696E0A2320476974487562203A2068747470733A2F2F6769746875622E636F6D2F42696E79616D696E2D62696E6E690A2320596F7554756265204368616E6E656C203A20547269636B2050726F6F660A696D706F7274206D61727368616C0A65786563286D61727368616C2E6C6F6164732822635C7830305C7830305C7830305C7830305C7830305C7830305C7830305C7830305C7830335C7830305C7830305C783030405C7830305C7830305C78303073215C7830305C7830305C783030645C7830305C783030645C7830315C7830306C5C7830305C7830305A5C7830305C783030655C7830305C7830306A5C7830315C783030645C7830325C7830305C7838335C7830315C783030645C7830315C7830305C78303455645C7830315C78303053285C7830335C7830305C7830305C783030695C7866665C7866665C7866665C7866664E73375C7830325C7830305C783030785C7839635C786264555C7862626E5C786462305C7831345C7839645C7863625C7861665C786238555C7830375C78643968245C78623941275C7830335C7831655C7839635C786336485C7830375C786137366C5C783037415C786137425C783932695C7838395C786138445C6E245C7830355C7863375C7866645C7838325C7838655C783131505C786135405C7830315C7861665C786639305C7837664928594F5C7864626D5C7864325C7830655C7862645C7865305C7863305C786662385C7866375C7831635224455C7863325C783838715C744C5C7839635C7838395C783864385C783933245C7863345C7838385C74535C786364255C7830653B5C7839615C783162605C7839626B5D5C7838345C786465645C7830365C7865335C7863395C7864355C7830345C786632295C6E5C7839385C7863375C7830365C7862615C786165235C7830345C7862395C7865645C7831655C7862655C7831665C7838655C7866345C7865375C7865652139725C7839335C7838367B5C7830345C7863395C7865335C7865645C7838655C7865395C7838665D7A5C7839665C7838645C7839335C78623026475C7839615C7865635C7864325F45715C7830625C7839325C786663415C7865335C7865332E5C7864645C7864364C5C7830375C783163695C7864325C7839325C5C5C7861625C7864395C7862654C5C7865335E515C7838622F395C783834645C7866345C7831395C7863375C783062355C786536725C7839614D5C7864335C7865345C7830345C7861345C7862355C7838665C7830375C78396252715C7831365C7866315C7838325C786266515C7862363D5C7830355C786139575C7838645C7838635C78626636754A5C7838365C7862315C78663431575C7866303E5C5C5C7831305C7862615C78623143425C7831315C5C5C7831315C78663931765C7866325C7861302F65245C7866615C7839365C7865355C7831315C7865395C7863375C7838655C7865395C7862325C786430725C7838613A434D28415C7866305C7839395C7863355C7838625C7864385C786331595C7866355C7838325C7831335C7866372B4C39632B5C7830345C7831375C786561305C7838615C7838385C786339465C7831375C7839395C7865355C7861332C6D3A45366B5C7838395C786665453B3C635C7866392D5C7866385C786330385C7863375C7861655C7862635C7831315C783938533B5C786334305C7830305C7861645C7839345C7861665C7839355C7863395C7861392D5C7863345C7839615C786631655C7839655C7861345C786466345C7861346E5C7831315C7838625C7839345C7861374B5C783165635C7831645C7861647D5C783132605C7865385C7865635C783833655C7862345C7864625C7863665C783035445C783963505C745C7864395C7861645C7863625C7864645C7862385C7865365C7865315C7866365C7866615C7830625C786131512C3B5C7831612C265C783933315C7864635C786363475C7862334F5C7863335C786562515C7831665C7864345C7863645C7863645C7838615C7863395C6E3A35605C7830305C7830375A5C7830625C78383657512D5C7865665C7861385C78653774385C7839665C7864664E665C783937555C7863665C7862326F5C725C7861615C7866615C7839365C7863622C5C7866615C7839365C7862365F5C7838315C783036635C78653679785C745C7838345C7838325C7838385D5C7831375C7830625C7862315C7838615C783833605C7830335C786236505C7862395C7862375C7864355C7863325A5C7864305C7865635932455C783830715C786434795C7864376D655C7863615C725C5C5C7864395C783831503B585C7838365C7862315C7866324E5C7862335C78646672463D5C7861385C7862655C7863346B5C78616455553F7D5C7866615C7864645C78643233585C7838345C78653973475C7863613A5C7865665C7839645C7866375C7861635C7864657B4B625C7831655C786336775C7838365C6E5C783835365D5C6E635C7863355C786238415C7861385C7839305C7862365C7863375C7865645C7864305C786634655C7831385C7865385C7864645C7864665C783930555C7865665C7865635C7862315C7866365C7862365C7865655C7865617835745C786666375C7863645C7861375C7866655C7830625C7838645C783938635C7830625C7831665C7863635C7864396878793D325C7863335C7861355C7864367D5C7830326E5C786665785C783833285C7830325C7830305C7830305C783030745C7830345C7830305C7830305C7830307A6C6962745C6E5C7830305C7830305C7830306465636F6D7072657373285C7830305C7830305C7830305C783030285C7830305C7830305C7830305C783030285C7830305C7830305C7830305C783030735C7862645C7830365C7830335C7830302F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F785C7831625B31313B31326D30382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830305C7831625B31313B31326D2F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F373535373835303434353435333835303436323932353531333737323032303039353230313735353634363834353837313035363734373531363935303433383938383234373331343735313835303132303836313639353833303034323138343239313337383833393938393438393734313635353738313232303033333336303431373930333532373336333436313032383334313830383536313435373737343737383734323236313231343835383534323837333733393933313530313338363134393432383731313330383534313435303038343331313039343133313735383631343734333436303438333336343437323130343330373833383337303637343935323537323239393235313136383430353931393238373033373832313233353632353731373737353933353532383535333633333130323534363735343939383930303632323631383230333036303933343633323837393034363136393130343839343034303130323533373532313537363934343933393636383136373132313739363035363032383636373133363134313239373332323738353732363431333936313339343933323031343430353530333239383134363135313433383439343030393730333836333535333133363531303634303232343433373934363238323133393734393638343438333030303736353030383634383635313634393733393731373033303632383138303132383937373932393536393638373035303135393934373830313437323333323534343534313232393835313439353330343037373434353234333234303937373930313238343835353130373831373135323033313631383432323234313236323434323934313635303537353032373539383536393133363231323537313631303230353836383434363438393332313435373430343836383939353231313833373835393038383239393934353634323732373030363637313337373036393535303339343930383736353035333832323033333433313435343730303638373034313531313933323032303436353733353438313932303735313036303137373431353931383738313234303130343436343530363830313035343334373232343038313733373137363234313830373930333237363931303138333233333135353238373134313336323339343936363032353930303539343631333935373535353737303533313838383733373935383930373934363739363637303133343231353935373736363230383933353633303432383930393230393934323336323336333935333135323337393536353333363637343732333937323635343833363136363431343237363832323033353831373339393030363538333530313334393238373739383430393632393431313533373334393735323633373131333933343137353239393431333033303938393939363332393536323730333734363237333934323233373936323033363738363637363735313239323731313932353130303231323138393939373033373539393139383237333135313533303538333032353437373131303032383136393630313232323835353931333237303434353131383736393631383237313537393234353837323230323337373236323639363830363635333832363234303334343138323135373834323532323734353033373534353538393737393335383934333838333532303834323934343030333532303031383836363730323331383931353930313634353636333735353431373139373632303138343439363930323039343739343830343038393739353231383231323032313836353939333735393037303832333439393436343735333635323937343836383938333339343135303833383130333632323031323536303630343938393234333235303834353434393031303036383631353239383430303437323538363134333030313130373833393030333038393938353233343139343234333135353537373937383536343733313437333930303638323231323134393337353737383231313537343838333733353133383530393331333932353338353939313031373038323237393235373832323336343237383131373834323232353538383032323034353938343337363137373838333231363936333439313432353330393839373634333135363133333639333231323035383235383033303430383239343637333536343333313837393734363538313436343730323138363734393237313532393732323838333337313335393237313939393035333334363436343434383039363037333233343731313539383631323238353739363930323231363830353439373135353936353934353039393137353039383433303439343139393836383130383938313133313335343538353530303330383533343235373937353033383433313536363231343034313038393234303935373334353039353233333334303837323130333732373439393139393934373233353137383136393334373534303132303734333032323936343637353339333930383738313239373132363831343837333834393333333039343835323034313333353239333531353934313831393430303638373233373432313233373533373736373235313132333835383837393336313239393235373435373435373139343931323731373938323935313133313630323738333638393131373133393334373936303139313835373335333134333635343436343836333534393836323932353530323134393036363130373430393833313135363532303935343335373934383539313736313632313436373738383033303134333531313236323838313134363333353831323437373938373632383337343036383338393130343130393931343939303335343738363831363337343936333634363438373336353532383232313236393636323336393731303234303039383835353336343133393437383936323432353035383335313935303139353931383039373231323338383630353830343538353631373734303932363237303435303336393936393935303635383232383333313739313339323435353738393431313636343030353039363238303636343235383530343733393030313636363439343630303534303231363934373030343238323934373433373937373131383339313337323133373033313439333233383432323936343630343234343738353938313439353734313939353333393835383938383636323533383637383236343036383538313134383037383138323537353032383737363136343739303233323636313331393231323938303235383930393934323837313831303839363436313031313030353031303636393030313332313138393035323737313833363431313631383937333731303638363738343437383234393138313733343433383038393130303333383630363437353536363031373136363438323734343735333230383035383236323631303632303932303438383039343130343233373730363939313130343733343832353833313734393732313436303439303838383830393230343930363934323933323339383430323535343532333635313233393139303038313030383930353138343135333738333132373135333034393236393137333735303531383439363032303938363736323937313935383036393138383439373431303036333036333631363135393636373536323531373333353030343130343734393731393636363830343835333138383932363838383031313834363235363339373531363534383832313936333432383035363234323536393134383633383536303838363634313234363737323430343430323830313931353738323033333736323431383435393533343830333136363235363233333333383433373831353832303537363635363939323835393434343932323935333236353139333237353633393830383634353031353933303635373536313335383230303236383034303933393636333430363031363536333635333131343439323733363939343337343530313531323131313632383530383934343735373831353339343439393132343530333037313138333136393136313532333130393538303235313739333533343132323937383536313833373933333138353834343134373233373134303634313637313434303934373833363236303036323933323036373432323539333639363734333939333135323737303234323730363438343830313733323836313737343537383834373538343639383736323531373232333539323831313233323137363139303834363537363530353930393439323435323431363138363634303836393030393737353237303936333833343132363335393038333934303731343432323132313732343633343034323032303736373931373532353830303232343030343732393932313639323135373733323839383730313336313939303637393636383135393039353839323937323935343235323834343437323530313830373532343430353832333330353133373732323537333239373237383430363739363139333336393339353933343233343637323338363339393638373937383839333437383033373838373731313435383130363032313733313730363136333237363139393730323738363035323637313933353030393234393436393232383139313239323935383936353034383131303134313431323235343235333835313238373833353737323431383232383530363736363334363535303930343832313538333537363238383430393633323133363830313130303832353631353834393430393039383738393137303538333039393132373836353136383434333833303734363431373437333935383039333638313430323737383730373036373635313532333135353035333730383031353433353936303338333639323936343135313236393131373830363938303537313433323531313134373830343934343432393439343936373932343238333532373231313934393737343636343331393531363734323733373732383936343835353138363239313233353539363232343534363439333839313939353934303330343735333238363731393936383038333936343435363830373633373932323833323438303938373534323934363332353730343432353737353137373037323937353833373033353135383230383938323438343134303437393237393533353134313131303535343736353838373737303139323739343738383437323432363131323939343637383237353632393230383332313135363036373237353138363030393231313738353533313532333636343734353338343437383239323635353039323236303733353838303631323238343039333534363438323839383637313636383338313736323631383334303230323238353638303933383730373630303735373936303131303636363331363530383835323833323830343735303939363930383235323537303436343839383935373538323233393534393137383038323738353538343031333031313830363438353635343730313538353639343431323931343132383034353937393138373034313838373437313934363132383037373033353334323335323838343431343637333036313134363037313838353932303037333936363237363230373939313534343639303139373932353736343630333834333131323231303730383438393234313738343036313436353131383036333635373331303937323635313739353833383839363533353739313138313730303239373031353138363637323832343635393837333037313236383439313535373339343537383333363034303431373137383136353837343338333338363736393333343231323837363936313131303630313638323638353139333333393639323836353934303131353130333631333233333438323231343433333439333435363335343530323439333736373636313939303839313434333938363030323830393538363831343538303438383131383037313030303938373133333033313036333938353237393036363239323234383236333832393935343437363130323832373739313939313438303635353930343130343238353739323831303036313338383534353737363637373638343038393135383934333733393435323731373538353932313031343435373231393432333433313435373939323132393331333830333033343638353039323231373635333238373333393231383438333132313931373439383434373830383832333833393832333032373437373930343039313130353830313231363031313335363431303639353835313136363934353039363333303930343131323138313636393035343531363935343937303333353431363737373233303539333230303330353136343533353435363530313831353432303237333835313335303837393438343534323131353537323834313937363237313635393937303938323135313234343032393830363534383936383530303333313739333532393639393834343237373835373638373630363430303838313035333936393739383137313938393839393135353131363030303131363135393238313038393434343032333039303936353530333930323036363938313532393131373639353834373731393237323233353433373232303335333835383935383830323732323234383436373536373132303932353035343233373532343736383133313734353939393537363538343735393534303835353033313233363731333230383835313735393034333330383035363236343439373832303830313937343835393835353139363430323038363836353330363335303238313637343937393330333137343739323639333034303732363932363837343236343430343639363930343035323234333235323332393631383638353233303736353934353638333732343636303736303139393133313737363830333737363030353838343234313636343639393631303637343331363337313836383032323930353937363438373935373533353432353536323238353637383134333539303432333031363135363434373738363937323730343632393135363438363335373034383330343932353139333137363934363536383939343432333939333230383134363332353539353134373636343431383237343938353533343334323131333531313231303034343436343930333034313338393438333434323831323139323636303235373233313736343536373537303435333038343537363830363035373837393534393235303935333535373232303038373236323535383835323538353239373237373431343833343037333737373133383335343839383034303735363833303733333632333932363334353138323332393436323239313036313434303839353237333332393837383135353035333535343132333439343037373530353431353836383134393534383539303534343535353530303833303238343137383537323332373530353938303630383632333338313330353539313431343232333631343339373137363930313036373038323833363839313335353537313034303333383433343638373934333434383438343439323930323337393035333039373235383936333533383634393337323833373935303034343931303437393035333633373233393535383938383534393139373039393732353830373733373739363838393731333933313730323233393834323730363532383837313438343434393139393334393437313833333636333632393538363338383836323336333431353534353439313339373036383439373038383237303637333636343234343933303036333739313932323933323538353739343233363437383431363434393935373330373938363932383432303534333433383433333838363532333330303030383137343331333630383632333939313239353737353134343039383635393231303133323738383736303035393034303136363335363232363633343437353339353938323531343434313538323032353337393538373435373432303032383131373939363731323032303738373531323337313031383033383330343738343931333932363836323632333635303931383837393230313831383239323936393432383332343338333736333132343534353534343331383131373136393433353030363138383634303435353636393634383433323132303536313330383438313233333836303632313735363539343034363536383335303535313837303831343337383338383039343531383838363330393232393036333534313934353636333133333235343835393131383130383937383537323831383431363433383230393938343436353337303233373632313330353834333136393238393130323339323231393832313531393633313033363033303730363033323535323230373335303339323532393133393936373739303038373137383932363838373835303335393934393538383635363139383037333433383637373639363030323434383635363133313534303431373232363732393739303531363537353932333132353833393739353034363234393131383237333830353233323938383731373238383033363438343239383334343339323337333130383134323232383930343135383231323538303734313739353330313133373136303030363235353232313936363936363137363936363833303433303638333130383936363134343535343538383833343938343236343030363334333239353132313535363638383834303637333633323138363638313030393837343432313238343833383637303738383533373639363035333833343739393830313135313030333538323833313938343736383439343339363732363134323639323533383236353732383736383633383033353935313531393339333436333735373438333635343935373838383937313939303039343930323539353738343234373133383832353433393535343039393839333834323037313832303236353232393932333438393336303837353936303631313035373234343431343830393133363938353032323833303731323237363038373635383636343234383930313338393434313438333931323532393536333631303033393433363535353434353139383236323332323137313636383436353937313035313336303936313231323830343136303132393135303336383937383030313939393031333633303433363337363039343439373230333236333638373936333534343438363334353930343633313833303639313330353630383435333833303334363337363838373833333435383933393039353030323237373535383238353837393538303931383333303337323532313032363332383832383332363938363139373335353835373035333933313632353635363235323733383039353539343431383133383230373739303234313636393733363335343136333430353236353437323333343834383035393237363235383831363531333937373031313735363439343533343836313432383332313835383439303538393238313832333930303831343339313130373039303534313639353436323937393633343436323338303633373733343137333239383434303539353332323535353630303835333035363032333731353238383231323733333434363430353835333331313134323137343932353832323731373739383133313637393931343235363537363738303230383639353232303834333832333036333934383334373838323037333936353737343938383238393131363136323230393033323132353833303234323635323637393033323938363236333433353839333736363534333531353533323635363934373133323336333738333035383633323438393539383035343637383739333039363935303936363234313436353530333032333335333636363935333334373235343336343832383130323736383234323034323230323534363538363838303034303934303531343135393139343636343536353632343831383339303835373634343137343138363138353736373838343038343130323832383039303437383334323637313834343233343132363031363931343330373932383638313132393331353133303632393937343730333432383134373334303439333136333734363230323836353230353333343237393832353431303831313231383137393039353835323631303134383034393434343836333935353336383932303431393938383230333433393733343635353939313833333433303333393334323337363335353536383135333339383036333731323537383539353038373532353734353831363937393638323531333934353231393730353033393433373739343035313838393135303433333937333338373336333839343434343630383134363030323936343032323331333438363136353837343836373835373230383636363833383035333237343231303834353236323234303230303837343630383333303131303834343033303233373936303638303735373231323732373631383837333234313536333730313433343738303535323438333139343234323535373839333733363338303630303338373339313635333339303834323234333831373338333833393935353937303131353439383038363630303334353133353735363430323039313338333138343839393439393131323939343134333533353331373732333139383637373532343731393239353632373137373032313330313337353236373639343732353934323931393338383431333134303831353232363034303534313533343034393533353737393036313337303634343335313133383231393636353231333536363030383836383834303735373836333836323338363331383736343037393232383539353630313131333433323039383238333634323934303432383737323431303133373539353234343430393330303033363533383638313835363531323936343230333534323130393139363136343738373032383030333630333537333839303736373833373732393630313635343232373439333633383939353632303232393038303530383937353633363437323134343737363132353232313339303738373437323932353730343836373931343039373435353231333830323833353632313736393032313133373635393336333732363336353633343833383631363739363937353934333735373737303835323739343939383434393431373037373834343536343434393933363332383930303931313636323432313730383534323638313331383038303738343532353932313130343239363636323834383433393933323030373636323733363039373937333831343437323433343237323735353738383438363038333133373232393038343233353332333031393036363638383533343439333631303230303632313239313936393938333237383638363833353434323131383939353339353634343230393433353831383633343530353630373636383130333034323234323439323732313938393932393535373534383734323337313636373239393338313737343335363231363238393932333831343630303735323337373538333536353134373030383733383736303932333633373037323732373233393738303830393533333434343035323237363239303131323430373737363237393332343436393636343431333031353432343432373836333338363730363137383233323136373932353439383436393035343136363732363438393732363830383133383037373838353331383937363535313339313035303330383939363433383538333430333735343433373839383836313538363733313830313239363230313330313131363537323438303631393933353332363431313337313531333632353731343131353838383836393035393631343037363937313833313031343433303533383337323234383432363131373135343435363739303434373639343331363630313734333238303932333236363430373635323930303932393033373833353937313239353736343836363435313134303037353736343830313734323137313931373335343536333936313630353132363436353939323334353531353937353234323634353633303538383734323531303436373636333438313938313430393338323634383037353137333931363130303934333736343739323433393132363936353334323933393837313831373038393537323130383534363331303931323239383435333037353231353530353834303437323732313336373837313131313838353031303631313934333730313933323134333438303636303735383637303735373138343739343537383931323734383439343231353239383431363431333837323537393831383731393232373635303231373436343239303433373938393131313633373838383037313337333431303234343530393933343232303238303633323034393738353933383733363630353138303031303034393231363635363830313138363838303032323831323237333632313039363437393232333036343538353138373631323131313930383335353131353135353039353137323634373032333539393136353531363835333839313234313038313033353434343538353536303732343435313132373632363837343338313739353332363737313130353336373431363934323735303133353231343334313132313739383633363330343937373539343637373635373836303033393537363639363739363038313235333138383337333732363332363831323636323431323933383531363339393330373537353739353433373132383538333933313531343035353938363932383636303633353230353632313931373436313538393735303236373335373730373231363037363432353334373238343135393536393637343139303934323333363630303233353931303538313630343834373133313737303432333039333031393334383837333331333937383433383431313537353332353336393835343237343135333532303936353638363630333630353631373232373035393036363531383536363731313239303234313136353931343739333435393631303837303530323031383439313935373337393334363837323638363436323233353434343439343437363734303335313035353838383534373934373635783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830323735353738353034343534353338353034363239323535313337373230323030393532303137353536343638343538373130353637343735313639353034333839383832343733313437353138353031323038363136393538333030343231383432393133373838333939383934383937343136353537383132323030333333363034313739303335323733363334363130323833343138303835363134353737373437373837343232363132313438353835343238373337333939333135303133383631343934323837313133303835343134353030383433313130393431333137353836313437343334363034383333363434373231303433303738333833373036373439353235373232393932353131363834303539313932383730333738323132333536323537313737373539333535323835353336333331303235343637353439393839303036323236313832303330363039333436333238373930343631363931303438393430343031303235333735323135373639343439333936363831363731323137393630353630323836363731333631343132393733323237383537323634313339363133393439333230313434303535303332393831343631353134333834393430303937303338363335353331333635313036343032323434333739343632383231333937343936383434383330303037363530303836343836353136343937333937313730333036323831383031323839373739323935363936383730353031353939343738303134373233333235343435343132323938353134393533303430373734343532343332343039373739303132383438353531303738313731353230333136313834323232343132363234343239343136353035373530323735393835363931333632313235373136313032303538363834343634383933323134353734303438363839393532313138333738353930383832393939343536343237323730303636373133373730363935353033393439303837363530353338323230333334333134353437303036383730343135313139333230323034363537333534383139323037353130363031373734313539313837383132343031303434363435303638303130353433343732323430383137333731373632343138303739303332373639313031383332333331353532383731343133363233393439363630323539303035393436313339353735353537373035333138383837333739353839303739343637393636373031333432313539353737363632303839333536333034323839303932303939343233363233363339353331353233373935363533333636373437323339373236353438333631363634313432373638323230333538313733393930303635383335303133343932383737393834303936323934313135333733343937353236333731313339333431373532393934313330333039383939393633323935363237303337343632373339343232333739363230333637383636373637353132393237313139323531303032313231383939393730333735393931393832373331353135333035383330323534373731313030323831363936303132323238353539313332373034343531313837363936313832373135373932343538373232303233373732363236393638303636353338323632343033343431383231353738343235323237343530333735343535383937373933353839343338383335323038343239343430303335323030313838363637303233313839313539303136343536363337353534313731393736323031383434393639303230393437393438303430383937393532313832313230323138363539393337353930373038323334393934363437353336353239373438363839383333393431353038333831303336323230313235363036303439383932343332353038343534343930313030363836313532393834303034373235383631343330303131303738333930303330383939383532333431393432343331353535373739373835363437333134373339303036383232313231343933373537373832313135373438383337333531333835303933313339323533383539393130313730383232373932353738323233363432373831313738343232323535383830323230343539383433373631373738383332313639363334393134323533303938393736343331353631333336393332313230353832353830333034303832393436373335363433333138373937343635383134363437303231383637343932373135323937323238383333373133353932373139393930353333343634363434343830393630373332333437313135393836313232383537393639303232313638303534393731353539363539343530393931373530393834333034393431393938363831303839383131333133353435383535303033303835333432353739373530333834333135363632313430343130383932343039353733343530393532333333343038373231303337323734393931393939343732333531373831363933343735343031323037343330323239363436373533393339303837383132393731323638313438373338343933333330393438353230343133333532393335313539343138313934303036383732333734323132333735333737363732353131323338353838373933363132393932353734353734353731393439313237313739383239353131333136303237383336383931313731333933343739363031393138353733353331343336353434363438363335343938363239323535303231343930363631303734303938333131353635323039353433353739343835393137363136323134363737383830333031343335313132363238383131343633333538313234373739383736323833373430363833383931303431303939313439393033353437383638313633373439363336343634383733363535323832323132363936363233363937313032343030393838353533363431333934373839363234323530353833353139353031393539313830393732313233383836303538303435383536313737343039323632373034353033363939363939353036353832323833333137393133393234353537383934313136363430303530393632383036363432353835303437333930303136363634393436303035343032313639343730303432383239343734333739373731313833393133373231333730333134393332333834323239363436303432343437383539383134393537343139393533333938353839383836363235333836373832363430363835383131343830373831383235373530323837373631363437393032333236363133313932313239383032353839303939343238373138313038393634363130313130303530313036363930303133323131383930353237373138333634313136313839373337313036383637383434373832343931383137333434333830383931303033333836303634373535363630313731363634383237343437353332303830353832363236313036323039323034383830393431303432333737303639393131303437333438323538333137343937323134363034393038383838303932303439303639343239333233393834303235353435323336353132333931393030383130303839303531383431353337383331323731353330343932363931373337353035313834393630323039383637363239373139353830363931383834393734313030363330363336313631353936363735363235313733333530303431303437343937313936363638303438353331383839323638383830313138343632353633393735313635343838323139363334323830353632343235363931343836333835363038383636343132343637373234303434303238303139313537383230333337363234313834353935333438303331363632353632333333333834333738313538323035373636353639393238353934343439323239353332363531393332373536333938303836343530313539333036353735363133353832303032363830343039333936363334303630313635363336353331313434393237335C7831625B31313B31326D3639393433373435303135313231313136323835303839343437353738313533393434393931323435303330373131383331363931363135323331303935383032353137393335333431323239373835363138333739333331383538343431343732333731343036343136373134343039343738333632363030363239333230363734323235393336393637343339393331353237373032343237303634383438303137333238363137373435373838343735383436393837363235313732323335393238313132333231373631393038343635373635303539303934393234353234313631383636343038363930303937373532373039363338333431323633353930383339343037313434323231323137323436333430343230323037363739313735323538303032323430303437323939323136393231353737333238393837303133363139393036373936363831353930393538393239373239353432353238343434373235303138303735323434303538323333303531333737323235373332393732373834303637393631393333363933393539333432333436373233383633393936383739373838393334373830333738383737313134353831303630323137333137303631363332373631393937303237383630353236373139333530303932343934363932323831393132393239353839363530343831313031343134313232353432353338353132383738333537373234313832323835303637363633343635353039303438323135383335373632383834303936333231333638303131303038323536313538343934303930393837383931373035383330393931323738363531363834343338333037343634313734373339353830393336383134303237373837303730363736353135323331353530353337303830313534333539363033383336393239363431353132363931313738303639383035373134333235313131343738303439343434323934393439363739323432383335323732313139343937373436363433313935313637343237333737323839363438353531383632393132333535393632323435343634393338393139393539343033303437353332383637313939363830383339363434353638303736333739323238333234383039383735343239343633323537303434323537373531373730373239373538333730333531353832303839383234383431343034373932373935333531343131313035353437363538383737373031393237393437383834373234323631313239393436373832373536323932303833323131353630363732373531383630303932313137383535333135323336363437343533383434373832393236353530393232363037333538383036313232383430393335343634383238393836373136363833383137363236313833343032303232383536383039333837303736303037353739363031313036363633313635303838353238333238303437353039393639303832353235373034363438393839353735383232333935343931373830383237383535383430313330313138303634383536353437303135383536393434313239313431323830343539373931383730343138383734373139343631323830373730333533343233353238383434313436373330363131343630373138383539323030373339363632373632303739393135343436393031393739323537363436303338343331313232313037303834383932343137383430363134363531313830363336353733313039373236353137393538333838393635333537393131383137303032393730313531383636373238323436353938373330373132363834393135353733393435373833333630343034313731373831363538373433383333383637363933333432313238373639363131313036303136383236383531393333333936393238363539343031313531303336313332333334383232313434333334393334353633353435303234393337363736363139393038393134343339383630303238303935383638313435383034383831313830373130303039383731333330333130363339383532373930363632393232343832363338323939353434373631303238323737393139393134383036353539303431303432383537393238313030363133383835343537373636373736383430383931353839343337333934353237313735383539323130313434353732313934323334333134353739393231323933313338303330333436383530393232313736353332383733333932313834383331323139313734393834343738303838323338333938323330323734373739303430393131303538303132313630313133353634313036393538353131363639343530393633333039303431313231383136363930353435313639353439373033333534313637373732333035393332303033303531363435333534353635303138313534323032373338353133353038373934383435343231313535373238343139373632373136353939373039383231353132343430323938303635343839363835303033333137393335323936393938343432373738353736383736303634303038383130353339363937393831373139383938393931353531313630303031313631353932383130383934343430323330393039363535303339303230363639383135323931313736393538343737313932373232333534333732323033353338353839353838303237323232343834363735363731323039323530353432333735323437363831333137343539393935373635383437353935343038353530333132333637313332303838353137353930343333303830353632363434393738323038303139373438353938353531393634303230383638363533303633353032383136373439373933303331373437393236393330343037323639323638373432363434303436393639303430353232343332353233323936313836383532333037363539343536383337323436363037363031393931333137373638303337373630303538383432343136363436393936313036373433313633373138363830323239303539373634383739353735333534323535363232383536373831343335393034323330313631353634343737383639373237303436323931353634383633353730343833303439323531393331373639343635363839393434323339393332303831343633323535393531343736363434313832373439383535333433343231313335313132313030343434363439303330343133383934383334343238313231393236363032353732333137363435363735373034353330383435373638303630353738373935343932353039353335353732323030383732363235353838353235383532393732373734313438333430373337373731333833353438393830343037353638333037333336323339323633343531383233323934363232393130363134343038393532373333323938373831353530353335353431323334393430373735303534313538363831343935343835393035343435353535303038333032383431373835373233323735303539383036303836323333383133303535393134313432323336313433393731373639303130363730383238333638393133353535373130343033333834333436383739343334343834383434393239303233373930353330393732353839363335333836343933373238333739353030343439313034373930353336333732333935353839383835343931393730393937323538303737333737393638383937313339333137303232333938343237303635323838373134383434343931393933343934373138333336363336323935383633383838363233363334313535343534393133393730363834393730383832373036373336363432343439333030363337393139323239333235383537393432333634373834313634343939353733303739383639323834323035343334333834333338383635323333303030303831373433313336303836323339393132393537373531343430393836353932313031333237383837363030353930343031363633353632323636333434373533393539383235313434343135383230323533373935383734353734323030323831313739393637313230323037383735313233373130313830333833303437383439313339323638363236323336353039313838373932303138313832393239363934323833323433383337363331323435343535343433313831313731363934333530303631383836343034353536363936343834333231323035363133303834383132333338363036323137353635393430343635363833353035353138373038313433373833383830393435313838383633303932323930363335343139343536363331333332353438353931313831303839373835373238313834313634333832303939383434363533373032333736323133303538343331363932383931303233393232313938323135313936333130333630333037303630333235353232303733353033393235323931333939363737393030383731373839323638383738353033353939343935383836353631393830373334333836373736393630303234343836353631333135343034313732323637323937393035313635373539323331323538333937393530343632343931313832373338303532333239383837313732383830333634383432393833343433393233373331303831343232323839303431353832313235383037343137393533303131333731363030303632353532323139363639363631373639363638333034333036383331303839363631343435353435383838333439383432363430303633343332393531323135353636383838343036373336333231383636383130303938373434323132383438333836373037383835333736393630353338333437393938303131353130303335383238333139383437363834393433393637323631343236393235333832363537323837363836333830333539353135313933393334363337353734383336353439353738383839373139393030393439303235393537383432343731333838323534333935353430393938393338343230373138323032363532323939323334383933363038373539363036313130353732343434313438303931333639383530323238333037313232373630383736353836363432343839303133383934343134383339313235323935363336313030333934333635353534343531393832363233323231373136363834363539373130353133363039363132313238303431363031323931353033363839373830303139393930313336333034333633373630393434393732303332363336383739363335343434383633343539303436333138333036393133303536303834353338333033343633373638383738333334353839333930393530303232373735353832383538373935383039313833333033373235323130323633323838323833323639383631393733353538353730353339333136323536353632353237333830393535393434313831333832303737393032343136363937333633353431363334303532363534373233333438343830353932373632353838313635313339373730313137353634393435333438363134323833323138353834393035383932383138323339303038313433393131303730393035343136393534363239373936333434363233383036333737333431373332393834343035393533323235353536303038353330353630323337313532383832313237333334343634303538353333313131343231373439323538323237313737393831333136373939313432353635373637383032303836393532323038343338323330363339343833343738383230373339363537373439383832383931313631363232303930333231323538333032343236353236373930333239383632363334333538393337363635343335313535333236353639343731333233363337383330353836333234383935393830353436373837393330393639353039363632343134363535303330323333353336363639353333343732353433363438323831303237363832343230343232303235343635383638383030343039343035313431353931393436363435363536323438313833393038353736343431373431383631383537363738383430383431303238323830393034373833343236373138343432333431323630313639313433303739323836383131323933313531333036323939373437303334323831343733343034393331363337343632303238363532303533333432373938323534313038313132313831373930393538353236313031343830343934343438363339353533363839323034313939383832303334333937333436353539393138333334333033333933343233373633353535363831353333393830363337313235373835393530383735323537343538313639373936383235313339343532313937303530333934333737393430353138383931353034333339373333383733363338393434343436303831343630303239363430323233313334383631363538373438363738353732303836363638333830353332373432313038343532363232343032303038373436303833333031313038343430333032333739363036383037353732313237323736313838373332343135363337303134333437383035353234383331393432343235353738393337333633383036303033383733393136353333393038343232343338313733383338333939353539373031313534393830383636303033343531333537353634303230393133383331383438393934393931313239393431343335333533313737323331393836373735323437313932393536323731373730323133303133373532363736393437323539343239313933383834313331343038313532323630343035343135333430343935333537373930363133373036343433353131333832313936363532313335363630303838363838343037353738363338363233383633313837363430373932323835393536303131313334333230393832383336343239343034323837373234313031333735393532343434303933303030333635333836383138353635313239363432303335343231303931393631363437383730323830303336303335373338393037363738333737323936303136353432323734393336333839393536323032323930383035303839373536333634373231343437373631323532323133393037383734373239323537303438363739313430393734353532313338303238333536323137363930323131333736353933363337323633363536333438333836313637393639373539343337353737373038353237393439393834343934313730373738343435363434343939333633323839303039313136363234323137303835343236383133313830383037383435323539323131303432393636363238343834333939333230303736363237333630393739373338313434373234333432373237353537383834383630383331333732323930383432333533323330313930363636383835333434393336313032303036323132393139363939383332373836383638333534343231313839393533393536343432303934333538313836333435303536303736363831303330343232343234393237323139383939323935353735343837343233373136363732393933383137373433353632313632383939323338313436303037353233373735383335363531343730303837333837363039323336333730373237323732333937383038303935333334343430353232373632393031313234303737373632373933323434363936363434313330313534323434323738363333383637303631373832333231363739323534393834363930353431363637323634383937323638303831333830373738383533313839373635353133393130353033303839393634333835383334303337353434333738393838363135383637333138303132393632303133303131313635373234383036313939333533323634313133373135313336323537313431313538383838363930353936313430373639373138333130313434333035333833373232343834323631313731353434353637393034343736393433313636303137343332383039323332363634303736353239303039323930333738333539373132393537363438363634353131343030373537363438303137343231373139313733353435363339363136303531323634363539393233343535313539373532343236343536333035383837343235313034363736363334383139383134303933383236343830373531373339313631303039343337363437393234333931323639363533343239333938373138313730383935373231303835343633313039313232393834353330373532313535303538343034373237323133363738373131313138383530313036313139343337303139333231343334383036363037353836373037353731383437393435373839313237343834393432313532393834313634313338373235373938313837313932323736353032313734363432393034333739383931313136333738383830373133373334313032343435303939333432323032383036333230343937383539333837333636303531383030313030343932313636353638303131383638383030323238313232373336323130393634373932323330363435383531383736313231313139303833353531313531353530393531373236343730323335393931363535313638353338393132343130383130333534343435383535363037323434353131323736323638373433383137393533323637373131303533363734313639343237353031333532313433343131323137393836333633303439373735393436373736353738363030333935373636393637393630383132353331383833373337323633323638313236363234313239333835313633393933303735373537393534333731323835383339333135313430353539383639323836363036333532303536323139313734363135383937353032363733353737303732313630373634323533343732383431353935363936373431393039343233333636303032333539313035383136303438343731333137373034323330393330313933343838373333313339373834333834313135373533323533363938353432373431353335323039363536383636303336303536313732323730353930363635313835363637313132393032343131363539313437393334353936313038373035303230313834393139353733373933343638373236383634363232333534343434393434373637343033353130353538383835343739343736352F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F7830303265786563286261736536342E6231366465636F6465283233323034333646364437303639364336353634323034323739323033413230343236393645373936313644363936453041323332303437363937343438373536323230334132303638373437343730373333413246324636373639373436383735363232453633364636443246343236393645373936313644363936453244363236393645364536393041323332303539364637353534373536323635323034333638363136453645363536433230334132303534373236393633364232303530373236463646363630413639364437303646373237343230364436313732373336383631364330413635373836353633323836443631373237333638363136433245364336463631363437333238323736333543373833303330354337383330333035433738333033303543373833303330354337383330333035433738333033303543373833303330354337383330333035433738333033333543373833303330354337383330333035433738333033303430354337383330333035433738333033303543373833303330373332313543373833303330354337383330333035433738333033303634354337383330333035433738333033303634354337383330333135433738333033303643354337383330333035433738333033303541354337383330333035433738333033303635354337383330333035433738333033303641354337383330333135433738333033303634354337383330333235433738333033303543373833383333354337383330333135433738333033303634354337383330333135433738333033303543373833303334353536343543373833303331354337383330333035333238354337383330333335433738333033303543373833303330354337383330333036393543373836363636354337383636363635433738363636363543373836363636344537333639354337383330363335433738333033303543373833303330373835433738333936333543373836343634354235433738363336343645354337383331363235433738363333393543373833313331334535433738333933333446354337383634333134423635333536343438354337383636333135375C7831625B31313B31326D35433738363636333543373833393331354337383338333335433738363333313543373836333332354337383634333235413543373836323336363035433738363136363543373836313334354337383634333833323543373833393633354337383338363432323330354337383636333335433738363433333543373833313334354337383633333735433738333136313445344635433738363133363543373833383337324235433738363336323543373836313337354337383330333435433738363333383543373833303333354337383330333435433738363333383236354337383631333732303543373833393337354337383634363335433738363633323543373833303332353936303543373833303636373935433738333833313744354336453233354337383636333735433738363236333432354337383631363135433738363236313543373836313337354337383635333735433738333836363233354337383338333935433738363133343634334235303543373836353634354135433738333936333543373836353635354337383631363535433738363136353543373836313635354337383631363535433738363136353543373836363631354337383631363135433738333936313543373833313338364537433543373836343331354337383339363135433738363633333543373836313330363533413545354337383633363235433738363236363543364535433738363133373543373836333633354337383635363235433738333933363337324336363342354337383634363535433738363233393345354337383330363635433237354235433738363133333732354337383634333935433738333933393543373836363339324335433738333033383543373435433738363533333543373237453543373836333335354337383331363235433738363133313333354337383631333335433732354337383634363235433738333033383543373836313339373835433738333033383543373833303633354337383633363636363543373836323333354337383633333635433738363433343543373836353330353335433738363433373331354337383331363235433738333033313644354337383338333435433738363433333543373833383330354337383331363133383543373836323634354337383636333135433738333933363333354337383631363637313445343335433738363436363543373836353330354337383632363333313543373833303636354335433634354337383632333035433738333133383543373836323632373032383343354337383339333535433237354337383330333135433738333936323543373833393331354337383634333935433738363436333543373235433738333136343346363035433738333133363543373836353335354337383331363335433738363133363334374435433738363333363543354335433738333133323543373836313634373732323543373833303334354337383331363434333446333935433738333036333543373836313635354337383331363535433738333933353543373435303334354337383331333435433738363433303543373836343636354337383633363432393543373833303636373935433738333933393543373836323635354337383632333335433738363133383543373833313636354337383339333235433738333033333543373836343331354337383636363432343543373833303338353832303543373836363339354337383331333836463543373833383332354337383631363532313543373833393634353533353543373836343636354337383636333135433738363236323543373836333334354337383636333137383638354337383632333836453343353135433738363136323244354337383632333035433738363333393543373836323634354337383331333335433738363633333543373833393634354337383634333335433738363633343543373836313636363035433543324135433738363133383236333435433738363533353441354337383330363535433738363633333543373836323338353236323543373833383636373935433738333136353543373836323335354337383632333034423243354335433245364635433738333033303543373833313331354337383635363134393543373836353333354337383331333136433543373833393335354337383330333335433738363536313332354337383633333335433738363136353543373836333332323235433738363233353332354337383636363336393732354337383331363135433738363436313734363235433738363333303543373836313635354337383331333535463535354337383330333332423543373833383636373033443331354337383634363436323245354337383339363235433738333033373543354335433738363336353543373836343635354337383634333535433738363233353543373836343636354337383632333437423543373836323634354337383634333335433738363336353246373635433738363233363637354135433738363633393635354337383634333235433738363536333430354337383636333336393543373836343332354337383635363334323543373836363333344435433738363433323543373833313633343235433738363633333535354337383634333235433738333136333430354337383636333333383639354337383636333635433738363133313543373833313339324435433738363533363532323335433738333933303442354337383338333135363543373836333334333235433738363136323543373836323335354337383330333535433738363633333543373435433738333936353538334433433733354337383633333235443441354337383636363436343541354337383631333836363543373836353331363035333543373833303633353633423335333935433738363433343239354337383331333836423337354337383634363235433738333136343235354337383633363635433738333036363543373833313633324632343343354337383632333435433738333133333731334533373543373836313632354337383635363636313536363935433738363333323543373833303332343235433738363533313543373836353334354337383633333837423532354332373543373836343631364633433543373235433738363636313441354337383633333235433738333833383543373836313331354337383633363435433738363533363631354337383636333333323730343235413543373836313335354337383632333536433543373836353636354337383633333435433738333936343543373836363333363935433738333133353342373137313543373836313335333635433738363236323543373836313330354337383331363535433738333936333543373836353335354337383338363337443437353335433738333836313543373833303337343535433738333136323543373833303635363636343242354337383339333835433738363333383239354337383633363435433738363633303544354337383631333835433738363633363543373836323635343135453543373833313633334433443543373833313332354337383632333333373543373836333631324533423637354337383632363135433738363133363639354337383635333535313434354337383631333634313543373836333338354337383633333832343543373833313331354337383634333932333638354337383633333135433738363236363332354337383636363534313543373836343332373035433738363536333543373836343632353135433738333033333543373833393636354337383331363235433738363133333633354337383339333235433738333136323543373833393336354435433738363236363435323637423543373836313334354337383339333136343738354337383338363335433738363233343335334135433738333133363744323034303543373836363330323535433738363333333542354337383338333234363636323335413535363935433738333933303543373833313331324535433738363333343543373836333631323532323543373233323543373836333333354337383330333635433738333036353434354337383631363234413543373836343336363835383543373833313332324535433738363136343536343535433738333033313543373836343331373037393642363532413543373833393337334635433738363636333543373836363634354337383331363635433738363533343543373836363331334335433738333936333543373836343332354337383330333034343344323235433738363236323543373833383635373736353543373836333633354337383331363334463543373836363334334637353543373836333332363737333533354337383636333434463543373836333333354337383634333035433738363533373543373833383636354135433738363136343733354332373543373833393633354337383633363535433738363336343543373836313336354337383633333536363543373833383338344535433738333833323735354337383330363235433738333136353343343734433543373836363338354337383339333635433738363336343446354337383635333632363543373836333335354337343543323735433738333833313633354435433738333933303543373836353333354337383338333035433738363233313543373833383339354337383331333835433738363436313543373833383335354337383635333335433738363533313345354337383330363235333543373836323332343236343543373836363331354337383339333135433738363133333639343635433738363133333238373835433738363136343744354337383339333035423543373833303338354337383636363435433738363333323334354337383631363335433738333036323543373836323634354435433738363533363733354337383330363235433738363133313644333237373544354337383636363435433738363633343543373836313633364334443239363835433738363333313543373836333330354337383635333735433738363233313543373836333335324535433738363233383543373836353338354337383636333435433738363333353543373833383337363335433738363533333543373836343636354337383339333935433738363533313738354337383636333832393543373833313633354337383330363535433738333133343543373833303335354337383636333735433738333836363543373836363634353533343543373836343331363334423543373833313338354337383631333935433738363533353543373833393332343034353543373836363330373335333637334535433738363633353543373836313631354135433738363533323543373836323332354337323543373836313332363135433738363336333543373833393336344133333543373836353631354337383633363435433738363136323335353232453435343035393744344535433738363136363543373833303334354133353543373833303635354337383338363535433738363333343543323734413543373833383632354337383633333435433738333933353634333035363543373836343331354337383632333735333444324435433738363433393543373836343338354337383338333735433738363236363543373836363634354337383331333335433738333833333543373836353330354337383635333035433738333933303543373836323633333933383739343635433738363633363543373833313636354337383635363633443543373836343339334433413741354337383338363535433738363636343541334335433738333033333345363132423543373833383331373133393736334335433738333736363543373833313635353635433738363233353543373836343333354337383636363135433738333133393339354337383339363335433738363336363443354337383331363135433738363233343543373833303635354337383632363537453434323435433738363533373543373836313335354337383631363434373543373833393339354337383631333135433738333133393744344135433738363333363633373835433738363236313634354337383338333134443543373833313334323335433738363133303543373833383331354337383331333837383543373833383331354337383339363233463543323735433738333033373543373833313635333935433738333033363434354337383635333235433738333933343543373836323633333135433738333936333543373833393330333435433738363633313346354337383633333135433738333033383543373833313339354337383633333932303341354337383339333136393543373836313336354337343543373833313636354337383633333232363543373833313335373535433738363636353543373836353336354337383339333635433738363533313342354337383633363435433738333833393631353135433738333133333332354337383339333035433738363633303543373836313339354337383331333935433738333033353543373836333334354337383632333735423543373833303336333836343533354337383339333835433738363633363242343335433738333136333543373836343336353835433738363433383531354337383635363635433738363633363543373833383336354337383633333335433738363636353543373836333635344537423543373836313337354337383632363635433738363433333543373833313339354337383636333435433738363636323546373635433738363636323543373836343634354337383636363537303543373836313636334435433738363533393643354337383632333735433732354337383633333335433738363133343543373836363336354337383633333435433738333136333543373836363334354337323543373836313632334233343543373833383336354337383632363435433738333136343641373735433738333836333645373735433738363433303333334235433738333936323543373833383330354133333233354337383634333433313543373836363337364437323543373836363632363235433738363636333543373833313634354337323338354337383631333435433738333033383543373836323634354337383632333334393543373836353331364335443543373836323634353235433738363136363341373635433738363136343545354337383634333937343543373833393339363535433738363233383534354337383631333735433738363436353543373836363338354337383636333535433738363136323444334635433738363436313336354337383330363535433738333833333644373035433738333136333236354337383635363235433738333036353543373836353333354337383339363235433738363533373543373836343334354337383631333335433738333033313543373836313334354337383634363233313543373836333337354337383331363335433738363333393543373836323633354337383632333134433543373836313331354337383331363332343732354337383635333735433543354337383635363634443543373836363631354337383636363435433738363636353634363735433738333033373543373836343334354337383635333934433243374236383543373833313338364436423742374235433738363433323543373833313636344435433738363636313543373836343634324535433738333936343543373833303633324136383543373833393633354337383636333734323543373833393636354337383631333635433738363333383435363832383543373836353633373332363434344236463538354337383331333335303543373833383632354337383635363535433738333833303437354337383633363434333330354337383631333633343631354337383635363335363543373833393335354337383330333635433738363133393543354335433645373935433738333933323233343235433738363533303543373836363337354337383631333735433738333933353543373836323334354337383339363335433738363336313539354337383338363135433738363333333732354337383331333935433738363133373535354337383634333535433738363133313343354336453543373836333635353335433738333033333443354337383337363635433738333836313543373836323338324535433738333036363543373833393336354337383631333435433738333033323543373833303335354337383634363335433738363533393543373836333636354135433738363636343543373833383634354337383339363135433738333133343543373836333336354337383634333335333230354337383636363435433738363336353345354337383634363634323543373836353634354337383635333235433738363633383543373836323635354337383632363335433738363236333643354535433738333033313543373836363635354337383330333332323543373833383338354337383633333335433738363233353543373836313336354337383330333636343637354337383632333735433738363633353741354337383635363636303543373836313636354337383634363437393731373537303543373836343334363735433738333133303543373836363634354337383339363435463543373836323332354337383633333135433738363236353543373836323333354337383636363435433738363336343543373836313335354337383631333635433738333833343436354337383632363535433738333836363543373833383636333235383543373836313330343135443445354337383339333532313543373836333330344335433738333133353543373833313334354337383339363537363543373833393634354337383338363133303543373836343335363935433738333833353632354337383330333435433738333836343637354337383636363335433738363236333732354337383631333635433738363433383543373836343333354337383634363237433643353936433543373833303635344635303543373833313633354337383331333135433738333833333234354337383632333134463543373836323635354337383338333835373543373836333636364535433738333133303239354336453543373836363332354337383631333835433237354337383631363434393541354337383632333435433237354337383635333336393543373836333635354337383330333134413543373833313634354337383632363234353543373833393334354337383636363635433738333133303543373833383337354337383339333335433738363336323543373833383330343135433738363133303543373836353334354337383634333635433738333033383543373836333633354337383635363135433738363233323642323235433738363333393543373833383633354337383632333735433738363633313543373836333634354337383339333335433738363333333543373836343337354337383636333235433734354337383632363135433738333133303543373833383332354337383631333435433738333833393430354337383631333337333543373833393337354337383339333935433738333833363442343235433738363633333436354337383331333834323543373833313334364136323543373833313335354337383338333735433738363333323535354337383631363437363543373833313334354337383634333536393543373435433738363333323543373833393633354337383338333835433738363633343743354337383635333037443637354337383632333835433738333836353543373833313634363935433738363136353334354337383339333635433738333936313543373836333336354337383331363134363543373833383632354337383632333135433738363633303532354337383338363634423333354337383330333035433738333933303543373836313631354337383331333635433237354337383338363435433738363333303543373836363330354337383631333735433738363633393430354337383633363634363543373836323336353635433738333036363444354337383331333436413234353135433738363333313543373836313632323035433738363233313543373833313339354337383634333237373631354337383631333235433738363133383543373833383332354337383633363135433738363233343543373836313336354337383632333735433738333136333543373836353339333534413437354337383330333235433738333036323641354337383633333936363543373836313635354337383338333834433543373836313636373535433738363333383543373836333330354337383633363434323543373833313631373833343543373833383334354337383632333235303543373836323331354337383631333935433738333933353543373836333334354336453332323234303543373833393635323335433738363436313543373836343332354337383339333735433738363233333330354337383339363535433738333836313339373235433738363633303335333933343636354337383331333435433738363133303543373836323335344535433738333836333533354337383633363435433738333833333637354337383635363432433543373836333337354337383636333335433738333136313544344135433738363533303734354337383331363435383543373833313633334236313543373833383338354337383636333935433738363636313543373836353634354337383339363636423542354337383634363134323543373836323337373635433738363436313339323337423543373833303331323433453543373836363332354337383330333633323337354337383634333935433738333036363543373833313633354337383635363135433738363433393543373836353334354337383338333535433738363333333433324435433738363236343438353737313543373836353634363335433738333833313744334333373544354337383633333732323543373833393331363133333543373833383633354337383632363435433738333036333543373836353333354337383632363535433738363533333543373836343332354337383633363335433738363633303336323835433738333836323543373836363337354337383331333035433738363233323543373836363632354337383631363235433738333033333732354337383330333235433738363433373543373833383334354337383633363336383543373836363636354337383330363335433738333036323543373836363334354337383634333035433738363233303432373035413543373833303332373635433738363436323543373833383636354335433232354337383633333333353543373833313330354337383338333935433738333833353543373833313633354337383633363435433738363233333242354337383632333735433738363336363543373836333338354337383331333335433738363233303543373836353339354337383635333235433738333136353543373836363333354235433738333936363543373833393339354337383631363135433738363333363543373833313334354337383632333134323543373833313635354337383635363635433738333936343543373833313633354337383331363335433738333136343543373836333631354337383339333035433738333933313543373833303331323334413543373836333633354337383339333233393442363734343543373836363332354337383635333135433738363136363543373833373636354337383636363335433738363536363543373833303636334635433738363636353543373836363334354337383631363635433738333936363543373836363635354337383636363435433738363533313543373836363632354337383331363635433738363533313543373836333636354635433738363636353543373836363334354337383635333135433738363636323346354337383636363335433738363533373543373833383337354337383634363636333543373836363331354337383330333035433738363333323543373833303331373335433738363336333543373833313339354337383634333135433738363133313543373836333330344335433738363636323543373836333333324235433738363536315C7831625B31313B31326D35433738333833323543373833303632354337383338333035433738363436323543373836333339333036313745354337383633363135433738333136343543373833383334354337383331363535433738333833303542354337383636333135433738363336333543373833383635373637443632354337383330363635363543373435433738333936343543373836343336343435433738333136333233324635433738333036353543373836313331354337383636333735433738363533393543373833313330354336453534354337383330333035433738333833393543373836343332354337383339363434423543373833393330354337383634333433303543373836343636363835433738333033323543373833313636354337383631333132453638354337383331363133453630354337383330333435433543354337383338333836323543373836343337344135433738363533393543373836343633354337383632363434313637354337383633333735433738333036353543373836363333333535433738363333353543373836363331334335343543373836343635354337383331333333393644344237433436324535453543373836323632353635363543373833383334323135433738363133353543373836323337354337383635363336323739344236383735353035433738363133313543373836313635354337383331333535433738333936323543373836313331354337383633333435433738363636333543373836323334323134303730364335433738333836313543373836313334354337383338333235313543373836343338354337383331333135433738363333373230354337383636333335333331353835433738333833343341343935433734354337383330333635433738333933313543373836353632343133443342354337383330333135433738333033313543364532303442363935433738333833373543373836343631354337383338333533303543373833393331354337383632363437313543373833313332354235433738363533313543373833393334354337383631333335433738363236343543373836313634374435433738363433343543373836333635344434373543373836353634354337383339363335363543373833313633354337383632363235433738333933323342354337383635363135433738363436353543373234373543373836313634354337383338363535433738363333373543373836323331354435433738333833303543373833383635354337383636363335433738363133393346343134383543373833313334363035433738333033303543373836323335363233383543373833383634354337383330363535413543373836313338354337383635333133413543373833393635354337383632333832463543373833383631333432343637354332373543373833313339354337383330333835433738333033373543373836323339354337383338333835433738363333353543373833383633373635433738363433383543373235373543373836333337354337383633333035433738363633313543373836313632354337383632333535433738363433343239354337383631333435433738333933323534373235433738333033383632354337383634333134333338354337383338333637443731354337383338363534423543373833313634343335433738363233323543373836353337354337383633363137363543373833303335354337383633333435433738363533353543373836353631354337383339333835433738363533383543373836323332354337383636333935433738363233333637343735433738363436363343363933353543373833393632354337383636333235433738333936323543373833393338373735433738333036353543373836333631354337383633333837313541373635433738333833363543373833383632364336453543373833313332354337383632333835433738333136363543373836333331354337383331333535433738333936353543373836333331354337383631333635333337323635433738363433373238354337383632363133313543373836313335354337383338333534413543373836313632354337383631333735433738363136313543373833383339354337383633333835433738363633303543373836313332354337383330333735433738333936343543373836323335354337383634333335433738363636353532344235433738363233373543373836323632354337383632333937363246354337383634333735433738363436353543373836333635354337383632333535433738363636323543373836323339354337383636333632303543373836343337354337383331363535433738363533363543373836343631354337383631333335433543374235433237354337383634333735433738363536353543373836323334373335433738333136343538323035433738363136353543373836343332354337383338333634343543373833303330354535433738333136363543373836333635354434333745354337383338333135433738363133333543373833313332354337383330333637393543373833303336363937433543373833393337343235433738363536343543373836363331354337383634363135433738363333373443363235433738333933333537343935433738333133393741354337383632333535433738363536353543373833393332354337383631333935433738363136363341354337383632363135433738333036323543373836353337353835313543373836343636333335433738363333393543373836333333354337383338333436333639354337383633333936463543373833393633324135393637354337383635333935433738333133373338344236313543373836343331354337383339363235433734363435433738333736363543373836333631343235433738363433363543373836363334354337383631333735433738363636353537354337383331333335433738333133333245363135433738363436343543373836333331373033383543373836353633354337383636333533413543373833383333354337383634333135433738363336353543373836323330354337383632333335433738363433333543373833313636364537323543373833313631354337383635363134363543373836323333354337383634333335433738333136353543373836323335354337383634363235433738333936343543373836343331354337383631333035433738363236363543373836323333354337383634363435433738363536363734374135433738333936323543373836353331354337383339333534463543373836363335354337383634363532363434354337383338333433453331354435433738363233353543373833383634354337383330333235433738333036363543373833313635353435433738333136353543373836313339324235433738333933393243354337383634333936323543373836363632354337383632363435433738333133343546354337383631363435433738363433383543373833303335343535303543373836313635343434393631343035423543373836333632364333313543373836333335323236423545354337383331333537373543373833313330373632453543373833383336354337383632333535433738363333353543373833383332354337383330333035433738333133323543373836313330374235433738333933353543373836333339373935433738363533393543373836343631354337383635333535433738333833343543373833383335353035433738333933333543373836333631354337383632363135433738333833333233354337383633333234323630353735443543373833303331354337383330333635433738363233353541324435433738363236353543373836323337354337383633333735433738333833383743354337383633333235433738363533303543373833383635354337383633333237433543373833313332374435433738333033313439334135433738333136333543373833303333354336453543373836323636344235433738333833303543373836333334354337383632363137373543373836363332354337383636333235433738333033353543373435433738363133373543373836323330354337383632333935463642354337383631333935433738363233393232354337383635363635433738363533373543373836333631373135353543373833313335363435433738363236313543373836313632354337383339333535433738363533383543373836313332374135433734354337383331333735433738363433353436354337383632363335433738333033363543373833313334354337383635333835433738363333313543373833313633323235433738333133323441373435433738363433383543373833313333354337383633333835433237354337383636333835353543373836333331373935433738333833303543373836333338354535433738363133393543373833313335343835433738363136353643354337383635333535433738363236323543373836313333354337383635333237433633354337383635363635433738363533353543373836353333354337383632363435433738363533373543373833303337354337383338333734463543373836313333373235433738363133333234354337383635363135433732354337383330333334413445323333383543373833313337343035433738333133373543373833313335354337383635333935433738333933363543373836333636354337383331363235433738333833343544373035433738363533383543373836333331354337383330363235433738333833333543373833303635354337383633333335433734354337383633363335433738333836313641354637453543373833303338333135433738363333373543373833383636354635433738363236343741373335433738363633343543373836363332364235433738363433323543373833383339354337383631333432413543373833303635333835433738363633373543373833383635354337383330363535433738363333313342354337383331363135433738363333303543373836363636354337383331333235433738363533343230354332373543373836353337354337383635363535433738363436323543373836343631354337383636363635433738363436343744354235433738363136323537373135433738363433333543373836323335354337383632363137363543373836333634354337383338363435433738333136323543373833313338363036373338374534463337364536393543373833393632354337383338363235433738363633343543373833393335354235363543373833303332354337383338333533373543373836353635354337383330363235433738333933313633354337383331333235433738363136363739354135433738363433373543373833313332354337383331333035433738333833323642373035443543373833383333354337383339363135433738363233303543373833383635354235433738333933303446354337383631333835433738363136323543373833393631354337383330333136373543373836313631354337383636333235433738333933383630353036333241354337383338333235433738363433343543373833303335353635433738363336323543373833303333354337383338333535433543354337383636333935343543373833383632324635433738363233303531354337383634333635433738363336353739354337383339363635433738363233363543373833393337354335433730354337383631363535353432354337383631333435433738333133343543373836323636353835433738333133313543373833313633354337383633363232383543373836313333374334393344364636343744354337383631363535433738363136323743354337383631363535433738333933303543373836343339354337383635333735443543373833313634354337383631333235433738363336313543373836323338354337383338333035433738363333343645373835433738333933353543373833383633354337383330333635433738333036363543373836333337354337383331333535433738363236323543373836363337354337383635363535433738333836313543373836333631343635433738333033353245354337383631333935433738333833363645373135433738363336343243354337383632333035433738363436343543373836353635354337383631333235433738363436343541374535433738363533363543373836313332354337383631363236363739354337383331333635433738363236333634354435433738363433373543373836333634354337383631363137413543373833383333354337383330333732463543373836353631353835433738363536333543373836333339354337383338363237433543373833383335354337383632363135433738363533363543373833393634353535433738363433313436354337383633363535433738363333353742354337383330363232453745354337383634333336433543373833393636354337383636333735433738333033303733354337383331363435433738333133373543373833393332354337383635363235433738333133353543373833393632364235433738333833353241334635433738333136333543373836343337354337383635363637443334354337383634333735373734343335433738333033383238354133323543373833313334354337383635333235433738333133333541333232343741354337383338333535433738333836323231324435433738333833363436333135433738363536363543373836323335364535433738363133373543373836323334354533373534354337383632333235423543373836323339323136343543373836313635354337383634363635433738333033333543373836343332363235433738363533383543354335433738363336363543373833373636354337383635333335453543373833393330364535433738333136313543373836333632343535433738363433343543373836363336354337383632333535433738333133313543373836313335363835433738363236313543373836333631353535383645354337383632333837353543373836333634343535433738363636363543373836323631323535433738363436353543373836313336364635433738363333373543373836343333354337383635333435433738363633363543373833373636354337383331363435433738363236363543373836333636354337383632333735433738363633353543373836353631354337383634363235333644354337383635333235433738333033343343354337383331363335433738363336323646354337383330333135433738363536313541354337383631333735433738363436323242354337383636333635423435354337383330363635433237354337383634333635433738363233373346374135433738363136333242354132323543373836353336354337383331333535433738363136343543373833313338354337383636363235433738363633313631324535433738333833393543373833303331354337383634363233373245354337383338363535433738363233343543373833383338354337383330333533373543373836333636354337383632333933353543373833383635355C7831625B31313B31326D433738363433343543373836313635354337383635363535433738333836313543373235433738363433393543373836313634354337383634363535433738333833303543373833313331354337383632333735433738363536363543373833313331363935433738333133313242364535433738333936363543373836323337354337383634333435453543373833393331354337383339333635433738363533313543373836333339363134383543373836363636353635433738333036333531354337383636333435433734354337383632333034343543373836343331354337383631363135433738333933383543373836313332354337383633333835433738363533373543373836363634354337383330333236433543373833383339364636443543373836323337344435433738333736363338354337383331333835433738363433333543373836363636363435433738333133383543373836313333363835433738333033353543373836313633353135433738363233343236354337383635333635433738363333343543373833383337354337383632363432343543373836363336354337383636333435433738333933373532354337383330333636393543373833313331354337383338333335433738333933363543373833393632354337383632363237343743323235433738363436343543323732363239354337383632363135433738333133353543373833393632354337383339333635433738363233373543373833303331354337383634333232323436324433463543373833373636323535423230354337383631363435433738363333323543373833393632354337383633333335433738363136353543373836333331354337383633333635433738363433323543373836353330354337383631333535433738363533383543373833313333354337383338333235433738333933383543373836313332373535433738363333313443354337383339333135433738363336363543373833303337354337383633333535433738333033353543373836343333373635433738333736363543373836323339354337383634333035343543373836363334373035433738363533303644354337383636333035433738363333393543373836353331344435433738363433313543373833313631333035433738363133373543373836353338354337383338363537303543373836313337363835393543373836343338354337383331363235433738363136333543373836313334354337383331363335433738363433323232354337383636363335433738363136343236363335433738363533353543373836343330343735433738363636313543373833313338373035433738363133383543373836353338353635383543354335433738363436343436343835433738333836323543373836363330354337383632333835433738363236313543373833393633354337383632333536433543373833383335354337383632333435433738363336353543373833393633354337383331363336433543373833303635353734363444343535433738333936363543373833303331334435433738333133353543373836343634354337383331333534353543373833313335354337383636333937433538354337383338333835433738363133363543373836313632354337383633333135433738333833303543373836313332354337383338333735433738333033333543373836313339354337383633333335433738363336363543373833303336354337383631333935433738333836313245354337383634333235433237354337383632363332323543373836323336324135433738363236313543323735433738333836333535354337383632333432433543373836343336354337383330363535433738363433373532354337383331333636393543373833313331373335433738363433373543373833393333354337383632333533363543373833393635323037443443354337383330363335363734324235433738333133363543373836313636364634333543373836313334343534433545354635433738363436353543373833393634364335433738333833393734354337383339333735433738363233393331353635433738333036323543373836313338354337383331363535433738363136343543373235433738363433353543373833383631334532333634324235433738363236313246354337383635333835363543373836353334354337383636333335313543373833313331354337383338333437373543373836323631364235433738333833323543373836323338354337383631333235433738333833373543373833303333354337383635333635433738363133333543373836333636354337383330363535433738363533363543373833383631354337383635363535433738333033333543373836343334354337383331333535433738363436343333354337383632333832423541354337383331333635433738363533343437373735323543373833313635363935433738333133313543373836353633354337383635363632363543373836363333354337383633363534303543373833383335354337383636333432393543373836333330354635433738363433313543373836313634343935433738363533303543373836353635333634363541344335433738333033363543373836313333354337383635333435433732333735433738363633393636354337383331363235433738333833343442354337383632333935433738363533343543373836353632354337383633333932463543373836353634354337383631363135433738363236343336354337383632363535333543373836343333354337383339363335433738333133393745354337383331333535463235363935433738333133303543373836333337354337383631363536353546354337383338333232443745354337383636363435433738363233373543373836313631354337383331363434373246354337383632333835433738363533303543373836313632354337383634333132363543373836323635354337383331333335433738333033343543373833313336354337383636333435443543373833313631354337383633363135433738363233373543373833383332323232363543373836363339354337383338333635433738363336353543373836343331354337383636333335433738363433363543373836343635333135433738363433313533324635433738363533383543373833383330344635433738363433343230354135423439354337383330363636383543373833383636354337383635363635433738363536333543373836333334354337383634333235433738363533353543373836323632333435433738333133363543373836363333354337383631363635433738333033383543373836363635354337383339333034463543373836363633354337383634363334333543373836363633343235433738333836363542354337383330333135433738363136333543373833393336354337383634333235433237354337383636333534323543373833393635374135433738333136363543373836313636354337383635333037373745354337383636333935373543373836353635354135433738363633313543373833303636354337383636363634413635373035433738333833363543373836323331343035433738363433343543373836363331353835433738363433373543373836323335354337383636333135433738333133383244333135433738333136353642354337383636333235433738363133373632354337383635363135433738363136343645374335433738363636353543373833313636354337383339333635433738363633313543373833383339354337383339363132383543373833303332354337383330333035433738333033303543373833303330373435433738333033343543373833303330354337383330333035433738333033303741364336393632373435433645354337383330333035433738333033303543373833303330363436353633364636443730373236353733373332383543373833303330354337383330333035433738333033303543373833303330323835433738333033303543373833303330354337383330333035433738333033303238354337383330333035433738333033303543373833303330354337383330333037333543373833303636354332373543373833303330354337383330333033373331333833323339333233393337333233373337333433373331333633313339333333373330333933383333333533313338333833353338333933323337333533353338333933323339333833383339333433303330333033323334333733393337333733393337333833313330333533373332333133343333333433303333333133393339333833303331333833373336333333333336333533333339333433343333333433363332333833303332333433373336333733323331333833373333333033313331333033383336333933303336333033383331333133313330333433353330333133323332333933323332333433303330333433303335333933313335333133303332333533343338333033353334333033303336333333343330333233323336333833313330333333313331333233333335333533343338333133303335333833313334333133373336333633303333333233373332333433333339333933333331333233353332333733313338333533353338333433313336333633323339333233343339333133393339333433353330333433383335333233333331333633353337333133373338333833363338333533373339333233373336333733373332333033373339333133343333333433373339333233373339333333313330333633383332333533343333333333303332333033343331333533333333333333353339333533373339333133333330333933333334333233383337333333333338333433313337333933333336333733303334333033343336333033393337333733393336333633303332333333353330333833303335333233303336333033353331333633353332333933323330333233323333333033343331333033393330333333393331333033393331333433303333333133343334333633343334333133353331333033383336333833313332333033383334333833393331333333363338333033333332333433303331333033303331333533383338333233313338333933303339333333363336333433383339333933363334333133383339333133323337333933363331333633363339333333313337333033373337333533333339333333333339333833353339333533303338333833393335333333313337333333303335333133303335333633373338333833393330333533383337333333393336333733373335333533343339333233393330333433383333333233333338333933323330333533363334333433313334333933303332333333393333333733393330333833313333333433303331333833313333333933363334333133383332333533373338333633383337333933313331333233303336333933343332333033363337333833393335333333393337333433393336333933313331333333373339333233363334333533373335333933383331333733393335333533343337333633363335333633373339333033343337333633373332333133303333333733353336333533333331333233373338333433363330333233373338333533353333333333303338333633383335333133383334333933303338333433303333333733383331333633363332333933303335333733303334333733353338333733323331333733303332333833343330333233363334333733333337333833373332333033393332333233373337333333393334333433363338333333353337333833333336333233313334333733313339333133323334333833363335333633353331333433353334333433313330333933303332333333383333333833363333333933323337333033313331333633383332333533383333333233393337333533343335333933383332333833343333333033313332333133353334333833343331333233363333333133383333333433333337333233303336333733383331333333383339333033333336333733303333333833303331333333373330333233383339333233383334333533343337333633393336333133353331333433373336333433393336333733323332333733313336333733303334333933363332333333353336333133383336333933303331333733363334333133303337333233323338333033363339333033363335333433333336333333303331333733353338333333393332333033383333333633303333333633363336333433353333333033343334333133393334333933363333333733393338333533373331333633323330333233323331333433383332333733343338333533323335333633353332333933383333333133383338333533373336333933353330333333383333333633373338333933353335333733393331333933353337333733373336333733365C7831625B31313B31326D333033303337333233303339333833343333333033363334333233323331333833303330333733383333333533383332333233343330333833343339333633313334333533363331333033303337333933323339333933373334333733333331333333393330333833363336333133373336333033303331333933313332333233333336333133343336333033303334333333313330333133373339333233313334333433303336333933313331333333343336333833343331333433303334333533323331333733393337333533313337333933343335333833343337333633333331333533373336333333303338333533333332333233323336333533333335333233363335333633323335333433393332333133333339333033343330333733353332333333333336333433353337333133363338333233373330333133313337333333363333333733343339333533373339333633343339333633313339333633313335333933323337333833333333333833383335333133303335333333363336333933343332333533343338333333333332333133353330333933313331333433353331333933383339333833343334333233373334333933363333333833333331333633393331333233393339333433373331333433333333333633323337333533323335333633323338333133333337333933383338333933363332333133343330333533373332333333303336333233323339333633323334333233323332333933393336333333383332333533373330333033303330333333333331333033363330333433393339333633323336333033303333333833303338333033363337333833313330333533323333333633363332333833333339333833353335333333353330333933313336333833393336333033353333333933333331333633333338333033323335333133343331333433393331333133383333333233383336333833303336333933343332333733323334333133343330333833353338333033383332333433333330333133373331333033323335333533313338333733383335333933393336333333373337333033363332333433323330333833363330333933353331333633363336333633303331333733393335333233303330333933363332333333383330333133393339333333333336333133343332333333393333333233333332333233303338333933393333333533383333333833353335333033363338333133393331333533333332333333363335333333323338333933373334333433363331333733363330333733393330333733343337333033343335333333363333333933363334333233373339333233393334333033393334333133303339333333353330333033393333333233373337333633363337333333303336333833373336333633393335333433313331333433303330333233313333333933363332333333363339333733393339333633313338333933373330333833323331333833383333333833303339333133323335333833383334333633343339333833393335333933303330333533363334333333373332333233313333333033333335333433303335333333343331333133353332333633393333333233353330333533303338333433383335333433343333333933323338333133353335333133303333333133373336333933363333333433393333333033313336333433363335333733303334333833373334333433313336333833333338333833333334333033363338333033383336333833343335333733373339333833383337333933393333333533353330333333373330333033313338333833363333333133303334333433313330333933333336333233373339333033303331333833393337333633383333333433303332333633373330333933333330333233333334333733323334333033383335333633353333333233313338333633303338333733343331333533303330333333373339333833373336333833373338333533313332333733353333333333353336333033353337333833353335333733303335333733343335333033363337333833313335333033383335333033383334333533343332333633353335333733303330333233353334333533303336333833323335333633373336333433323330333133313334333833393339333933323332333933333338333433343335333433363332333433373338333033373335333933313330333133343339333133363330333733323331333633303335333333393338333733373338333733383333333233333331333833393331333133393334333133303335333433353332333033393333333833303339333433323339333933313337333833373335333233393331333733313331333833333335333333373339333733363339333833353339333033373339333333393336333233333333333833333337333533323339333833323331333233363333333033333331333533353334333733393337333433393336333433363335333933323336333533363339333833393333333433383331333533393339333133373331333133313336333933393331333133393330333433383331333733303335333433303332333133343337333633393336333933383333333633333338333833313331333633353339333333333336333233313336333733363339333933383337333433363337333533323337333533343336333633303337333633303337333833393338333333323337333933383334333333333338333233373338333033333333333633343332333833363335333033383337333933323333333133363332333033323332333833393331333333363335333533363334333833363337333033373332333933353330333133313335333533333330333733333337333333313332333933313334333033303334333833333332333333363332333433313337333633363331333033393339333633393339333833333331333433373332333633343333333433373339333933383338333533323336333633333337333633353337333433343338333333383336333433393330333733313335333433343338333233323335333733353331333933333332333333303331333433363339333833383336333133333339333033343338333133373336333533333331333533313334333433363331333333363336333033323337333633303339333233303334333733383334333433363338333333353331333033353332333033323332333033333339333133373337333933343331333733303335333533323330333133343337333433333334333633333330333533313337333033353335333233363337333933373337333633373334333333363336333833373334333533313334333033323336333433373336333933363331333233353333333233313336333833363334333233303335333633323336333633393337333733363339333033353338333433323335333933333334333133333334333033373337333333373332333033383332333333393335333733303339333033323334333633333332333733393332333633333335333333353332333133333330333433373338333533393339333633333337333933393330333933353331333533373338333233383334333233303332333233363331333033313331333833323335333133303338333133303336333233333335333333323337333133373339333133373330333833363337333833343335333533323331333833323333333833343335333133373336333333363330333433313335333033343330333633333339333833343339333133383331333033373330333233373330333133343333333633303336333733343330333933313338333733313333333233343330333933373339333133353331333233393333333933373332333433393330333433343330333333363331333533383333333233343332333433323334333533363336333633393333333933383333333433373336333333373337333033303333333533343338333233363333333833383339333533343333333233343334333233393337333833393337333733323330333733303331333233363334333933313336333533373335333733393332333633353334333633313330333233393331333633393332333233343330333533373330333433353333333133383331333933333334333533373335333333383334333833323330333733393330333933373333333133333334333333353330333633343333333933373339333933333333333033303333333033383337333133313339333133313331333933383330333033383339333333303335333433343338333233363331333733343339333233333331333833333333333433363337333433343330333033303333333133303330333133363332333733343339333633353333333433353333333633353338333333383335333533383334333333353339333333303335333333393333333433373332333133333334333633353330333933313331333733353330333133353334333333343338333133343338333933303330333233313334333033363331333833313331333733313338333833343338333833343330333433313337333833383330333133313334333333303331333133373337333433343334333033333336333733353334333733373330333933343334333433333338333433393337333933393337333533343339333533323339333233373335333833303339333433353339333733343333333333333331333233373330333233393332333933343332333233353334333833353332333133393333333533383332333033383339333333313334333733313336333733343331333733393335333733393334333433313334333933323337333933373330333633363334333333353338333933303332333233333335333433313336333733373332333833353336333633323332333233393336333633313334333533393338333733393332333433393338333633343333333733323336333533393331333833313335333533323339333333303338333233353331333533393335333933353334333733323335333133333333333833343337333533383339333233323336333333373338333733373335333233373335333733303336333133373330333733323333333833313334333833333330333433373335333333393333333233353330333533323337333533323336333533353331333933303331333933323330333333323339333733313333333133393337333233323334333633353338333533373336333733333336333733393335333033383334333433303332333633333334333233363337333533313333333333313339333433343336333333363337333333353336333133363338333033353334333733383335333733373339333933393336335C7831625B31313B31326D333338333633323330333833383333333633393334333933343330333433303336333133303337333933373331333033313339333533303339333733333333333133373336333033373332333833353333333433383334333433323333333033313332333133343334333433353336333433373339333233363336333033353334333333303333333633393339333233343335333533383338333333303330333633313334333533323338333833343332333233313339333533313339333533313332333233353335333433313335333133373330333933323339333733333333333633363333333833353334333133333333333533383338333733333332333533303334333733323338333033323330333733323338333633333330333433343334333633343334333433393334333833343338333133353333333233353330333533333335333833313338333433363335333633373336333533313335333533363337333233303333333533353331333533323337333733373336333533333336333533313338333333393334333433303331333733363335333233353330333733363337333233363334333533373338333933313335333333373331333833313339333733323330333333363338333533323333333233363339333233313334333033313333333133363331333433323339333133333332333133363338333433373331333033323336333133393335333233363330333233303332333033373338333933343331333533343339333533393331333133363338333433323332333633363338333033343336333333343334333933333331333933333333333433333330333733383338333933373337333333343332333333323338333933303332333833373334333633343330333133313336333533393334333033363336333033363332333433313330333033313339333233383330333633373330333533303335333433313330333633363331333633383334333533313339333233363330333033343332333133363339333433323331333433393337333233393338333733313337333933373339333533343333333533313336333633343337333733393333333633363332333733363337333333353335333233353330333433353330333033313336333133383335333133393331333533373334333633303330333233353332333633313333333533363337333633323336333333333331333233323335333433333331333933393331333633323332333433333330333633333339333133343339333033343333333233333334333833333332333233353336333233363332333433333331333233383335333133343334333233323337333233333332333033313333333833393334333633353334333733383338333433373335333533383335333933393333333933303336333933323339333833313330333833303339333433323338333433383339333833373335333233363335333333323336333633383331333033363333333433303334333233353330333933353333333733303331333033383336333533393331333033333332333733363334333233353335333833353337333733383330333633303330333433373338333133373333333733323332333333313338333133343334333133323337333033353339333033393339333133363334333933333335333533373331333233383338333333323334333133363338333933383333333033373338333433333336333533383331333233393333333633343337333233303331333233353335333633353337333933333334333233333330333533343334333033323336333633383335333333313339333333393332333233373332333533393336333433383335333133393334333333323331333733323330333633363335333733383331333133373337333433333337333533333333333933363333333533363338333333353339333833353337333933323335333233383336333133303337333233333335333633303336333433393335333233323339333533363333333733393333333833333331333033393331333733303336333133343333333233373338333833353337333733343336333133333333333933333330333433373332333533393331333833383336333233313333333733393337333133343334333333343333333733323338333933363336333033313330333033363331333933313335333733333331333733393332333133373337333233353334333733383331333633353339333833383335333033373336333533393332333033343339333233303337333233383331333633353335333733393339333133303334333433323338333133393332333133353338333833303334333133333331333733303330333233323331333633393332333233383334333733373337333333323335333833323334333633323338333133313335333833333333333633343338333633353333333233383330333233363337333033363330333433353337333633393337333433363337333333323330333233303339333833383336333333373335333233333336333933343332333333353335333733333336333133313335333833363339333633333332333633303337333733393336333433383333333433313335333733353334333533383335333833303334333933313333333033323339333933333336333133373337333233303336333633343339333133363338333733343339333733313332333833313330333333333334333333353337333133393336333533343333333733343335333433363332333733373333333933393338333033363339333833303331333733343332333733313332333733393337333833363335333833323337333433333334333633313336333233323330333533343339333433383337333333393335333733383334333033333335333833323332333533323338333033363335333833383334333733343334333333383338333633373338333133323334333733393335333333363335333133343336333033323337333033393338333833333333333233363332333433303330333433303332333633383331333133383330333033343339333333323331333133303335333433363333333533333332333333343330333433363330333233313339333933323336333233373337333533313336333133333335333433373332333633343334333733333337333333333331333733353331333933313335333833323339333333343332333033343338333933333334333933343339333733343333333733323336333333303335333533313334333233373337333433363334333633373337333933303330333533313337333133303339333633383339333933373334333633393334333733353331333033313335333033383333333533383334333133373333333233363337333933303336333133373334333333303334333533373339333933393337333833373338333033323335333033353337333533373339333733303332333533363331333933373339333733323336333533303335333633383330333933363331333233303337333333383331333933393337333233303333333833303336333233363339333533393334333933333331333833363336333633353330333133353335333333333339333133303334333933393330333333303333333233343334333333323334333433333335333733363336333133323338333833343331333933373336333433353336333333353339333233383330333333383334333933393330333233353336333633353336333033373335333433323339333933383333333933363334333333363331333533383336333233343334333933373336333833313336333133393335333133393331333633343337333633323336333733333339333633373337333733323339333233303339333533353332333933363339333333303336333633303336333333333337333233303335333133383338333733323335333733383333333733373330333333313336333933343339333733383338333633313331333333333335333633353330333333373333333933303332333533343331333933333336333833353337333933373334333733313332333033323332333533343331333633313337333933313335333233323336333933343334333033333334333533353337333333333337333133373336333233313330333933353330333633313330333433373331333733393335333133333337333033353336333333353334333633393333333033323335333733313336333133343338333333303334333933323331333833303336333933363334333433353332333333353337333233343331333933393334333933333335333433343332333433343332333933323335333433393336333333333339333933353332333333373336333233303338333533303335333633383335333733353337333133363334333233393339333433353338333733303334333133393335333733373332333833303336333433323337333533343335333933333332333333323333333233303330333733393331333133343338333133303337333133333336333833323339333533313336333433323336333333303336333633313337333033323338333833353335333533383332333433353338333833353330333633383339333733333334333633353338333733393333333533373331333233333336333633353333333833303331333233363333333633393333333033383339333333383330333833363333333233393335333433313333333833333336333233303335333933343333333533353332333533343334333033373336333933313334333533313339333833393338333633363331333733373333333733363330333333373331333733333339333633373330333933363330333733313331333333313332333133323330333633373331333633393334333733353330333233343332333833383330333533333332333133353335333633323334333833393338333233383338333133393339333233323337333633353330333033353339333233303332333633323333333033343338333633363335333833343333333133393330333333333336333033353335333133393331333533353332333333393338333933333332333633303338333133323330333133393334333233323336333333373334333933303331333533303335333133303339333833383339333533373331333433393333333733363336333133353336333233393331333233333334333133333336333233333330333333393339333833333339333133393336333133343332333033373330333833363339333533343336333433333338333333393331333233393337333333353335333933303331333233393330333033373333333533373334333033383335333733383339333533313331333333353333333233313339333133343336333333343333333433333333333333303330333033323331333233383331333633313336333333353331333633373334333633333335333933353331333733323335333333383331333233393337333933373336333433393335333033313338333733353330333533303335333333383336333233333332333633363335333133383333333633313334333933373335333433313337333733363337333833393339333433353336333833373333333433363336333233363333333933363335333933353333333433353336333633353339333333333332333633323331333833333339333933363335333133363331333033333339333533353331333933373331333633373339333333353339333633383336333333353337333433343332333633313338333733393332333033393337333533333331333833323335333933343336333433373334333233393331333033343338333633303334333133393336333433363332333633393339333533393333333533393332333333363333333333323330333233303338333233313332333233323335333933343334333033315C7831625B31313B31326D3330333333313332333233353334333733393334333233313337333733323330333233353338333133383336333233353336333533323332333133353330333233333335333833313336333233333339333033353339333333343336333333303338333733373337333433353333333433363339333533393330333633393332333933303333333933343332333033343339333433303334333133383339333133393334333333363331333133333332333033353333333433343335333433313331333533363330333133343330333433373337333133323334333433343334333733383339333833383338333233323338333833363337333433303330333933353336333833323336333333303330333733363336333133313331333133383335333333373333333633373339333833353336333733363335333333373333333333373338333633373333333233343335333933323339333133343331333633323330333033343336333833353338333333373332333233353337333233333331333133363334333333383335333933323338333433393337333533383336333833393332333933333330333833393336333433313333333933323334333933393338333033353338333533393330333833353339333033313332333133383331333833313337333433303337333333363338333133363336333333313332333533363330333433323339333933363337333933303330333333333335333333363336333233373339333433313338333133333336333133393333333833343339333833363339333733373338333033323330333133363337333933393334333533323337333333373336333433393339333733363332333133343336333033303333333933373339333333313333333333343338333433353332333133333338333033383333333533393337333233393334333533303330333633303334333433313334333933373333333733323335333833313333333533373336333633313336333533393330333433333336333333313330333533393334333033363333333433313335333333303334333733393336333833353330333233383332333033343335333633313339333833323333333733393334333133313335333533303337333633353330333133333334333833323339333133393336333233303337333733363335333633393337333133353338333433303337333733393331333233333337333533333330333133343331333633383331333633303335333333313336333233373338333133353334333933323336333133343335333133303339333833383337333633383336333533323334333233333332333633333333333033393334333133303339333833343338333633313335333433343336333933303332333233343336333933323333333833383331333533343335333433333339333433323333333733393335333633363331333733343330333233323333333133383337333133383330333833393335333733383333333833373332333333393338333633313330333033333335333533333337333033313333333533313330333833393336333133303336333333373333333133313334333933373336333533343336333133333337333333303333333833313337333033323333333033323336333533393334333433383334333033383338333333313331333333383337333833353336333133303330333233323334333733313332333433343338333333353335333233363333333633333332333033333331333333393338333433393336333633393335333433343334333033323334333733313339333233343337333633333336333433313331333533323338333833353334333533343337333333343337333433323338333733303332333733363334333233343331333233303335333833313336333833343331333433393330333233383336333333303337333733303336333733343330333233343335333733323337333133373337333233343334333733393336333233343336333333343330333233333337333633353332333733373338333433303335333633303333333633383334333633302F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830303735353738353034343534353338353034363239323535313337373230323030393532303137353536343638343538373130353637343735313639353034333839383832343733313437353138353031323038363136393538333030343231383432393133373838333939383934383937343136353537383132323030333333363034313739303335323733363334363130323833343138303835363134353737373437373837343232363132313438353835343238373337333939333135303133383631343934323837313133303835343134353030383433313130393431333137353836313437343334363034383333363434373231303433303738333833373036373439353235373232393932353131363834303539313932383730333738323132333536323537313737373539333535323835353336333331303235343637353439393839303036323236313832303330363039333436333238373930343631363931303438393430343031303235333735323135373639343439333936363831363731323137393630353630323836363731333631343132393733323237383537323634313339363133393439333230313434303535303332393831343631353134333834393430303937303338363335353331333635313036343032323434333739343632383231333937343936383434383330303037363530303836343836353136343937333937313730333036323831383031323839373739323935363936383730353031353939343738303134373233333235343435343132323938353134393533303430373734343532343332343039373739303132383438353531303738313731353230333136313834323232343132363234343239343136353035373530323735393835363931333632313235373136313032303538363834343634383933323134353734303438363839393532313138333738353930383832393939343536343237323730303636373133373730363935353033393439303837363530353338323230333334333134353437303036383730343135313139333230323034363537333534383139323037353130363031373734313539313837383132343031303434363435303638303130353433343732323430383137333731373632343138303739303332373639313031383332333331353532383731343133363233393439363630323539303035393436313339353735353537373035333138383837333739353839303739343637393636373031333432313539353737363632303839333536333034323839303932303939343233363233363339353331353233373935363533333636373437323339373236353438333631363634313432373638323230333538313733393930303635383335303133343932383737393834303936323934313135333733343937353236333731313339333431373532393934313330333039383939393633323935363237303337343632373339343232333739363230333637383636373637353132393237313139323531303032313231383939393730333735393931393832373331353135333035383330323534373731313030323831363936303132323238353539313332373034343531313837363936313832373135373932343538373232303233373732363236393638303636353338323632343033343431383231353738343235323237343530333735343535383937373933353839343338383335323038343239343430303335323030313838363637303233313839313539303136343536363337353534313731393736323031383434393639303230393437393438303430383937393532313832313230323138363539393337353930373038323334393934363437353336353239373438363839383333393431353038333831303336323230313235363036303439383932343332353038343534343930313030363836313532393834303034373235383631343330303131303738333930303330383939383532333431393432343331353535373739373835363437333134373339303036383232313231343933373537373832313135373438383337333531333835303933313339323533383539393130313730383232373932353738323233363432373831313738343232323535383830323230343539383433373631373738383332313639363334393134323533303938393736343331353631333336393332313230353832353830333034303832393436373335363433333138373937343635383134363437303231383637343932373135323937323238383333373133353932373139393930353333343634363434343830393630373332333437313135393836313232383537393639303232313638303534393731353539363539343530393931373530393834333034393431393938363831303839383131333133353435383535303033303835333432353739373530333834333135363632313430343130383932343039353733343530393532333333343038373231303337323734393931393939343732333531373831363933343735343031323037343330323239363436373533393339303837383132393731323638313438373338343933333330393438353230343133333532393335313539343138313934303036383732333734323132333735333737363732353131323338353838373933363132393932353734353734353731393439313237313739383239353131333136303237383336383931313731333933343739363031393138353733353331343336353434363438363335343938363239323535303231343930363631303734303938333131353635323039353433353739343835393137363136323134363737383830333031343335313132363238383131343633333538313234373739383736323833373430363833383931303431303939313439393033353437383638313633373439363336343634383733363535323832323132363936363233363937313032343030393838353533363431333934373839363234323530353833353139353031393539313830393732313233383836303538303435383536313737343039323632373034353033363939363939353036353832323833333137393133393234353537383934313136363430303530393632383036363432353835303437333930303136363634393436303035343032313639343730303432383239343734333739373731313833393133373231333730333134393332333834323239363436303432343437383539383134393537343139393533333938353839383836363235333836373832363430363835383131343830373831383235373530323837373631363437393032333236363133313932313239383032353839303939343238373138313038393634363130313130303530313036363930303133323131383930353237373138333634313136313839373337313036383637383434373832343931383137333434333830383931303033333836303634373535363630313731363634383237343437353332303830353832363236313036323039323034383830393431303432333737303639393131303437333438323538333137343937323134363034393038383838303932303439303639343239333233393834303235353435323336353132333931393030383130303839303531383431353337383331323731353330343932363931373337353035313834393630323039383637363239373139353830363931383834393734313030363330363336313631353936363735363235313733333530303431303437343937313936363638303438353331383839323638383830313138343632353633393735313635343838323139363334323830353632343235363931343836333835363038383636343132343637373234303434303238303139313537383230333337363234313834353935333438303331363632353632333333333834333738313538323035373636353639393238353934343439323239353332363531393332373536333938303836343530313539333036353735363133353832303032363830343039333936363334303630313635363336353331313434393237333639393433373435303135313231313136323835303839343437353738313533393434393931323435303330373131383331363931363135323331303935383032353137393335333431323239373835363138333739333331383538343431343732333731343036343136373134343039343738333632363030363239333230363734323235393336393637343339393331353237373032343237303634383438303137333238363137373435373838343735383436393837363235313732323335393238313132333231373631393038343635373635303539303934393234353234313631383636343038363930303937373532373039363338333431323633353930383339343037313434323231323137323436333430343230323037363739313735323538303032323430303437323939323136393231353737333238393837303133363139393036373936363831353930393538393239373239353432353238343434373235303138303735323434303538323333303531333737323235373332393732373834303637393631393333363933393539333432333436373233383633393936383739373838393334373830333738383737313134353831303630323137333137303631363332373631393937303237383630353236373139333530303932343934363932323831393132393239353839363530343831313031343134313232353432353338353132383738333537373234313832323835303637363633343635353039303438323135383335373632383834303936333231333638303131303038323536313538343934303930393837383931373035383330393931323738363531363834343338333037343634313734373339353830393336383134303237373837303730363736353135323331353530353337303830313534333539363033383336393239363431353132363931313738303639383035373134333235313131343738303439343434323934393439363739323432383335323732313139343937373436363433313935313637343237333737323839363438353531383632393132333535393632323435343634393338393139393539343033303437353332383637313939363830383339363434353638303736333739323238333234383039383735343239343633323537303434323537373531373730373239373538333730333531353832303839383234383431343034373932373935333531343131313035353437363538383737373031393237393437383834373234323631313239393436373832373536323932303833323131353630363732373531383630303932313137383535333135323336363437343533383434373832393236353530393232363037333538383036313232383430393335343634383238393836373136363833383137363236313833343032303232383536383039333837303736303037353739363031313036363633313635303838353238333238303437353039393639303832353235373034363438393839353735383232333935343931373830383237383535383430313330313138303634383536353437303135383536393434313239313431323830343539373931383730343138383734373139343631323830373730333533343233353238383434313436373330363131343630373138383539323030373339363632373632303739393135343436393031393739323537363436303338343331313232313037303834383932343137383430363134363531313830363336353733313039373236353137393538333838393635333537393131383137303032393730313531383636373238323436353938373330373132363834393135353733393435373833333630343034313731373831363538373433383333383637363933333432313238373639363131313036303136383236383531393333333936393238363539343031313531303336313332333334383232313434335C7831625B31313B31326D3334393334353633353435303234393337363736363139393038393134343339383630303238303935383638313435383034383831313830373130303039383731333330333130363339383532373930363632393232343832363338323939353434373631303238323737393139393134383036353539303431303432383537393238313030363133383835343537373636373736383430383931353839343337333934353237313735383539323130313434353732313934323334333134353739393231323933313338303330333436383530393232313736353332383733333932313834383331323139313734393834343738303838323338333938323330323734373739303430393131303538303132313630313133353634313036393538353131363639343530393633333039303431313231383136363930353435313639353439373033333534313637373732333035393332303033303531363435333534353635303138313534323032373338353133353038373934383435343231313535373238343139373632373136353939373039383231353132343430323938303635343839363835303033333137393335323936393938343432373738353736383736303634303038383130353339363937393831373139383938393931353531313630303031313631353932383130383934343430323330393039363535303339303230363639383135323931313736393538343737313932373232333534333732323033353338353839353838303237323232343834363735363731323039323530353432333735323437363831333137343539393935373635383437353935343038353530333132333637313332303838353137353930343333303830353632363434393738323038303139373438353938353531393634303230383638363533303633353032383136373439373933303331373437393236393330343037323639323638373432363434303436393639303430353232343332353233323936313836383532333037363539343536383337323436363037363031393931333137373638303337373630303538383432343136363436393936313036373433313633373138363830323239303539373634383739353735333534323535363232383536373831343335393034323330313631353634343737383639373237303436323931353634383633353730343833303439323531393331373639343635363839393434323339393332303831343633323535393531343736363434313832373439383535333433343231313335313132313030343434363439303330343133383934383334343238313231393236363032353732333137363435363735373034353330383435373638303630353738373935343932353039353335353732323030383732363235353838353235383532393732373734313438333430373337373731333833353438393830343037353638333037333336323339323633343531383233323934363232393130363134343038393532373333323938373831353530353335353431323334393430373735303534313538363831343935343835393035343435353535303038333032383431373835373233323735303539383036303836323333383133303535393134313432323336313433393731373639303130363730383238333638393133353535373130343033333834333436383739343334343834383434393239303233373930353330393732353839363335333836343933373238333739353030343439313034373930353336333732333935353839383835343931393730393937323538303737333737393638383937313339333137303232333938343237303635323838373134383434343931393933343934373138333336363336323935383633383838363233363334313535343534393133393730363834393730383832373036373336363432343439333030363337393139323239333235383537393432333634373834313634343939353733303739383639323834323035343334333834333338383635323333303030303831373433313336303836323339393132393537373531343430393836353932313031333237383837363030353930343031363633353632323636333434373533393539383235313434343135383230323533373935383734353734323030323831313739393637313230323037383735313233373130313830333833303437383439313339323638363236323336353039313838373932303138313832393239363934323833323433383337363331323435343535343433313831313731363934333530303631383836343034353536363936343834333231323035363133303834383132333338363036323137353635393430343635363833353035353138373038313433373833383830393435313838383633303932323930363335343139343536363331333332353438353931313831303839373835373238313834313634333832303939383434363533373032333736323133303538343331363932383931303233393232313938323135313936333130333630333037303630333235353232303733353033393235323931333939363737393030383731373839323638383738353033353939343935383836353631393830373334333836373736393630303234343836353631333135343034313732323637323937393035313635373539323331323538333937393530343632343931313832373338303532333239383837313732383830333634383432393833343433393233373331303831343232323839303431353832313235383037343137393533303131333731363030303632353532323139363639363631373639363638333034333036383331303839363631343435353435383838333439383432363430303633343332393531323135353636383838343036373336333231383636383130303938373434323132383438333836373037383835333736393630353338333437393938303131353130303335383238333139383437363834393433393637323631343236393235333832363537323837363836333830333539353135313933393334363337353734383336353439353738383839373139393030393439303235393537383432343731333838323534333935353430393938393338343230373138323032363532323939323334383933363038373539363036313130353732343434313438303931333639383530323238333037313232373630383736353836363432343839303133383934343134383339313235323935363336313030333934333635353534343531393832363233323231373136363834363539373130353133363039363132313238303431363031323931353033363839373830303139393930313336333034333633373630393434393732303332363336383739363335343434383633343539303436333138333036393133303536303834353338333033343633373638383738333334353839333930393530303232373735353832383538373935383039313833333033373235323130323633323838323833323639383631393733353538353730353339333136323536353632353237333830393535393434313831333832303737393032343136363937333633353431363334303532363534373233333438343830353932373632353838313635313339373730313137353634393435333438363134323833323138353834393035383932383138323339303038313433393131303730393035343136393534363239373936333434363233383036333737333431373332393834343035393533323235353536303038353330353630323337313532383832313237333334343634303538353333313131343231373439323538323237313737393831333136373939313432353635373637383032303836393532323038343338323330363339343833343738383230373339363537373439383832383931313631363232303930333231323538333032343236353236373930333239383632363334333538393337363635343335313535333236353639343731333233363337383330353836333234383935393830353436373837393330393639353039363632343134363535303330323333353336363639353333343732353433363438323831303237363832343230343232303235343635383638383030343039343035313431353931393436363435363536323438313833393038353736343431373431383631383537363738383430383431303238323830393034373833343236373138343432333431323630313639313433303739323836383131323933313531333036323939373437303334323831343733343034393331363337343632303238363532303533333432373938323534313038313132313831373930393538353236313031343830343934343438363339353533363839323034313939383832303334333937333436353539393138333334333033333933343233373633353535363831353333393830363337313235373835393530383735323537343538313639373936383235313339343532313937303530333934333737393430353138383931353034333339373333383733363338393434343436303831343630303239363430323233313334383631363538373438363738353732303836363638333830353332373432313038343532363232343032303038373436303833333031313038343430333032333739363036383037353732313237323736313838373332343135363337303134333437383035353234383331393432343235353738393337333633383036303033383733393136353333393038343232343338313733383338333939353539373031313534393830383636303033343531333537353634303230393133383331383438393934393931313239393431343335333533313737323331393836373735323437313932393536323731373730323133303133373532363736393437323539343239313933383834313331343038313532323630343035343135333430343935333537373930363133373036343433353131333832313936363532313335363630303838363838343037353738363338363233383633313837363430373932323835393536303131313334333230393832383336343239343034323837373234313031333735393532343434303933303030333635333836383138353635313239363432303335343231303931393631363437383730323830303336303335373338393037363738333737323936303136353432323734393336333839393536323032323930383035303839373536333634373231343437373631323532323133393037383734373239323537303438363739313430393734353532313338303238333536323137363930323131333736353933363337323633363536333438333836313637393639373539343337353737373038353237393439393834343934313730373738343435363434343939333633323839303039313136363234323137303835343236383133313830383037383435323539323131303432393636363238343834333939333230303736363237333630393739373338313434373234333432373237353537383834383630383331333732323930383432333533323330313930363636383835333434393336313032303036323132393139363939383332373836383638333534343231313839393533393536343432303934333538313836333435303536303736363831303330343232343234393237323139383939323935353735343837343233373136363732393933383137373433353632313632383939323338313436303037353233373735383335363531343730303837333837363039323336333730373237323732333937383038303935333334343430353232373632393031313234303737373632373933323434363936363434313330313534323434323738363333383637303631373832333231363739323534393834363930353431363637323634383937323638303831333830373738383533313839373635353133393130353033303839393634333835383334303337353434333738393838363135383637333138303132393632303133303131313635373234383036313939333533323634313133373135313336323537313431313538383838363930353936313430373639373138333130313434333035333833373232343834323631313731353434353637393034343736393433313636303137343332383039323332363634303736353239303039323930333738333539373132393537363438363634353131343030373537363438303137343231373139313733353435363339363136303531323634363539393233343535313539373532343236343536333035383837343235313034363736363334383139383134303933383236343830373531373339313631303039343337363437393234333931323639363533343239333938373138313730383935373231303835343633313039313232393834353330373532313535303538343034373237323133363738373131313138383530313036313139343337303139333231343334383036363037353836373037353731383437393435373839313237343834393432313532393834313634313338373235373938313837313932323736353032313734363432393034333739383931313136333738383830373133373334313032343435303939333432323032383036333230343937383539333837333636303531383030313030343932313636353638303131383638383030323238313232373336323130393634373932323330363435383531383736313231313139303833353531313531353530393531373236343730323335393931363535313638353338393132343130383130333534343435383535363037323434353131323736323638373433383137393533323637373131303533363734313639343237353031333532313433343131323137393836333633303439373735393436373736353738363030333935373636393637393630383132353331383833373337323633323638313236363234313239333835313633393933303735373537393534333731323835383339333135313430353539383639323836363036333532303536323139313734363135383937353032363733353737303732313630373634323533343732383431353935363936373431393039343233333636303032333539313035383136303438343731333137373034323330393330313933343838373333313339373834333834313135373533323533363938353432373431353335323039363536383636303336303536313732323730353930363635313835363637313132393032343131363539313437393334353936313038373035303230313834393139353733373933343638373236383634363232333534343434393434373637343033353130353538383835343739343736352F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030325C7831625B31313B31326D2F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F5C7831625B31313B31326D783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F37353537383530343435343533383530343632393235353133373732303230303935323031373535363436383435383731303536373437353136393530343338393838323437333134373531383530313230383631363935383330303432313834323931333738383339393839343839373431363535373831323230303333333630343137393033353237333633343631303238333431383038353631343537373734373738373432323631323134383538353432383733373339393331353031333836313439343238373131333038353431343530303834333131303934313331373538363134373433343630343833333634343732313034333037383338333730363734393532353732323939323531313638343035393139323837303337383231323335363235373137373735393335353238353533363333313032353436373534393938393030363232363138323033303630393334363332383739303436313639313034383934303430313032353337353231353736393434393339363638313637313231373936303536303238363637313336313431323937333232373835373236343133393631333934393332303134343035353033323938313436313531343338343934303039373033383633353533313336353130363430323234343337393436323832313339373439363834343833303030373635303038363438363531363439373339373137303330363238313830313238393737393239353639363837303530313539393437383031343732333332353434353431323239383531343935333034303737343435323433323430393737393031323834383535313037383137313532303331363138343232323431323632343432393431363530353735303237353938353639313336323132353731363130323035383638343436343839333231343537345C7831625B31313B31326D3034383638393935323131383337383539303838323939393435363432373237303036363731333737303639353530333934393038373635303533383232303333343331343534373030363837303431353131393332303230343635373335343831393230373531303630313737343135393138373831323430313034343634353036383031303534333437323234303831373337313736323431383037393033323736393130313833323333313535323837313431333632333934393636303235393030353934363133393537353535373730353331383838373337393538393037393436373936363730313334323135393537373636323038393335363330343238393039323039393432333632333633393533313532333739353635333336363734373233393732363534383336313636343134323736383232303335383137333939303036353833353031333439323837373938343039363239343131353337333439373532363337313133393334313735323939343133303330393839393936333239353632373033373436323733393432323337393632303336373836363736373531323932373131393235313030323132313839393937303337353939313938323733313531353330353833303235343737313130303238313639363031323232383535393133323730343435313138373639363138323731353739323435383732323032333737323632363936383036363533383236323430333434313832313537383432353232373435303337353435353839373739333538393433383833353230383432393434303033353230303138383636373032333138393135393031363435363633373535343137313937363230313834343936393032303934373934383034303839373935323138323132303231383635393933373539303730383233343939343634373533363532393734383638393833333934313530383338313033363232303132353630363034393839323433323530383435343439303130303638363135323938343030343732353836313433303031313037383339303033303839393835323334313934323433313535353737393738353634373331343733393030363832323132313439333735373738323131353734383833373335313338353039333133393235333835393931303137303832323739323537383232333634323738313137383432323235353838303232303435393834333736313737383833323136393633343931343235333039383937363433313536313333363933323132303538323538303330343038323934363733353634333331383739373436353831343634373032313836373439323731353239373232383833333731333539323731393939303533333436343634343438303936303733323334373131353938363132323835373936393032323136383035343937313535393635393435303939313735303938343330343934313939383638313038393831313331333534353835353030333038353334323537393735303338343331353636323134303431303839323430393537333435303935323333333430383732313033373237343939313939393437323335313738313639333437353430313230373433303232393634363735333933393038373831323937313236383134383733383439333333303934383532303431333335323933353135393431383139343030363837323337343231323337353337373637323531313233383538383739333631323939323537343537343537313934393132373137393832393531313331363032373833363839313137313339333437393630313931383537333533313433363534343634383633353439383632393235353032313439303636313037343039383331313536353230393534333537393438353931373631363231343637373838303330313433353131323632383831313436333335383132343737393837363238333734303638333839313034313039393134393930333534373836383136333734393633363436343837333635353238323231323639363632333639373130323430303938383535333634313339343738393632343235303538333531393530313935393138303937323132333838363035383034353835363137373430393236323730343530333639393639393530363538323238333331373931333932343535373839343131363634303035303936323830363634323538353034373339303031363636343934363030353430323136393437303034323832393437343337393737313138333931333732313337303331343933323338343232393634363034323434373835393831343935373431393935333339383538393838363632353338363738323634303638353831313438303738313832353735303238373736313634373930323332363631333139323132393830323538393039393432383731383130383936343631303131303035303130363639303031333231313839303532373731383336343131363138393733373130363836373834343738323439313831373334343338303839313030333338363036343735353636303137313636343832373434373533323038303538323632363130363230393230343838303934313034323337373036393931313034373334383235383331373439373231343630343930383838383039323034393036393432393332333938343032353534353233363531323339313930303831303038393035313834313533373833313237313533303439323639313733373530353138343936303230393836373632393731393538303639313838343937343130303633303633363136313539363637353632353137333335303034313034373439373139363636383034383533313838393236383838303131383436323536333937353136353438383231393633343238303536323432353639313438363338353630383836363431323436373732343034343032383031393135373832303333373632343138343539353334383033313636323536323333333338343337383135383230353736363536393932383539343434393232393533323635313933323735363339383038363435303135393330363537353631333538323030323638303430393339363633343036303136353633363533313134343932373336393934333734353031353132313131363238353038393434373537383135333934343939313234353033303731313833313639313631353233313039353830323531373933353334313232393738353631383337393333313835383434313437323337313430363431363731343430393437383336323630303632393332303637343232353933363936373433393933313532373730323432373036343834383031373332383631373734353738383437353834363938373632353137323233353932383131323332313736313930383436353736353035393039343932343532343136313836363430383639303039373735323730393633383334313236333539303833393430373134343232313231373234363334303432303230373637393137353235383030323234303034373239393231363932313537373332383938373031333631393930363739363638313539303935383932393732393534323532383434343732353031383037353234343035383233333035313337373232353733323937323738343036373936313933333639333935393334323334363732333836333939363837393738383933343738303337383837373131343538313036303231373331373036313633323736313939373032373836303532363731393335303039323439343639323238313931323932393538393635303438313130313431343132323534323533383531323837383335373732343138323238353036373636333436353530393034383231353833353736323838343039363332313336383031313030383235363135383439343039303938373839313730353833303939313237383635313638343433383330373436343137343733393538303933363831343032373738373037303637363531353233313535303533373038303135343335393630333833363932393634313531323639313137383036393830353731343332353131313437383034393434343239343934393637393234323833353237323131393439373734363634333139353136373432373337373238393634383535313836323931323335353936323234353436343933383931393935393430333034373533323836373139393638303833393634343536383037363337393232383332343830393837353432393436333235373034343235373735313737303732393735383337303335313538323038393832343834313430343739323739353335313431313130353534373635383837373730313932373934373838343732343236313132393934363738323735363239323038333231313536303637323735313836303039323131373835353331353233363634373435333834343738323932363535303932323630373335383830363132323834303933353436343832383938363731363638333831373632363138333430323032323835363830393338373037363030373537393630313130363636333136353038383532383332383034373530393936393038323532353730343634383938393537353832323339353439313738303832373835353834303133303131383036343835363534373031353835363934343132393134313238303435393739313837303431383837343731393436313238303737303335333432333532383834343134363733303631313436303731383835393230303733393636323736323037393931353434363930313937393235373634363033383433313132323130373038343839323431373834303631343635313138303633363537333130393732363531373935383338383936353335373931313831373030323937303135313836363732383234363539383733303731323638343931353537333934353738333336303430343137313738313635383734333833333836373639333334323132383736393631313130363031363832363835313933333339363932383635393430313135313033363133323333343832323134343333343933343536333534353032343933373637363631393930383931343433393836303032383039353836383134353830343838313138303731303030393837313333303331303633393835323739303636323932323438323633383239393534343736313032383237373931393931343830363535393034313034323835373932383130303631333838353435373736363737363834303839313538393433373339343532373137353835393231303134343537323139343233343331343537393932313239333133383033303334363835303932323137363533323837333339323138343833313231393137343938343437383038383233383339383233303237343737393034303931313035383031323136303131333536343130363935383531313636393435303936333330393034313132313831363639303534353136393534393730333335343136373737323330353933323030333035313634353335343536353031383135343230323733383531333530383739343834353432313135353732383431393736323731363539393730393832313531323434303239383036353438393638353030333331373933353239363939383434323737383537363837363036343030383831303533393639373938313731393839383939313535313136303030313136313539323831303839343434303233303930393635353033393032303636393831353239313137363935383437373139323732323335343337323230333533383538393538383032373232323438343637353637313230393235303534323337353234373638313331373435393939353736353834373539353430383535303331323336373133323038383531373539303433333038303536323634343937383230383031393734383539383535313936343032303836383635333036333530323831363734393739333033313734373932363933303430373236393236383734323634343034363936393034303532323433323532333239363138363835323330373635393435363833373234363630373630313939313331373736383033373736303035383834323431363634363939363130363734333136333731383638303232393035393736343837393537353335343235353632323835363738313433353930343233303136313536343437373836393732373034363239313536343836333537303438333034393235313933313736393436353638393934343233393933323038313436333235353935313437363634343138323734393835353334333432313133353131323130303434343634393033303431333839343833343432383132313932363630323537323331373634353637353730343533303834353736383036303537383739353439323530393533353537323230303837323632353538383532353835323937323737343134383334303733373737313338333534383938303430373536383330373333363233393236333435313832333239343632323931303631343430383935323733333239383738313535303533353534313233343934303737353035343135383638313439353438353930353434353535353030383330323834313738353732333237353035393830363038363233333831333035353931343134323233363134333937313736393031303637303832383336383931333535353731303430333338343334363837393433343438343834343932393032333739303533303937323538393633353338363439333732383337393530303434393130343739303533363337323339353538393838353439313937303939373235383037373337373936383839373133393331373032323339383432373036353238383731343834343439313939333439343731383333363633363239353836333838383632333633343135353435343931333937303638343937303838323730363733363634323434393330303633373931393232393332353835373934323336343738343136343439393537333037393836393238343230353433343338343333383836353233333030303038313734333133363038363233393931323935373735313434303938363539323130313332373838373630303539303430313636333536323236363334343735333935393832353134343431353832303235333739353837343537343230303238313137393936373132303230373837353132333731303138303338333034373834393133393236383632363233363530393138383739323031383138323932393639343238333234333833373633313234353435353434333138313137313639343335303036313838363430343535363639363438343332313230353631333038343831323333383630363231373536353934303436353638333530353531383730383134333738333838303934353138383836333039323239303633353431393435363633313333323534383539313138313038393738353732383138343136343338323039393834343635333730323337363231333035383433313639323839313032333932323139383231353139363331303336303330373036303332353532323037333530333932353239313339393637373930303837313738393236383837383530333539393439353838363536313938303733343338363737363936303032343438363536313331353430343137323236373239373930353136353735393233313235383339373935303436323439313138323733383035323332393838373137323838303336343834323938333434333932333733313038313432323238393034313538323132353830373431373935333031313337313630303036323535323231393636393636313736393636383330343330363833313038393636313434353534353838383334393834323634303036333433323935313231353536363838383430363733363332313836363831303039383734343231323834383338363730373838353337363936303533383334373939383031313531303033353832383331393834373638343934333936373236313432363932353338323635373238373638363338303335393531353139333933343633373537343833363534393537383838393731393930303934393032353935373834323437313338383235343339353534303939383933383432303731383230323635323239393233343839333630383735393630363131303537323434343134383039313336393835303232383330373132323736303837363538363634323438393031333839343431343833393132353239353633363130303339343336353535343435313938323632333232313731363638343635393731303531333630393631323132383034313630313239313530333638393738303031393939303133363330343336333736303934343937323033323633363837393633353434343836333435393034363331383330363931333035363038343533383330333436333736383837383333343538393339303935303032323737353538323835383739353830393138333330333732353231303236333238383238333236393836313937333535383537303533393331363235363536323532373338303935353934343138313338323037373930323431363639373336333534313633343035323635343732333334383438303539323736323538383136353133393737303131373536343934353334383631343238333231383538343930353839323831383233393030383134333931313037303930353431363935343632393739363334343632333830363337373334313733323938343430353935333232353535363030383533303536303233373135323838323132373333343436343035383533333131313432313734393235383232373137373938313331363739393134323536353736373830323038363935323230383433383233303633393438333437383832303733393635373734395C7831625B31313B31326D383832383931313631363232303930333231323538333032343236353236373930333239383632363334333538393337363635343335313535333236353639343731333233363337383330353836333234383935393830353436373837393330393639353039363632343134363535303330323333353336363639353333343732353433363438323831303237363832343230343232303235343635383638383030343039343035313431353931393436363435363536323438313833393038353736343431373431383631383537363738383430383431303238323830393034373833343236373138343432333431323630313639313433303739323836383131323933313531333036323939373437303334323831343733343034393331363337343632303238363532303533333432373938323534313038313132313831373930393538353236313031343830343934343438363339353533363839323034313939383832303334333937333436353539393138333334333033333933343233373633353535363831353333393830363337313235373835393530383735323537343538313639373936383235313339343532313937303530333934333737393430353138383931353034333339373333383733363338393434343436303831343630303239363430323233313334383631363538373438363738353732303836363638333830353332373432313038343532363232343032303038373436303833333031313038343430333032333739363036383037353732313237323736313838373332343135363337303134333437383035353234383331393432343235353738393337333633383036303033383733393136353333393038343232343338313733383338333939353539373031313534393830383636303033343531333537353634303230393133383331383438393934393931313239393431343335333533313737323331393836373735323437313932393536323731373730323133303133373532363736393437323539343239313933383834313331343038313532323630343035343135333430343935333537373930363133373036343433353131333832313936363532313335363630303838363838343037353738363338363233383633313837363430373932323835393536303131313334333230393832383336343239343034323837373234313031333735393532343434303933303030333635333836383138353635313239363432303335343231303931393631363437383730323830303336303335373338393037363738333737323936303136353432323734393336333839393536323032323930383035303839373536333634373231343437373631323532323133393037383734373239323537303438363739313430393734353532313338303238333536323137363930323131333736353933363337323633363536333438333836313637393639373539343337353737373038353237393439393834343934313730373738343435363434343939333633323839303039313136363234323137303835343236383133313830383037383435323539323131303432393636363238343834333939333230303736363237333630393739373338313434373234333432373237353537383834383630383331333732323930383432333533323330313930363636383835333434393336313032303036323132393139363939383332373836383638333534343231313839393533393536343432303934333538313836333435303536303736363831303330343232343234393237323139383939323935353735343837343233373136363732393933383137373433353632313632383939323338313436303037353233373735383335363531343730303837333837363039323336333730373237323732333937383038303935333334343430353232373632393031313234303737373632373933323434363936363434313330313534323434323738363333383637303631373832333231363739323534393834363930353431363637323634383937323638303831333830373738383533313839373635353133393130353033303839393634333835383334303337353434333738393838363135383637333138303132393632303133303131313635373234383036313939333533323634313133373135313336323537313431313538383838363930353936313430373639373138333130313434333035333833373232343834323631313731353434353637393034343736393433313636303137343332383039323332363634303736353239303039323930333738333539373132393537363438363634353131343030373537363438303137343231373139313733353435363339363136303531323634363539393233343535313539373532343236343536333035383837343235313034363736363334383139383134303933383236343830373531373339313631303039343337363437393234333931323639363533343239333938373138313730383935373231303835343633313039313232393834353330373532313535303538343034373237323133363738373131313138383530313036313139343337303139333231343334383036363037353836373037353731383437393435373839313237343834393432313532393834313634313338373235373938313837313932323736353032313734363432393034333739383931313136333738383830373133373334313032343435303939333432323032383036333230343937383539333837333636303531383030313030343932313636353638303131383638383030323238313232373336323130393634373932323330363435383531383736313231313139303833353531313531353530393531373236343730323335393931363535313638353338393132343130383130333534343435383535363037323434353131323736323638373433383137393533323637373131303533363734313639343237353031333532313433343131323137393836333633303439373735393436373736353738363030333935373636393637393630383132353331383833373337323633323638313236363234313239333835313633393933303735373537393534333731323835383339333135313430353539383639323836363036333532303536323139313734363135383937353032363733353737303732313630373634323533343732383431353935363936373431393039343233333636303032333539313035383136303438343731333137373034323330393330313933343838373333313339373834333834313135373533323533363938353432373431353335323039363536383636303336303536313732323730353930363635313835363637313132393032343131363539313437393334353936313038373035303230313834393139353733373933343638373236383634363232333534343434393434373637343033353130353538383835343739343736357830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F5C7831625B31313B31326D7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F373535373835303434353435333835303436323932353531333737323032303039353230313735353634363834353837313035363734373531363935303433383938383234373331343735313835303132303836313639353833303034323138343239313337383833393938393438393734313635353738313232303033333336303431373930333532373336333436313032383334313830383536313435373737343737383734323236313231343835383534323837333733393933313530313338363134393432383731313330383534313435303038343331313039343133313735383631343734333436303438333336343437323130343330373833383337303637343935323537323239393235313136383430353931393238373033373832313233353632353731373737353933353532383535333633333130323534363735343939383930303632323631383230333036303933343633323837393034363136393130343839343034303130323533373532313537363934343933393636383136373132313739363035363032383636373133363134313239373332323738353732363431333936313339343933323031343430353530333239383134363135313433383439343030393730333836333535333133363531303634303232343433373934363238323133393734393638343438333030303736353030383634383635313634393733393731373033303632383138303132383937373932393536393638373035303135393934373830313437323333323534343534313232393835313439353330343037373434353234333234303937373930313238343835353130373831373135323033313631383432323234313236323434323934313635303537353032373539383536393133363231323537313631303230353836383434363438393332313435373430343836383939353231313833373835393038383239393934353634323732373030363637313337373036393535303339343930383736353035333832323033333433313435343730305C7831625B31313B31326D363837303431353131393332303230343635373335343831393230373531303630313737343135393138373831323430313034343634353036383031303534333437323234303831373337313736323431383037393033323736393130313833323333313535323837313431333632333934393636303235393030353934363133393537353535373730353331383838373337393538393037393436373936363730313334323135393537373636323038393335363330343238393039323039393432333632333633393533313532333739353635333336363734373233393732363534383336313636343134323736383232303335383137333939303036353833353031333439323837373938343039363239343131353337333439373532363337313133393334313735323939343133303330393839393936333239353632373033373436323733393432323337393632303336373836363736373531323932373131393235313030323132313839393937303337353939313938323733313531353330353833303235343737313130303238313639363031323232383535393133323730343435313138373639363138323731353739323435383732323032333737323632363936383036363533383236323430333434313832313537383432353232373435303337353435353839373739333538393433383833353230383432393434303033353230303138383636373032333138393135393031363435363633373535343137313937363230313834343936393032303934373934383034303839373935323138323132303231383635393933373539303730383233343939343634373533363532393734383638393833333934313530383338313033363232303132353630363034393839323433323530383435343439303130303638363135323938343030343732353836313433303031313037383339303033303839393835323334313934323433313535353737393738353634373331343733393030363832323132313439333735373738323131353734383833373335313338353039333133393235333835393931303137303832323739323537383232333634323738313137383432323235353838303232303435393834333736313737383833323136393633343931343235333039383937363433313536313333363933323132303538323538303330343038323934363733353634333331383739373436353831343634373032313836373439323731353239373232383833333731333539323731393939303533333436343634343438303936303733323334373131353938363132323835373936393032323136383035343937313535393635393435303939313735303938343330343934313939383638313038393831313331333534353835353030333038353334323537393735303338343331353636323134303431303839323430393537333435303935323333333430383732313033373237343939313939393437323335313738313639333437353430313230373433303232393634363735333933393038373831323937313236383134383733383439333333303934383532303431333335323933353135393431383139343030363837323337343231323337353337373637323531313233383538383739333631323939323537343537343537313934393132373137393832393531313331363032373833363839313137313339333437393630313931383537333533313433363534343634383633353439383632393235353032313439303636313037343039383331313536353230393534333537393438353931373631363231343637373838303330313433353131323632383831313436333335383132343737393837363238333734303638333839313034313039393134393930333534373836383136333734393633363436343837333635353238323231323639363632333639373130323430303938383535333634313339343738393632343235303538333531393530313935393138303937323132333838363035383034353835363137373430393236323730343530333639393639393530363538323238333331373931333932343535373839343131363634303035303936323830363634323538353034373339303031363636343934363030353430323136393437303034323832393437343337393737313138333931333732313337303331343933323338343232393634363034323434373835393831343935373431393935333339383538393838363632353338363738323634303638353831313438303738313832353735303238373736313634373930323332363631333139323132393830323538393039393432383731383130383936343631303131303035303130363639303031333231313839303532373731383336343131363138393733373130363836373834343738323439313831373334343338303839313030333338363036343735353636303137313636343832373434373533323038303538323632363130363230393230343838303934313034323337373036393931313034373334383235383331373439373231343630343930383838383039323034393036393432393332333938343032353534353233363531323339313930303831303038393035313834313533373833313237313533303439323639313733373530353138343936303230393836373632393731393538303639313838343937343130303633303633363136313539363637353632353137333335303034313034373439373139363636383034383533313838393236383838303131383436323536333937353136353438383231393633343238303536323432353639313438363338353630383836363431323436373732343034343032383031393135373832303333373632343138343539353334383033313636323536323333333338343337383135383230353736363536393932383539343434393232393533323635313933323735363339383038363435303135393330363537353631333538323030323638303430393339363633343036303136353633363533313134343932373336393934333734353031353132313131363238353038393434373537383135333934343939313234353033303731313833313639313631353233313039353830323531373933353334313232393738353631383337393333313835383434313437323337313430363431363731343430393437383336323630303632393332303637343232353933363936373433393933313532373730323432373036343834383031373332383631373734353738383437353834363938373632353137323233353932383131323332313736313930383436353736353035393039343932343532343136313836363430383639303039373735323730393633383334313236333539303833393430373134343232313231373234363334303432303230373637393137353235383030323234303034373239393231363932313537373332383938373031333631393930363739363638313539303935383932393732393534323532383434343732353031383037353234343035383233333035313337373232353733323937323738343036373936313933333639333935393334323334363732333836333939363837393738383933343738303337383837373131343538313036303231373331373036313633323736313939373032373836303532363731393335303039323439343639323238313931323932393538393635303438313130313431343132323534323533383531323837383335373732343138323238353036373636333436353530393034383231353833353736323838343039363332313336383031313030383235363135383439343039303938373839313730353833303939313237383635313638343433383330373436343137343733393538303933363831343032373738373037303637363531353233313535303533373038303135343335393630333833363932393634313531323639313137383036393830353731343332353131313437383034393434343239343934393637393234323833353237323131393439373734363634333139353136373432373337373238393634383535313836323931323335353936323234353436343933383931393935393430333034373533323836373139393638303833393634343536383037363337393232383332343830393837353432393436333235373034343235373735313737303732393735383337303335313538323038393832343834313430343739323739353335313431313130353534373635383837373730313932373934373838343732343236313132393934363738323735363239323038333231313536303637323735313836303039323131373835353331353233363634373435333834343738323932363535303932323630373335383830363132323834303933353436343832383938363731363638333831373632363138333430323032323835363830393338373037363030373537393630313130363636333136353038383532383332383034373530393936393038323532353730343634383938393537353832323339353439313738303832373835353834303133303131383036343835363534373031353835363934343132393134313238303435393739313837303431383837343731393436313238303737303335333432333532383834343134363733303631313436303731383835393230303733393636323736323037393931353434363930313937393235373634363033383433313132323130373038343839323431373834303631343635313138303633363537333130393732363531373935383338383936353335373931313831373030323937303135313836363732383234363539383733303731323638343931353537333934353738333336303430343137313738313635383734333833333836373639333334323132383736393631313130363031363832363835313933333339363932383635393430313135313033363133323333343832323134343333343933343536333534353032343933373637363631393930383931343433393836303032383039353836383134353830343838313138303731303030393837313333303331303633393835323739303636323932323438323633383239393534343736313032383237373931393931343830363535393034313034323835373932383130303631333838353435373736363737363834303839313538393433373339343532373137353835393231303134343537323139343233343331343537393932313239333133383033303334363835303932323137363533323837333339323138343833313231393137343938343437383038383233383339383233303237343737393034303931313035383031323136303131333536343130363935383531313636393435303936333330393034313132313831363639303534353136393534393730333335343136373737323330353933323030333035313634353335343536353031383135343230323733383531333530383739343834353432313135353732383431393736323731363539393730393832313531323434303239383036353438393638353030333331373933353239363939383434323737383537363837363036343030383831303533393639373938313731393839383939313535313136303030313136313539323831303839343434303233303930393635353033393032303636393831353239313137363935383437373139323732323335343337323230333533383538393538383032373232323438343637353637313230393235303534323337353234373638313331373435393939353736353834373539353430383535303331323336373133323038383531373539303433333038303536323634343937383230383031393734383539383535313936343032303836383635333036333530323831363734393739333033313734373932363933303430373236393236383734323634343034363936393034303532323433323532333239363138363835323330373635393435363833373234363630373630313939313331373736383033373736303035383834323431363634363939363130363734333136333731383638303232393035393736343837393537353335343235353632323835363738313433353930343233303136313536343437373836393732373034363239313536343836333537303438333034393235313933313736393436353638393934343233393933323038313436333235353935313437363634343138323734393835353334333432313133353131323130303434343634393033303431333839343833343432383132313932363630323537323331373634353637353730343533303834353736383036303537383739353439323530393533353537323230303837323632353538383532353835323937323737343134383334303733373737313338333534383938303430373536383330373333363233393236333435313832333239343632323931303631343430383935323733333239383738313535303533353534313233343934303737353035343135383638313439353438353930353434353535353030383330323834313738353732333237353035393830363038363233333831333035353931343134323233363134333937313736393031303637303832383336383931333535353731303430333338343334363837393433343438343834343932393032333739303533303937323538393633353338363439333732383337393530303434393130343739303533363337323339353538393838353439313937303939373235383037373337373936383839373133393331373032323339383432373036353238383731343834343439313939333439343731383333363633363239353836333838383632333633343135353435343931333937303638343937303838323730363733363634323434393330303633373931393232393332353835373934323336343738343136343439393537333037393836393238343230353433343338343333383836353233333030303038313734333133363038363233393931323935373735313434303938363539323130313332373838373630303539303430313636333536323236363334343735333935393832353134343431353832303235333739353837343537343230303238313137393936373132303230373837353132333731303138303338333034373834393133393236383632363233363530393138383739323031383138323932393639343238333234333833373633313234353435353434333138313137313639343335303036313838363430343535363639363438343332313230353631333038343831323333383630363231373536353934303436353638333530353531383730383134333738333838303934353138383836333039323239303633353431393435363633313333323534383539313138313038393738353732383138343136343338323039393834343635333730323337363231333035383433313639323839313032333932323139383231353139363331303336303330373036303332353532323037333530333932353239313339393637373930303837313738393236383837383530333539393439353838363536313938303733343338363737363936303032343438363536313331353430343137323236373239373930353136353735393233313235383339373935303436323439313138323733383035323332393838373137323838303336343834323938333434333932333733313038313432323238393034313538323132353830373431373935333031313337313630303036323535323231393636393636313736393636383330343330363833313038393636313434353534353838383334393834323634303036333433323935313231353536363838383430363733363332313836363831303039383734343231323834383338363730373838353337363936303533383334373939383031313531303033353832383331393834373638343934333936373236313432363932353338323635373238373638363338303335393531353139333933343633373537343833363534393537383838393731393930303934393032353935373834323437313338383235343339353534303939383933383432303731383230323635323239393233343839333630383735393630363131303537323434343134383039313336393835303232383330373132323736303837363538363634323438393031333839343431343833393132353239353633363130303339343336353535343435313938323632333232313731363638343635393731303531333630393631323132383034313630313239313530333638393738303031393939303133363330343336333736303934343937323033323633363837393633353434343836333435393034363331383330363931333035363038343533383330333436333736383837383333343538393339303935303032323737353538323835383739353830393138333330333732353231303236333238383238333236393836313937333535383537303533393331363235363536323532373338303935353934343138313338323037373930323431363639373336333534313633343035323635343732333334383438303539323736323538383136353133393737303131373536343934353334383631343238333231383538343930353839323831383233393030383134333931313037303930353431363935343632393739363334343632333830363337373334313733323938343430353935333232353535363030383533303536303233373135323838323132373333343436343035383533333131313432313734393235383232373137373938313331363739393134323536353736373830323038363935323230383433383233303633393438333437383832303733393635373734393838323839313136313632323039303332313235383330323432363532363739303332393836323633343335383933373636353433353135353332363536393437313332333633373833303538363332343839353938303534363738373933303936393530393636323431343635353033303233333533363636393533333437323534333634383238313032373638323432303432323032353436353836383830303430393430353134313539313934363634353635363234383138333930383537363434313734313836313835373637383834303834313032383238303930343738333432363731383434323334313236303136393134333037393238363831313239333135313330363239393734373033343238313437333430343933313633373436323032383635323035333334323739383235343130383131323138313739303935383532363130313438303439343434383633393535333638393230343139393838323033343339373334363535393931383333343330333339333432333736333535353638313533333938303633373132353738353935303837353235373435383136393739363832353133393435323139373035303339343337373934303531383839313530343333393733333837333633383934343434363038313436303032393634303232333133343836313635383734383637383537323038363636383338303533323734323130383435323632323430323030383734363038333330313130383434303330323337393630363830373537323132373237363138383733323431353633373031343334373830353532343833313934323432353537383933373336333830363030333837333931363533333930383432323433383137333833383339393535393730313135343938303836363030333435313335373536343032303931333833313834383939343939313132393934313433353335333137373233313938363737353234373139323935363237313737303231333031333735323637363934373235393432393139333838343133313430383135323236303430353431353334303439353335373739303631333730363434333531313338323139363635323133353636303038383638383430373537383633383632333836333138373634303739323238353935363031313133343332303938323833363432393430343238373732343130313337353935323434343039333030303336353338363831383536353132393634323033353432313039313936313634373837303238303033363033353733383930373637383337373239363031363534323237343933363338393935363230323239303830353038393735363336343732313434373736313235323231333930373837343732395C7831625B31313B31326D323537303438363739313430393734353532313338303238333536323137363930323131333736353933363337323633363536333438333836313637393639373539343337353737373038353237393439393834343934313730373738343435363434343939333633323839303039313136363234323137303835343236383133313830383037383435323539323131303432393636363238343834333939333230303736363237333630393739373338313434373234333432373237353537383834383630383331333732323930383432333533323330313930363636383835333434393336313032303036323132393139363939383332373836383638333534343231313839393533393536343432303934333538313836333435303536303736363831303330343232343234393237323139383939323935353735343837343233373136363732393933383137373433353632313632383939323338313436303037353233373735383335363531343730303837333837363039323336333730373237323732333937383038303935333334343430353232373632393031313234303737373632373933323434363936363434313330313534323434323738363333383637303631373832333231363739323534393834363930353431363637323634383937323638303831333830373738383533313839373635353133393130353033303839393634333835383334303337353434333738393838363135383637333138303132393632303133303131313635373234383036313939333533323634313133373135313336323537313431313538383838363930353936313430373639373138333130313434333035333833373232343834323631313731353434353637393034343736393433313636303137343332383039323332363634303736353239303039323930333738333539373132393537363438363634353131343030373537363438303137343231373139313733353435363339363136303531323634363539393233343535313539373532343236343536333035383837343235313034363736363334383139383134303933383236343830373531373339313631303039343337363437393234333931323639363533343239333938373138313730383935373231303835343633313039313232393834353330373532313535303538343034373237323133363738373131313138383530313036313139343337303139333231343334383036363037353836373037353731383437393435373839313237343834393432313532393834313634313338373235373938313837313932323736353032313734363432393034333739383931313136333738383830373133373334313032343435303939333432323032383036333230343937383539333837333636303531383030313030343932313636353638303131383638383030323238313232373336323130393634373932323330363435383531383736313231313139303833353531313531353530393531373236343730323335393931363535313638353338393132343130383130333534343435383535363037323434353131323736323638373433383137393533323637373131303533363734313639343237353031333532313433343131323137393836333633303439373735393436373736353738363030333935373636393637393630383132353331383833373337323633323638313236363234313239333835313633393933303735373537393534333731323835383339333135313430353539383639323836363036333532303536323139313734363135383937353032363733353737303732313630373634323533343732383431353935363936373431393039343233333636303032333539313035383136303438343731333137373034323330393330313933343838373333313339373834333834313135373533323533363938353432373431353335323039363536383636303336303536313732323730353930363635313835363637313132393032343131363539313437393334353936313038373035303230313834393139353733373933343638373236383634363232333534343434393434373637343033353130353538383835343739343736357830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F78305C7831625B31313B31326D302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F78303032333933353337333133383330333533393334333933303336333833333332333833393333333433373336333533383331333333333331333533313339333533323337333733383331333033343332333133363335333233343331333133363337333433383331333033373330333333393330333033313333333833303334333933373331333933323336333933363339333933353332333433303331333933303334333633313332333933373337333233393337333333373334333933333338333733393338333833363332333833363331333233303335333033303338333833353337333933303336333133303332333133373334333933353338333933323339333433323339333533353338333533343336333933333336333633303336333933333335333833363335333533383336333133343331333033333339333233303338333333313336333233343335333633323339333633303332333533313332333333353339333033373333333033363335333333383330333033333335333833333339333233353334333433393337333333373334333533333338333533373332333533383332333533393335333633313330333033373330333233303332333433313330333533303330333133353335333933343338333133373333333233343332333533353338333833383334333633373336333433333339333733393335333333393338333033393330333933323331333733323335333133373333333733303335333833373330333433373331333833363339333033313334333333383337333933393337333433383337333833323337333633353337333033383334333133363337333633323334333233303336333433323330333433373330333233373335333633383337333833333338333033363336333633363330333133363330333433313337333333393338333133393332333433313339333833393334333933353332333133333338333233303333333433313331333533323334333733313337333233313337333633323335333333313330333033393332333633323332333233343330333633333333333833353331333433333336333633383333333633333338333933393337333633343335333733383331333433303339333833313332333233343333333433383332333433363334333133313338333133333331333933373333333433373339333233353333333733343335333533303331333233313339333233333336333833303339333633383331333133353335333733393339333733343334333633393330333533333338333533333337333533383339333433343336333733383337333833333335333133353335333633333331333633383335333733383333333633323338333333383334333533313332333133343331333433393330333333353335333633323337333733343332333233363338333133333330333133303338333833323339333833373339333733313334333333313336333033343338333033313336333533393338333933303332333733363330333033383339333433323336333533383339333133373336333733353337333533323331333733383331333433333335333833343330333533303338333233363331333733353337333433323333333233353337333833303332333933393339333933343338333433363335333333303339333833383330333833373333333433333335333633323339333233383338333133353334333633363335333133333336333233343339333333333333333133373338333433373331333433333338333133363332333533383339333733373338333533393333333433353335333333343330333233343332333433353336333833373336333433313332333633363339333833323333333633393331333533333331333133363339333333393333333033363337333033333334333133353333333633303332333933303332333033303333333333313334333833383337333733313331333733313330333933313333333033333334333933333332333633313337333933343334333433373332333333343339333233383333333433383339333333363330333733303338333533323333333533323330333233363331333133333337333733343330333233373339333833313333333333383331333233393336333933363335333933383332333733363334333533363336333033303336333433383335333933393332333633393337333833373339333333313331333233393338333033323334333333313337333633363330333333353332333333363333333833393339333533393335333133393331333133303339333933323333333633323338333033343330333033353335333833343331333533323330333233323335333633343330333833333338333833343330333433333338333233323337333433343330333333393338333333373331333333383331333933313336333933333333333933343332333733373336333033333330333433313331333733343338333333313333333233323330333133333332333533373339333033303334333033383334333933363337333733353338333033343333333833393337333933353330333933353336333833383330333933353336333333383335333033363330333333333333333233343339333833363334333833303337333633323330333233343332333333333331333733373336333733353336333433363334333033393337333233333338333633353332333733353330333133323337333133323337333733393336333133313336333033383339333033373331333233393333333433353336333933343339333233353334333233343330333833393337333533313338333133333337333433303337333833383331333033383331333133323338333433333333333733313334333933333335333733333336333433343338333333353338333333393334333733333331333933333331333233383333333533313331333333393334333333373333333933373331333333373339333333363338333933393337333133373334333233373330333633313338333533323339333033343336333433383333333133343337333233363330333133383339333733303336333233343336333533363335333333333334333533343331333033383333333933383331333333323338333133313337333833303332333333353338333533313332333933363338333833353338333533353339333633323332333333373339333233363336333433333330333633323335333233363336333233393339333133313331333133323333333033303332333133393334333633303339333333393333333033383337333833333337333633313339333033313335333033393335333233393330333733303339333633363334333433313337333033343335333633393339333333323332333133393334333433313338333633353339333133323330333133353331333533303335333633393332333033313332333733303332333933393331333733323337333633333333333633303337333833343331333533313336333833383331333633393334333633313330333633303336333533303334333133383339333233313339333033303333333533383333333733363337333533313333333633353336333333393337333633313334333933363335333033323337333233363339333033333338333333303339333133353332333933373335333133383337333233363330333133303334333933393331333433333336333633323338333833393330333033363335333933323332333833373335333433313336333033313339333233333334333133333334333933373333333533333332333933363331333833333331333833313339333633313339333833393334333133383330333333373331333333373331333133373330333033343335333533313336333933303334333733383331333333373330333433323332333933353333333333303337333033303338333833303330333233343331333333343332333833393335333133373330333433333336333333343335333233393339333433393335333633343333333533383335333433393339333033333332333233333332333833363331333433303336333933323331333833343333333233303331333333393338333733373337333633393335333733373333333033373339333733393330333733353331333833303337333633383337333233353336333133343331333733343330333633303337333533393334333033373331333333353333333633353330333133373335333233363332333233333335333333333337333033363336333333363332333633303330333133383338333133333336333333363333333133393334333933383337333633333334333933363331333533333335333833363334333433303337333033353337333233383333333733393332333433363332333033353334333933333334333033323335333833303339333333323337333133333334333033313335333033393331333433373330333433393335333133303330333833373336333633363335333233353337333433323336333033323332333833303333333033353332333133333332333333363338333033303334333333313339333033303337333333323331333933323333333233333336333533393333333833343330333333313333333233373336333833343338333333383338333733303338333933393332333033383337333833323334333233373333333533343338333933313333333833383330333333363331333033363332333733373333333733363334333233363330333133303339333333353332333033343337333833303335333733313337333833363332333033363337333833383331333233313333333333323337333433383336333533353339333533363338333933313331333233323333333133333333333933343330333433373333333533363336333533313338333333313335333933323338333233383338333033363333333933393333333033393332333633353335333533323335333333383338333233353339333533393338333533363336333333373338333733353331333733323335333733343337333833353339333233303334333333343331333733363334333133383338333733373339333433393332333433353337333133303338333133383337333533313339333533343332333933343339333633353331333433383332333733313333333833353334333733333333333933303336333733343339333233393339333733383330333233333330333133353332333933373332333633303333333433363330333533343338333833353331333533353333333133323339333233383331333633373332333633333337333533313330333733363338333133303336333733323330333733393330333833323335333433393332333833313332333533383339333533313335333833383331333433313330333133383333333833373336333933363331333533373339333433393335333533393338333233373338333733313336333433343331333433333335333733333331333833373331333533323336333133353339333233343332333633343334333533373336333033303331333633333336333233363338333833373335333433323339333033323331333733383336333533393334333133373334333133333338333333303333333733313336333733363331333533333336333533333330333833333331333433333331333233393332333433313339333733323332333233353333333633303337333333333334333933343338333933363334333233343331333033373334333233303339333833303334333933373333333633333334333833363330333633343336333033393336333133393330333333353338333633363332333133303338333633333331333033373332333633393338333833383338333833333331333833313332333333343338333833323337333233343337333933353339333033313336333333353338333733373335333133383333333733373338333033313337333533343339333833333331333733313334333733343332333033363338333933383338333933313336333433353331333033333339333633323337333733313335333233373334333933353334333433303331333233323331333233363336333333343330333033373335333533393337333833373334333933303338333633333334333733333336333033393334333333333331333733323331333333353336333133303336333633343336333533313333333233343332333533313332333633343339333333393334333833393332333433323336333733383332333833313334333233383333333733373331333733313334333933303337333333313339333533363337333133343334333733303336333433313334333133353332333333323332333833363338333433313336333033303333333133393335333233373330333433393333333333393330333533373331333533383337333433393330333233373330333533323333333133353333333433303332333533303339333333383335333633373335333133343338333133303334333733303337333933303330333233333331333333323339333133343333333533313337333133393337333433343339333833303335333933303339333033313337333633343339333933383337333733383339333933393330333733363336333633343332333233303335333233313336333633353330333233393331333333323331333233373339333133383334333633313332333533313338333233353333333633353331333933323336333333373330333333383330333133383337333633313331333433393333333133333339333433393336333733373330333933323331333433303335333033323338333333373338333533353339333633303334333133313339333733313330333033313330333933363335333633383330333233313335333433313339333533353336333933383336333133383335333733373334333733323334333633373334333833393338333433343338333533373331333833333332333733383336333133323338333133393334333833323339333333373333333833393330333233393337333933383334333833343332333233383334333933383334333833333339333833333334333633343336333433353336333633323338333133363337333433303331333633313336333633373339333633333334333633303339333333303332333933373336333433323336333433303330333433323333333633323335333733303334333933353332333833373336333533353336333933313331333933343336333033393336333533303334333233383337333833333333333233393334333633383331333233383332333833363336333533363339333933343338333733363338333433393336333733383334333933333330333633383337333333393333333333363334333633313332333333363335333633363336333033383333333733313335333933333332333833373330333233383335333333393332333933343333333433373334333933343333333933353338333333333334333733333336333933343335333733313339333933343335333933333338333933333330333233343335333633393330333533343334333033323330333833373338333533303336333333303336333733373330333633353332333333383330333933303331333133313338333233323334333833373331333433383331333933333330333933313330333633373331333933333333333633373335333033303337333333393330333833363337333133323338333833343335333833393338333533353339333833313336333033373335333533303339333933303330333033363335333033393338333333323337333133323338333533373332333733313334333833373335333233333333333433363334333733373335333533373333333933343339333333373339333933343337333133383332333533393337333033363330333433353337333233343335333633313333333533393331333233383338333133353336333833333336333633353337333833363331333533333331333433383339333833313332333233363338333833343331333833373337333033323330333233323338333633333333333133383330333133383332333233363336333833353338333633333331333833333338333833393334333433323331333133333338333933333333333333393337333933373339333033333333333933303332333133323335333833333336333233333334333733393338333833313332333733393332333633353338333033323336333933353337333333383334333233313331333433323331333233323338333833313330333233303338333333353337333833393333333933373335333833323336333133313338333233353338333233323336333033303331333733323338333733343339333133303334333133373330333833373336333533393334333833313339333933323332333833333339333833373333333733333334333533393334333133393333333133393336333233373338333433303334333033373335333333373331333733343339333333313337333333303336333833313339333733373336333033323339333133323330333433373333333233353339333633383332333233383332333533333330333633313330333233383333333133333333333633353338333133323338333033373331333033383333333433303339333933343331333633343331333533313335333133383331333733333333333933353337333333393336333833333336333833303337333333353339333433343335333433333338333133303335333133323336333233323336333233363330333133333332333733363335333333303339333833383334333333323334333433383338333633363339333633303333333433303333333233343337333933363331333533343333333433353332333933353335333933333331333333373332333433373334333933303339333033393336333933303335333633393336333833353333333733313332333233323338333933363330333033393338333733303337333733313331333333313330333433323330333733353330333433393334333733333333333633353332333433313338333833363331333933313339333833333332333233323339333533373333333233343337333333393334333333393337333233353338333333393332333433373334333933363331333633343337333933393338333233323334333033353335333733353336333133313331333733323334333333393334333233303338333233393338333933383335333833363335333533323336333633383333333433353337333133363338333233363333333033373332333433373331333833373330333833353338333433373330333633353338333833363333333133353330333833373339333933323331333933303339333633303337333133373336333733373338333033303332333633333337333533383336333333323334333933343332333333373337333233383330333733313338333533373332333933303333333533303334333733353335333633323339333433383339333633323333333133373339333933393334333733333333333233313334333133373331333433333330333933333330333633393337333933363334333033333333333433323334333133303336333133323335333733373339333333373335333333393336333033353335333733393332333433303333333433323335333533373338333733383332333133313332333333323332333433363331333833303337333433343338333233313336333533313332333833343339333133303333333733343338333433353336333033303334333933323333333133373333333533323335333033323332333833333332333433343331333333353330333233373339333733363335333433353332333633393330333533313334333033303332333133323332333233383331333033333334333933313336333233333333333333383330333033393337333733303334333033303332333733343330333433353338333833353337333733393333333333383339333633373339333433373331333933393330333633313334333333343338333633343336333133393336333833363334333733363332333933353336333733313339333333323332333233393338333533393331333733373337333833393330333433383332333033363337333833383336333433313330333433363333333633353331333133333335333133343332333433363335333833323332333133373336333633303338333433323339333033333334333533393330333333333336333133303332333933313333333133393334333833363339333233323339333333373334333333333338333933303335333533333330333133303333333633313333333933333335333933343337333133393337333633313338333633373335333233383339333333343337333133303332333433343330333833323331333033323334333233323337333933343338333033323339333633393338333633393333333733353331333933343335333133313330333533333332333833393338333733333339333633313336333633383330333733323335333233363330333733393339333733393332333533313333333533383336333833353331333333393332333933353339333333343330333833353333333133373335333633343339333333363331333033343330333833373335333833393335333533393331333233343335333033343330333533303335333033353338333033333336333933383333333533373330333733363331333933323335333433353332333133363337333633303334333133383338333933393332333233373332333333303335333633343337333533333337333333373335333033313335333933303336333933303330333233343337333933373332333533333339333233373337333933393333333133373333333433383330333433383335333733363333333637343543373833303338354337383330333035433738333033303543373833303330334336443646363437353643363533453543373833303334354337383330333035433738333033303543373833303330373335433738333033323543373833303330354337383330333035433738333033303543373833303633354337383330333132373239323929292F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030325C7831625B31313B31326D2F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F783037353537383530343435343533383530343632393235353133373732303230303935323031373535363436383435383731303536373437353136393530343338393838323437333134373531383530313230383631363935383330303432313834323931333738383339393839343839373431363535373831323230303333333630343137393033353237333633343631303238333431383038353631343537373734373738373432323631323134383538353432383733373339393331353031333836313439343238373131333038353431343530303834333131303934313331373538363134373433343630343833333634343732313034333037383338333730363734393532353732323939323531313638343035393139323837303337383231323335363235373137373735393335353238353533363333313032353436373534393938393030363232363138323033303630393334363332383739303436313639313034383934303430313032353337353231353736393434393339363638313637313231373936303536303238363637313336313431323937333232373835373236343133393631333934393332303134343035353033323938313436313531343338343934303039373033383633353533313336353130363430323234343337393436323832313339373439363834343833303030373635303038363438363531363439373339373137303330363238313830313238393737393239353639363837303530313539393437383031343732333332353434353431323239383531343935333034303737343435323433323430393737393031323834383535313037383137313532303331363138343232323431323632343432393431363530353735303237353938353639313336323132353731363130323035383638343436343839333231343537343034383638393935323131383337383539303838323939393435363432373237303036363731333737303639353530333934393038373635303533383232303333343331343534373030363837303431353131393332303230343635373335343831393230373531303630313737343135393138373831323430313034343634353036383031303534333437323234303831373337313736323431383037393033323736393130313833323333313535323837313431333632333934393636303235393030353934363133393537353535373730353331383838373337393538393037393436373936363730313334323135393537373636323038393335363330343238393039323039393432333632333633393533313532333739353635333336363734373233393732363534383336313636343134323736383232303335383137333939303036353833353031333439323837373938343039363239343131353337333439373532363337313133393334313735323939343133303330393839393936333239353632373033373436323733393432323337393632303336373836363736373531323932373131393235313030323132313839393937303337353939313938323733313531353330353833303235343737313130303238313639363031323232383535393133323730343435313138373639363138323731353739323435383732323032333737323632363936383036363533383236323430333434313832313537383432353232373435303337353435353839373739333538393433383833353230383432393434303033353230303138383636373032333138393135393031363435363633373535343137313937363230313834343936393032303934373934383034303839373935323138323132303231383635393933373539303730383233343939343634373533363532393734383638393833333934313530383338313033363232303132353630363034393839323433323530383435343439303130303638363135323938343030343732353836313433303031313037383339303033303839393835323334313934323433313535353737393738353634373331343733393030363832323132313439333735373738323131353734383833373335313338353039333133393235333835393931303137303832323739323537383232333634323738313137383432323235353838303232303435393834333736313737383833323136393633343931343235333039383937363433313536313333363933323132303538323538303330343038323934363733353634333331383739373436353831343634373032313836373439323731353239373232383833333731333539323731393939303533333436343634343438303936303733323334373131353938363132323835373936393032323136383035343937313535393635393435303939313735303938343330343934313939383638313038393831313331333534353835353030333038353334323537393735303338343331353636323134303431303839323430393537333435303935323333333430383732313033373237343939313939393437323335313738313639333437353430313230373433303232393634363735333933393038373831323937313236383134383733383439333333303934383532303431333335323933353135393431383139343030363837323337343231323337353337373637323531313233383538383739333631323939323537343537343537313934393132373137393832393531313331363032373833363839313137313339333437393630313931383537333533313433363534343634383633353439383632393235353032313439303636313037345C7831625B31313B31326D30393833313135363532303935343335373934383539313736313632313436373738383033303134333531313236323838313134363333353831323437373938373632383337343036383338393130343130393931343939303335343738363831363337343936333634363438373336353532383232313236393636323336393731303234303039383835353336343133393437383936323432353035383335313935303139353931383039373231323338383630353830343538353631373734303932363237303435303336393936393935303635383232383333313739313339323435353738393431313636343030353039363238303636343235383530343733393030313636363439343630303534303231363934373030343238323934373433373937373131383339313337323133373033313439333233383432323936343630343234343738353938313439353734313939353333393835383938383636323533383637383236343036383538313134383037383138323537353032383737363136343739303233323636313331393231323938303235383930393934323837313831303839363436313031313030353031303636393030313332313138393035323737313833363431313631383937333731303638363738343437383234393138313733343433383038393130303333383630363437353536363031373136363438323734343735333230383035383236323631303632303932303438383039343130343233373730363939313130343733343832353833313734393732313436303439303838383830393230343930363934323933323339383430323535343532333635313233393139303038313030383930353138343135333738333132373135333034393236393137333735303531383439363032303938363736323937313935383036393138383439373431303036333036333631363135393636373536323531373333353030343130343734393731393636363830343835333138383932363838383031313834363235363339373531363534383832313936333432383035363234323536393134383633383536303838363634313234363737323430343430323830313931353738323033333736323431383435393533343830333136363235363233333333383433373831353832303537363635363939323835393434343932323935333236353139333237353633393830383634353031353933303635373536313335383230303236383034303933393636333430363031363536333635333131343439323733363939343337343530313531323131313632383530383934343735373831353339343439393132343530333037313138333136393136313532333130393538303235313739333533343132323937383536313833373933333138353834343134373233373134303634313637313434303934373833363236303036323933323036373432323539333639363734333939333135323737303234323730363438343830313733323836313737343537383834373538343639383736323531373232333539323831313233323137363139303834363537363530353930393439323435323431363138363634303836393030393737353237303936333833343132363335393038333934303731343432323132313732343633343034323032303736373931373532353830303232343030343732393932313639323135373733323839383730313336313939303637393636383135393039353839323937323935343235323834343437323530313830373532343430353832333330353133373732323537333239373237383430363739363139333336393339353933343233343637323338363339393638373937383839333437383033373838373731313435383130363032313733313730363136333237363139393730323738363035323637313933353030393234393436393232383139313239323935383936353034383131303134313431323235343235333835313238373833353737323431383232383530363736363334363535303930343832313538333537363238383430393633323133363830313130303832353631353834393430393039383738393137303538333039393132373836353136383434333833303734363431373437333935383039333638313430323737383730373036373635313532333135353035333730383031353433353936303338333639323936343135313236393131373830363938303537313433323531313134373830343934343432393439343936373932343238333532373231313934393737343636343331393531363734323733373732383936343835353138363239313233353539363232343534363439333839313939353934303330343735333238363731393936383038333936343435363830373633373932323833323438303938373534323934363332353730343432353737353137373037323937353833373033353135383230383938323438343134303437393237393533353134313131303535343736353838373737303139323739343738383437323432363131323939343637383237353632393230383332313135363036373237353138363030393231313738353533313532333636343734353338343437383239323635353039323236303733353838303631323238343039333534363438323839383637313636383338313736323631383334303230323238353638303933383730373630303735373936303131303636363331363530383835323833323830343735303939363930383235323537303436343839383935373538323233393534393137383038323738353538343031333031313830363438353635343730313538353639343431323931343132383034353937393138373034313838373437313934363132383037373033353334323335323838343431343637333036313134363037313838353932303037333936363237363230373939313534343639303139373932353736343630333834333131323231303730383438393234313738343036313436353131383036333635373331303937323635313739353833383839363533353739313138313730303239373031353138363637323832343635393837333037313236383439313535373339343537383333363034303431373137383136353837343338333338363736393333343231323837363936313131303630313638323638353139333333393639323836353934303131353130333631333233333438323231343433333439333435363335343530323439333736373636313939303839313434333938363030323830393538363831343538303438383131383037313030303938373133333033313036333938353237393036363239323234383236333832393935343437363130323832373739313939313438303635353930343130343238353739323831303036313338383534353737363637373638343038393135383934333733393435323731373538353932313031343435373231393432333433313435373939323132393331333830333033343638353039323231373635333238373333393231383438333132313931373439383434373830383832333833393832333032373437373930343039313130353830313231363031313335363431303639353835313136363934353039363333303930343131323138313636393035343531363935343937303333353431363737373233303539333230303330353136343533353435363530313831353432303237333835313335303837393438343534323131353537323834313937363237313635393937303938323135313234343032393830363534383936383530303333313739333532393639393834343237373835373638373630363430303838313035333936393739383137313938393839393135353131363030303131363135393238313038393434343032333039303936353530333930323036363938313532393131373639353834373731393237323233353433373232303335333835383935383830323732323234383436373536373132303932353035343233373532343736383133313734353939393537363538343735393534303835353033313233363731333230383835313735393034333330383035363236343439373832303830313937343835393835353139363430323038363836353330363335303238313637343937393330333137343739323639333034303732363932363837343236343430343639363930343035323234333235323332393631383638353233303736353934353638333732343636303736303139393133313737363830333737363030353838343234313636343639393631303637343331363337313836383032323930353937363438373935373533353432353536323238353637383134333539303432333031363135363434373738363937323730343632393135363438363335373034383330343932353139333137363934363536383939343432333939333230383134363332353539353134373636343431383237343938353533343334323131333531313231303034343436343930333034313338393438333434323831323139323636303235373233313736343536373537303435333038343537363830363035373837393534393235303935333535373232303038373236323535383835323538353239373237373431343833343037333737373133383335343839383034303735363833303733333632333932363334353138323332393436323239313036313434303839353237333332393837383135353035333535343132333439343037373530353431353836383134393534383539303534343535353530303833303238343137383537323332373530353938303630383632333338313330353539313431343232333631343339373137363930313036373038323833363839313335353537313034303333383433343638373934333434383438343439323930323337393035333039373235383936333533383634393337323833373935303034343931303437393035333633373233393535383938383534393139373039393732353830373733373739363838393731333933313730323233393834323730363532383837313438343434393139393334393437313833333636333632393538363338383836323336333431353534353439313339373036383439373038383237303637333636343234343933303036333739313932323933323538353739343233363437383431363434393935373330373938363932383432303534333433383433333838363532333330303030383137343331333630383632333939313239353737353134343039383635393231303133323738383736303035393034303136363335363232363633343437353339353938323531343434313538323032353337393538373435373432303032383131373939363731323032303738373531323337313031383033383330343738343931333932363836323632333635303931383837393230313831383239323936393432383332343338333736333132343534353534343331383131373136393433353030363138383634303435353636393634383433323132303536313330383438313233333836303632313735363539343034363536383335303535313837303831343337383338383039343531383838363330393232393036333534313934353636333133333235343835393131383130383937383537323831383431363433383230393938343436353337303233373632313330353834333136393238393130323339323231393832313531393633313033363033303730363033323535323230373335303339323532393133393936373739303038373137383932363838373835303335393934393538383635363139383037333433383637373639363030323434383635363133313534303431373232363732393739303531363537353932333132353833393739353034363234393131383237333830353233323938383731373238383033363438343239383334343339323337333130383134323232383930343135383231323538303734313739353330313133373136303030363235353232313936363936363137363936363833303433303638333130383936363134343535343538383833343938343236343030363334333239353132313535363638383834303637333633323138363638313030393837343432313238343833383637303738383533373639363035333833343739393830313135313030333538323833313938343736383439343339363732363134323639323533383236353732383736383633383033353935313531393339333436333735373438333635343935373838383937313939303039343930323539353738343234373133383832353433393535343039393839333834323037313832303236353232393932333438393336303837353936303631313035373234343431343830393133363938353032323833303731323237363038373635383636343234383930313338393434313438333931323532393536333631303033393433363535353434353139383236323332323137313636383436353937313035313336303936313231323830343136303132393135303336383937383030313939393031333633303433363337363039343439373230333236333638373936333534343438363334353930343633313833303639313330353630383435333833303334363337363838373833333435383933393039353030323237373535383238353837393538303931383333303337323532313032363332383832383332363938363139373335353835373035333933313632353635363235323733383039353539343431383133383230373739303234313636393733363335343136333430353236353437323333343834383035393237363235383831363531333937373031313735363439343533343836313432383332313835383439303538393238313832333930303831343339313130373039303534313639353436323937393633343436323338303633373733343137333239383434303539353332323535353630303835333035363032333731353238383231323733333434363430353835333331313134323137343932353832323731373739383133313637393931343235363537363738303230383639353232303834333832333036333934383334373838323037333936353737343938383238393131363136323230393033323132353833303234323635323637393033323938363236333433353839333736363534333531353533323635363934373133323336333738333035383633323438393539383035343637383739333039363935303936363234313436353530333032333335333636363935333334373235343336343832383130323736383234323034323230323534363538363838303034303934303531343135393139343636343536353632343831383339303835373634343137343138363138353736373838343038343130323832383039303437383334323637313834343233343132363031363931343330373932383638313132393331353133303632393937343730333432383134373334303439333136333734363230323836353230353333343237393832353431303831313231383137393039353835323631303134383034393434343836333935353336383932303431393938383230333433393733343635353939313833333433303333393334323337363335353536383135333339383036333731323537383539353038373532353734353831363937393638323531333934353231393730353033393433373739343035313838393135303433333937333338373336333839343434343630383134363030323936343032323331333438363136353837343836373835373230383636363833383035333237343231303834353236323234303230303837343630383333303131303834343033303233373936303638303735373231323732373631383837333234313536333730313433343738303535323438333139343234323535373839333733363338303630303338373339313635333339303834323234333831373338333833393935353937303131353439383038363630303334353133353735363430323039313338333138343839393439393131323939343134333533353331373732333139383637373532343731393239353632373137373032313330313337353236373639343732353934323931393338383431333134303831353232363034303534313533343034393533353737393036313337303634343335313133383231393636353231333536363030383836383834303735373836333836323338363331383736343037393232383539353630313131333433323039383238333634323934303432383737323431303133373539353234343430393330303033363533383638313835363531323936343230333534323130393139363136343738373032383030333630333537333839303736373833373732393630313635343232373439333633383939353632303232393038303530383937353633363437323134343737363132353232313339303738373437323932353730343836373931343039373435353231333830323833353632313736393032313133373635393336333732363336353633343833383631363739363937353934333735373737303835323739343939383434393431373037373834343536343434393933363332383930303931313636323432313730383534323638313331383038303738343532353932313130343239363636323834383433393933323030373636323733363039373937333831343437323433343237323735353738383438363038333133373232393038343233353332333031393036363638383533343439333631303230303632313239313936393938333237383638363833353434323131383939353339353634343230393433353831383633343530353630373636383130333034323234323439323732313938393932393535373534383734323337313636373239393338313737343335363231363238393932333831343630303735323337373538333536353134373030383733383736303932333633373037323732373233393738303830393533333434343035323237363239303131323430373737363237393332343436393636343431333031353432343432373836333338363730363137383233323136373932353439383436393035343136363732363438393732363830383133383037373838353331383937363535313339313035303330383939363433383538333430333735343433373839383836313538363733313830313239363230313330313131363537323438303631393933353332363431313337313531333632353731343131353838383836393035393631343037363937313833313031343433303533383337323234383432363131373135343435363739303434373639343331363630313734333238303932333236363430373635323930303932393033373833353937313239353736343836363435313134303037353736343830313734323137313931373335343536333936313630353132363436353939323334353531353937353234323634353633303538383734323531303436373636333438313938313430393338323634383037353137333931363130303934333736343739323433393132363936353334323933393837313831373038393537323130383534363331303931323239383435333037353231353530353834303437323732313336373837313131313838353031303631313934333730313933323134333438303636303735383637303735373138343739343537383931323734383439343231353239383431363431333837323537393831383731393232373635303231373436343239303433373938393131313633373838383037313337333431303234343530393933343232303238303633323034393738353933383733363630353138303031303034393231363635363830313138363838303032323831323237333632313039363437393232333036343538353138373631323131313930383335353131353135353039353137323634373032333539393136353531363835333839313234313038313033353434343538353536303732343435313132373632363837343338313739353332363737313130353336373431363934323735303133353231343334313132313739383633363330343937373539343637373635373836303033393537363639363739363038313235333138383337333732363332363831323636323431323933383531363339393330373537353739353433373132383538333933313531343035353938363932383636303633353230353632313931373436313538393735303236373335373730373231363037363432353334373238343135393536393637343139303934323333363630303233353931303538313630343834373133313737303432333039333031393334383837333331333937383433383431313537353332353336393835343237343135333532303936353638363630333630353631373232373035393036363531383536363731313239303234313136353931343739333435393631303837303530323031383439313935373337393334363837323638363436323233353434343439343437363734303335313035353838383534373934373635302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F78305C7831625B31313B31326D302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F783030322F7830302F7830302F7830302F7830382F7830322F7830302F7830302F78303032745C7830385C7830305C7830305C7830303C6D6F64756C653E5C7830345C7830305C7830305C783030735C7830325C7830305C7830305C7830305C7830635C783031222929')) | 80,262.4 | 401,204 | 0.99992 | 17 | 401,312 | 23,604.705882 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.93829 | 0.000037 | 401,312 | 5 | 401,204 | 80,262.4 | 0.061668 | 0.000217 | 0 | 0 | 0 | 0 | 0.99989 | 0.99989 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 12 |
1406d39a7d6da4c0c78f1f92c02fdb296698911c | 204 | py | Python | hpe_benchmark/data/transforms/__init__.py | anotherTK/human-pose-estimation.pytorch | 9c2b1fa6b5b4d70cd5f4c72a915f953a4257a994 | [
"MIT"
] | null | null | null | hpe_benchmark/data/transforms/__init__.py | anotherTK/human-pose-estimation.pytorch | 9c2b1fa6b5b4d70cd5f4c72a915f953a4257a994 | [
"MIT"
] | null | null | null | hpe_benchmark/data/transforms/__init__.py | anotherTK/human-pose-estimation.pytorch | 9c2b1fa6b5b4d70cd5f4c72a915f953a4257a994 | [
"MIT"
] | null | null | null | # keypoints
from .transforms import get_affine_transform
from .transforms import affine_transform
from .transforms import flip_joints
from .transforms import flip_back
from .build import build_transforms | 29.142857 | 44 | 0.862745 | 27 | 204 | 6.296296 | 0.407407 | 0.329412 | 0.470588 | 0.341176 | 0.411765 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107843 | 204 | 7 | 45 | 29.142857 | 0.934066 | 0.044118 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
141a89605bedcbf9e5c8097943e96962e5a263bb | 6,544 | py | Python | tests/models/event/weighted_site/tests_frontier_methods.py | EderVs/Voronoi-Diagrams | 6e69f9b6eb516dee12d66f187cf267a7b527da5f | [
"MIT"
] | 3 | 2021-11-12T17:43:08.000Z | 2022-01-03T02:47:34.000Z | tests/models/event/weighted_site/tests_frontier_methods.py | EderVs/Voronoi-Diagrams | 6e69f9b6eb516dee12d66f187cf267a7b527da5f | [
"MIT"
] | 3 | 2021-11-19T20:12:31.000Z | 2021-11-19T20:14:39.000Z | tests/models/event/weighted_site/tests_frontier_methods.py | EderVs/Voronoi-Diagrams | 6e69f9b6eb516dee12d66f187cf267a7b527da5f | [
"MIT"
] | null | null | null | """Test frontier methods in WeightedSite."""
# Math
from decimal import Decimal
# Models
from voronoi_diagrams.models import WeightedSite, Point
# General Utils
from general_utils.numbers import are_close
class TestGetFrontierPointingToPoint:
"""Test x and y formula of the frontier."""
def test_up_right(self):
"""Test in the up right corner."""
p = WeightedSite(Decimal(1), Decimal(1), Decimal(2).sqrt())
point = Point(Decimal(3), Decimal(3))
x = p.get_x_frontier_pointing_to_point(point)
y = p.get_y_frontier_pointing_to_point(point)
epsilon = Decimal(0.00001)
assert are_close(x, Decimal(2), epsilon)
assert are_close(y, Decimal(2), epsilon)
def test_up_left(self):
"""Test in the up left corner."""
p = WeightedSite(Decimal(1), Decimal(1), Decimal(2).sqrt())
point = Point(Decimal(-1), Decimal(3))
x = p.get_x_frontier_pointing_to_point(point)
y = p.get_y_frontier_pointing_to_point(point)
epsilon = Decimal(0.00001)
assert are_close(x, Decimal(0), epsilon)
assert are_close(y, Decimal(2), epsilon)
def test_down_right(self):
"""Test in the down right corner."""
p = WeightedSite(Decimal(1), Decimal(1), Decimal(2).sqrt())
point = Point(Decimal(3), Decimal(-1))
x = p.get_x_frontier_pointing_to_point(point)
y = p.get_y_frontier_pointing_to_point(point)
epsilon = Decimal(0.00001)
assert are_close(x, Decimal(2), epsilon)
assert are_close(y, Decimal(0), epsilon)
def test_down_left(self):
"""Test in the down left corner."""
p = WeightedSite(Decimal(1), Decimal(1), Decimal(2).sqrt())
point = Point(Decimal(-1), Decimal(-1))
x = p.get_x_frontier_pointing_to_point(point)
y = p.get_y_frontier_pointing_to_point(point)
epsilon = Decimal(0.00001)
assert are_close(x, Decimal(0), epsilon)
assert are_close(y, Decimal(0), epsilon)
def test_right(self):
"""Test in the down left corner."""
p = WeightedSite(Decimal(1), Decimal(1), Decimal(1))
point = Point(Decimal(3), Decimal(1))
x = p.get_x_frontier_pointing_to_point(point)
y = p.get_y_frontier_pointing_to_point(point)
epsilon = Decimal(0.00001)
assert are_close(x, Decimal(2), epsilon)
assert are_close(y, Decimal(1), epsilon)
def test_left(self):
"""Test in the down left corner."""
p = WeightedSite(Decimal(1), Decimal(1), Decimal(1))
point = Point(Decimal(-1), Decimal(1))
x = p.get_x_frontier_pointing_to_point(point)
y = p.get_y_frontier_pointing_to_point(point)
epsilon = Decimal(0.00001)
assert are_close(x, Decimal(0), epsilon)
assert are_close(y, Decimal(1), epsilon)
def test_up(self):
"""Test in the down left corner."""
p = WeightedSite(Decimal(1), Decimal(1), Decimal(1))
point = Point(Decimal(1), Decimal(3))
x = p.get_x_frontier_pointing_to_point(point)
y = p.get_y_frontier_pointing_to_point(point)
epsilon = Decimal(0.00001)
assert are_close(x, Decimal(1), epsilon)
assert are_close(y, Decimal(2), epsilon)
def test_down(self):
"""Test in the down left corner."""
p = WeightedSite(Decimal(1), Decimal(1), Decimal(1))
point = Point(Decimal(1), Decimal(-1))
x = p.get_x_frontier_pointing_to_point(point)
y = p.get_y_frontier_pointing_to_point(point)
epsilon = Decimal(0.00001)
assert are_close(x, Decimal(1), epsilon)
assert are_close(y, Decimal(0), epsilon)
class TestGetFrontier:
"""Test get frontier point given a coordinate."""
def test_right_middle(self):
"""Test in the up right corner."""
p = WeightedSite(Decimal(0), Decimal(0), Decimal(2).sqrt())
fx = Decimal(1)
fy = Decimal(1)
x1, x2 = p.get_x_frontier_formula(fy)
y1, y2 = p.get_y_frontier_formula(fx)
epsilon = Decimal(0.00001)
assert are_close(x1, fx, epsilon)
assert are_close(x2, -fx, epsilon)
assert are_close(y1, fy, epsilon)
assert are_close(y2, -fy, epsilon)
def test_left_middle(self):
"""Test in the up left corner."""
p = WeightedSite(Decimal(0), Decimal(0), Decimal(2).sqrt())
fx = Decimal(-1)
fy = Decimal(1)
x1, x2 = p.get_x_frontier_formula(fy)
y1, y2 = p.get_y_frontier_formula(fx)
epsilon = Decimal(0.00001)
assert are_close(x1, -fx, epsilon)
assert are_close(x2, fx, epsilon)
assert are_close(y1, fy, epsilon)
assert are_close(y2, -fy, epsilon)
def test_right_one(self):
"""Test in the down left corner."""
p = WeightedSite(Decimal(0), Decimal(0), Decimal(1))
fx = Decimal(1)
fy = Decimal(0)
x1, x2 = p.get_x_frontier_formula(fy)
y1, y2 = p.get_y_frontier_formula(fx)
epsilon = Decimal(0.00001)
assert are_close(x1, fx, epsilon)
assert are_close(x2, -fx, epsilon)
assert are_close(y1, fy, epsilon)
assert are_close(y2, -fy, epsilon)
def test_sides(self):
"""Test in the down left corner."""
p = WeightedSite(Decimal(0), Decimal(0), Decimal(1))
fx = Decimal(-1)
fy = Decimal(0)
x1, x2 = p.get_x_frontier_formula(fy)
y1, y2 = p.get_y_frontier_formula(fx)
epsilon = Decimal(0.00001)
assert are_close(x1, -fx, epsilon)
assert are_close(x2, fx, epsilon)
assert are_close(y1, fy, epsilon)
assert are_close(y2, fy, epsilon)
def test_middle(self):
"""Test in the down left corner."""
p = WeightedSite(Decimal(0), Decimal(0), Decimal(1))
fx = Decimal(0)
fy = Decimal(-1)
x1, x2 = p.get_x_frontier_formula(fy)
y1, y2 = p.get_y_frontier_formula(fx)
epsilon = Decimal(0.00001)
assert are_close(x1, fx, epsilon)
assert are_close(x2, fx, epsilon)
assert are_close(y1, -fy, epsilon)
assert are_close(y2, fy, epsilon)
def test_not_in_limit(self):
"""Test in the down left corner."""
p = WeightedSite(Decimal(0), Decimal(0), Decimal(1))
fx = Decimal(2)
fy = Decimal(2)
result_x = p.get_x_frontier_formula(fy)
result_y = p.get_y_frontier_formula(fx)
assert result_x is None
assert result_y is None
| 34.808511 | 67 | 0.617665 | 931 | 6,544 | 4.146079 | 0.067669 | 0.093264 | 0.13057 | 0.12513 | 0.895855 | 0.891451 | 0.869948 | 0.869948 | 0.867876 | 0.867876 | 0 | 0.042062 | 0.258863 | 6,544 | 187 | 68 | 34.994652 | 0.753814 | 0.08588 | 0 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.287879 | 1 | 0.106061 | false | 0 | 0.022727 | 0 | 0.143939 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
149980a3ea0802e4ad849029c7d5608635f5133a | 25,661 | py | Python | isi_sdk_7_2/isi_sdk_7_2/api/statistics_api.py | mohitjain97/isilon_sdk_python | a371f438f542568edb8cda35e929e6b300b1177c | [
"Unlicense"
] | 24 | 2018-06-22T14:13:23.000Z | 2022-03-23T01:21:26.000Z | isi_sdk_7_2/isi_sdk_7_2/api/statistics_api.py | mohitjain97/isilon_sdk_python | a371f438f542568edb8cda35e929e6b300b1177c | [
"Unlicense"
] | 46 | 2018-04-30T13:28:22.000Z | 2022-03-21T21:11:07.000Z | isi_sdk_7_2/isi_sdk_7_2/api/statistics_api.py | mohitjain97/isilon_sdk_python | a371f438f542568edb8cda35e929e6b300b1177c | [
"Unlicense"
] | 29 | 2018-06-19T00:14:04.000Z | 2022-02-08T17:51:19.000Z | # coding: utf-8
"""
Isilon SDK
Isilon SDK - Language bindings for the OneFS API # noqa: E501
OpenAPI spec version: 2
Contact: sdk@isilon.com
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from isi_sdk_7_2.api_client import ApiClient
class StatisticsApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def get_statistics_current(self, **kwargs): # noqa: E501
"""get_statistics_current # noqa: E501
Retrieve stats. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_statistics_current(async_req=True)
>>> result = thread.get()
:param async_req bool
:param int timeout: Time in seconds to wait for results from remote nodes.
:param bool degraded: If true, try to continue even if some stats are unavailable. In this case, errors will be present in the per-key returned data.
:param list[str] devid: Node devid to query. Either an <integer> or \"all\". Can be used more than one time to query multiple nodes. \"all\" queries all up nodes. 0 means query the local node. For \"cluster\" scoped keys, in any devid including 0 can be used.
:param list[str] key: Key names. Can be used more than one time to query multiple keys.
:param bool expand_clientid: If true, use name resolution to expand client addresses and other IDs.
:return: StatisticsCurrent
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_statistics_current_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_statistics_current_with_http_info(**kwargs) # noqa: E501
return data
def get_statistics_current_with_http_info(self, **kwargs): # noqa: E501
"""get_statistics_current # noqa: E501
Retrieve stats. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_statistics_current_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param int timeout: Time in seconds to wait for results from remote nodes.
:param bool degraded: If true, try to continue even if some stats are unavailable. In this case, errors will be present in the per-key returned data.
:param list[str] devid: Node devid to query. Either an <integer> or \"all\". Can be used more than one time to query multiple nodes. \"all\" queries all up nodes. 0 means query the local node. For \"cluster\" scoped keys, in any devid including 0 can be used.
:param list[str] key: Key names. Can be used more than one time to query multiple keys.
:param bool expand_clientid: If true, use name resolution to expand client addresses and other IDs.
:return: StatisticsCurrent
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['timeout', 'degraded', 'devid', 'key', 'expand_clientid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_statistics_current" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'timeout' in params:
query_params.append(('timeout', params['timeout'])) # noqa: E501
if 'degraded' in params:
query_params.append(('degraded', params['degraded'])) # noqa: E501
if 'devid' in params:
query_params.append(('devid', params['devid'])) # noqa: E501
collection_formats['devid'] = 'csv' # noqa: E501
if 'key' in params:
query_params.append(('key', params['key'])) # noqa: E501
collection_formats['key'] = 'csv' # noqa: E501
if 'expand_clientid' in params:
query_params.append(('expand_clientid', params['expand_clientid'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/1/statistics/current', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='StatisticsCurrent', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_statistics_history(self, **kwargs): # noqa: E501
"""get_statistics_history # noqa: E501
Retrieve stats. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_statistics_history(async_req=True)
>>> result = thread.get()
:param async_req bool
:param int begin: Earliest time (Unix epoch seconds) of interest. Negative times are interpreted as relative (before) now.
:param int interval: Minimum sampling interval time in seconds. If native statistics are higher resolution, data will be down-sampled.
:param int end: Latest time (Unix epoch seconds) of interest. Negative times are interpreted as relative (before) now. If not supplied, use now as the end time.
:param int timeout: Time in seconds to wait for results from remote nodes.
:param list[str] devid: Node devid to query. Either an <integer> or \"all\". Can be used more than one time to query multiple nodes. \"all\" queries all up nodes. 0 means query the local node. For \"cluster\" scoped keys, in any devid including 0 can be used.
:param bool memory_only: Only use statistics sources that reside in memory (faster, but with less retention).
:param list[str] key: Key names. Can be used more than one time to query multiple keys.
:param bool degraded: If true, try to continue even if some stats are unavailable. In this case, errors will be present in the per-key returned data.
:param bool expand_clientid: If true, use name resolution to expand client addresses and other IDs.
:return: StatisticsHistory
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_statistics_history_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_statistics_history_with_http_info(**kwargs) # noqa: E501
return data
def get_statistics_history_with_http_info(self, **kwargs): # noqa: E501
"""get_statistics_history # noqa: E501
Retrieve stats. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_statistics_history_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param int begin: Earliest time (Unix epoch seconds) of interest. Negative times are interpreted as relative (before) now.
:param int interval: Minimum sampling interval time in seconds. If native statistics are higher resolution, data will be down-sampled.
:param int end: Latest time (Unix epoch seconds) of interest. Negative times are interpreted as relative (before) now. If not supplied, use now as the end time.
:param int timeout: Time in seconds to wait for results from remote nodes.
:param list[str] devid: Node devid to query. Either an <integer> or \"all\". Can be used more than one time to query multiple nodes. \"all\" queries all up nodes. 0 means query the local node. For \"cluster\" scoped keys, in any devid including 0 can be used.
:param bool memory_only: Only use statistics sources that reside in memory (faster, but with less retention).
:param list[str] key: Key names. Can be used more than one time to query multiple keys.
:param bool degraded: If true, try to continue even if some stats are unavailable. In this case, errors will be present in the per-key returned data.
:param bool expand_clientid: If true, use name resolution to expand client addresses and other IDs.
:return: StatisticsHistory
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['begin', 'interval', 'end', 'timeout', 'devid', 'memory_only', 'key', 'degraded', 'expand_clientid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_statistics_history" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'begin' in params:
query_params.append(('begin', params['begin'])) # noqa: E501
if 'interval' in params:
query_params.append(('interval', params['interval'])) # noqa: E501
if 'end' in params:
query_params.append(('end', params['end'])) # noqa: E501
if 'timeout' in params:
query_params.append(('timeout', params['timeout'])) # noqa: E501
if 'devid' in params:
query_params.append(('devid', params['devid'])) # noqa: E501
collection_formats['devid'] = 'csv' # noqa: E501
if 'memory_only' in params:
query_params.append(('memory_only', params['memory_only'])) # noqa: E501
if 'key' in params:
query_params.append(('key', params['key'])) # noqa: E501
collection_formats['key'] = 'csv' # noqa: E501
if 'degraded' in params:
query_params.append(('degraded', params['degraded'])) # noqa: E501
if 'expand_clientid' in params:
query_params.append(('expand_clientid', params['expand_clientid'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/1/statistics/history', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='StatisticsHistory', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_statistics_key(self, statistics_key_id, **kwargs): # noqa: E501
"""get_statistics_key # noqa: E501
List key meta-data. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_statistics_key(statistics_key_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str statistics_key_id: List key meta-data. (required)
:return: StatisticsKeys
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_statistics_key_with_http_info(statistics_key_id, **kwargs) # noqa: E501
else:
(data) = self.get_statistics_key_with_http_info(statistics_key_id, **kwargs) # noqa: E501
return data
def get_statistics_key_with_http_info(self, statistics_key_id, **kwargs): # noqa: E501
"""get_statistics_key # noqa: E501
List key meta-data. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_statistics_key_with_http_info(statistics_key_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str statistics_key_id: List key meta-data. (required)
:return: StatisticsKeys
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['statistics_key_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_statistics_key" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'statistics_key_id' is set
if ('statistics_key_id' not in params or
params['statistics_key_id'] is None):
raise ValueError("Missing the required parameter `statistics_key_id` when calling `get_statistics_key`") # noqa: E501
collection_formats = {}
path_params = {}
if 'statistics_key_id' in params:
path_params['StatisticsKeyId'] = params['statistics_key_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/1/statistics/keys/{StatisticsKeyId}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='StatisticsKeys', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_statistics_keys(self, **kwargs): # noqa: E501
"""get_statistics_keys # noqa: E501
List meta-data for matching keys. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_statistics_keys(async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool count: Only count matching keys, do not return meta-data.
:param int limit: Return no more than this many results at once (see resume).
:param bool queryable: Only list keys that can/cannot be queries. Default is true.
:param str resume: Continue returning results from previous call using this token (token should come from the previous call, resume cannot be used with other options).
:return: StatisticsKeysExtended
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_statistics_keys_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_statistics_keys_with_http_info(**kwargs) # noqa: E501
return data
def get_statistics_keys_with_http_info(self, **kwargs): # noqa: E501
"""get_statistics_keys # noqa: E501
List meta-data for matching keys. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_statistics_keys_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool count: Only count matching keys, do not return meta-data.
:param int limit: Return no more than this many results at once (see resume).
:param bool queryable: Only list keys that can/cannot be queries. Default is true.
:param str resume: Continue returning results from previous call using this token (token should come from the previous call, resume cannot be used with other options).
:return: StatisticsKeysExtended
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['count', 'limit', 'queryable', 'resume'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_statistics_keys" % key
)
params[key] = val
del params['kwargs']
if 'limit' in params and params['limit'] < 1: # noqa: E501
raise ValueError("Invalid value for parameter `limit` when calling `get_statistics_keys`, must be a value greater than or equal to `1`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'count' in params:
query_params.append(('count', params['count'])) # noqa: E501
if 'limit' in params:
query_params.append(('limit', params['limit'])) # noqa: E501
if 'queryable' in params:
query_params.append(('queryable', params['queryable'])) # noqa: E501
if 'resume' in params:
query_params.append(('resume', params['resume'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/1/statistics/keys', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='StatisticsKeysExtended', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_statistics_protocols(self, **kwargs): # noqa: E501
"""get_statistics_protocols # noqa: E501
Retrieve protocol list. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_statistics_protocols(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: StatisticsProtocols
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_statistics_protocols_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_statistics_protocols_with_http_info(**kwargs) # noqa: E501
return data
def get_statistics_protocols_with_http_info(self, **kwargs): # noqa: E501
"""get_statistics_protocols # noqa: E501
Retrieve protocol list. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_statistics_protocols_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: StatisticsProtocols
If the method is called asynchronously,
returns the request thread.
"""
all_params = [] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_statistics_protocols" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/platform/1/statistics/protocols', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='StatisticsProtocols', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 44.550347 | 270 | 0.629477 | 3,108 | 25,661 | 4.995174 | 0.085907 | 0.049984 | 0.029565 | 0.023188 | 0.916779 | 0.902866 | 0.890048 | 0.878712 | 0.876522 | 0.872657 | 0 | 0.017058 | 0.280387 | 25,661 | 575 | 271 | 44.627826 | 0.823676 | 0.411753 | 0 | 0.744186 | 1 | 0.003322 | 0.189802 | 0.044018 | 0 | 0 | 0 | 0 | 0 | 1 | 0.036545 | false | 0 | 0.013289 | 0 | 0.10299 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
21534c51127f7cca63bb698869dabe0fd8ceca2e | 240 | py | Python | mantrap/solver/baselines/__init__.py | simon-schaefer/mantrap | 9a2b3f32a0005cc0cb79bb78924f09da5a94587d | [
"MIT"
] | 7 | 2020-05-11T18:13:27.000Z | 2022-03-09T02:52:48.000Z | mantrap/solver/baselines/__init__.py | StanfordASL/mantrap | 9a2b3f32a0005cc0cb79bb78924f09da5a94587d | [
"MIT"
] | null | null | null | mantrap/solver/baselines/__init__.py | StanfordASL/mantrap | 9a2b3f32a0005cc0cb79bb78924f09da5a94587d | [
"MIT"
] | 3 | 2020-12-09T00:03:26.000Z | 2022-03-03T10:39:03.000Z | from mantrap.solver.baselines.mcts import MonteCarloTreeSearch
from mantrap.solver.baselines.orca import ORCASolver
from mantrap.solver.baselines.random_search import RandomSearch
from mantrap.solver.baselines.rrt_star import RRTStarSolver
| 48 | 63 | 0.883333 | 30 | 240 | 7 | 0.5 | 0.209524 | 0.32381 | 0.495238 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066667 | 240 | 4 | 64 | 60 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
0d046cbf439067ab3a0eca09ec6e96b88a09d9f9 | 6,089 | py | Python | app.py | silenteyesoncode/BigDATA_Analysis | e88c09b93fcaad42d2d93cb4fd6e7271658236e8 | [
"MIT"
] | null | null | null | app.py | silenteyesoncode/BigDATA_Analysis | e88c09b93fcaad42d2d93cb4fd6e7271658236e8 | [
"MIT"
] | null | null | null | app.py | silenteyesoncode/BigDATA_Analysis | e88c09b93fcaad42d2d93cb4fd6e7271658236e8 | [
"MIT"
] | null | null | null | from flask import Flask, make_response, request, render_template
import io
import os
import csv
import pickle
import pandas as pd
from sklearn.preprocessing import LabelEncoder
import numpy as np
app = Flask(__name__)
model = pickle.load(open(r'model.pkl', 'rb'))
# Base Route
@app.route('/')
def hello():
return render_template('bigmart.html')
# Prediction for dataset
@app.route('/predict_for_set', methods=['POST'])
def predict_for_set():
file = request.files.get('file')
df = pd.read_csv(file)
# check for categorical attributes
cat_col = []
for x in df.dtypes.index:
if df.dtypes[x] == 'object':
cat_col.append(x)
cat_col.remove('Item_Identifier')
cat_col.remove('Outlet_Identifier')
item_weight_mean = df.pivot_table( values="Item_Weight", index='Item_Identifier')
miss_bool = df['Item_Weight'].isnull()
for i, item in enumerate(df['Item_Identifier']):
if miss_bool[i]:
if item in item_weight_mean:
df['Item_Weight'][i] = item_weight_mean.loc[item]['Item_Weight']
else:
df['Item_Weight'][i] = np.mean(df['Item_Weight'])
outlet_size_mode = df.pivot_table(values='Outlet_Size', columns='Outlet_Type', aggfunc=(lambda x: x.mode()[0]))
miss_bool = df['Outlet_Size'].isnull()
df.loc[miss_bool, 'Outlet_Size'] = df.loc[miss_bool,'Outlet_Type'].apply(lambda x: outlet_size_mode[x])
# replace zeros with mean
df.loc[:, 'Item_Visibility'].replace(
[0], [df['Item_Visibility'].mean()], inplace=True)
# combine item fat content
df['Item_Fat_Content'] = df['Item_Fat_Content'].replace({'LF': 'Low Fat', 'reg': 'Regular', 'low fat': 'Low Fat'})
df['Item_Fat_Content'].value_counts()
# Creation of New Attributes
df['New_Item_Type'] = df['Item_Identifier'].apply(lambda x: x[:2])
df['New_Item_Type'] = df['New_Item_Type'].map({'FD': 'Food', 'NC': 'Non-Consumable', 'DR': 'Drinks'})
df.loc[df['New_Item_Type'] == 'Non-Consumable','Item_Fat_Content'] = 'Non-Edible'
# create small values for establishment year
df['Outlet_Years'] = 2013 - df['Outlet_Establishment_Year']
le = LabelEncoder()
df['Outlet'] = le.fit_transform(df['Outlet_Identifier'])
cat_col = ['Item_Fat_Content', 'Item_Type', 'Outlet_Size','Outlet_Location_Type', 'Outlet_Type', 'New_Item_Type']
for col in cat_col:
df[col] = le.fit_transform(df[col])
# Input Split
X = df.drop(columns=['Outlet_Establishment_Year', 'Item_Identifier', 'Outlet_Identifier'])
print(X)
print(X.dtypes)
# Prediction
output = model.predict(X).tolist()
sales = sum(output)
df['Item_Outlet_Sales'] = output
df['Revenue'] = df['Item_Outlet_Sales']*df['Item_MRP']
revenue = sum(df['Revenue'])
print(revenue)
revenue /= 1000000
print(revenue)
return render_template('bigmart.html', pred1="The {} is the overall number of items that are expected to be sold from bigmart stores.".format(sales), pred2="The total revenue that should be generate is ${} million.".format(revenue))
# Prediction for single product
@app.route('/predict_for_one', methods=['POST'])
def predict_for_one():
d = None
d = request.form.to_dict()
df = pd.DataFrame([d.values()], columns=d.keys())
df = df.infer_objects()
df[['Item_Weight','Item_Visibility','Item_MRP','Outlet_Establishment_Year']] = df[['Item_Weight','Item_Visibility','Item_MRP','Outlet_Establishment_Year']].apply(pd.to_numeric)
# Process dataframe as required
# check for categorical attributes
cat_col = []
for x in df.dtypes.index:
if df.dtypes[x] == 'object':
cat_col.append(x)
cat_col.remove('Item_Identifier')
cat_col.remove('Outlet_Identifier')
item_weight_mean = df.pivot_table( values="Item_Weight", index='Item_Identifier')
miss_bool = df['Item_Weight'].isnull()
for i, item in enumerate(df['Item_Identifier']):
if miss_bool[i]:
if item in item_weight_mean:
df['Item_Weight'][i] = item_weight_mean.loc[item]['Item_Weight']
else:
df['Item_Weight'][i] = np.mean(df['Item_Weight'])
outlet_size_mode = df.pivot_table(values='Outlet_Size', columns='Outlet_Type', aggfunc=(lambda x: x.mode()[0]))
miss_bool = df['Outlet_Size'].isnull()
df.loc[miss_bool, 'Outlet_Size'] = df.loc[miss_bool,'Outlet_Type'].apply(lambda x: outlet_size_mode[x])
# replace zeros with mean
df.loc[:, 'Item_Visibility'].replace(
[0], [df['Item_Visibility'].mean()], inplace=True)
# combine item fat content
df['Item_Fat_Content'] = df['Item_Fat_Content'].replace({'LF': 'Low Fat', 'reg': 'Regular', 'low fat': 'Low Fat'})
df['Item_Fat_Content'].value_counts()
# Creation of New Attributes
df['New_Item_Type'] = df['Item_Identifier'].apply(lambda x: x[:2])
df['New_Item_Type'] = df['New_Item_Type'].map({'FD': 'Food', 'NC': 'Non-Consumable', 'DR': 'Drinks'})
df.loc[df['New_Item_Type'] == 'Non-Consumable','Item_Fat_Content'] = 'Non-Edible'
# create small values for establishment year
df['Outlet_Years'] = 2013 - df['Outlet_Establishment_Year']
le = LabelEncoder()
df['Outlet'] = le.fit_transform(df['Outlet_Identifier'])
cat_col = ['Item_Fat_Content', 'Item_Type', 'Outlet_Size','Outlet_Location_Type', 'Outlet_Type', 'New_Item_Type']
for col in cat_col:
df[col] = le.fit_transform(df[col])
# Input Split
X = df.drop(columns=['Outlet_Establishment_Year', 'Item_Identifier', 'Outlet_Identifier'])
print(X)
print(X.dtypes)
# Prediction
output = model.predict(X).tolist()
sales = sum(output)
df['Item_Outlet_Sales'] = output
df['Revenue'] = df['Item_Outlet_Sales']*df['Item_MRP']
revenue = sum(df['Revenue'])
return render_template('bigmart.html', pred1='The {} units of this product are expected to be sold.'.format(sales), pred2='${} is the revenue that should be generated by selling this much units of selected product.'.format(revenue))
if __name__ == "__main__":
app.run(debug = True)
| 46.128788 | 236 | 0.668418 | 859 | 6,089 | 4.501746 | 0.203725 | 0.043445 | 0.043445 | 0.026894 | 0.801138 | 0.770882 | 0.770882 | 0.770882 | 0.748901 | 0.748901 | 0 | 0.004961 | 0.172442 | 6,089 | 131 | 237 | 46.480916 | 0.762453 | 0.072754 | 0 | 0.728972 | 0 | 0 | 0.33517 | 0.026657 | 0 | 0 | 0 | 0 | 0 | 1 | 0.028037 | false | 0 | 0.074766 | 0.009346 | 0.130841 | 0.056075 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0d13fca8e344c1cb7c5cb5872df1f9b74cfb492b | 9,614 | py | Python | accounts/test_manage.py | kylejuliandev/dev_blog_assignment | 272466cb591f9b45fb81c2a42e86b25bff3cd9ad | [
"MIT"
] | null | null | null | accounts/test_manage.py | kylejuliandev/dev_blog_assignment | 272466cb591f9b45fb81c2a42e86b25bff3cd9ad | [
"MIT"
] | null | null | null | accounts/test_manage.py | kylejuliandev/dev_blog_assignment | 272466cb591f9b45fb81c2a42e86b25bff3cd9ad | [
"MIT"
] | null | null | null | import random
import string
from django.test import TestCase
from accounts.models import User
class ManageTest(TestCase):
def __init__(self, methodName: str = ...) -> None:
super().__init__(methodName)
def setUp(self) -> None:
"""Creates test data and adds a uest user"""
create_mock_user('TestUser', 'TestUser_P', False, True)
return super().setUp()
def tearDown(self) -> None:
"""Tears down test data and removes all test users"""
User.objects.all().delete()
return super().tearDown()
def test_manage_not_post(self):
"""
Given I am authenticated,
When I manage my account,
Then I am shown the manage page
"""
user = User.objects.get(username='testuser')
self.client.login(username='testuser', password='TestUser_P')
response = self.client.get('/manage')
form = response.context['form']
actual_firstname = str(form.fields['first_name'].initial)
actual_lastname = str(form.fields['last_name'].initial)
self.assertNotEqual(response.status_code, 302)
self.assertEqual(actual_firstname, user.first_name)
self.assertEqual(actual_lastname, user.last_name)
def test_manage_with_incorrect_password(self):
"""
Given I specified the wrong password,
When I manage my account,
Then I am shown the manage page and my account changes were not saved
"""
self.client.login(username='testuser', password='TestUser_P')
data = {
'first_name': 'NewFirstName',
'last_name': 'NewLastName',
'password': 'incorrect_password'
}
response = self.client.post('/manage', data=data)
user = User.objects.get(username='testuser')
self.assertNotEqual(response.status_code, 302)
self.assertNotEqual(user.first_name, 'NewFirstName')
self.assertNotEqual(user.last_name, 'NewLastName')
def test_manage_with_missing_password(self):
"""
Given I have not specified the password,
When I manage my account,
Then I am shown the manage page and my account changes were not saved
"""
self.client.login(username='testuser', password='TestUser_P')
data = {
'first_name': 'NewFirstName',
'last_name': 'NewLastName'
}
response = self.client.post('/manage', data=data)
user = User.objects.get(username='testuser')
self.assertNotEqual(response.status_code, 302)
self.assertNotEqual(user.first_name, 'NewFirstName')
self.assertNotEqual(user.last_name, 'NewLastName')
def test_manage_with_missing_first_name(self):
"""
Given I have not specified the first name,
When I manage my account,
Then I am shown the manage page and my account changes were not saved
"""
self.client.login(username='testuser', password='TestUser_P')
data = {
'last_name': 'NewLastName',
'password': 'TestUser_P'
}
response = self.client.post('/manage', data=data)
user = User.objects.get(username='testuser')
self.assertNotEqual(response.status_code, 302)
self.assertNotEqual(user.last_name, 'NewLastName')
def test_manage_with_missing_last_name(self):
"""
Given I have not specified the last name,
When I manage my account,
Then I am shown the manage page and my account changes were not saved
"""
self.client.login(username='testuser', password='TestUser_P')
data = {
'first_name': 'NewFirstName',
'password': 'TestUser_P'
}
response = self.client.post('/manage', data=data)
user = User.objects.get(username='testuser')
self.assertNotEqual(response.status_code, 302)
self.assertNotEqual(user.first_name, 'NewFirstName')
def test_manage_with_first_name_containing_numbers(self):
"""
Given I have specified a first name containing numbers,
When I manage my account,
Then I am shown the manage page and my account changes were not saved
"""
self.client.login(username='testuser', password='TestUser_P')
firstname = get_random_string_with_numbers(50)
data = {
'first_name': firstname,
'last_name': 'NewLastName',
'password': 'TestUser_P'
}
response = self.client.post('/manage', data=data)
user = User.objects.get(username='testuser')
self.assertNotEqual(response.status_code, 302)
self.assertNotEqual(user.first_name, firstname)
self.assertNotEqual(user.last_name, 'NewLastName')
def test_manage_with_last_name_containing_numbers(self):
"""
Given I have specified a last name containing numbers,
When I manage my account,
Then I am shown the manage page and my account changes were not saved
"""
self.client.login(username='testuser', password='TestUser_P')
lastname = get_random_string_with_numbers(50)
data = {
'first_name': 'NewFirstName',
'last_name': lastname,
'password': 'TestUser_P'
}
response = self.client.post('/manage', data=data)
user = User.objects.get(username='testuser')
self.assertNotEqual(response.status_code, 302)
self.assertNotEqual(user.first_name, 'NewFirstName')
self.assertNotEqual(user.last_name, lastname)
def test_manage_with_first_name_too_many_characters(self):
"""
Given I have specified a first name that is too long,
When I manage my account,
Then I am shown the manage page and my account changes were not saved
"""
self.client.login(username='testuser', password='TestUser_P')
firstname = get_random_string(51)
data = {
'first_name': firstname,
'last_name': 'NewLastName',
'password': 'TestUser_P'
}
response = self.client.post('/manage', data=data)
user = User.objects.get(username='testuser')
self.assertNotEqual(response.status_code, 302)
self.assertNotEqual(user.first_name, firstname)
self.assertNotEqual(user.last_name, 'NewLastName')
def test_manage_with_last_name_too_many_characters(self):
"""
Given I have specified a last name that is too long,
When I manage my account,
Then I am shown the manage page and my account changes were not saved
"""
self.client.login(username='testuser', password='TestUser_P')
lastname = get_random_string(51)
data = {
'first_name': 'NewFirstName',
'last_name': lastname,
'password': 'TestUser_P'
}
response = self.client.post('/manage', data=data)
user = User.objects.get(username='testuser')
self.assertNotEqual(response.status_code, 302)
self.assertNotEqual(user.first_name, 'NewFirstName')
self.assertNotEqual(user.last_name, lastname)
def test_manage_saves_changes(self):
"""
Given a valid first and last name, and valid password,
When I manage my account,
Then I am redirected to the home page and my changes were saved
"""
self.client.login(username='testuser', password='TestUser_P')
data = {
'first_name': 'NewFirstName',
'last_name': 'NewLastName',
'password': 'TestUser_P'
}
response = self.client.post('/manage', data=data)
user = User.objects.get(username='testuser')
self.assertEqual(response.status_code, 302)
self.assertEqual(user.first_name, 'NewFirstName')
self.assertEqual(user.last_name, 'NewLastName')
def test_manage_saves_partial_changes(self):
"""
Given that I only changed the last name,
When I manage my account,
Then I am redirected to the home page and my changes were saved
"""
self.client.login(username='testuser', password='TestUser_P')
data = {
'first_name': 'NewFirstName',
'last_name': 'User',
'password': 'TestUser_P'
}
response = self.client.post('/manage', data=data)
user = User.objects.get(username='testuser')
self.assertEqual(response.status_code, 302)
self.assertEqual(user.first_name, 'NewFirstName')
self.assertEqual(user.last_name, 'User')
def get_random_string(length:int) -> str:
"""Creates a random string from the ascii letter character set"""
return ''.join(random.choice(string.ascii_letters) for x in range(length))
def get_random_string_with_numbers(length:int) -> str:
"""Creates a random string from digits"""
return ''.join(random.choice(string.digits) for x in range(length))
def create_mock_user(username:str, password:str, isauthor:bool, isactive:bool):
"""Create a new user with the specified details"""
user = User()
user.username = username.lower()
user.first_name = 'Test'
user.last_name = 'User'
user.is_author = isauthor
user.is_active = isactive
user.is_admin = False
user.set_password(password)
user.save() | 36.142857 | 81 | 0.618161 | 1,106 | 9,614 | 5.222423 | 0.116637 | 0.040166 | 0.055921 | 0.024758 | 0.809037 | 0.794841 | 0.777528 | 0.754328 | 0.719529 | 0.698753 | 0 | 0.005913 | 0.27876 | 9,614 | 266 | 82 | 36.142857 | 0.827084 | 0.180674 | 0 | 0.596154 | 0 | 0 | 0.145522 | 0 | 0 | 0 | 0 | 0 | 0.198718 | 1 | 0.108974 | false | 0.153846 | 0.025641 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
b4c7543d07de9f9c113382e08ed870614daf45c8 | 1,972 | py | Python | test/converters/test_convert_suggest_settings_parameters.py | Skylion007/zivid-python | 28b16a2f260e5d060e4fb5a3436a3f1c7d659954 | [
"BSD-3-Clause"
] | 23 | 2019-07-01T09:50:04.000Z | 2022-03-06T23:54:28.000Z | test/converters/test_convert_suggest_settings_parameters.py | Skylion007/zivid-python | 28b16a2f260e5d060e4fb5a3436a3f1c7d659954 | [
"BSD-3-Clause"
] | 100 | 2019-07-02T07:49:13.000Z | 2022-02-16T21:05:39.000Z | test/converters/test_convert_suggest_settings_parameters.py | Skylion007/zivid-python | 28b16a2f260e5d060e4fb5a3436a3f1c7d659954 | [
"BSD-3-Clause"
] | 13 | 2019-10-01T07:26:05.000Z | 2022-02-16T20:21:56.000Z | def test_to_internal_suggest_settings_parameters_to_suggest_settings_parameters_modified():
from zivid.capture_assistant import SuggestSettingsParameters
from zivid._suggest_settings_parameters import (
_to_capture_assistant_suggest_settings_parameters,
_to_internal_capture_assistant_suggest_settings_parameters,
)
modified_suggest_settings_parameters = SuggestSettingsParameters(
ambient_light_frequency=SuggestSettingsParameters.AmbientLightFrequency.hz50
)
converted_suggest_settings_parameters = (
_to_capture_assistant_suggest_settings_parameters(
_to_internal_capture_assistant_suggest_settings_parameters(
modified_suggest_settings_parameters
)
)
)
assert modified_suggest_settings_parameters == converted_suggest_settings_parameters
assert isinstance(converted_suggest_settings_parameters, SuggestSettingsParameters)
assert isinstance(modified_suggest_settings_parameters, SuggestSettingsParameters)
def test_to_internal_suggest_settings_parameters_to_suggest_settings_parameters_default():
from zivid.capture_assistant import SuggestSettingsParameters
from zivid._suggest_settings_parameters import (
_to_capture_assistant_suggest_settings_parameters,
_to_internal_capture_assistant_suggest_settings_parameters,
)
default_suggest_settings_parameters = SuggestSettingsParameters()
converted_suggest_settings_parameters = (
_to_capture_assistant_suggest_settings_parameters(
_to_internal_capture_assistant_suggest_settings_parameters(
default_suggest_settings_parameters
)
)
)
assert default_suggest_settings_parameters == converted_suggest_settings_parameters
assert isinstance(converted_suggest_settings_parameters, SuggestSettingsParameters)
assert isinstance(default_suggest_settings_parameters, SuggestSettingsParameters)
#
| 43.822222 | 91 | 0.814402 | 178 | 1,972 | 8.325843 | 0.129213 | 0.283401 | 0.472335 | 0.145749 | 0.923752 | 0.812416 | 0.812416 | 0.812416 | 0.812416 | 0.812416 | 0 | 0.001202 | 0.156187 | 1,972 | 44 | 92 | 44.818182 | 0.889423 | 0 | 0 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 1 | 0.055556 | false | 0 | 0.111111 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
370b9b0e0930e47653776537a630fc312d10fe57 | 185 | py | Python | sphere/example/__init__.py | MehdiN/sphere | dec3b10ef31a99c01378ffd53c434c664ae43a6c | [
"MIT"
] | 15 | 2019-04-01T22:35:09.000Z | 2021-11-18T20:48:38.000Z | sphere/example/__init__.py | MehdiN/sphere | dec3b10ef31a99c01378ffd53c434c664ae43a6c | [
"MIT"
] | 3 | 2019-05-12T21:44:58.000Z | 2022-02-16T04:10:30.000Z | sphere/example/__init__.py | MehdiN/sphere | dec3b10ef31a99c01378ffd53c434c664ae43a6c | [
"MIT"
] | 6 | 2019-09-18T04:59:06.000Z | 2022-01-05T10:43:03.000Z | from .example import test_example_normalization
from .example import test_example_mle
from .example import calculate_bias_var_and_mse
from .example import test_example_mle2
del example
| 30.833333 | 47 | 0.881081 | 28 | 185 | 5.464286 | 0.464286 | 0.287582 | 0.444444 | 0.411765 | 0.54902 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005988 | 0.097297 | 185 | 5 | 48 | 37 | 0.91018 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.8 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
2ec87b9135481d4a600e63535d8e5ca423169ccc | 44 | py | Python | primer/primer/api/health.py | nikwek/buildkite-monorepo-example | 8e35f29614c7bb48d584d9e9fe9eca238185e739 | [
"MIT"
] | 8 | 2018-05-23T17:11:15.000Z | 2021-02-18T08:28:52.000Z | primer/primer/api/health.py | ksindi/buildpipe-monorepo-example | 3baf512d9278fb0a703b7108bda17966228d1eb2 | [
"MIT"
] | 1 | 2022-03-06T21:50:20.000Z | 2022-03-06T21:54:42.000Z | primer/primer/api/health.py | ksindi/buildpipe-monorepo-example | 3baf512d9278fb0a703b7108bda17966228d1eb2 | [
"MIT"
] | 3 | 2018-06-26T07:16:48.000Z | 2019-12-22T18:13:52.000Z | def search():
return {'msg': 'ok'}, 200
| 14.666667 | 29 | 0.522727 | 6 | 44 | 3.833333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088235 | 0.227273 | 44 | 2 | 30 | 22 | 0.588235 | 0 | 0 | 0 | 0 | 0 | 0.113636 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
2ecb8ffaeec635fe993de5c073d7737e67e6b85a | 50,624 | py | Python | src/models/gaussian_process.py | AndrewRLawrence/dp_gp_lvm | b0d4c776714f22e83de31127fbfbbd511f017dcd | [
"MIT"
] | 1 | 2021-01-17T11:44:36.000Z | 2021-01-17T11:44:36.000Z | src/models/gaussian_process.py | AndrewRLawrence/dp_gp_lvm | b0d4c776714f22e83de31127fbfbbd511f017dcd | [
"MIT"
] | 1 | 2020-07-19T20:47:02.000Z | 2020-07-19T20:47:02.000Z | src/models/gaussian_process.py | AndrewRLawrence/dp_gp_lvm | b0d4c776714f22e83de31127fbfbbd511f017dcd | [
"MIT"
] | 1 | 2020-07-21T07:13:13.000Z | 2020-07-21T07:13:13.000Z | """
This module implements a few different Gaussian process (GP) models, specifically GP regression, GP latent variable
model (GP-LVM), Bayesian GP-LVM (BGP-LVM), and Manifold Relevance Determination (MRD), which is a multi-view extension
to the BGP-LVM.
"""
from src.distributions.normal import mvn_log_pdf, mvn_conditional_mean_covar
from src.kernels.interfaces.kernel import KernelHyperparameters
from src.kernels.rbf_kernel import k_ard_rbf
from src.models.interfaces.trainable import Trainable
from src.models.expressions.gp_expressions import calculate_kl_divergence_standard_prior
from src.utils.constants import GP_INIT_GAMMA, GP_INIT_ALPHA, GP_INIT_BETA, GP_LVM_DEFAULT_LATENT_DIMENSIONS, \
GP_LVM_DEFAULT_NUM_INDUCING_POINTS, MAX_MC_SAMPLES
from src.utils.expressions import nearest_neighbour, empirical_mean, principal_component_analysis as pca
from src.utils.types import TF_DTYPE, create_positive_variable, validate_kernel
import numpy as np
import tensorflow as tf
import tensorflow_probability as tfp
def gp_regression(x_train, y_train, kernel=None):
"""
TODO
:param x_train: The training input. Must be [N x Q].
:param y_train: The training output. Must be [N x D]. Can be 1D output so can be [N x 1].
:param kernel:
:return:
"""
# TODO: Validate input.
# Determine dimensions of tensors.
n, q = x_train.get_shape().as_list()
d = y_train.get_shape().as_list()[1]
# Validate kernel or define kernel, if necessary.
if kernel is not None:
validate_kernel(kernel)
# gamma = kernel._hyperparameter_dict[KernelHyperparameters.ARD_WEIGHTS]
else:
# TODO: May want to add some noise to gamma init values.
batch_size = 1
gamma = create_positive_variable(initial_value=GP_INIT_GAMMA, shape=(batch_size, q), is_trainable=True)
alpha = create_positive_variable(initial_value=GP_INIT_ALPHA, shape=(batch_size, 1), is_trainable=True)
beta = create_positive_variable(initial_value=GP_INIT_BETA, shape=(batch_size, 1), is_trainable=True)
kernel = k_ard_rbf(gamma=gamma, alpha=alpha, beta=beta)
# Define GP.
k_xx = kernel.covariance_matrix(input_0=x_train, input_1=None, include_noise=True, include_jitter=True)
log_likelihood = mvn_log_pdf(x=tf.transpose(y_train),
mean=tf.zeros(shape=(1, n), dtype=TF_DTYPE),
covariance=k_xx) + \
kernel.prior_log_likelihood
objective = tf.negative(tf.reduce_sum(log_likelihood))
class GaussianProcess(Trainable):
"""
This class defines a GaussianProcess object.
"""
@property
def kernel(self):
"""
TODO
:return:
"""
return kernel
@property
def log_likelihood(self):
"""
TODO
:return:
"""
return log_likelihood
@staticmethod
def predict_mean_covar(x_test):
"""
TODO
:param x_test:
:return:
"""
num_test_points = x_test.get_shape().as_list()[0]
k_ss = kernel.covariance_matrix(input_0=x_test, input_1=None, include_noise=False, include_jitter=True)
k_xs = kernel.covariance_matrix(input_0=x_train, input_1=x_test, include_noise=False, include_jitter=False)
predicted_mean, predicated_covar = \
mvn_conditional_mean_covar(b=y_train,
mean_a=tf.zeros(shape=(num_test_points, 1), dtype=TF_DTYPE),
mean_b=tf.zeros(shape=(n, 1), dtype=TF_DTYPE),
covar_aa=k_ss,
covar_bb=k_xx,
covar_ab=k_xs)
return predicted_mean, predicated_covar
@property
def objective(self):
"""
TODO
:return:
"""
return objective
return GaussianProcess()
def gp_lvm(y_train, kernel=None, num_latent_dims=GP_LVM_DEFAULT_LATENT_DIMENSIONS):
"""
TODO
:param y_train: The training output. Must be [N x D].
:param kernel:
:param num_latent_dims:
:return:
"""
# TODO: Validate input.
# Determine dimensions of tensors.
d = y_train.get_shape().as_list()[1]
assert 0 < num_latent_dims < d, \
'Number of latent dimensions must be postive and less than the dimensionality of the observed data.'
# TODO: PCA will not work if y_train is a tensor.
x_latent = tf.Variable(pca(y_train, num_latent_dimensions=num_latent_dims), dtype=TF_DTYPE, trainable=True)
return gp_regression(x_train=x_latent, y_train=y_train, kernel=kernel)
def bayesian_gp_lvm(y_train, kernel=None, num_latent_dims=GP_LVM_DEFAULT_LATENT_DIMENSIONS,
num_inducing_points=GP_LVM_DEFAULT_NUM_INDUCING_POINTS, num_latent_samples=0):
"""
TODO
This is only standard latent prior so factorised p(X).
:param y_train: As numpy array of size [N x D].
:param kernel:
:param num_latent_dims:
:param num_inducing_points:
:param num_latent_samples:
:return:
"""
# Determine tensor dimensions.
num_samples, num_dimensions = np.shape(y_train)
# num_samples, num_dimensions = y_train.get_shape().as_list()
# Validate input.
assert isinstance(num_latent_dims, int), 'Number of latent dimensions must be an integer.'
assert 0 < num_latent_dims < num_dimensions, \
'Number of latent dimensions must be postive and less than the dimensionality of the observed data.'
assert isinstance(num_inducing_points, int), 'Number of inducing points must be an integer.'
assert 0 < num_inducing_points < num_samples, \
'Number of inducing points must be positive and less than the number of observations in the observed data.'
assert isinstance(num_latent_samples, int), 'Number of latent space samples must be an integer.'
assert 0 <= num_latent_samples < MAX_MC_SAMPLES, \
'Number of latent space samples must be positive and less than {}.'.format(MAX_MC_SAMPLES)
# Use stochastic variational inference if num_latent_samples is not zero; otherwise, use normal variational
# inference with closed-form versions of psi_statistics.
if not num_latent_samples:
use_svi = False
else:
use_svi = True
# Validate kernel or define kernel, if necessary.
batch_size = 1 # This is B.
if kernel is not None:
validate_kernel(kernel)
# gamma = kernel._hyperparameter_dict[KernelHyperparameters.ARD_WEIGHTS]
else:
# TODO: May want to add some noise to gamma init values.
# batch_size = 1 # This is B.
gamma = create_positive_variable(initial_value=GP_INIT_GAMMA,
shape=(batch_size, num_latent_dims),
is_trainable=True)
alpha = create_positive_variable(initial_value=GP_INIT_ALPHA,
shape=(batch_size, 1),
is_trainable=True)
beta = create_positive_variable(initial_value=GP_INIT_BETA,
shape=(batch_size, 1),
is_trainable=True)
kernel = k_ard_rbf(gamma=gamma, alpha=alpha, beta=beta)
# Fit latent means using PCA and the inducing inputs as a subset of those with a little noise.
x_init = pca(y_train, num_latent_dimensions=num_latent_dims)
x_mean = tf.Variable(initial_value=x_init, dtype=TF_DTYPE, trainable=True) # [N x Q].
# Initialise inducing inputs as a subset of the PCA values calculated for latent means plus some noise.
x_u_init = np.random.permutation(x_init)[:num_inducing_points] + \
np.random.normal(loc=0.0, scale=0.01, size=(num_inducing_points, num_latent_dims))
x_u = tf.Variable(initial_value=x_u_init, dtype=TF_DTYPE, trainable=True) # [M x Q].
# Define psi-statistics.
if use_svi:
x_var_diag = create_positive_variable(initial_value=0.5,
shape=(num_samples, num_latent_dims),
is_trainable=True) # [N x Q].
# N, Q-length multivariate normal distributions with a diagonal covariance.
q_x = tfp.distributions.MultivariateNormalDiag(loc=x_mean, scale_diag=tf.sqrt(x_var_diag))
x_samples = q_x.sample([num_latent_samples]) # [num_latent_samples x N x Q].
# TODO: Should really be Q, N-length mulitvariate normals so it is easy to modify for dynamic prior.
# # Q, N-length multivariate normal distributions with a diagonal covariance.
# q_x = tfp.distributions.MultivariateNormalDiag(loc=tf.transpose(x_mean),
# scale_diag=tf.transpose(tf.sqrt(x_var_diag)))
# x_samples = q_x.sample([num_latent_samples]) # [num_latent_samples x Q x N].
kff_trace = tf.reduce_sum(kernel.covariance_diag(input_0=x_samples, include_noise=False, include_jitter=False),
axis=-1) # [num_latent_samples x B x 1].
k_fu = kernel.covariance_matrix(input_0=x_samples, input_1=x_u, include_noise=False,
include_jitter=False) # [num_latent_samples x B x N x M].
# Define psi-statistics.
psi_0 = empirical_mean(kff_trace) # [B x 1].
psi_1 = empirical_mean(k_fu) # [B x N x M].
psi_2 = empirical_mean(tf.matmul(k_fu, k_fu, transpose_a=True)) # [B x M x M].
# [B x 1].
else:
x_covar = tf.matrix_diag(create_positive_variable(initial_value=0.5,
shape=(num_samples, num_latent_dims),
is_trainable=True)) # [N x Q x Q].
# [B x 1].
psi_0 = kernel.psi_0(inducing_input=x_u, latent_input_mean=x_mean, latent_input_covariance=x_covar)
# [B x N x M].
psi_1 = kernel.psi_1(inducing_input=x_u, latent_input_mean=x_mean, latent_input_covariance=x_covar)
# [B x M x M].
psi_2 = kernel.psi_2(inducing_input=x_u, latent_input_mean=x_mean, latent_input_covariance=x_covar)
# Calculate f_hat term from the evidence lower bound (ELBO). Using stable calculation for f_hat.
beta = kernel.noise_precision
beta_b11 = tf.expand_dims(beta, axis=-1) # [B x 1 x 1].
k_uu = kernel.covariance_matrix(input_0=x_u, input_1=None, include_noise=False, include_jitter=True) # [B x M x M].
l_uu = tf.cholesky(k_uu) # [B x M x M].
l_uu_inv_psi_2 = tf.matrix_triangular_solve(l_uu, psi_2, lower=True) # [B x M x M].
l_uu_inv_psi_2_inv_transpose = tf.transpose(
tf.matrix_triangular_solve(l_uu, tf.transpose(l_uu_inv_psi_2, perm=[0, 2, 1]), lower=True),
perm=[0, 2, 1]) # [B x M x M].
a = beta_b11 * l_uu_inv_psi_2_inv_transpose + tf.eye(num_inducing_points, batch_shape=[batch_size], dtype=TF_DTYPE)
l_a = tf.cholesky(a) # [B x M x M].
log_det_l_a = tf.reduce_sum(tf.log(tf.matrix_diag_part(l_a))) # Scalar.
# [B x M x N].
l_uu_inv_psi_1_transpose = tf.matrix_triangular_solve(l_uu, tf.transpose(psi_1, perm=[0, 2, 1]), lower=True)
c = tf.matrix_triangular_solve(l_a, l_uu_inv_psi_1_transpose, lower=True) # [B x M x N].
c_transpose_c = tf.squeeze(tf.matmul(c, c, transpose_a=True), axis=0) # Squeeze since B=1 so cTc is [N x N].
yy_transpose = tf.matmul(y_train, y_train, transpose_b=True) # [N x N].
f_hat = 0.5 * num_samples * num_dimensions * (tf.reduce_sum(tf.log(beta)) - np.log(2.0 * np.pi)) - \
num_dimensions * log_det_l_a + \
0.5 * num_dimensions * tf.reduce_sum(beta * (tf.trace(l_uu_inv_psi_2_inv_transpose) - psi_0)) + \
0.5 * tf.reduce_sum(tf.square(beta) * tf.trace(tf.matmul(c_transpose_c, yy_transpose))) - \
0.5 * tf.reduce_sum(beta * tf.trace(yy_transpose))
# Define KL divergence between q(X) and p(X).
kl_q_x_p_x = calculate_kl_divergence_standard_prior(x_mean=x_mean, x_covar=x_covar)
# Define evidence lower bound (ELBO).
elbo = f_hat - kl_q_x_p_x
# Define objective function.
objective = tf.negative(elbo + kernel.prior_log_likelihood)
# Calculate some other intermediate values for prediction.
l_uu_inv = tf.matrix_triangular_solve(l_uu, tf.eye(num_inducing_points, batch_shape=[batch_size], dtype=TF_DTYPE))
l_a_inv = tf.matrix_triangular_solve(l_a, tf.eye(num_inducing_points, batch_shape=[batch_size], dtype=TF_DTYPE))
class BayesianGPLVM(Trainable):
"""
This class defines a BayesianGPLVM object.
"""
@property
def kernel(self):
"""
TODO
:return:
"""
return kernel
@property
def ard_weights(self):
"""
TODO
:return:
"""
return kernel.hyperparameters[KernelHyperparameters.ARD_WEIGHTS]
@property
def signal_variance(self):
"""
TODO
:return:
"""
return kernel.hyperparameters[KernelHyperparameters.SIGNAL_VARIANCE]
@property
def noise_precision(self):
"""
TODO
:return:
"""
return kernel.noise_precision
@property
def inducing_input(self):
"""
TODO
:return:
"""
return x_u
@property
def q_x(self):
"""
TODO
:return:
"""
return x_mean, x_covar
@staticmethod
def predict_new_latent_variables(y_test, use_pca=False):
"""
TODO
:param y_test:
:param use_pca:
:return:
"""
# y_test is [N* x D].
num_test_points, test_dims = np.shape(y_test)
assert test_dims == num_dimensions, \
'Observed dimensionality for prediction must be equal to the dimensionality of the training data.'
# Define variables for q(X_star).
if use_pca:
# Initialise x_test_mean using PCA.
init_values = pca(y_test, num_latent_dimensions=num_latent_dims)
else:
# Initialise x_test_mean with mean from nearest neighbour between training and test sets
# plus some noise.
y_test_nn_indices = nearest_neighbour(y_train, y_test)
init_values = tf.map_fn(lambda index: x_mean[index], y_test_nn_indices, dtype=TF_DTYPE) + \
np.random.normal(scale=0.01, size=(num_test_points, num_latent_dims))
x_test_mean = tf.Variable(initial_value=init_values, dtype=TF_DTYPE, trainable=False) # [N* x Q].
x_test_covar = tf.matrix_diag(create_positive_variable(initial_value=1.0,
shape=(num_test_points, num_latent_dims),
is_trainable=False)) # [N* x Q x Q].
# Define extra f_hat terms for y_test.
psi_0_test = kernel.psi_0(inducing_input=x_u, latent_input_mean=x_test_mean,
latent_input_covariance=x_test_covar) # [B x 1].
psi_1_test = kernel.psi_1(inducing_input=x_u, latent_input_mean=x_test_mean,
latent_input_covariance=x_test_covar) # [B x N* x M].
psi_2_test = kernel.psi_2(inducing_input=x_u, latent_input_mean=x_test_mean,
latent_input_covariance=x_test_covar) # [B x M x M].
l_uu_inv_psi_2_test = tf.matrix_triangular_solve(l_uu, psi_2_test, lower=True) # [B x M x M].
l_uu_inv_psi_2_test_inv_transpose = tf.transpose(
tf.matrix_triangular_solve(l_uu, tf.transpose(l_uu_inv_psi_2_test, perm=[0, 2, 1]), lower=True),
perm=[0, 2, 1]) # [B x M x M].
a_test = beta_b11 * l_uu_inv_psi_2_test_inv_transpose + \
tf.eye(num_inducing_points, batch_shape=[batch_size], dtype=TF_DTYPE) # [B x M x M].
l_a_test = tf.cholesky(a_test) # [B x M x M].
log_det_l_a_test = tf.reduce_sum(tf.log(tf.matrix_diag_part(l_a_test))) # Scalar.
# [B x M x N*].
l_uu_inv_psi_1_test_transpose = tf.matrix_triangular_solve(l_uu,
tf.transpose(psi_1_test, perm=[0, 2, 1]),
lower=True)
c_test = tf.matrix_triangular_solve(l_a_test, l_uu_inv_psi_1_test_transpose, lower=True) # [B x M x N*].
c_transpose_c_test = tf.squeeze(tf.matmul(c_test, c_test, transpose_a=True),
axis=0) # Squeeze since B=1 so cTc is [N* x N*].
yy_test_transpose = tf.matmul(y_test, y_test, transpose_b=True) # [N* x N*].
f_hat_test = 0.5 * num_test_points * num_dimensions * (tf.reduce_sum(tf.log(beta)) -
np.log(2.0 * np.pi)) - \
num_dimensions * log_det_l_a_test + \
0.5 * num_dimensions * tf.reduce_sum(beta * (tf.trace(l_uu_inv_psi_2_test_inv_transpose) -
psi_0_test)) + \
0.5 * tf.reduce_sum(tf.square(beta) * tf.trace(tf.matmul(c_transpose_c_test, yy_test_transpose))) - \
0.5 * tf.reduce_sum(beta * tf.trace(yy_test_transpose))
# Define KL divergence between q(X_star) and p(X_star).
kl_q_x_test_p_x_test = calculate_kl_divergence_standard_prior(x_mean=x_test_mean, x_covar=x_test_covar)
# Define prediction lower bound.
prediction_lower_bound = f_hat + f_hat_test - kl_q_x_p_x - kl_q_x_test_p_x_test
# Define test log-likelihood (from equation 36 of BGP-LVM journal paper).
test_log_likelihood = f_hat_test - f_hat
return prediction_lower_bound, x_test_mean, x_test_covar, test_log_likelihood
@staticmethod
def predict_missing_data(y_test, use_pca=False):
"""
TODO
:param y_test:
:param use_pca:
:return:
"""
# TODO: Currently assume y_test is first observed_dims of d. Update to be more generic.
num_test_points, num_observed_dims = np.shape(y_test)
assert num_observed_dims < num_dimensions, \
'Observed dimensionality for missing data scenario must be less than the total ' \
'dimensionality of the training data.'
# Get slice of y_train for remaining unobserved dimensions. Du = D - Do.
# Slice y_train into observed and unobserved dimensions for the missing data. D = Do + Du
y_train_observed = tf.slice(y_train, begin=[0, 0], size=[-1, num_observed_dims]) # [N x Do]
y_train_unobserved = tf.slice(y_train, begin=[0, num_observed_dims], size=[-1, -1]) # [N x Du]
# Define variables for q(X_star).
if use_pca:
# Initialise x_test_mean using PCA.
init_values = pca(y_test, num_latent_dimensions=num_latent_dims)
else:
# Initialise x_test_mean with mean from nearest neighbour between training and test sets
# plus some noise.
y_test_nn_indices = nearest_neighbour(y_train_observed, y_test)
init_values = tf.map_fn(lambda index: x_mean[index], y_test_nn_indices, dtype=TF_DTYPE) + \
np.random.normal(scale=0.01, size=(num_test_points, num_latent_dims))
x_test_mean = tf.Variable(initial_value=init_values, dtype=TF_DTYPE, trainable=False) # [N* x Q].
x_test_covar = tf.matrix_diag(create_positive_variable(initial_value=1.0,
shape=(num_test_points, num_latent_dims),
is_trainable=False)) # [N* x Q x Q].
# Define extra f_hat terms for y_test.
psi_0_test = kernel.psi_0(inducing_input=x_u, latent_input_mean=x_test_mean,
latent_input_covariance=x_test_covar) # [B x 1].
psi_1_test = kernel.psi_1(inducing_input=x_u, latent_input_mean=x_test_mean,
latent_input_covariance=x_test_covar) # [B x N* x M].
psi_2_test = kernel.psi_2(inducing_input=x_u, latent_input_mean=x_test_mean,
latent_input_covariance=x_test_covar) # [B x M x M].
l_uu_inv_psi_2_test = tf.matrix_triangular_solve(l_uu, psi_2_test, lower=True) # [B x M x M].
l_uu_inv_psi_2_test_inv_transpose = tf.transpose(
tf.matrix_triangular_solve(l_uu, tf.transpose(l_uu_inv_psi_2_test, perm=[0, 2, 1]), lower=True),
perm=[0, 2, 1]) # [B x M x M].
a_test = beta_b11 * l_uu_inv_psi_2_test_inv_transpose + \
tf.eye(num_inducing_points, batch_shape=[batch_size], dtype=TF_DTYPE) # [B x M x M].
l_a_test = tf.cholesky(a_test) # [B x M x M].
log_det_l_a_test = tf.reduce_sum(tf.log(tf.matrix_diag_part(l_a_test))) # Scalar.
# [B x M x N*].
l_uu_inv_psi_1_test_transpose = tf.matrix_triangular_solve(l_uu,
tf.transpose(psi_1_test, perm=[0, 2, 1]),
lower=True)
c_test = tf.matrix_triangular_solve(l_a_test, l_uu_inv_psi_1_test_transpose, lower=True) # [B x M x N*].
c_transpose_c_test = tf.squeeze(tf.matmul(c_test, c_test, transpose_a=True),
axis=0) # Squeeze since B=1 so cTc is [N* x N*].
yy_test_transpose = tf.matmul(y_test, y_test, transpose_b=True)
f_hat_test = 0.5 * num_test_points * num_observed_dims * (tf.reduce_sum(tf.log(beta)) -
np.log(2.0 * np.pi)) - \
num_observed_dims * log_det_l_a_test + \
0.5 * num_observed_dims * tf.reduce_sum(beta *
(tf.trace(l_uu_inv_psi_2_test_inv_transpose) - psi_0_test)) + \
0.5 * tf.reduce_sum(tf.square(beta) * tf.trace(tf.matmul(c_transpose_c_test, yy_test_transpose))) - \
0.5 * tf.reduce_sum(beta * tf.trace(yy_test_transpose))
# Define KL divergence between q(X_star) and p(X_star).
kl_q_x_test_p_x_test = calculate_kl_divergence_standard_prior(x_mean=x_test_mean, x_covar=x_test_covar)
# Define missing data lower bound.
missing_data_lower_bound = f_hat + f_hat_test - kl_q_x_p_x - kl_q_x_test_p_x_test
# Define extra terms for predicted mean.
c_predict = tf.matrix_triangular_solve(l_a, l_uu_inv_psi_1_test_transpose, lower=True) # [B x M x N*].
# Squeeze since B=1 so c_predict_transpose x c is [N* x N].
predicted_mean = beta * tf.matmul(tf.squeeze(tf.matmul(c_predict, c, transpose_a=True), axis=0),
y_train_unobserved) # [N* x Du].
# Define extra terms for predicted covariance.
g = psi_2_test - tf.matmul(psi_1_test, psi_1_test, transpose_a=True) # [B x M x M].
scale_du = tf.matmul(
tf.matmul(
tf.matmul(
l_uu_inv,
tf.matmul(
l_a_inv,
l_a_inv,
transpose_a=True
),
transpose_a=True
),
l_uu_inv
),
psi_1,
transpose_b=True
) # [B x M x N].
scale_du_yu = tf.matmul(tf.squeeze(scale_du, axis=0), y_train_unobserved) # [M x Du].
# Du-length vector.
yu_variance = tf.square(tf.squeeze(beta, axis=0)) * tf.diag_part(tf.matmul(
scale_du_yu,
tf.matmul(tf.squeeze(g, axis=0), scale_du_yu),
transpose_a=True))
# [Du x N* x N*] so there is a specific covariance for each unobserved dimension.
predicted_covar = tf.expand_dims(tf.expand_dims(yu_variance, axis=-1), axis=-1) + \
(
(psi_0_test + tf.reciprocal(beta) -
tf.trace(
tf.matmul(
tf.matmul(
tf.matmul(
l_uu_inv,
(
tf.eye(num_inducing_points, batch_shape=[batch_size], dtype=TF_DTYPE) -
tf.matmul(l_a_inv, l_a_inv, transpose_a=True)
),
transpose_a=True
),
l_uu_inv
),
psi_2_test
)
)
) *
tf.eye(num_test_points, batch_shape=[batch_size], dtype=TF_DTYPE)
) # [Du x N* x N*].
return missing_data_lower_bound, x_test_mean, x_test_covar, predicted_mean, predicted_covar
@property
def objective(self):
"""
TODO
:return:
"""
return objective
return BayesianGPLVM()
def manifold_relevance_determination(views_train, num_latent_dims=GP_LVM_DEFAULT_LATENT_DIMENSIONS,
num_inducing_points=GP_LVM_DEFAULT_NUM_INDUCING_POINTS):
"""
TODO
:param views_train:
:param num_latent_dims:
:param num_inducing_points:
:return:
"""
# TODO: Validate input.
# Determine tensor dimensions.
num_views = len(views_train)
shapes = np.array([np.shape(v) for v in views_train])
num_samples = [shapes[v][0] for v in range(num_views)]
num_dimensions = [shapes[v][1] for v in range(num_views)]
assert np.size(np.unique(num_samples)) == 1, 'Each view must have the same number of observations.'
num_samples = num_samples[0]
assert 0 < num_latent_dims < np.sum(num_dimensions), \
'Number of latent dimensions must be postive and less than the dimensionality of the observed data.'
assert 0 < num_inducing_points < num_samples, \
'Number of inducing points must be positive and less than the number of observations in the observed data.'
# Create kernels, one per view.
batch_size = 1
gammas = [create_positive_variable(initial_value=GP_INIT_GAMMA,
shape=(batch_size, num_latent_dims),
is_trainable=True)
for _ in range(num_views)] # [V x B x Q].
alphas = [create_positive_variable(initial_value=GP_INIT_ALPHA,
shape=(batch_size, 1),
is_trainable=True)
for _ in range(num_views)] # [V x B x 1].
betas = [create_positive_variable(initial_value=GP_INIT_BETA,
shape=(batch_size, 1),
is_trainable=True)
for _ in range(num_views)] # [V x B x 1].
kernels = [k_ard_rbf(gamma=gammas[v], alpha=alphas[v], beta=betas[v]) for v in range(num_views)]
# Fit latent means using PCA and the inducing inputs as a subset of those with a little noise.
x_init = pca(np.hstack(views_train), num_latent_dimensions=num_latent_dims)
x_mean = tf.Variable(initial_value=x_init, dtype=TF_DTYPE, trainable=True) # [N x Q].
x_covar = tf.matrix_diag(create_positive_variable(initial_value=1.0,
shape=(num_samples, num_latent_dims),
is_trainable=True)) # [N x Q x Q].
# Initialise inducing inputs as a subset of the PCA values calculated for latent means plus some noise.
# There is a set of inducing inputs per view.
x_u_inits = [np.random.permutation(x_init)[:num_inducing_points] +
np.random.normal(loc=0.0, scale=0.01, size=(num_inducing_points, num_latent_dims))
for _ in range(num_views)]
x_us = [tf.Variable(initial_value=x_u_inits[v], dtype=TF_DTYPE, trainable=True)
for v in range(num_views)] # [V x M x Q].
# Define psi-statistics.
psi_0s = [kernels[v].psi_0(inducing_input=x_us[v], latent_input_mean=x_mean, latent_input_covariance=x_covar)
for v in range(num_views)] # [V x B x 1].
psi_1s = [kernels[v].psi_1(inducing_input=x_us[v], latent_input_mean=x_mean, latent_input_covariance=x_covar)
for v in range(num_views)] # [V x B x N x M].
psi_2s = [kernels[v].psi_2(inducing_input=x_us[v], latent_input_mean=x_mean, latent_input_covariance=x_covar)
for v in range(num_views)] # [V x B x M x M].
# Define Kuu and its cholesky decomposition.
k_uus = [kernels[v].covariance_matrix(input_0=x_us[v], input_1=None, include_noise=False, include_jitter=True)
for v in range(num_views)] # [V x B x M x M].
l_uus = [tf.cholesky(k_uus[v]) for v in range(num_views)] # [V x B x M x M].
# Calculate f_hat terms from the evidence lower bound (ELBO). Using stable calculation for f_hat.
f_hat = 0.0
l_as = []
for v in range(num_views):
num_dims = num_dimensions[v]
beta = kernels[v].noise_precision # [B x 1].
beta_b11 = tf.expand_dims(beta, axis=-1) # [B x 1 x 1].
l_uu = l_uus[v] # [B x M x M].
l_uu_inv_psi_2 = tf.matrix_triangular_solve(l_uu, psi_2s[v], lower=True) # [B x M x M].
l_uu_inv_psi_2_inv_transpose = tf.transpose(
tf.matrix_triangular_solve(l_uu, tf.transpose(l_uu_inv_psi_2, perm=[0, 2, 1]), lower=True),
perm=[0, 2, 1]) # [B x M x M].
a = beta_b11 * l_uu_inv_psi_2_inv_transpose + tf.eye(num_inducing_points,
batch_shape=[batch_size],
dtype=TF_DTYPE)
l_a = tf.cholesky(a) # [B x M x M].
l_as.append(l_a)
log_det_l_a = tf.reduce_sum(tf.log(tf.matrix_diag_part(l_a))) # Scalar.
# [B x M x N].
l_uu_inv_psi_1_transpose = tf.matrix_triangular_solve(l_uu,
tf.transpose(psi_1s[v], perm=[0, 2, 1]),
lower=True)
c = tf.matrix_triangular_solve(l_a, l_uu_inv_psi_1_transpose, lower=True) # [B x M x N].
c_transpose_c = tf.squeeze(tf.matmul(c, c, transpose_a=True), axis=0) # Squeeze since B=1 so cTc is [N x N].
yy_transpose = tf.matmul(views_train[v], views_train[v], transpose_b=True) # [N x N].
f_hat += 0.5 * num_samples * num_dims * (tf.reduce_sum(tf.log(beta)) - np.log(2.0 * np.pi)) - \
num_dims * log_det_l_a + \
0.5 * num_dims * tf.reduce_sum(beta * (tf.trace(l_uu_inv_psi_2_inv_transpose) - psi_0s[v])) + \
0.5 * tf.reduce_sum(tf.square(beta) * tf.trace(tf.matmul(c_transpose_c, yy_transpose))) - \
0.5 * tf.reduce_sum(beta * tf.trace(yy_transpose))
# Define KL divergence between q(X) and p(X).
kl_q_x_p_x = calculate_kl_divergence_standard_prior(x_mean=x_mean, x_covar=x_covar)
# Define evidence lower bound (ELBO).
elbo = f_hat - kl_q_x_p_x
# Define objective function.
objective = tf.negative(elbo + tf.reduce_sum([kernels[v].prior_log_likelihood for v in range(num_views)]))
class ManifoldRelevanceDetermination(Trainable):
"""
This class defines a Manifold Relevance Determination object.
"""
@property
def number_of_views(self):
"""
TODO
:return:
"""
return num_views
@property
def kernels(self):
"""
TODO
:return:
"""
return kernels
@property
def ard_weights(self):
"""
TODO
:return:
"""
return [kernel.hyperparameters[KernelHyperparameters.ARD_WEIGHTS] for kernel in kernels]
@property
def signal_variance(self):
"""
TODO
:return:
"""
return [kernel.hyperparameters[KernelHyperparameters.SIGNAL_VARIANCE] for kernel in kernels]
@property
def noise_precision(self):
"""
TODO
:return:
"""
return [kernel.noise_precision for kernel in kernels]
@property
def inducing_input(self):
"""
TODO
:return:
"""
return x_us
@property
def q_x(self):
"""
TODO
:return:
"""
return x_mean, x_covar
@staticmethod
def predict_new_latent_variables(views_test, use_pca=False):
"""
TODO
:param views_test:
:param use_pca:
:return:
"""
# Determine tensor dimensions.
assert num_views == len(views_test), \
'The number of test views must be the same as the number of training views.'
test_shapes = np.array([np.shape(v) for v in views_test])
num_test_points = [test_shapes[v][0] for v in range(num_views)]
num_test_dimensions = [test_shapes[v][1] for v in range(num_views)]
assert np.size(np.unique(num_test_points)) == 1, 'Each view must have the same number of test points.'
num_test_points = num_test_points[0]
assert num_dimensions == num_test_dimensions, \
'Observed dimensionality for prediction must be equal to the dimensionality of the training data ' \
'for each view.'
# Define variables for q(X_star).
if use_pca:
# Initialise x_test_mean using PCA.
init_values = pca(np.hstack(views_test), num_latent_dimensions=num_latent_dims)
else:
# Initialise x_test_mean with mean from nearest neighbour between training and test sets
# plus some noise.
y_test_nn_indices = nearest_neighbour(np.hstack(views_train), np.hstack(views_test))
init_values = tf.map_fn(lambda index: x_mean[index], y_test_nn_indices, dtype=TF_DTYPE) + \
np.random.normal(scale=0.01, size=(num_test_points, num_latent_dims))
x_test_mean = tf.Variable(initial_value=init_values, dtype=TF_DTYPE, trainable=False) # [N* x Q].
x_test_covar = tf.matrix_diag(create_positive_variable(initial_value=1.0,
shape=(num_test_points, num_latent_dims),
is_trainable=False)) # [N* x Q x Q].
# Define psi-statistics for test values.
psi_0s_test = [kernels[v].psi_0(inducing_input=x_us[v],
latent_input_mean=x_test_mean,
latent_input_covariance=x_test_covar)
for v in range(num_views)] # [V x B x 1].
psi_1s_test = [kernels[v].psi_1(inducing_input=x_us[v],
latent_input_mean=x_test_mean,
latent_input_covariance=x_test_covar)
for v in range(num_views)] # [V x B x N* x M].
psi_2s_test = [kernels[v].psi_2(inducing_input=x_us[v],
latent_input_mean=x_test_mean,
latent_input_covariance=x_test_covar)
for v in range(num_views)] # [V x B x M x M].
# Calculate f_hat terms for prediction.
f_hat_test = 0.0
for v in range(num_views):
num_dims = num_dimensions[v]
beta = kernels[v].noise_precision # [B x 1].
beta_b11 = tf.expand_dims(beta, axis=-1) # [B x 1 x 1].
l_uu = l_uus[v] # [B x M x M].
l_uu_inv_psi_2_test = tf.matrix_triangular_solve(l_uu, psi_2s_test[v], lower=True) # [B x M x M].
l_uu_inv_psi_2_test_inv_transpose = tf.transpose(
tf.matrix_triangular_solve(l_uu, tf.transpose(l_uu_inv_psi_2_test, perm=[0, 2, 1]), lower=True),
perm=[0, 2, 1]) # [B x M x M].
a_test = beta_b11 * l_uu_inv_psi_2_test_inv_transpose + tf.eye(num_inducing_points,
batch_shape=[batch_size],
dtype=TF_DTYPE) # [B x M x M].
l_a_test = tf.cholesky(a_test) # [B x M x M].
log_det_l_a_test = tf.reduce_sum(tf.log(tf.matrix_diag_part(l_a_test))) # Scalar.
# [B x M x N*].
l_uu_inv_psi_1_test_transpose = tf.matrix_triangular_solve(l_uu,
tf.transpose(psi_1s_test[v], perm=[0, 2, 1]),
lower=True)
# [B x M x N*].
c_test = tf.matrix_triangular_solve(l_a_test, l_uu_inv_psi_1_test_transpose, lower=True)
c_transpose_c_test = tf.squeeze(tf.matmul(c_test, c_test, transpose_a=True),
axis=0) # Squeeze since B=1 so cTc is [N* x N*].
yy_test_transpose = tf.matmul(views_test[v], views_test[v], transpose_b=True) # [N* x N*].
f_hat_test += 0.5 * num_test_points * num_dims * (tf.reduce_sum(tf.log(beta)) - np.log(2.0 * np.pi)) - \
num_dims * log_det_l_a_test + \
0.5 * num_dims * tf.reduce_sum(beta * (tf.trace(l_uu_inv_psi_2_test_inv_transpose) -
psi_0s_test[v])) + \
0.5 * tf.reduce_sum(tf.square(beta) * tf.trace(tf.matmul(c_transpose_c_test, yy_test_transpose))) -\
0.5 * tf.reduce_sum(beta * tf.trace(yy_test_transpose))
# Define KL divergence between q(X_star) and p(X_star).
kl_q_x_test_p_x_test = calculate_kl_divergence_standard_prior(x_mean=x_test_mean, x_covar=x_test_covar)
# Define prediction lower bound.
prediction_lower_bound = f_hat + f_hat_test - kl_q_x_p_x - kl_q_x_test_p_x_test
return prediction_lower_bound, x_test_mean, x_test_covar
@staticmethod
def predict_missing_data(views_test, use_pca=False):
"""
TODO
:param views_test:
:param use_pca:
:return:
"""
# TODO: Currently assume views_test is first views with later ones missing. Update to be more generic.
# Determine tensor dimensions.
num_test_views = len(views_test)
assert 0 < num_test_views < num_views, \
'The number of test views for the missing data scenario must be less than the number of training views.'
test_shapes = np.array([np.shape(v) for v in views_test])
num_test_points = [test_shapes[v][0] for v in range(num_test_views)]
num_test_dimensions = [test_shapes[v][1] for v in range(num_test_views)]
assert np.size(np.unique(num_test_points)) == 1, 'Each view must have the same number of test points.'
num_test_points = num_test_points[0]
assert num_dimensions[:num_test_views] == num_test_dimensions, \
'Observed dimensionality for prediction must be equal to the dimensionality of the training data ' \
'for each observed view.'
# Get slice of views_train for observed and unobserved views. V = Vo + Vu.
views_train_observed = views_train[:num_test_views] # [Vo x N x Dv].
views_train_unobserved = views_train[num_test_views:] # [Vu x N x Dv]
# Define variables for q(X_star).
if use_pca:
# Initialise x_test_mean using PCA.
init_values = pca(np.hstack(views_test), num_latent_dimensions=num_latent_dims)
else:
# Initialise x_test_mean with mean from nearest neighbour between training and test sets
# plus some noise.
y_test_nn_indices = nearest_neighbour(np.hstack(views_train_observed), np.hstack(views_test))
init_values = tf.map_fn(lambda index: x_mean[index], y_test_nn_indices, dtype=TF_DTYPE) + \
np.random.normal(scale=0.01, size=(num_test_points, num_latent_dims))
x_test_mean = tf.Variable(initial_value=init_values, dtype=TF_DTYPE, trainable=False) # [N* x Q].
x_test_covar = tf.matrix_diag(create_positive_variable(initial_value=1.0,
shape=(num_test_points, num_latent_dims),
is_trainable=False)) # [N* x Q x Q].
# Define psi-statistics for test values.
psi_0s_test = [kernels[v].psi_0(inducing_input=x_us[v],
latent_input_mean=x_test_mean,
latent_input_covariance=x_test_covar)
for v in range(num_views)] # [V x B x 1].
psi_1s_test = [kernels[v].psi_1(inducing_input=x_us[v],
latent_input_mean=x_test_mean,
latent_input_covariance=x_test_covar)
for v in range(num_views)] # [V x B x N* x M].
psi_2s_test = [kernels[v].psi_2(inducing_input=x_us[v],
latent_input_mean=x_test_mean,
latent_input_covariance=x_test_covar)
for v in range(num_views)] # [V x B x M x M].
# Calculate f_hat terms for prediction.
f_hat_test = 0.0
for v in range(num_test_views):
num_dims = num_dimensions[v]
beta = kernels[v].noise_precision # [B x 1].
beta_b11 = tf.expand_dims(beta, axis=-1) # [B x 1 x 1].
l_uu = l_uus[v] # [B x M x M].
l_uu_inv_psi_2_test = tf.matrix_triangular_solve(l_uu, psi_2s_test[v], lower=True) # [B x M x M].
l_uu_inv_psi_2_test_inv_transpose = tf.transpose(
tf.matrix_triangular_solve(l_uu, tf.transpose(l_uu_inv_psi_2_test, perm=[0, 2, 1]), lower=True),
perm=[0, 2, 1]) # [B x M x M].
a_test = beta_b11 * l_uu_inv_psi_2_test_inv_transpose + tf.eye(num_inducing_points,
batch_shape=[batch_size],
dtype=TF_DTYPE) # [B x M x M].
l_a_test = tf.cholesky(a_test) # [B x M x M].
log_det_l_a_test = tf.reduce_sum(tf.log(tf.matrix_diag_part(l_a_test))) # Scalar.
# [B x M x N*].
l_uu_inv_psi_1_test_transpose = tf.matrix_triangular_solve(l_uu,
tf.transpose(psi_1s_test[v], perm=[0, 2, 1]),
lower=True)
# [B x M x N*].
c_test = tf.matrix_triangular_solve(l_a_test, l_uu_inv_psi_1_test_transpose, lower=True)
c_transpose_c_test = tf.squeeze(tf.matmul(c_test, c_test, transpose_a=True),
axis=0) # Squeeze since B=1 so cTc is [N* x N*].
yy_test_transpose = tf.matmul(views_test[v], views_test[v], transpose_b=True) # [N* x N*].
f_hat_test += 0.5 * num_test_points * num_dims * (tf.reduce_sum(tf.log(beta)) - np.log(2.0 * np.pi)) - \
num_dims * log_det_l_a_test + \
0.5 * num_dims * tf.reduce_sum(beta * (tf.trace(l_uu_inv_psi_2_test_inv_transpose) -
psi_0s_test[v])) + \
0.5 * tf.reduce_sum(tf.square(beta) * tf.trace(tf.matmul(c_transpose_c_test, yy_test_transpose))) -\
0.5 * tf.reduce_sum(beta * tf.trace(yy_test_transpose))
# Define KL divergence between q(X_star) and p(X_star).
kl_q_x_test_p_x_test = calculate_kl_divergence_standard_prior(x_mean=x_test_mean, x_covar=x_test_covar)
# Define prediction lower bound.
prediction_lower_bound = f_hat + f_hat_test - kl_q_x_p_x - kl_q_x_test_p_x_test
# Define extra terms for predicted mean and covariance.
predicted_means = []
predicted_covars = []
for v in range(num_test_views, num_views):
l_uu_inv_psi_1_test_transpose = tf.matrix_triangular_solve(l_uus[v],
tf.transpose(psi_1s_test[v], perm=[0, 2, 1]),
lower=True)
c_predict = tf.matrix_triangular_solve(l_as[v], l_uu_inv_psi_1_test_transpose)
predicted_mean = kernels[v].noise_precision * tf.matmul(tf.squeeze(
tf.matmul(c_predict, c, transpose_a=True), axis=0), views_train[v]) # [N* x Dv].
predicted_means.append(predicted_mean) # [N* x Dv].
l_uu_inv = tf.matrix_triangular_solve(l_uus[v], tf.eye(num_inducing_points,
batch_shape=[batch_size],
dtype=TF_DTYPE))
l_a_inv = tf.matrix_triangular_solve(l_as[v], tf.eye(num_inducing_points,
batch_shape=[batch_size],
dtype=TF_DTYPE))
g = psi_2s_test[v] - tf.matmul(psi_1s_test[v], psi_1s_test[v], transpose_a=True) # [B x M x M].
scale_du = tf.matmul(
tf.matmul(
tf.matmul(
l_uu_inv,
tf.matmul(
l_a_inv,
l_a_inv,
transpose_a=True
),
transpose_a=True
),
l_uu_inv
),
psi_1s[v],
transpose_b=True
) # [B x M x N].
scale_du_yu = tf.matmul(tf.squeeze(scale_du, axis=0), views_train[v]) # [M x Dv].
# Dv-length vector.
yu_variance = tf.square(tf.squeeze(kernels[v].noise_precision, axis=0)) * tf.diag_part(
tf.matmul(scale_du_yu, tf.matmul(tf.squeeze(g, axis=0), scale_du_yu), transpose_a=True))
# [Dv x N* x N*] so there is a specific covariance for each unobserved dimension.
predicted_covar = tf.expand_dims(tf.expand_dims(yu_variance, axis=-1), axis=-1) + \
((psi_0s_test[v] + tf.reciprocal(kernels[v].noise_precision) -
tf.trace(
tf.matmul(
tf.matmul(
tf.matmul(l_uu_inv,
(tf.eye(num_inducing_points,
batch_shape=[batch_size],
dtype=TF_DTYPE) -
tf.matmul(l_a_inv, l_a_inv, transpose_a=True)),
transpose_a=True),
l_uu_inv),
psi_2s_test[v])
)
) * tf.eye(num_test_points, batch_shape=[batch_size], dtype=TF_DTYPE))
predicted_covars.append(predicted_covar)
return prediction_lower_bound, x_test_mean, x_test_covar, predicted_means, predicted_covars
@property
def objective(self):
"""
TODO
:return:
"""
return objective
return ManifoldRelevanceDetermination()
| 50.573427 | 120 | 0.551833 | 6,680 | 50,624 | 3.862275 | 0.053144 | 0.007752 | 0.012791 | 0.008217 | 0.834496 | 0.801395 | 0.771163 | 0.760891 | 0.738256 | 0.714225 | 0 | 0.012528 | 0.36141 | 50,624 | 1,000 | 121 | 50.624 | 0.785542 | 0.154018 | 0 | 0.603041 | 0 | 0 | 0.035836 | 0 | 0 | 0 | 0 | 0.033 | 0.030405 | 1 | 0.045608 | false | 0 | 0.018581 | 0 | 0.114865 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.